Aggregate high resolution (300m*300m) raster (raster::aggregate and velox not able to handle well this resolution)

234 Views Asked by At

I'm trying to aggregate a raster r of global extent from a ~300m*300m (10 arc‐seconds, 7.4GB) resolution to a ~10km resolution (0.083333 decimal degrees), i.e. a factor of 30. Both the aggregate functions from the raster and the velox packages do not seem to handle such large dataset. I very much welcome recommendations!

# sample rasters
r <- raster(extent(-180,180, -90 , 90))
res(r)<-c(0.5/6/30, 0.5/6/30)
r <- setValues(r, runif(ncell(r))) # Error: cannot allocate vector of size 62.6 Gb

# velox example
devtools::install_github('hunzikp/velox')
library(velox)
vx <- velox(r) # the process aborts in linux
vx$aggregate(factor=30, aggtype='mean')

# raster example
r_agg <- aggregate(r, fact=30)
2

There are 2 best solutions below

0
Robert Hijmans On BEST ANSWER

You say that raster cannot handle a large raster like that, but that is not true. The problem is that you are trying to create a very large data set in memory --- more memory than your computer has available. You can use the init function instead. I show that below but not using a global 300 m raster to make the example run a bit faster.

library(raster)
r <- raster(ymn=80, res=0.5/6/30)
r <- init(r, "col")
r_agg <- aggregate(r, fact=30)

You get better mileage with terra

library(terra)
rr <- rast(ymin=80, res= 0.5/6/30)
rr <- init(rr, "col")
rr_agg <- aggregate(rr, fact=30)
0
a_PhD_researcher On

In addition to Robert's suggestion, I'd resample the rast with a template so the ext and crs would be precise.

r <- terra::rast("your_rast.tif") %>%

aggregate(., fact = 30) %>%

resample(., template_rast, filename ="sth.tif", wopt = list(gdal = c("COMPRESS=LZW", "TFW=YES", "BIGTIFF=YES"), tempdir = "somewhere_you_have_a_lot_of_space", todisk = TRUE))

Those wopt options might help you a lot with large rasters.