The lay abstract featured today (for Normalizing Basis Functions: Approximate Stationary Models for Large Spatial Data by Antony Sikorski, Daniel McKenzie and Douglas Nychka) is from Stat with the full article now available to read here.
How to Cite
Sikorski, A., McKenzie, D. and Nychka, D. (2024), Normalizing Basis Functions: Approximate Stationary Models for Large Spatial Data. Stat, 13: e70015. https://doi.org/10.1002/sta4.70015
The analysis of large spatial data plays a key role in understanding phenomena like climate change, ecosystem dynamics, and public health trends. Traditional models often use the Gaussian Process (GP) to capture spatial relationships within the data. It is well known that fitting a GP becomes computationally infeasible when dealing with large data volumes, necessitating the use of approximate methods. A powerful class of methods approximate the GP as a sum of basis functions with random coefficients. Although these basis function methods offer computational efficiency, they can introduce unwanted patterns and edge effects into the predicted surfaces, which are not physically representative of the data. One way to mitigate this issue is to “normalize” the basis functions, but this process is also computationally demanding. This research addresses the issue by introducing two fast and accurate algorithms for the normalization step. The practical value of these algorithms is showcased in the context of a spatial analysis on a large dataset, where significant computational speedups are achieved. To make this methodology more accessible to practitioners, the algorithms have been incorporated into the LatticeKrig R package. However, these algorithms can also be adapted to other basis function methods that operate on regular grids.
More Details