Distance-Aware Error for Spline Networks: A Bottom-Up Approach to Uncertainty

arXiv:2501.04757v2 Announce Type: replace-cross Abstract: We develop a new class of distance-aware error bounds that tightly characterize the approximation error of spline neural networks. Our bottom-up approach analyzes the error bound of each neuron (a spline) and then extends it to the full network. We begin with error bounds for Newton's polynomial, generalize them to arbitrary splines under higher-order Lipschitz continuity, and extend the result to function compositions, the core of deep networks such as Kolmogorov-Arnold networks. By analyzing error propagation through composed spline layers, we obtain error bounds for the entire network. These bounds are deterministic, do not rely on sampling or probabilistic assumptions, and hold under mild regularity conditions. We evaluate our method on object shape estimation from sparse laser scans and safe navigation in unstructured environments. Our method is faster than the Gaussian process and Monte Carlo approaches, and our bounds reliably enclose the true error. We also develop a metric for the distance-awareness of an uncertainty estimator and show that distance-aware uncertainty for Kolmogorov networks (DAREK) is distance-aware in more regions than the baselines.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top