Ensemble-Based Gradient Inference for Particle Methods in Optimization and Sampling

Type of content
Journal Article
Thesis discipline
Degree name
Publisher
Journal Title
Journal ISSN
Volume Title
Language
Date
2023
Authors
Schillings , Claudia
Totzeck , Claudia
Wacker, Philipp
Abstract

We propose an approach based on function evaluations and Bayesian inference to extract higher-order differential information of objective functions from a given ensemble of particles. Pointwise evaluation of some potential V in an ensemble contains implicit information about first or higher order derivatives, which can be made explicit with little computational effort (ensemble-based gradient inference – EGI). We suggest to use this information for the improvement of established ensemble-based numerical methods for optimization and sampling such as Consensus-based optimization and Langevin-based samplers. Numerical studies indicate that the augmented algorithms are often superior to their gradient-free variants, in particular the augmented methods help the ensembles to escape their initial domain, to explore multimodal, non-Gaussian settings and to speed up the collapse at the end of optimization dynamics. The code for the numerical examples in this manuscript can be found in the paper’s Github repository

Description
Citation
Schillings C, Totzeck C, Wacker P (2023). Ensemble-Based Gradient Inference for Particle Methods in Optimization and Sampling. Ensemble-Based Gradient Inference for Particle Methods in Optimization and Sampling. 11(3). 757-787.
Keywords
optimization, sampling, Langevin dynamics, ensemble methods
Ngā upoko tukutuku/Māori subject headings
ANZSRC fields of research
49 - Mathematical sciences::4903 - Numerical and computational mathematics::490304 - Optimisation
49 - Mathematical sciences::4905 - Statistics
Rights
All rights reserved unless otherwise stated