Please note that the linked preprints may deviate considerably from the published articles. If you have problems accessing one of the articles please contact me.
Preprints
[6p] Liam Llamazares-Elias, Jonas Latz, Finn Lindgren (2024): A parameterization of anisotropic Gaussian fields with penalized complexity priors. (arXiv)
[5p] Jonas Latz (2024): The random timestep Euler method and its continuous dynamics. (arXiv)
[4p] Zihan Ding, Kexin Jin, Jonas Latz, Chenguang Liu (2024): How to beat a Bayesian adversary. (arXiv)
[3p] Liam Llamazares-Elias, Samir Llamazares-Elias, Jonas Latz, Stefan Klus (2024): Data-driven approximation of Koopman operators and generators: Convergence rates and error bounds. (arXiv)
[2p] Kexin Jin, Jonas Latz, Chenguang Liu, Alessandro Scagliotti (2022): Losing momentum in continuous-time stochastic optimisation. (arXiv)
[1p] Yury Korolev, Jonas Latz, Carola-Bibiane Schönlieb (2022): Gaussian random fields on non-separable Banach spaces. (arXiv)
Refereed journal articles
[22] Jonas Weidner, Ivan Ezhov, Michal Balcerak, Marie-Christin Metz, Sergey Litvinov, Sebastian Kaltenbach, Leonhard Feiner, Laurin Lux, Florian Kofler, Jana Lipkova, Jonas Latz, Daniel Rueckert, Bjoern Menze, Benedikt Wiestler (2024): A Learnable Prior Improves Inverse Tumor Growth Modeling. IEEE Transactions on Medical Imaging (early access), doi. (arXiv)
[21] Alix Leroy, Benedict Leimkuhler, Jonas Latz, Desmond J. Higham (2024): Adaptive stepsize algorithms for Langevin dynamics. SIAM Journal on Scientific Computing 46(6), pp. A3574-A3598, doi. (arXiv)
[20] Jonas Latz, Doris Schneider, Philipp Wacker (2024): Nested Sampling for Uncertainty Quantification and Rare Event Estimation. SIAM Journal on Scientific Computing 46(5), pp. A3305-A3329, doi. (arXiv, .bib)
[19] Jonas Latz (2024): Correction to: analysis of stochastic gradient descent in continuous time. Statistics and Computing 34, 146, doi. (open access, .bib)
[18] Tamara G. Grossmann, Urszula Julia Komorowska, Jonas Latz, Carola-Bibiane Schönlieb (2024): Can Physics-Informed Neural Networks beat the Finite Element Method? IMA Journal of Applied Mathematics 89(1), pp. 143-174, doi. (open access, .bib, arXiv)
[17] Kexin Jin, Jonas Latz, Chenguang Liu, Carola-Bibiane Schönlieb (2023): A Continuous-time Stochastic Gradient Descent Method for Continuous Data. Journal of Machine Learning Research 24(274), pp. 1-48. (open access, .bib, arXiv)
[16] Jonas Latz (2023): Bayesian Inverse Problems are Usually Well-posed. SIAM Review 65(3), pp. 831-865, doi. (open access, .bib)
[15] Matei Hanu, Jonas Latz, Claudia Schillings (2023): Subsampling in ensemble Kalman inversion. Inverse Problems 39, 094002, doi. (open access, .bib, arXiv)
[14] Jeremy Budd, Yves van Gennip, Jonas Latz, Simone Parisotto, Carola-Bibiane Schönlieb (2023): Joint reconstruction-segmentation on graphs. SIAM Journal on Imaging Sciences 16(2), pp. 911-947, doi. (.bib, arXiv)
[13] Jonas Latz (2022): Gradient flows and randomised thresholding: sparse inversion and classification. Inverse Problems 38, 124006, doi. (open access, .bib, arXiv)
[12] Jonas Latz, Juan P. Madrigal-Cianci, Fabio Nobile, Raul Tempone (2021): Generalized Parallel Tempering on Bayesian Inverse Problems. Statistics and Computing 31, 67, doi. (open access, .bib, arXiv)
[11] Felipe Uribe, Iason Papaioannou, Jonas Latz, Wolfgang Betz, Elisabeth Ullmann, Daniel Straub (2021): Bayesian inference with subset simulation in varying dimensions applied to the Karhunen–Loève expansion. International Journal for Numerical Methods in Engineering 122, pp. 5100–5127, doi. (open access, .bib)
[10] Jonas Latz (2021): Analysis of stochastic gradient descent in continuous time. Statistics and Computing 31, 39, doi. (open access, .bib, arXiv)
[9] Fabian Wagner, Jonas Latz, Iason Papaioannou, Elisabeth Ullmann (2021): Error analysis for probabilities of rare events with approximate models. SIAM Journal on Numerical Analysis 59(4), pp. 1948–1975, doi. (.bib, arXiv)
[8] Jeremy Budd, Yves van Gennip, Jonas Latz (2021): Classification and image processing with a semi-discrete scheme for fidelity forced Allen–Cahn on graphs. GAMM Mitteilungen 44(1), e202100004, doi. (open access, .bib, arXiv)
[7] Daniel Kressner, Jonas Latz, Stefano Massei, Elisabeth Ullmann (2020): Certified and fast computations with shallow covariance kernels. Foundations of Data Science 2(4), pp. 487–512, doi. (.bib, arXiv)
[6] Fabian Wagner, Jonas Latz, Iason Papaioannou, Elisabeth Ullmann (2020): Multilevel Sequential Importance Sampling for Rare Event Estimation. SIAM Journal on Scientific Computing 42(4), pp. A2062–A2087, doi. (.bib, arXiv)
[5] Jonas Latz (2020): On the Well-posedness of Bayesian Inverse Problems. SIAM/ASA Journal on Uncertainty Quantification 8(1), pp. 451–482, doi. (.bib, arXiv)
[4] Ionuţ-Gabriel Farcaş, Jonas Latz, Elisabeth Ullmann, Tobias Neckel, Hans-Joachim Bungartz (2020): Multilevel Adaptive Sparse Leja Approximations for Bayesian Inverse Problems. SIAM Journal on Scientific Computing 42(1), pp. A424–A451, doi. (.bib, arXiv)
[3] Christian Kahle, Kei Fong Lam, Jonas Latz, Elisabeth Ullmann (2019): Bayesian parameter identification in Cahn-Hilliard models for biological growth. SIAM/ASA Journal on Uncertainty Quantification 7(2), pp. 526-552, doi. (.bib, arXiv)
[2] Jonas Latz, Marvin Eisenberger, Elisabeth Ullmann (2019): Fast Sampling of parameterised Gaussian random fields. Computer Methods in Applied Mechanics and Engineering 348, pp. 978-1012, doi. (.bib, arXiv)
[1] Jonas Latz, Iason Papaioannou, Elisabeth Ullmann (2018): Multilevel Sequential² Monte Carlo for Bayesian Inverse Problems. Journal of Computational Physics 368, pp. 154-178, doi. (.bib, arXiv)
Refereed book chapters and articles in conference proceedings
[2b] Kexin Jin, Chenguang Liu, Jonas Latz (2024): Subsampling error in Stochastic Gradient Langevin Diffusions. Proceedings of the 27th International Conference on Artificial Intelligence and Statistics (AISTATS), Proceedings of Machine Learning Research 238, pp. 1414-1422. (open access, .bib, arXiv)
[1b] Matthieu Bulté, Jonas Latz, Elisabeth Ullmann (2020): A practical example for the non-linear Bayesian filtering of model parameters. in M. D’Elia, M. Gunzburger, G. Rozza (ed.): Quantification of Uncertainty: Improving Efficiency and Technology – QUIET selected contributions, Lecture Notes in Computational Science and Engineering, Vol. 137, Springer, Cham, Chpt. 11, pp. 241-272, doi. (github, .bib, arXiv)
Theses
[3t] Jonas Latz (2019): Exploring and exploiting hierarchies in Bayesian inverse problems. Doctoral thesis, Technical University of Munich. (.bib, full text)
[2t] Jonas Latz (2016): Bayes Linear Methods for Inverse Problems. Master’s thesis, University of Warwick. (.bib, full text)
[1t] Jonas Latz (2014): Äußere Hausdorff-Maße: Anwendungen und Eigenschaften. Bachelor’s thesis, University of Trier (in German).
Miscellaneous (non-refereed)
[4n] Jonas Latz (2023): Book review: ‘Bayesian Non-linear Statistical Inverse problems’ by Richard Nickl. zbMATH. (open access)
[3n] Jonas Latz (2022): Stochastic gradient in continuous time: discrete and continuous data. Oberwolfach Report 10/2022, pp. 16-17, doi.
[2n] Jonas Latz, Björn Sprungk (2022): Solving inverse problems with Bayes’ theorem. Snapshots of Modern Mathematics from Oberwolfach. (open access)
[1n] Jonas Latz (2019): On the well-posedness of Bayesian inverse problems: The Gaussian noise case. Oberwolfach Report 12/2019, pp. 35-36, doi.