Bibliography

33. Bibliography#

This is a partial list of good references.

[BS06]

J.M. Bernardo and A.F.M. Smith. Bayesian Theory. Wiley Series in Probability and Statistics. John Wiley & Sons Canada, Limited, 2006. ISBN 9780470028735.

[BCKW15]

Charles Blundell, Julien Cornebise, Koray Kavukcuoglu, and Daan Wierstra. Weight uncertainty in neural networks. 2015. doi:10.48550/ARXIV.1505.05424.

[BGJM11]

S. Brooks, A. Gelman, G. Jones, and X.L. Meng. Handbook of Markov Chain Monte Carlo. Chapman & Hall/CRC Handbooks of Modern Statistical Methods. CRC Press, 2011. ISBN 9781420079425. URL: https://books.google.se/books?id=qfRsAIKZ4rIC.

[Cox61]

Richard Threlkeld Cox. The Algebra of Probable Inference. Johns Hopkins University Press, 1961.

[Cyb89]

G Cybenko. Approximation by superpositions of a sigmoidal function. Math. Control Signal Systems, 2:303–314, 1989. doi:10.1007/BF02551274.

[DMF+22]

C. Drischler, J. A. Melendez, R. J. Furnstahl, A. J. Garcia, and Xilin Zhang. BUQEYE guide to projection-based emulators in nuclear physics. Front. in Phys., 10:1092931, 2022. arXiv:2212.04912, doi:10.3389/fphy.2022.1092931.

[DKPR87]

S. Duane, A. D. Kennedy, B. J. Pendleton, and D. Roweth. Hybrid Monte Carlo. Phys. Lett. B, 195:216–222, 1987. doi:10.1016/0370-2693(87)91197-X.

[DHS11]

John Duchi, Elad Hazan, and Yoram Singer. Adaptive subgradient methods for online learning and stochastic optimization. Journal of Machine Learning Research, 12(61):2121–2159, 2011. URL: http://jmlr.org/papers/v12/duchi11a.html.

[FHB09]

F Feroz, M P Hobson, and M Bridges. MultiNest: an efficient and robust Bayesian inference tool for cosmology and particle physics. Monthly Notices of the Royal Astronomical Society, 398(4):1601–1614, September 2009. doi:10.1111/j.1365-2966.2009.14548.x.

[FMHLG13]

Daniel Foreman-Mackey, David W. Hogg, Dustin Lang, and Jonathan Goodman. Emcee: the mcmc hammer. Publications of the Astronomical Society of the Pacific, 125(925):306–312, Mar 2013. doi:10.1086/670067.

[FHI+18]

Dillon Frame, Rongzheng He, Ilse Ipsen, Daniel Lee, Dean Lee, and Ermal Rrapaj. Eigenvector continuation with subspace learning. Phys. Rev. Lett., 121(3):032501, 2018. arXiv:1711.07090, doi:10.1103/PhysRevLett.121.032501.

[GCS+13]

A. Gelman, J.B. Carlin, H.S. Stern, D.B. Dunson, A. Vehtari, and D.B. Rubin. Bayesian Data Analysis, Third Edition. Chapman & Hall/CRC Texts in Statistical Science. Taylor & Francis, 2013. URL: http://www.stat.columbia.edu/~gelman/book/BDA3.pdf.

[GR92]

Andrew Gelman and Donald B. Rubin. Inference from iterative simulation using multiple sequences. Statistical Science, 7(4):457–472, 1992. URL: http://www.jstor.org/stable/2246093.

[GW07]

M. Goldstein and D. Wooff. Bayes Linear Statistics: Theory and Methods. Wiley Series in Probability and Statistics. Wiley, 2007. ISBN 9780470015629.

[GW10]

Jonathan Goodman and Jonathan Weare. Ensemble samplers with affine invariance. Comm. App. Math. and Comp. Sci., 5(1):65–80, 2010. doi:10.2140/camcos.2010.5.65.

[Gre05]

Phil Gregory. Bayesian Logical Data Analysis for the Physical Sciences: A Comparative Approach with Mathematica® Support. Cambridge University Press, 2005. doi:10.1017/CBO9780511791277.

[Geron17]

A. Géron. Hands-on Machine Learning with Scikit-Learn and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems. O'Reilly Media, 2017. ISBN 9781491962299. URL: https://books.google.se/books?id=I6qkDAEACAAJ.

[Hal21]

James Halverson. Building Quantum Field Theories Out of Neurons. 12 2021. arXiv:2112.04527.

[HMS21]

James Halverson, Anindita Maiti, and Keegan Stoner. Neural Networks and Quantum Field Theory. Mach. Learn. Sci. Tech., 2(3):035002, 2021. arXiv:2008.08601, doi:10.1088/2632-2153/abeca3.

[HTF09]

T. Hastie, R. Tibshiranie, and J. Friedman. The Elements of Statistical Learning: Data Mining, Inference, and Prediction. Springer, 2009. ISBN 978-0-387-84858-7. URL: https://link.springer.com/book/10.1007/978-0-387-84858-7.

[HG+14]

Matthew D Hoffman, Andrew Gelman, and others. The no-u-turn sampler: adaptively setting path lengths in hamiltonian monte carlo. J. Mach. Learn. Res., 15(1):1593–1623, 2014.

[H+22]

Baishan Hu and others. Ab initio predictions link the neutron skin of $^208$Pb to nuclear forces. Nature Phys., 18(10):1196–1200, 2022. arXiv:2112.01125, doi:10.1038/s41567-022-01715-8.

[Jay88]

E. T. Jaynes. How Does the Brain Do Plausible Reasoning?, pages 1–24. Springer Netherlands, Dordrecht, 1988. doi:10.1007/978-94-009-3049-0_1.

[Jay03]

E. T. Jaynes. Probability Theory: The Logic of Science. Cambridge University Press, 2003. doi:10.1017/CBO9780511790423.

[JForssen22]

Weiguang Jiang and Christian Forssén. Bayesian probability updates using sampling/importance resampling: Applications in nuclear theory. Front. in Phys., 10:1058809, 2022. arXiv:2210.02507, doi:10.3389/fphy.2022.1058809.

[KRGB15]

Alp Kucukelbir, Rajesh Ranganath, Andrew Gelman, and David M. Blei. Automatic variational inference in stan. 2015. doi:10.48550/ARXIV.1506.03431.

[LWahlstromLSchon21]

A. Lindholm, N. Wahlström, F. Lindsten, and T. B. Schön. Machine Learning: A First Course for Engineers and Scientists. Cambridge University Press, 2021. ISBN 9781108843607. URL: http://smlbook.org/book/sml-book-draft-latest.pdf.

[Mac03]

D.J.C. MacKay. Information Theory, Inference and Learning Algorithms. Cambridge University Press, 2003. ISBN 9780521642989. URL: http://www.inference.org.uk/mackay/itila/.

[May22]

Eleanor May. Bayesian history matching of chiral effective field theory in the two-nucleon sector. Master's thesis, Chalmers, 2022.

[MBW+19]

Pankaj Mehta, Marin Bukov, Ching-Hao Wang, Alexandre G.R. Day, Clint Richardson, Charles K. Fisher, and David J. Schwab. A high-bias, low-variance introduction to machine learning for physicists. Phys. Rep., 810:1–124, may 2019. doi:10.1016/j.physrep.2019.03.001.

[MDF+22]

J. A. Melendez, C. Drischler, R. J. Furnstahl, A. J. Garcia, and Xilin Zhang. Model reduction methods for nuclear emulators. J. Phys. G, 49(10):102001, 2022. arXiv:2203.05528, doi:10.1088/1361-6471/ac83dd.

[Pol54a]

George Polya. Mathematics and Plausible Reasoning, Volume 1: Induction and Analogy in Mathematics. Princeton University Press, 1954.

[Pol54b]

George Polya. Mathematics and Plausible Reasoning, Volume 2: Patterns of Plausible Inference. Princeton University Press, 1954.

[Puk94]

Friedrich Pukelsheim. The three sigma rule. The American Statistician, 48(2):88–91, 1994. URL: http://www.jstor.org/stable/2684253.

[QMN15]

Alfio Quarteroni, Andrea Manzoni, and Federico Negri. Reduced Basis Methods for Partial Differential Equations. Springer International Publishing, 2015. ISBN 9783319154305. doi:10.1007/978-3-319-15431-2.

[RW05]

Carl Edward Rasmussen and Christopher K. I. Williams. Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning). The MIT Press, 2005. ISBN 026218253X.

[Rob21]

Daniel A. Roberts. Why is AI hard and Physics simple? 3 2021. arXiv:2104.00008.

[RYH22]

Daniel A. Roberts, Sho Yaida, and Boris Hanin. The Principles of Deep Learning Theory. Cambridge University Press, 5 2022. ISBN 978-1-00-902340-5. arXiv:2106.10165, doi:10.1017/9781009023405.

[Rub88]

Donald B Rubin. Using the sir algorithm to simulate posterior distributions. Bayesian statistics, 3:395–402, 1988.

[SS06]

D. S. Sivia and J. Skilling. Data Analysis - A Bayesian Tutorial. Oxford Science Publications. Oxford University Press, 2nd edition, 2006.

[SG92]

A. F. M. Smith and A. E. Gelfand. Bayesian statistics without tears: a sampling-resampling perspective. Am. Stat., 46(2):84–88, May 1992.

[Sum21]

D. Sumpter. Ethics in Machine Learning, pages 309–326. Cambridge University Press, 2021.

[SEkstromForssen22]

Isak Svensson, Andreas Ekström, and Christian Forssén. Bayesian parameter estimation in chiral effective field theory using the Hamiltonian Monte Carlo method. Phys. Rev. C, 105(1):014004, 2022. arXiv:2110.04011, doi:10.1103/PhysRevC.105.014004.

[Tro08]

Roberto Trotta. Bayes in the sky: Bayesian inference and model selection in cosmology. Contemp. Phys., 49:71–104, 2008. arXiv:0803.4089, doi:10.1080/00107510802066753.

[vW15]

Wessel N. van Wieringen. Lecture notes on ridge regression. 2015. doi:10.48550/ARXIV.1509.09169.

[Van16]

J. VanderPlas. Python Data Science Handbook: Essential Tools for Working with Data. O'Reilly Media, 2016. ISBN 9781491912133. URL: https://books.google.se/books?id=6omNDQAAQBAJ.

[VGB14]

Ian Vernon, Michael Goldstein, and Richard Bower. Galaxy formation: bayesian history matching for the observable universe. Statist. Sci., 29(1):81–90, 02 2014. doi:10.1214/12-STS412.

[VGB10]

Ian Vernon, Michael Goldstein, and Richard G. Bower. Galaxy formation: a bayesian uncertainty analysis. Bayesian Anal., 5(4):619–669, 12 2010. doi:10.1214/10-BA524.

[VLG+18]

Ian Vernon, Junli Liu, Michael Goldstein, James Rowe, Jen Topping, and Keith Lindsey. Bayesian uncertainty analysis for complex systems biology models: emulation, global parameter searches and evaluation of gene functions. BMC Systems Biology, 12:1, 1 2018. doi:10.1186/s12918-017-0484-3.

[WAK+17]

Meng Wang, G. Audi, F. G. Kondev, W.J. Huang, S. Naimi, and Xing Xu. The AME2016 atomic mass evaluation (II). tables, graphs and references. Chinese Phys. C, 41(3):030003, mar 2017. doi:10.1088/1674-1137/41/3/030003.

[KingmaBa14]

Diederik P. Kingma and Jimmy Ba. Adam: A Method for Stochastic Optimization. arXiv e-prints, pages arXiv:1412.6980, December 2014. arXiv:1412.6980, doi:10.48550/arXiv.1412.6980.

[RumelhartHintonWilliams86]

David E. Rumelhart, Geoffrey E. Hinton, and Ronald J. Williams. Learning representations by back-propagating errors. Nature, 323(6088):533–536, October 1986. doi:10.1038/323533a0.

[vandSchootDK+21]

Rens van de Schoot, Sarah Depaoli, Ruth King, Bianca Kramer, Kaspar Märtens, Mahlet G. Tadesse, Marina Vannucci, Andrew Gelman, Duco Veen, Joukje Willemsen, and Christopher Yau. Bayesian statistics and modelling. Nat Rev Methods Primers, 1:1, 2021. doi:10.1038/s43586-020-00001-2.

[Zeiler12]

Matthew D. Zeiler. ADADELTA: An Adaptive Learning Rate Method. arXiv e-prints, pages arXiv:1212.5701, December 2012. arXiv:1212.5701, doi:10.48550/arXiv.1212.5701.