Logo

Wesley Maddox

I’m currently a quantitative researcher at Jump Trading.

I completed a PhD in Data Science at New York University, working under Andrew Wilson. I’m interested in probabilistic machine learning, bayesian deep learning, Gaussian processes, Bayesian optimization, and generative models. More specifically, I’ve worked on methods to incorporate, use, and deploy uncertainty in machine learning models such as neural networks.

My PhD was partially supported as a NSF Graduate Research Fellow, received in 2017.

See Publications for what I’ve worked on in the past, Preprints for what I’m currently working on, Talks for slides of my talks, and my resume (out of date) resume (see also my academic CV). Feel free to drop me an email if you’re interested in chatting about any of these topics (or more). My email is wjm363 at nyu dot edu .

In summer 2021, I interned (remotely) at Facebook Core Data Science, mentored by Qing Feng and Max Balandat. In summer and fall 2020, I interned (remotely) at Facebook Core Data Science, mentored by Max Balandat and Eytan Bakshy. In summer 2019, I interned at Amazon in Cambridge, UK, mentored by Andreas Damianou and Pablo Garcia Moreno.

I spent two years as a PhD student in Statistics at Cornell University, before transferring to NYU. Previously, I did a MS in Statistics and a BS in Systems Biology at Case Western Reserve University in Cleveland, OH. There, I was advised by Robin Snyder and Wojbor Woyczynski. I also worked under Thomas LaFramboise.

Outside of work, I enjoy playing tennis.

Publications

Preprints

Gregory Benton, Nate Gruver, Wesley Maddox, Andrew Gordon Wilson “Deep Probabilistic Time Series Forecasting over Long Horizons.” pdf

Shuai Tang, Wesley Maddox, Charlie Dickens, Tom Diethe, Andreas Damianou “Similarity of Neural Networks with Gradients.” pdf code

Wesley Maddox^, Gregory Benton^, Andrew Gordon Wilson “Rethinking Parameter Counting in Deep Models: Effective Dimensionality Revisited.” pdf code

Ruqi Zhang^, Wesley Maddox^, Ben Athiwaratkun, Andrew Gordon Wilson “An Exploration of Bayesian Methods for Autoencoders.” pdf code

Conference Papers

Samuel Stanton, Wesley Maddox, Andrew Gordon Wilson “Bayesian Optimization with Conformal Coverage Guarantees.” to appear in Artificial Intelligence and Statistics (AISTATS), 2023. pdf code

Sanyam Kapoor^, Wesley Maddox^, Pavel Izmailov^, Andrew Gordon Wilson “On Uncertainty, Tempering, and Data Augmentation in Bayesian Classification.” Neural Information Processing Systems (NeurIPS), 2022. pdf code

Wesley Maddox^, Andres Potapcynski^, Andrew Gordon Wilson “Low Precision Arithmetic for Fast Gaussian Processes.” Uncertainty in Artificial Intelligence (UAI), 2022. pdf code

Samuel Stanton, Wesley Maddox, Nate Gruver, Phillip Maffettone, Emily Delaney, Peyton Greenside, Andrew Gordon Wilson “Accelerating Bayesian Optimization for Biological Sequence Design with Denoising Autoencoders.” International Conference on Machine Learning (ICML), 2022. pdf code tiktok

Greg Benton, Wesley Maddox, Andrew Gordon Wilson “Volatility Based Kernels and Moving Average Means for Accurate Forecasting with Gaussian Processes.” International Conference on Machine Learning (ICML), 2022. pdf code

Wesley Maddox, Samuel Stanton, Andrew Gordon Wilson, “Conditioning Stochastic Variational Gaussian Processes for Online Decision-Making.” Neural Information Processing Systems (NeurIPS), 2021. pdf, code poster

Wesley Maddox, Maximilian Balandat, Andrew Gordon Wilson, Eytan Bakshy “Bayesian Optimization with High-Dimensional Outputs.” Neural Information Processing Systems (NeurIPS), 2021. pdf experimental code tutorial poster.

Gregory Benton, Wesley Maddox, Sanae Lofti, Andrew Gordon Wilson. “Loss Surface Simplexes for Mode Connecting Volumes and Fast Ensembling.” In International Conference on Machine Learning (ICML), 2021. pdf code

Wesley Maddox, Shuai Tang, Pablo Moreno, Andrew Gordon Wilson, Andreas Damianou. “Fast Adaptation with Linearized Neural Networks.” In Artificial Intelligence and Statistics (AISTATS), 2021. Expanded and heavily revised version of “On Transfer Learning via Linearized Neural Networks.” pdf code slides poster proceedings

Samuel Stanton^, Wesley Maddox^, Ian Delbridge, Andrew Gordon Wilson. “Kernel Interpolation for Scalable Online Gaussian Processes.” In Artificial Intelligence and Statistics (AISTATS), 2021. pdf code Sam’s slides poster proceedings

Wesley Maddox^, Timur Garipov^, Pavel Izmailov^, Dmitry Vetrov, Andrew Gordon Wilson. “A Simple Baseline for Bayesian Uncertainty in Deep Learning.” In Neural Information Processing Systems (NeurIPS), 2019. Expanded and heavily revised version of “Fast Uncertainty Estimates and Bayesian Model Averaging of DNNs.” pdf code poster proceedings

Gregory Benton^, Wesley Maddox^, Jayson Salkey^, Julio Albinati, Andrew Gordon Wilson. “Function-Space Distributions over Kernels”. In Neural Information Processing Systems (NeurIPS), 2019. pdf Slides Poster Code proceedings theme song

Pavel Izmailov^, Wesley Maddox^, Polina Kirichenko^, Timur Garipov^, Dmitry Vetrov, Andrew Gordon Wilson. “Subspace Inference for Bayesian Deep Learning.” In Uncertainty in Artifical Intelligence (UAI), 2019. pdf code poster uai proceedings pmlr proceedings

Workshop Papers

Wesley Maddox, Qing Feng, Max Balandat. “Optimizing High-Dimensional Physics Simulations via Composite Bayesian Optimization.” In Workshop on Machine Learning and the Physical Sciences at NeurIPS 2021. pdf poster

Wesley Maddox, Sanyam Kapoor, Andrew Gordon Wilson. “When are Iterative Gaussian Processes Reliably Accurate?” In Beyond First Order Methods in ML (OPT-ML) Workshop at ICML 2021. pdf code demo video

Wesley Maddox, Shuai Tang, Pablo Garcia Moreno, Andrew Gordon Wilson, Andreas Damianou. “On Transfer Learning via Linearized Neural Networks”. In 3rd MetaLearning Workshop at NeurIPS 2019. Poster Code Malaria Global Atlas Dataset Andreas’ Slides

Gregory Benton, Wesley Maddox, Jayson Salkey, Julio Albinati, Andrew Gordon Wilson. “Function-Space Distributions over Kernels”. In ICML Time Series Workshop, 2019. (Won best paper award). Slides Poster Code

Pavel Izmailov, Wesley Maddox, Polina Kirichenko, Timur Garipov, Dmitry Vetrov, Andrew Gordon Wilson. “Subspace Inference for Bayesian Deep Learning.” In ICML Workshop on Uncertainty & Robustness in Deep Learning, 2019. (Accepted as contributed talk). Slides Poster Code Polina’s Talk

Marc Finzi, Pavel Izmailov, Wesley Maddox, Polina Kirichenko, Andrew Gordon Wilson. “Invertible Convolutional Networks.” In ICML Workshop on Invertible Neural Networks and Normalizing Flows, 2019. (Accepted as a spotlight talk). Slides Poster

Wesley Maddox, Timur Garipov, Pavel Izmailov, Dmitry Vetrov, Andrew Gordon Wilson. “Fast Uncertainty Estimates and Bayesian Model Averaging of DNNs.” In UAI Workshop on Uncertainty in Deep Learning, Monterey, 2018. (Accepted as contributed talk) Slides Poster Code

Journal Articles

Sneha Grandhi, Colleen Bosworth, Wesley Maddox, Cole Sensiba, Sara Akhavanfard, Ying Ni, and Thomas LaFramboise. “Heteroplasmic shifts in tumor mitochondrial genomes reveal tissue-specific signals of relaxed and positive selection”. In: Human Molecular Genetics 26.15 (Aug. 2017), pp. 2912–2922. DOI: 10.1093/hmg/ddx172. Last visited on 06/14/2018.

Alexandru Belu, Wesley Maddox, Wojbor A Woyczynski. “Copulas and dependency measures for multivariate Linnik distributions” International Journal of Statistics and Probability. Vol 7 No 6, 2018. DOI:10.5539/ijsp.v7n6p154. Last visited on 09/20/2019.

Talks

“Gaussian Processes and Bayesian Neural Networks for Decision Making,” NYU CDS Thesis defense, Slides

“Two Sampling Methods for Approximate Inference in Bayesian Deep Learning,” University of Cambridge, Slides Event Link

“Bayesian Neural Networks: A Tutorial,” CILVR Lab Tutorial, Slides

“Bayesian Neural Networks: A Tutorial,” NYU CDS Data Science Lunch Seminar, 02/26/20, Slides Event Link

Data

Pre-processed Malaria Global Atlas dataset. Received from Max Balandat and was originally used in their BoTorch paper. Hdf5 file