Logo

Wesley Maddox

I’m a PhD student in Data Science at New York University, working under Professor Andrew Wilson. I’m interested in statistical machine learning, bayesian deep learning, Gaussian processes, and generative models. More specifically, I’m working on methods to incorporate and utilize uncertainty in machine learning models such as neural networks.

I’m supported as a NSF Graduate Research Fellow, received in 2017.

See Publications for what I’ve worked on in the past, Talks for slides of my talks, and CV for my current CV. Feel free to drop me an email if you’re interested in chatting about any of these topics (or more).

In Summer 2019, I interned at Amazon in Cambridge, UK, mentored by Andreas Damianou and Pablo Garcia Moreno.

In Spring 2018, I organized a reading group on Bayesian non-parametrics for the Statistics Graduate Student Association and served as a social planner.

I spent two years as a PhD student in Statistics at Cornell University, before transferring to NYU with our group. Previously, I did a MS in Statistics and a BS in Systems Biology at Case Western Reserve University in Cleveland, OH. There, I was advised by Professor Robin Snyder and Professor Wojbor Woyczynski. I also worked under Professor Thomas LaFramboise.

Outside of work, I enjoy playing tennis, typically with the Cornell Club Tennis team.

Publications

Preprints

Ruqi Zhang^, Wesley Maddox^, Ben Athiwaratkun, Andrew Gordon Wilson “An Exploration of Bayesian Methods for Autoencoders.” pdf code

Conference Papers

Wesley Maddox^, Timur Garipov^, Pavel Izmailov^, Dmitry Vetrov, Andrew Gordon Wilson. “A Simple Baseline for Bayesian Uncertainty in Deep Learning.” In Neural Information Processing Systems (NeurIPS), 2019. Expanded and heavily revised version of “Fast Uncertainty Estimates and Bayesian Model Averaging of DNNs.” pdf code theme song

Gregory Benton^, Wesley Maddox^, Jayson Salkey^, Julio Albinati, Andrew Gordon Wilson. “Function-Space Distributions over Kernels”. In Neural Information Processing Systems (NeurIPS), 2019. pdf Slides Poster Code theme song

Pavel Izmailov^, Wesley Maddox^, Polina Kirichenko^, Timur Garipov^, Dmitry Vetrov, Andrew Gordon Wilson. “Subspace Inference for Bayesian Deep Learning.” In Uncertainty in Artifical Intelligence (UAI), 2019. pdf code poster uai proceedings

Workshop Papers

Wesley Maddox, Shuai Tang, Pablo Garcia Moreno, Andrew Gordon Wilson, Andreas Damianou. On Transfer Learning via Linearized Neural Networks. To appear in NeurIPS MetaLearn Workshop 2019.

Gregory Benton, Wesley Maddox, Jayson Salkey, Julio Albinati, Andrew Gordon Wilson. “Function-Space Distributions over Kernels”. In ICML Time Series Workshop, 2019. (Won best paper award). Slides Poster Code

Pavel Izmailov, Wesley Maddox, Polina Kirichenko, Timur Garipov, Dmitry Vetrov, Andrew Gordon Wilson. “Subspace Inference for Bayesian Deep Learning.” In ICML Workshop on Uncertainty & Robustness in Deep Learning, 2019. (Accepted as contributed talk). Slides Poster Code Polina’s Talk

Marc Finzi, Pavel Izmailov, Wesley Maddox, Polina Kirichenko, Andrew Gordon Wilson. “Invertible Convolutional Networks.” In ICML Workshop on Invertible Neural Networks and Normalizing Flows, 2019. (Accepted as a spotlight talk). Slides Poster

Wesley Maddox, Timur Garipov, Pavel Izmailov, Dmitry Vetrov, Andrew Gordon Wilson. “Fast Uncertainty Estimates and Bayesian Model Averaging of DNNs.” In UAI Workshop on Uncertainty in Deep Learning, Monterey, 2018. (Accepted as contributed talk) Slides Poster Code

Journal Articles

Sneha Grandhi, Colleen Bosworth, Wesley Maddox, Cole Sensiba, Sara Akhavanfard, Ying Ni, and Thomas LaFramboise. “Heteroplasmic shifts in tumor mitochondrial genomes reveal tissue-specific signals of relaxed and positive selection”. In: Human Molecular Genetics 26.15 (Aug. 2017), pp. 2912–2922. DOI: 10.1093/hmg/ddx172. Last visited on 06/14/2018.

Alexandru Belu, Wesley Maddox, Wojbor A Woyczynski. “Copulas and dependency measures for multivariate Linnik distributions” International Journal of Statistics and Probability. Vol 7 No 6, 2018. DOI:10.5539/ijsp.v7n6p154. Last visited on 09/20/2019.

Talks

“Two Sampling Methods for Approximate Inference in Bayesian Deep Learning,” University of Cambridge, Slides Event Link