Papers
This is likely out of date; check my google scholar page for a more comprehensive list.- Williamson, S.A. and Wu, L. (2024). Posterior Uncertainty Quantification in Neural Networks using Data Augmentation. AISTATS. [pdf]
- Turishcheva, P., Ramapuram, J., Williamson, S., Busbridge, D., Dhekane, E. and Webb, R. (2023). Bootstrap your own variance. NeurIPS Self-Supervised Learning Workshop. [pdf]
- Zhang, M.M., Dumitrascu, B., Williamson, S.A. and Engelhardt, B.E. (2023). Sequential Gaussian processes for online learning of nonstationary functions. IEEE Transactions on Signal Processing 71, 1539-1550. [pdf]
- Ribero, M., Henderson, J., Williamson, S. A., & Vikalo, H. (2022). Federating recommendations using differentially private prototypes. arXiv preprint:Pattern Recognition 129, 108746 [pdf]
- Namazi, R., Ghalebi, E., Williamson, S. A., & Mahyar, H. (2022) SMGRL: A scalable multi-resolution graph representation learning framework. arXiv preprint: arXiv:2201.12670
- Zhang, M.M., Williamson, S.A., & Perez-Cruz, F. (2022). Accelerated parallel non-conjugate sampling for Bayesian non-parametrics. Statistics and Computing 32 (3), 50. [pdf]
- Williamson, S.A., & Henderson, J. (2021). Understanding collections of related datasets using dependent MMD coresets. Information, 12(10). [pdf].
- Consul, S., & Williamson, S.A. (2021). Balance is key: Private median splits yield high-utility random trees. Preprint: arXiv:2006.08795.
- Ghalebi, E., Mahyar, H., Grosu, R., Taylor, G.W., & Williamson, S.A. (2021). A nonparametric Bayesian model for sparse dynamic multigraphs. Preprint arXiv:1905.11724.
- Dubey, A., Zhang, M.M., Xing, E.P., & Williamson, S.A. (2020). Distributed, partially collapsed MCMC for Bayesian nonparametrics. In Proceedings of the 23rd International Conference on Artificial Intelligence and Statistics (pp. 3685–3695). [pdf]
- Williamson, S.A., Zhang, M.M. , & Damien, P. (2020). A new class of time dependent latent factor models with applications. Journal of Machine Learning Research, 21(27), 1–24.[pdf]
- Ni, Y., Müller, P., Diesendruck, D., Williamson, S., Zhu, Y., & Ji, Y. (2020) Scalable Bayesian nonparametric clustering and classification. Journal of Graphical and Computational Statistics, 29(1), 53–65. [pdf]
- Henderson, J., Sharma, S., Gee, A., Alexiev, V., Draper, S., Marin, C., Hinojosa, Y., Draper, C., Perng, M., Aguirre, L., Li, M., Rouhani, S., Consul, S., Michalski, S., Prasad, A., Chutani, M., Kumar, A., Alam, A., Kandarpa, P., Jesudasan, B., Lee, C., Criscolo, M., Williamson, S., Sanchez, M., & Ghosh, J. (2020). Certifai: A toolkit for building trust in AI systems. In International Joint Conference on Artificial Intelligence Demonstrations Track (pp. 5249–5251). [pdf]
- Cole, G.W., & Williamson, S.A. (2019). Avoiding resentment via monotonic fairness. Preprint arXiv:1905:10003.
- Lake, T., Williamson, S.A., Hawk, A.T., Johnson, C.C., & Wing, B.P. (2019). Large-scale collaborative filtering with product embeddings. Preprint arXiv:1901.04321.
- Cole, G.W., & Williamson, S.A. (2019). Stochastic blockmodels with edge information. arXiv preprint arXiv:1904.02016.
- Zhang, M.M, & Williamson, S.A. (2019). Embarrassingly parallel inference for Gaussian processes. Journal of Machine Learning Research, 20(169), 1–26. [pdf]
- Diesendruck, M., Elenberg, E.R., Sen, R., Cole, G.W., Shakkottai, S., & Williamson, S.A. (2019). Importance weighted generative networks. In European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (pp. 249–265). [pdf]
- Williamson, S.A., & Tec, M. (2019). Random clique covers for graphs with local density and global sparsity. In Proceedings of the 35th Conference on Uncertainty in Artificial Intelligence (pp. 228–238). [pdf]
- Alawieh, M.B., Williamson, S.A., & Pan, D.Z. (2019). Rethinking sparsity in performance modeling for analog and mixed circuits using spike and slab models. In Proceedings of the 56th ACM/IEEE Design Automation Conference. [pdf]
- Joo, J., Williamson, S.A., Vazquez, A.I., Fernandez, J.R., & Bray, M.S. (2019). The influence of 15-week exercise training on dietary patterns among young adults. International Journal of Obesity, 43(9), 1681–1690. [pdf]
- Joo, J., Williamson, S.A., Vazquez, A.I., Fernandez, J.R., & Bray, M.S. (2018). Advanced dietary patterns analysis using sparse latent factor models in young adults. Journal of Nutrition, 148(12), 1984–1992. [pdf]
- Peters, M., Saar-Tsechansky, M., Ketter, W., Williamson, S.A., Groot, P., & Heskes, T. (2018). A scalable preference model for autonomous decision-making. Machine Learning, 107(6),1039–1069. [pdf]
- Doshi-Velez, F., & Williamson, S.A. (2017). Restricted Indian buffet processes. Statistics and Computing, 27(5), 1205–1223. [pdf]
- Williamson, S.A. (2016). Nonparametric network models for link prediction. Journal of Machine Learning Research, 17(202), 1–21. [pdf]
- Dubey, A., Reddi, S.J., Poczos, B., Smola, A.J., Xing, E.P., & Williamson, S.A. (2016). Variance reduction in stochastic gradient Langevin dynamics. In Advances in Neural Information Processing Systems 29 (pp. 1154–1162). [pdf]
- Orbanz, P., & Williamson, S. (2016). Unit-rate Poisson representations of completely random measures. [preprint]
- Foti, N.J., & Williamson, S.A. (2015). A survey of non-exchangeable priors for Bayesian nonparametric models. Pattern Analysis and Machine Intelligence, 37(2), 359–71. [pdf]
- Dubey, A., Ho, Q., Williamson, S.A., & Xing, E.P. (2014). Dependent nonparametric trees for dynamic hierarchical clustering. In Advances in Neural Information Processing Systems 27 (pp. 1152–1160). [pdf]
- Dubey, A., Williamson, S.A., & Xing, E.P. (2014). Parallel Markov chain Monte Carlo for Pitman-Yor mixture models. In Proceedings of the 30th Conference on Uncertainty in Artificial Intelligence (pp. 142–151). [pdf]
- Williamson, S.A., MacEachern, S.N., & Xing, E.P. (2013) Restricting exchangeable nonparametric distributions. In Advances in Neural Information Processing Systems 26 (pp. 2598–2606). [pdf]
- Dubey, A., Hefny, A., Williamson, S., & Xing, E.P. (2013) A nonparametric mixture model for topic modeling over time. In Proceedings of the SIAM International Conference on Data Mining (pp. 530–538). [pdf]
- Foti, N.J., Futoma, J.D., Rockmore, D.N., & Williamson, S.A. (2013). A unifying representation for a class of dependent random measures. In Proceedings of the 16th International Conference on Artificial Intelligence and Statistics (pp. 20–28). [pdf]
- Williamson, S.A., Dubey, A., & Xing, E.P. (2013). Parallel Markov chain Monte Carlo for nonparametric mixture models. In Proceedings of the 30th International Conference on Machine Learning (pp. 98–106). [pdf]
- Foti, N.J., & Williamson, S.A. (2012). Slice sampling normalized kernel-weighted completely random measure mixture models. In Advances in Neural Information Processing Systems 25 (pp. 2240–2248). [pdf]
- Hu, Y., Zhai, K., Williamson, S., & Boyd-Graber, J. (2012). Modeling images using transformed Indian buffet processes. In Proceedings of the 29th International Conference on Machine Learning (pp. 1511–1518). [pdf]
- Williamson, S., Wang, C., Heller, K.A.†, & Blei, D.M. (2010). The IBP compound Dirichlet process and its application to topic modeling. In Proceedings of the 27th International Conference on Machine Learning (pp. 1151–1158). [pdf]
- Williamson, S., Wang, C., Heller, K.A., & Blei, D.M. (2011). Nonparametric mixed membership modelling using the IBP compound Dirichlet process. In K.L. Mengerson, C.P. Robert & D.M. Titterington (Eds.), Mixtures: Estimation and Applications (pp. 145–160).
- Williamson, S., Orbanz, P., & Ghahramani, Z. (2010). Dependent Indian buffet processes. In Proceedings of the 13th International Conference on Artificial Intelligence and Statistics (pp. 924–931). [pdf]
- Heller, K.A., Williamson, S., & Ghahramani, Z. (2008). Statistical models for partial membership. In Proceedings of the 25th International Conference on Machine Learning, (pp. 392–399). [pdf]