I'm a Computer Science PhD student, working on
Statistical Machine learning, advised by Prof. S.V.N. Vishwanathan. My
interests lie at the intersection of Optimization and Machine
Learning, with applications in areas such as Ranking / Recommender
Systems, Extreme Classification (multi-class or multi-label
involving huge # of classes/labels) and Deep Learning. Other interests include Graphical
Models and Scalable Bayesian Inference.
My thesis work has focussed on:
Developing reformulations for a wide spectrum of frequentist and bayesian models
to help distribute computation across machines more efficiently (de-centralize both data as well as
model parameters simultaneously to achieve Hybrid Parallelism) by using novel algorithmic/statistical/computational techniques,
Developing and implementing "asynchronous" distributed stochastic optimizers to solve the reformulations.
I have experience designing and implementing parallel optimization/inference algorithms
in distributed memory settings, for a variety of large-scale
supervised and unsupervised machine learning models.
I am very excited about applying advances from algorithms/numerical linear
algebra and systems/high-performance computing areas to develop computationally
efficient machine learning algorithms that can deal with massive datasets.
News / Upcoming events
12/2019: Defended my Ph.D. dissertation on Hybrid-Parallel Parameter Estimation for Bayesian and
Frequentist Models , Slides
08/2019: Presented our work on Scaling Multinomial Logistic Regression via Hybrid-Parallelism
at KDD 2019, Slides
04/2019: Work on Scaling Multinomial Logistic Regression via Hybrid-Parallelism
accepted as Oral Presentation to KDD
2019, Anchorage, Alaska.
04/2019: Presented work on Extreme Stochastic Variational Inference (ESVI): Distributed
Inference for Large-Scale Mixture Models at AISTATS
2019, Okinawa, Japan.