PhD Student, Computer Science
University of California Santa Cruz (transferred from Purdue University
email: firstname.lastname@example.org, office: Engineering 2, 489
I am in the job market and am looking for researcher/applied scientist positions
starting Spring 2020.
I'm a Computer Science PhD student, working on
Statistical Machine learning, advised by Prof. S.V.N. Vishwanathan. My
interests lie at the intersection of Optimization and Machine
Learning, with applications in areas such as Ranking / Recommender
Systems, Extreme Classification (multi-class or multi-label
involving huge # of classes/labels) and Deep Learning. Other interests include Graphical
Models and Scalable Bayesian Inference.
My thesis work has focussed on:
Developing reformulations for a wide spectrum of frequentist and bayesian models
to help distribute computation across machines more efficiently (de-centralize both data as well as
model parameters simultaneously to achieve Hybrid Parallelism) by using novel algorithmic/statistical/computational techniques,
Developing and implementing "asynchronous" distributed stochastic optimizers to solve the reformulations.
I have experience designing and implementing parallel optimization/inference algorithms
in distributed memory settings, for a variety of large-scale
supervised and unsupervised machine learning models.
I am very excited about applying advances from algorithms/numerical linear
algebra and systems/high-performance computing areas to develop computationally
efficient machine learning algorithms that can deal with massive datasets.
Applied Scientist Intern, Amazon AI
, Palo Alto (Summer 2017) Researched and implemented temporal video recommendation models using
Deep Neural Networks.
Research Intern, Adobe Research , San Jose (Summer 2016) Developed models to cluster user-behavior in Adobe analytics data using
both click (user url) as well as content information (user
Research Intern, Microsoft Research (Cloud Information and Services
Lab), Mountain View (Summer 2015) Researched the problem of extrapolating learning curves in machine
learning. Developed, implemented and evaluated a new prototype to perform non-linear
curve-fitting of predictors using small bites of data.
Research Intern, LinkedIn (Search Relevance), Mountain View (Summer 2014) Explored machine learning methods to resolve issues of sample bias and position
bias present in learning to rank systems with implicit feedback. Proposed
a new ranking framework to combine models incrementally.
Software Engineer, Yahoo! , Sunnyvale (July 2011 - July 2013) Worked in the Personalization Group on projects related to Entity
Detection, Entity Matching and Resolution,
Knowledge Graph. Worked in the Hadoop Team contributing to the Open-Source
project Oozie (Yahoo!'s Hadoop Workflow Scheduler).
Software Engineering Intern, Intel Corporation , Chandler (Summer
2010) Developed a searching and indexing infrastructure to help silicon engineers find relevant product design
information. Gathered requirements, developed and tested the system, and
deployed in production.
Application Developer, ThoughtWorks , Bangalore (June 2008 - July
2009) Designed and implemented web-services for the UK train ticket retailing
system - thetrainline.com.
News / Upcoming events
12/2019: Defended my Ph.D. dissertation on Hybrid-Parallel Parameter Estimation for Bayesian and
Frequentist Models , Slides
08/2019: Presented our work on Scaling Multinomial Logistic Regression via Hybrid-Parallelism
at KDD 2019, Slides
04/2019: Work on Scaling Multinomial Logistic Regression via Hybrid-Parallelism
accepted as Oral Presentation to KDD
2019, Anchorage, Alaska.
04/2019: Presented work on Extreme Stochastic Variational Inference (ESVI): Distributed
Inference for Large-Scale Mixture Models at AISTATS
2019, Okinawa, Japan.