निरुपम गुप्ता / Nirupam Gupta
Hello, and welcome to my personal website. I am a Post-Doctoral Fellow in the Department of Computer Science at
Georgetown University, sponsored by Nitin H. Vaidya.
Research Work: I mainly work on problems related to distributed optimization, machine learning, and control systems.
My focus areas are fault-tolerance, robustness, and privacy-preservation. For details on my current (and past) research projects please check the following links:
Teaching Experience: As a postdoc. at Georgetown University, I taught a seminar course on Algorithms for Distributed Machine Learning. This course was designed to introduce the state-of-the-art algorithms, and the existing challenges for solving the problem of distributed supervised learning. The course was offered to the Computer Science PhD students in the spring semester of 2020. Further details on the course can be found here.
Some of my recent research work -
Byzantine Fault-Tolerance in Decentralized Optimization under Minimal Redundancy (arXiv, Sept'20),
with Thinh T. Doan, and Nitin Vaidya.
In this work, we extend our prior results on Byzantine fault-tolerant distributed optimization for server-based system artchitecture to the decentralized peer-to-peer system architecture. This paper presents the first ever provably correct Byzantine fault-tolerant decentralized optimization algorithm for high-dimensional optimization problems.
Byzantine Fault-Tolerant Distributed Machine Learning Using Stochastic Gradient Descent (SGD) and Norm-Based Comparative Gradient Elimination (CGE) (arXiv, Aug'20), with Shuo Liu, and Nitin Vaidya.
In this work, we show the applicability of a norm-based gradient elimination techinique (that we proposed) to Byzantine fault-tolerance in distributed stochastic gradient-descent method for distributed machine learning.
Iterative Pre-Conditioning for Expediting the Gradient-Descent Method: The Distributed Linear Least-Squares Problem (arXiv, Aug'20),
with Kushal Chakraborty, and Nikhil Chopra.
In this work, we propose the first ever distributed iterative optimization method with superlinear rate of convergence. Specifically, we show that a traditional gradient-descent method when coupled with an iterative pre-conditioner matrix can achieve superlinear convergence rate - unequivocally superior to state-of-the-art accelerated methods; namely Nesterov's accelerated method, heavy-ball method, and the quasi-Newton method called BFGS. (Variants of this work, showing improved robustness to system noise, have been published in the proceedings of the 2020 American Control Conference, and the IEEE Control Systems Letters - 2021.)
Preserving Statistical Privacy in Distributed Optimization, with Shripad Gade, Nikhil Chopra, and Nitin H. Vaidya.
We present a distributed optimization protocol that preserves statistical privacy of agents' local cost functions against a passive adversary that corrupts some agents in a peer-to-peer network without affecting the correctness of the solution, unlike the more widely used and popular differential privacy protocols. The work has been published in the IEEE Control Systems Letters, 2021.
Fault-tolerance in Distributed Optimization: The Case of Redundancy (pdf), with Nitin Vaidya.
Published in the proceedings of the 39th ACM Symposium on Principles of Distributed Computing (PODC'20) [presentation video]. In this work, we partially solve the problem of Byzantine fault-tolerance in high-dimensional distributed optimization. Specifically, we derive a tight impossibility result for Byzantine fault-tolerant distributed optimization, and propose a computationally efficient fault-tolerance mechanism that renders the traditional dstirbuted gradient-descent method resilient against Byzantine faulty agents in the system. Extensions of this work can be found in technical reports; (1) Resilience in Collaborative Optimization: Redundant and Independent Cost Functions (arXiv, March'20), and (2) Byzantine Fault Tolerant Distributed Linear Regression (arXiv, March'19).
Some highlights of my education -
I am humbly grateful for having received some decent education in this life. Besides literacy in three languages, namely Hindi (native), English, and Gujarati, I have also managed to obtain a couple of academic degrees: