Bharat Runwal
I graduated from IIT Delhi with B.Tech in Electrical Engineering (Power And Automation). At IITD, i was a member of MISN group, where I completed my B.Tech thesis "Robustifying GNN against Poisoning Adversarial Attacks using Weighted Laplacian" under the guidance of Prof. Sandeep Kumar .
My research interest spans Computer Vision, Multimodal Learning, Compositionality, Efficient ML and Continual Learning. However, I'm always open to explore new research directions, and develop interesting real-world projects.
In the past I have been fortunate to work with Prof. Gerad De Melo from HPI Potsdam, Germany in the field of NLP, Prof. Özgür B. Akan and Dr Oktay Cetinkaya in the domain of Molecular Communication.
Email  / 
CV  / 
Google Scholar  / 
Twitter  / 
Linkedin  / 
Github
|
|
Research Experience
|
Research internJan 2023 - Ongoing
Collaborators: Yilun Du, Prof. Josh Tenenbaum
Research Topic: Continual Generative Modeling
|
|
Visiting ScholarJan 2022 - March 2023
CERC-AAI Lab, Mila - Quebec AI Institute,
Collaborators: Diganta Misra , Irina Rish
Research Topic: Sparsity, Continual Learning
|
|
Research internJune 2021 - Oct. 2021
Internet Of Everything (IoE) Group, University Of Cambridge
Collaborators: Arunava Das, Dr Oktay Cetinkaya , Prof. Özgür B. Akan
Research Topic: Received Signal Modeling and BER Analysis for Molecular SISO Communications
Accepted to the ACM NanoCom 2022.
|
|
Research InternOct. 2020 - May 2021
Deep Data Lab, HPI Potsdam, Germany
Supervisor: Prof. Gerad De Melo
Research Area: NLP
|
|
APP: Anytime Progressive Pruning  
Diganta Misra*,
Bharat Runwal*,
Tianlong Chen,
Zhangyang Wang,
Irina Rish
DyNN workshop at ICML,2022
SNN, 2022
CLL workshop at ACML, 2022
SlowDNN workshop, 2023
project /
paper /
webpage /
abstract /
bibtex
With the latest advances in deep learning, there has been a lot of focus on the online learning paradigm due to its relevance in practical settings. Although many methods have been investigated for optimal learning settings in scenarios where the data stream is continuous over time, sparse networks training in such settings have often been overlooked. In this paper, we explore the problem of training a neural network with a target sparsity in a particular case of online learning: the anytime learning at macroscale paradigm (ALMA). We propose a novel way of progressive pruning, referred to as \textit{Anytime Progressive Pruning} (APP); the proposed approach significantly outperforms the baseline dense and Anytime OSP models across multiple architectures and datasets under short, moderate, and long-sequence training. Our method, for example, shows an improvement in accuracy of $\approx 7\%$ and a reduction in the generalization gap by $\approx 22\%$, while being $\approx 1/3$ rd the size of the dense baseline model in few-shot restricted imagenet training. We further observe interesting nonmonotonic transitions in the generalization gap in the high number of megabatches-based ALMA. The code and experiment dashboards can be accessed at \url{https://github.com/landskape-ai/Progressive-Pruning} and \url{https://wandb.ai/landskape/APP}, respectively.
@misc{misra2022app,
title={APP: Anytime Progressive Pruning},
author={Diganta Misra and Bharat Runwal and Tianlong Chen and Zhangyang Wang and Irina Rish},
year={2022},
eprint={2204.01640},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
|
|
Robustifying GNN Via Weighted Laplacian (Best Student Paper Award)
Bharat Runwal,
Vivek Dahiya,
Sandeep Kumar
SPCOM, 2022 
|
|
Received signal modeling and BER analysis for molecular SISO communications
Arunava Das, Bharat Runwal, O. Tansel Baydas, Dr Oktay Cetinkaya , Prof. Özgür B. Akan
ACM NanoCom 2022
|
|
Pruning CodeBERT for Improved Code-to-Text Efficiency
Alex Gu, Ria Sonecha, Saaketh Vedantam, Bharat Runwal, Diganta Misra
Sparsity in Neural Networks(SNN) workshop, ICLR 2023
|
|
Uncovering the Hidden Cost of Model Compression
Diganta Misra* ,
Agam Goyal* (UW-Madison),
Bharat Runwal*,
Pin Yu Chen (IBM)
|
|
From PEFT to DEFT: Parameter Efficient Finetuning for Reducing Activation Density in Transformers   New!
Bharat Runwal,
Tejaswini Pedapati (IBM),
Pin Yu Chen (IBM)
|
|