Home News Experience Research Projects Education

Bharat Runwal

I graduated from IIT Delhi with B.Tech in Electrical Engineering (Power And Automation). At IITD, i was a member of MISN group, where I completed my B.Tech thesis "Robustifying GNN against Poisoning Adversarial Attacks using Weighted Laplacian" under the guidance of Prof. Sandeep Kumar .

My research interest spans Computer Vision, Multimodal Learning, Compositionality, Efficient ML and Continual Learning. However, I'm always open to explore new research directions, and develop interesting real-world projects.

I am currently working as an independent researcher. If you're interested in my work or would like to collaborate, feel free to reach out.

Email  /  Google Scholar  /  Twitter  /  Linkedin  /  Github

profile photo

News




   Research Experience
nthu

Research internJan 2023 - Ongoing

Collaborators: Yilun Du, Prof. Josh Tenenbaum

Research Topic: Continual Generative Modeling

nthu

Visiting ScholarJan 2022 - March 2023
CERC-AAI Lab, Mila - Quebec AI Institute,
Collaborators: Diganta Misra , Irina Rish

Research Topic: Sparsity, Continual Learning

nthu

Research internJune 2021 - Oct. 2021
Internet Of Everything (IoE) Group, University Of Cambridge
Collaborators: Arunava Das, Dr Oktay Cetinkaya , Prof. Özgür B. Akan

Research Topic: Received Signal Modeling and BER Analysis for Molecular SISO Communications
Accepted to the ACM NanoCom 2022.

nthu

Research InternOct. 2020 - May 2021
Deep Data Lab, HPI Potsdam, Germany
Supervisor: Prof. Gerad De Melo
Research Area: NLP

   Work Experience
nthu

Research EngineerMarch 2024 - August 2024
Simbian
Spearheaded the development of the Security Accelerator, improving threat hunting and detection in the cybersecurity domain.

nthu

AI Research InternJune 2021 - August 2021
AlphaICs
Research Area: Quantization of Neural Networks and Graph Neural Networks(GNNs)

nthu

Junior Machine Learning EngineerJune 2021 - August 2021
Omdena
Project: Helping People with Visual Impairment to Easily Use Buses through Computer Vision

nthu

NLP InternMay 2021 - June 2021
Zevi
Worked on building a vernacular search engine for e-commerce applications with features like price tag detection from query, autocomplete,spell check.




Publication
*indicates equal contribution

nthu APP: Anytime Progressive Pruning  
Diganta Misra*, Bharat Runwal*, Tianlong Chen, Zhangyang Wang, Irina Rish

DyNN workshop at ICML,2022
SNN, 2022 CLL workshop at ACML, 2022
SlowDNN workshop, 2023
project / paper / webpage / abstract / bibtex

With the latest advances in deep learning, there has been a lot of focus on the online learning paradigm due to its relevance in practical settings. Although many methods have been investigated for optimal learning settings in scenarios where the data stream is continuous over time, sparse networks training in such settings have often been overlooked. In this paper, we explore the problem of training a neural network with a target sparsity in a particular case of online learning: the anytime learning at macroscale paradigm (ALMA). We propose a novel way of progressive pruning, referred to as \textit{Anytime Progressive Pruning} (APP); the proposed approach significantly outperforms the baseline dense and Anytime OSP models across multiple architectures and datasets under short, moderate, and long-sequence training. Our method, for example, shows an improvement in accuracy of $\approx 7\%$ and a reduction in the generalization gap by $\approx 22\%$, while being $\approx 1/3$ rd the size of the dense baseline model in few-shot restricted imagenet training. We further observe interesting nonmonotonic transitions in the generalization gap in the high number of megabatches-based ALMA. The code and experiment dashboards can be accessed at \url{https://github.com/landskape-ai/Progressive-Pruning} and \url{https://wandb.ai/landskape/APP}, respectively.

@misc{misra2022app,
title={APP: Anytime Progressive Pruning},
author={Diganta Misra and Bharat Runwal and Tianlong Chen and Zhangyang Wang and Irina Rish},
year={2022},
eprint={2204.01640},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
GitHub Repo stars
nthu Robustifying GNN Via Weighted Laplacian (Best Student Paper Award)
Bharat Runwal, Vivek Dahiya, Sandeep Kumar

SPCOM, 2022 
nthu Received signal modeling and BER analysis for molecular SISO communications
Arunava Das, Bharat Runwal, O. Tansel Baydas, Dr Oktay Cetinkaya , Prof. Özgür B. Akan

ACM NanoCom 2022
Paper
nthu Pruning CodeBERT for Improved Code-to-Text Efficiency
Alex Gu, Ria Sonecha, Saaketh Vedantam, Bharat Runwal, Diganta Misra

Sparsity in Neural Networks(SNN) workshop, ICLR 2023
Paper [Preprint Soon!]  
nthu Uncovering the Hidden Cost of Model Compression
Diganta Misra* , Muawiz Chaudhary, Agam Goyal*, Bharat Runwal*, Pin Yu Chen

PiV @ CVPR, 2024
Paper / Code
nthu From PEFT to DEFT: Parameter Efficient Finetuning for Reducing Activation Density in Transformers   New!
Bharat Runwal, Tejaswini Pedapati (IBM), Pin Yu Chen (IBM)

Paper / Code
nthu SOUL: Unlocking the Power of Second-Order Optimization for LLM Unlearning   New!
Jinghan Jia, Yihua Zhang, Yimeng Zhang, Jiancheng Liu, Bharat Runwal, James Diffenderfer, Bhavya Kailkhura, Sijia Liu

EMNLP Main Conference, 2024
Paper / Code
nthu TaskGen: A Task-Based, Memory-Infused Agentic Framework using StrictJSON   New!
John Chong Min Tan, Prince Saroj, Bharat Runwal, Hardik Maheshwari, Brian Lim Yi Sheng, Richard Cottrill, Alankrit Chona, Ambuj Kumar, Mehul Motani

Paper / Code / Video
Projects
nthu Continual-Diffusers
September'24

A Pytorch library for Continual-Learning with diffusion models.

nthu Energy Based Diffusion Model Training
April'23

Re-Implementation of Training Energy Based Diffusion Model (Reference Work: Reduce, Reuse, Recycle: Composing Energy-Based Diffusion Models with MCMC) in Pytorch with various Samplers.

nthu Weighted Signed Graph Attention Networks
Nov'21

Enhanced the learned embeddings of the network nodes by adapting the loss function of the SiGAT Model to the weighted signed graph. The learned embeddings shows better inter class seperability in the embeddings space.

nthu Abstractive Summarization Methods analysis on AMI meeting Corpus
May'21

This project involves generating summaries of AMI meeting transcripts. The analysis of different methods proposed for abstractive summarization using SOTA Language models is provided and also tried to tackle the problem of summarization on longer documents in the case of AMI meeting corpus.

nthu Anomaly Detection in Time series Data of S&P 500
May'20

This project is Anomaly detection in closing prices of S&P500(Stock market index) time series data using LSTM autoencoder.As LSTM network is best for time series Data so i trained a LSTM autoencoder using the Keras API with Tensorflow 2 as the backend to detect anomalies (Sudden price changes) in the S&P 500 index.

nthu Face Generation using GAN
Apr'21

Used two Networks here one is Generator which takes random noise for inspiration and tries to generate a face sample.Second is Discriminator which takes a face sample and tries to tell if it’s real or fake. i.e it predicts the probability of input image being a real face.There is snippet attached of generated faces from trained model after training for 15k iterations.

nthu Deep Learning Projects

Implementation of other projects : Fake News Detection using LSTM, Image Captioning, Image-Steganography-using-lsb





   Education
nthu

B.Tech in Electrical Engineering(Power And Automation)2018 - 2022
Indian Institute of Technology, Delhi
Advisor: Asst. Prof. Sandeep Kumar
Delhi, India


Updated on: 7th February, 2022 Merci, Jon Barron!