
I am Rachit, a PhD student at Harvard University advised by Prof. Sham Kakade and Prof. David Alvarez-Melis.
Over the past few years, I walked my first baby steps as a researcher owing to some wonderful people and collaborations. Most recently, I was a pre-doctoral researcher at Google DeepMind, working on modularizing LLMs with Partha and Prateek. Before that, I pursued my bachelor’s thesis research with Yonatan at the Technion in Israel studying the information-theoretic properties of neural networks. Before that, I was a research intern at Adobe’s Media and Data Science Research Lab, where I worked on commonsense reasoning for large language models.
I was fortunate to collaborate with Danish on evaluating neural attribution methods1. I also had an amazing time working with Naomi studying mode connectivity in loss surfaces of language models2.
I also spent a couple of wonderful summers as a part of the Google Summer of Code program with the Cuneiform Digital Library Initiative (CDLI). Here, I was advised by Jacob and Niko.
News and Timeline
2026
- January Our work at Meta: Test-time Training for long-context LLMs and RL Scaling Laws accepted at ICLR 2026!
2025
- May Interning at Meta Superintelligence Lab (MSL) this Summer!
2024
- August Starting my doctorate at Harvard University!
- May Presenting our work on composing LLMs at ICLR 2024 in Vienna!
2023
- May Presenting our work on linear mode connectivity at ICLR 2023 in Kigali! With Jeevesh and Naomi.
2022
- September My bachelor’s thesis work done at the Technion was accepted at NeurIPS 2022!
- August Joining Google DeepMind as a pre-doctoral researcher!
- May Two papers on commonsense and factual reasoning done at Adobe MDSR accepted at NAACL 2022!
- January Starting my bachelor’s thesis with Yonatan at the Technion, Israel!
2021
- November Our work evaluating model explanations is accepted at TACL! In collaboration with Danish and others at CMU. <!– * May Work with CDLI accepted at ACL SRW 2021. Gauging machine translation and sequence labeling for extremely low-resource languages.
- May Starting as a Research Intern at Adobe’s Media and Data Science Research (MDSR). –>
-
Started with a meek awe-inspired email ↩