I am a doctoral student in the Language Technologies Institute at Carnegie Mellon University where I am advised by Yonatan Bisk and Emma Strubell.
My research interests are in characterizing and evaluating the impact of machine learning workloads, with the goal of designing computationally and energy efficient systems for artificial intelligence. My research is generously supported by the NSF Graduate Research Fellowship.
Previously, I received my B.S. in Computer Science and B.S. in Electrical Engineering at Northwestern University where I worked with Doug Downey and Thrasos Pappas. I’ve also spent time at Google, Meta FAIR, and the Allen Institute for Artificial Intelligence.
Email: jaredfern [at] cmu.edu
Out-of-Date News:
$\qquad$ [April 2025] $\quad$ Presented joint work on
low-power distributed training at SIGBOVIK!
$\qquad$ [Nov. 2024] $\quad$ Presented our paper on continual pretraining for LLMs at EMNLP!
$\qquad$ [June 2024] $\quad$ Interning at Meta FAIR studying the efficiency of distributed training!
$\qquad$ [Dec. 2023] $\quad$ Presented work on deep learning framework efficiency at EMNLP!