I'm a fifth-year Computer Science PhD candidate at Stanford, where I'm co-advised by Chris Ré and Kayvon Fatahalian, affiliated with the Stanford AI Lab, Stanford CRFM, the Stanford Machine Learning Group, DAWN, and the Stanford Computer Graphics Lab. I'm also an academic partner with Together. In my research, I develop ML and systems algorithms to break crucial modeling bottlenecks. Recently, I've been very interested in enabling longer-sequence models, with a line of work starting with H3 and FlashAttention, and continuing on with Hyena. This blog post summarizes our past two years of work on increasing the context length of foundation models.