OLMo: Everything You Need to Train an Open Source LLM with Akshita Bhagia
EPISODE 674
|
MARCH
4,
2024
Watch
Follow
Share
About this Episode
Today we’re joined by Akshita Bhagia, a senior research engineer at the Allen Institute for AI. Akshita joins us to discuss OLMo, a new open source language model with 7 billion and 1 billion variants, but with a key difference compared to similar models offered by Meta, Mistral, and others. Namely, the fact that AI2 has also published the dataset and key tools used to train the model. In our chat with Akshita, we dig into the OLMo models and the various projects falling under the OLMo umbrella, including Dolma, an open three-trillion-token corpus for language model pretraining, and Paloma, a benchmark and tooling for evaluating language model performance across a variety of domains.
About the Guest
Akshita Bhagia
Allen Institute for AI (AI2)
Resources
- Paper: Paloma: A Benchmark for Evaluating Language Model Fit
- Open Language Model: OLMo
- Paper: Dolma: an Open Corpus of Three Trillion Tokens for Language Model Pretraining Research
- Hugging Face 🤗 BLOOM
- Hugging Face 🤗 ALlama
- Paper: What's In My Big Data?
- Paper: LLaMA: Open and Efficient Foundation Language Models
- Paper: Llama 2: Open Foundation and Fine-Tuned Chat Models
- BloombergGPT – an LLM for Finance with David Rosenberg - #639
