Anthropic Head of Pretraining on Scaling Laws, Compute, and the Future of AI

Anthropic Head of Pretraining on Scaling Laws, Compute, and the Future of AI

0 Ratings
0
Episode
291 of 304
Duration
1H 4min
Language
English
Format
Category
Non-fiction

Ever wonder what it actually takes to train a frontier AI model?YC General Partner Ankit Gupta sits down with Nick Joseph, Anthropic's Head of Pre-training, to explore the engineering challenges behind training Claude—from managing thousands of GPUs and debugging cursed bugs to balancing compute between pre-training and RL. We cover scaling laws, data strategies, team composition, and why the hardest problems in AI are often infrastructure problems, not ML problems.


Listen and read

Step into an infinite world of stories

  • Read and listen as much as you want
  • Over 1 million titles
  • Exclusive titles + Storytel Originals
  • 7 days free trial, then €9.99/month
  • Easy to cancel anytime
Try for free
Details page - Device banner - 894x1036
Cover for Anthropic Head of Pretraining on Scaling Laws, Compute, and the Future of AI

Other podcasts you might like ...