大模型训练所需GPU:GPT-4、LLaMA、Falcon、Inflection

How Many GPUs Are Needed?

  • GPT-4 was likely trained on somewherebetween 10,000 to 25,000 A100s.
  • Meta has about 21,000 A100s,
  • Tesla hasabout 7,000 A100s,
  •  Stability Al hasabout 5,000 A100s.
  • Falcon-40B was trained on 384 A100s
  • Inflection used 3,500 H100s for their GPT-3.5 equivalent model

猜你喜欢

转载自blog.csdn.net/u013250861/article/details/132156836