What does OpenAI's Strawberry mean for AI hardware?

Sun Sep 15 2024

banner cover

What does OpenAI's Strawberry mean for AI hardware?

There have been talks about Strawberry utilizing Reinforced learning, which will think multi-steps by itself, and hence reduce the GPU usage. This could be half right half wrong. Indeed  Strawberry could be using less GPU for Training vs GPT-4, but that's basically mainly due to it not being a general type tool - Strawberry does not focus on  multi-modal addressing images/video generation, but more focuses on reasoning comprehensive tasks (=inferencing) and give correct answers.  

S As Strawberry focuses on reasoning/inferencing, Rising usage of Strawberry would create massive rising demand for Inferencing which Nvidia confirms as well. 

As such, on Training alone, it's possible that Starwberry uses less GPU, but that doesn't mean much to the infrastructure industry, as 

  1. The emergence of Strawberry would help expand the audience instead of competing with GPT4 on same audience. 
  2. In addition combining Training and Inferencing, Strawberry could actually consume more hardware, depends on the number of users. 

Nvidia PM also comes out with similar conclusion https://www.linkedin.com/posts/drjimfan_openai-strawberry-o1-is-out-we-are-finally-activity-7240045907814051840-YjVF

That's it for this week!