Sorry Nvidia, TPUs from Google now power part of OpenAI’s ChatGPT workloads




  • OpenAI adds Google TPUs to reduce dependence on Nvidia GPUs
  • TPU adoption highlights OpenAI’s push to diversify compute options
  • Google Cloud wins OpenAI as customer despite competitive dynamics

OpenAI has reportedly begun using Google’s tensor processing units (TPUs) to power ChatGPT and other products.

A report from Reuters, which cites a source familiar with the move, notes this is OpenAI’s first major shift away from Nvidia hardware, which has so far formed the backbone of OpenAI’s compute stack.

https://cdn.mos.cms.futurecdn.net/dGrtPsDmoiXJpkBooQzmk5.jpg



Source link
waynewilliams@onmail.com (Wayne Williams)

Latest articles

spot_imgspot_img

Related articles

spot_imgspot_img