Part 5/14:
He also highlighted ongoing expansion of AI compute capacity, with over 35,000 H100 GPUs active—a figure expected to reach around 85,000 by the year's end. This compute power underpins not only the refinement of FSD but also the potential of "distributed inference", where Tesla's fleet of vehicles could serve as a massive, decentralized AI cloud, enhancing both safety and revenue streams.