Part 2/14:
Ross challenged the traditional view of computational constraints in AI. He argued that current limitations—such as model inference capacity—are not mere bottlenecks but opportunities for exponential growth if addressed correctly. He stated, "If OpenAI or Anthropic were given twice the inference compute, their revenue would almost double within a month." This highlights how ramping up compute resources isn't just about performance; it directly translates into revenue and ecosystem expansion.