Part 2/9:
Dean noted that the roots of the current AI landscape can be traced back to foundational work in neural networks around 2012 and 2013. This period saw the introduction of large neural networks capable of tackling diverse problems ranging from visual recognition to speech and language processing. Dean recounted a pivotal moment when Google trained a neural network that was 60 times larger than its predecessors, leveraging the power of 16,000 CPU cores. This served as an affirmation of the model's capacity for excellence—the motto “bigger models, more data, better results” has proven true over the years.