Part 3/15:
GPT-4 isn't much bigger than GPT-3—possibly in the trillion or low tens of trillions of parameters, emphasizing a gradual evolution rather than a colossal leap.
GPT-4 could be around 100 trillion parameters, a massive scale-up that would revolutionize capabilities but would also pose significant computational challenges.
Estimating based on growth trends, it seems plausible GPT-4 might fall somewhere between 1 trillion and 20 trillion parameters—a substantial leap but not the 100 trillion figure some suggest. Such a jump would represent a 100x to 1,000x increase over GPT-3, aligning with the progression seen historically.
Historically, OpenAI's major leaps:
- GPT-2 (1.5 billion) to GPT-3 (~100 billion) was around a 100x increase within just over a year.