It’s also that AGI (even only at human levels) is faster (e.g. a human mathematician equivalent in AGI thinking faster by 1000x should achieve 1000years of progress in just one year). Plus scalability, we have a couple of millions of scientists and engineers in the world, once AGI at that level is scalable, agentic and has perhaps also physical embodiment if needed, we could could have billions and trillions (again limited by the rate of recursive self improvement), all communicating and transferring gained knowledge at electronic speeds. Humans as the current drivers of progress are not subject to recusive self improvement outside of tool usage and specialization. Thus Kurzweils estimate for 2045 is likely far to conservative, also we might get AGI/ASI prior to his 2029 expectation (which he himself acknowledged recently).
You are viewing a single comment's thread from: