The AI Stack And Where The Money Resides

AI is going to generate fortunes like we never saw before.

That said, everything that is related to AI is not going to make money. This is the lesson from the Dotcom era. Many thought that simply because a company was "on the Internet" or "developing an app" that it was going to be worth a fortune.

In reality, most of what was out there was garbage.

For that reason, we will take a look at the AI stack to see where the money is going to be made. In my mind, it shapes up like this.


image made using Ideogram

Hardware

Here is the money printing machine.

It is the world of NVIDIA and whomever else is making AI chips. For now, GPUs dominate which makes NVIDIA the clear cut winner. Others like Tesla and Google design their own chips. There also could be newer technology relating to this that causes a massive breakthrough.

Whatever it is, this is the engine that drives everything. Compute, aka processing power, is going to be required and those that make this hardware are going to do well.

Infrastructure

This might not be as lucrative as chip making yet it is easier to get into. Whereas the first will have a few major players, this is more open.

What are some examples?

We have agent orchestration tools. This also would include software designed to evaluate what is going on. Then there is deployment.

Basically, we are looking at all that is inserted in the hardware to help produce the outcomes.

LLMs

The next up is the Large Language Models themselves. This is the level where things can get interesting.

Personally, I foresee a time when these are commoditized. At some point, the ability to separate from everything else is going to be difficult. That are basically all training on the same data, which then will lead to similar synthetic data, resulting on similar systems.

At present, we see the arms race because we are early in the game. However, as more play around with the technology, especially with open source systems out there, and it is easy to see how this could flip to a commodity.

Applications

Like the first two layers, we can see enormous value here.

What I find fascinating is those AI companies that already have working products could see that value enhanced by creating the LLMs. Nevertheless, it is the integration where the money will be made.

For Meta, this is integrating Llama into all its products. Google is doing something similar. We are likely going to see some of these building robots, like Tesla and OpenAi, which will utilize the LLMs.

This is what most will focus upon, including those in Web 3.0.

Web 3.0 Applications

The key, to me, is to build as much of this stack as we can.

Obviously, the hardware layer is going to be difficult. Perhaps would could end up seeing distributed computing coming into play here. However, can this be used to efficiently train these models. We are looking at a ton of compute to even begin to rival what Big Tech is doing.

That said, the rest of the structure can be addressed to varying degrees.

The central premise is going to be eventually offering services. AI is like anything else, people will not use it if there is no need.

So far, with Web 3.0, that is the case. What services are being offered? That is the billion dollar question, one which there really isn't an answer.

This could quickly change, especially as AI penetrates, well, everything we do in the digital world. This is the very early stages and blockchain could be a great ally in this endeavor.

For this reason, Web 3.0 applications can certainly leverage the technology. An example of this is AI personal assistants. Naturally, this is not going to be all on public databases. However, these could be offered within Web 3.0 applications that incorporate different aspects including cryptocurrency and decentralized data.

Each layer of the stack offers some potential except, ironically, the LLMs. If things like Grok and Llama become truly open source, as well as rivaling what else is out there, then we could see something completely different in a few years.

The race might be to become the standard AI operating system and let everyone else build value for the model, which is then integrated into that company's products.

Interesting times to say the least.


What Is Hive

Posted Using InLeo Alpha

Sort:  

This is a great way to look at things. Cutting it up and looking at the different parts makes it easy to identify where opportunities are available. Hardware and Infrastructure are a bit difficult since it can cost a lot to be competitive. LLMs and Applications on the other hand are much easier to get into.

That is the only way to do it. There are companies that are focusing up on each level, investing billions. That is how they are looking at it.

Therefore, so should we in my opinion.

I am so far off when it comes to AI. I understand the basic concept but not sure how all the layers work. I fee like existing tech companies will leverage their power and resources to eat up all the innovating work that is happening now and will happen in the future.

Understanding each layer is crucial. There are companies spending billions in their chosen area to make this all come to fruition.

Having a knowledge of what it all entails is crucial especially if we want to disrupt things.