You are viewing a single comment's thread from:

RE: Tesla Inference Computing And The Problem It Solves

in LeoFinance2 months ago

Havent heard any details. That is a long way away.

As for the access the computers, it would use the same method as they do now. I am not sure what the over the air updates uses for example. But the computers are connected to the Tesla network.

Would Tesla pay the vehicle owners for use of the computers and the battery it uses?

I would think it would be more the applications that are using the computers for inference. If you have "Inference-as-a-Service", there would be different applications that could use the infrastructure. They would have to pay for the service.

Sort:  

Ah, thank you.

I'm pretty sure the OTA updates use the owner's home WiFi... which is likely to have better connections than an in-car cellular service and is less likely to have data limits (in the US at least).

Oh... I hadn't realized that the applications themselves could pay potentially pay the owners for the cloud computing. That's a super interesting concept... I think it could be super hard to do technically, but I'll be super curious about the pricing to see how competitive it might be with AWS, Azure, etc.

I'll be super curious about the pricing to see how competitive it might be with AWS, Azure, etc.

I dont think it is a mistake that Amazon put $4 billion in Anthropic (Claude) and Microsoft has a huge stake in OpenAI. We are seeing the LLMs tied to those with the training compute.

What about inference? That is going to be the next challenge they all are going to have to create. We will see how it goes.