You are viewing a single comment's thread from:

RE: Tesla Inference Computing And The Problem It Solves

in LeoFinance2 months ago

Ah, thank you.

I'm pretty sure the OTA updates use the owner's home WiFi... which is likely to have better connections than an in-car cellular service and is less likely to have data limits (in the US at least).

Oh... I hadn't realized that the applications themselves could pay potentially pay the owners for the cloud computing. That's a super interesting concept... I think it could be super hard to do technically, but I'll be super curious about the pricing to see how competitive it might be with AWS, Azure, etc.

Sort:  

I'll be super curious about the pricing to see how competitive it might be with AWS, Azure, etc.

I dont think it is a mistake that Amazon put $4 billion in Anthropic (Claude) and Microsoft has a huge stake in OpenAI. We are seeing the LLMs tied to those with the training compute.

What about inference? That is going to be the next challenge they all are going to have to create. We will see how it goes.