The demand for LLM inference will drop off when people finally realise it is not the road to AGI. However there is still plenty of things GPU compute can be applied to and maybe spot prices will come down again.
It’s more likely that the demand for LLM inference will drop off only once AGI exists.
There are billions of active users already having it do everything from coming up with ideas for Christmas presents, to helping them write e-mails to clients.
That isn’t going anywhere until there’s a better option on the table. People would have already got bored and moved on if it wasn’t doing anything useful for them.
The demand for LLM inference will drop off when people finally realise it is not the road to AGI. However there is still plenty of things GPU compute can be applied to and maybe spot prices will come down again.
It’s more likely that the demand for LLM inference will drop off only once AGI exists.
There are billions of active users already having it do everything from coming up with ideas for Christmas presents, to helping them write e-mails to clients.
That isn’t going anywhere until there’s a better option on the table. People would have already got bored and moved on if it wasn’t doing anything useful for them.