• 1 Post
  • 202 Comments
Joined 2 years ago
cake
Cake day: June 10th, 2023

help-circle
  • Covering operating costs doesn’t make sense as the threshold for this discussion though.

    Operating costs would include things like computing costs for training new models and staffing costs for researchers, both of which would completely disappear in a marginal cost calculation for an existing model.

    If we use Deepseek R1 as an example of a large high end model, you can run a 8-bit quantized version of the 600B+ parameter model on Vast.Ai for about $18 per hour, or even on AWS for like $50/hour. Those produce tokens fast enough that you can have quite a few users on it at the same time, or even automated processes running concurrently with users. Most medium sized businesses could likely generate more than $50 in benefit from it per running hour, especially since you can just shut it down at night and not even pay for that time.

    You can just look at it from a much smaller perspective too. A small business could buy access to consumer GPU based systems and use them profitably with 30B or 120B parameter open source models for dollars per hour. I know this is possible, because I’m actively doing it.




  • The financial argument is pretty difficult to make.

    You’re right in one sense, there is a bubble here and some investors/companies are going to lose a lot of money when they get beaten by competitors.

    However, you’re also wrong in the sense that the marginal cost to run them is actually quite low, even with the hardware and electricity costs. The benefit doesn’t have to be that high to generate a positive ROI with such low marginal costs.

    People are clearly using these tools more and more, even for commercial purposes when you’re paying per token and not some subsidized subscription, just check out the graphs on OpenRouter https://openrouter.ai/rankings


  • You say “pro-AI” like there’s a group of random people needing to convince others to use the tools.

    The general public tried them, and they’re using them pretty frequently now. Nobody is forcing people to use ChatGPT to figure out their Christmas shopping, but something like 40% of people have already or are planning on using it for that purpose this year. That’s from a recent poll by Leger.

    If they weren’t at the very least perceived as adding value, people wouldn’t be using them.

    I can say with 100% certainty that there are things I have used AI for that have saved me time and money.

    The Anti-AI crowd may as well be the same people that were Anti-Internet 25 years ago.





  • This is one of those situations where nobody wins. They tried to do something nice, you didn’t like it, and both people ended up unhappy. Neither was being unreasonable.

    Buying someone a gift of a new thing isn’t unreasonable. Even if you tend to like older/used things it’s still not an unreasonable concept to buy someone an upgrade.

    Not liking a gift because the old one is fine is also not unreasonable. Especially if you have established this as a preference before.

    The best option here is for the partner to realize that the goal of the gift is to make the other person happy, and if that didn’t work, figure out the path forward that does make them happy. (in this case returning the gift, and finding something else)



  • When Gord Downie the lead singer of the Tragically Hip got terminal cancer, he went on tour. His final stop was Ontario, and they streamed it live on CBC (our national news network) to approximately 12 million viewers (Canada had a population of about 38 million)

    He died less than a month later. Our prime minister released a tribute statement, and held a press conference over it. Our House of Commons observed a moment of silence. The Toronto Maple Leafs (NHL Hockey Team) brought down one of the retired numbers from the rafters during their next game, because Gord had mentioned him in 50-mission cap. Almost every radio station that plays music switched to Hip songs for the day. The Canadian press gave him newsmaker of the year for 2016, then again for 2017 because of how significant the reaction to his death was.

    Dude was a legend both musically, and also in terms of the causes he championed. Especially the Indigenous reconciliation stuff.




  • So there’s an answer, and then there’s a problem.

    The easy answer is that Home Assistant has Voice Assistants now, and you can use Ollama, Whisper, and Piper to do that all locally.

    The problem is that it really only talks to Home Assistant, there’s no ability to have it search the web, or make a phone call, or really anything else outside of Home Assistant without significant addon stuffs.

    It also requires a reasonably significant amount of RAM on your computer to run the VM for Home Assistant while supporting Whisper and Piper and Ollama.