Congrats on being that guy
Congrats on being that guy
You’re aware that there’s the OpenAI API library right? https://github.com/openai/openai-python
It’s really nothing fancy especially on Lemmy where like 99% of people are software engineers…
Are you drunk?
Yeah I found some stats now and indeed you’re gonna wait like an hour to process if you throw like 80-100k token into a powerful model. With APIs that kinda works instantly, not surprising but just to give a comparison. Bummer.
Thanks! Hadn’t thought of YouTube at all but it’s super helpful. I guess that’ll help me decide if the extra Ram is worth it considering that inference will be much slower if I don’t go NVIDIA.
Yeah I was thinking about running something like Code Qwen 72B which apparently requires 145GB Ram to run the full model. But if it’s super slow especially with large context and I can only run small models at acceptable speed anyway it may be worth going NVIDIA alone for CUDA.
Seems like that extra $150 million extra in hasbara money is already in good use judging from your genocide denial and post history.
Meh, ofc I don’t.
Thanks, that’s very helpful! Will look into that type of build
I understand what you’re saying but I’m coming to this community because I like having more input, hear about the experience of others and potentially learn about things I didn’t know about. I wouldn’t ask specifically in this community if I wouldn’t want to optimize my setup as much as I can.
Interesting, is there any kind of model you could run at reasonable speed?
I guess over time it could amortize but if the usability sucks that may make it not worth it. OTOH really don’t want to send my data to any company.
I’d honestly be open for that but would an AMD setup not take up a lot of space and consume lots of power / be loud?
It seems like in terms of price & speed, the Macs suck compared to other options, but if you don’t have a lot of space and don’t want to hear an airplane engine constantly I’m wondering if there are options.
Yeah the VRAM of Mac M series is very attractive for running models at full context length and the memory bandwidth is quite good for token generation compared to the price, power consumption and heat generation of NVidia GPUs.
Since I’ll have to put this in my kitchen/living room that’d be a big plus but idk how well prompt processing would work if I send over like 80k tokens.
Might be true if you’re white and not Muslim. It’s not mutually exclusive to be nationalistic and socialist at the same time if you think about it.
https://jacobin.com/2021/02/denmark-zero-asylum-immigration-refugees
After Denmark’s Social Democrats returned to office in 2019, backed by the left-wing parties, some hoped for an end to the previous right-wing government’s extreme anti-migrant measures. The outgoing administration had introduced an infamous “jewelry law,” forcing immigrants to give up valuables when applying for asylum, and a “ghetto plan” making it possible to force immigrants out of their homes. Yet, such hopes of change were quickly foiled. The incoming government enthusiastically maintained and even bolstered migration policies that were once the preserve of the far right. And in recent weeks, the ruling Social Democratic Party has sunk to new lows. In an interview at the end of January, integration and immigration minister Mattias Tesfaye announced the aim for Denmark to accept “zero” asylum seekers. The following day, prime minister Mette Fredriksen clarified Tesfaye’s statement by confirming this stance: “We cannot make a promise of having zero asylum seekers, but we definitely can put forward such a vision.”
Denmark is at the forefront of normalizing right wing ideas in general, and racism and Islamophobia in particular.
They are most certainly not the most left-wing country in the world.
My first thought was that it’s very entertaining to see far right people clashing. Denmark is heavily infested with right wing politics as well.
My second thought was that last time the whole world decided to go right wing we got 2 world wars.
Just a correction: what you call „Israeli Muslim“ are Palestinians. Very few Palestinians identify as Israeli. The Christians in the plot are also Palestinians.
Lmao great find. Academic publishing companies are absolute parasites btw, Libgen did the world a favor
Those are a lot of words and I don’t fully understand your point. But essentially the only way to peace is an end of the genocide, the apartheid and the Zionist colonization as a whole, with equal rights plus compensation for the native Palestinians and acknowledging the right of return of the displaced.
Thanks for the reply, still reading here. Yeah thanks to the comments and reading some benchmarks I abandoned the idea of getting an Apple, it’s just too slow.
I was hoping to test Qwen 32B or llama 70b for running longer contexts, hence the apple seemed appealing.