• 1 Post
  • 409 Comments
Joined 2 years ago
cake
Cake day: June 10th, 2023

help-circle
  • Machine learning models have much different needs that crypto. Both run well on gaming GPUs and both run even better on much higher end GPUs, but ultimately machine learning models really really need fast memory because it loads the entire weights into graphics memory for processing. There’s some tools which will push it to system memory but these models are latency sensitive so crossing the CPU bus to pass 10s of gigabytes of data between the GPU and system memory is too much latency.

    Machine learning also has the aspect of training vs inference, where the training portion will take a long time, will take less time with more/faster compute and you simply can’t do anything with the model while it’s training, meanwhile inference is still compute heavy it doesn’t require anywhere near as much as the training phase. So organizations will typically rent as much hardware as possible for the training phase to try to get the model running as quickly as possible so they can move on to making money as quickly as possible.

    In terms of GPU availability this means they’re going to target high end GPUs, such as packing AI developer stations full of 4090s and whatever the heck Nvidia replaced the Tesla series with. Some of the new SOCs which have shared system/vram such as AMD’s and Apple’s new SOCs also fill a niche for AI developer and AI enthusiasts too since that enables large amounts of high speed video memory for relatively low cost. Realistically the biggest impact that AI is having on the Gaming GPU space is it’s changing the calculation that AMD, Nvidia and Intel are making when planning out their SKUs, so they’re likely being stingy on GPU memory specs for lower end GPUs to try to push anyone with specific AI models they’re looking to run to much more expensive GPUs



  • Side thoughts in the middle of sentences are definitely weird in written form. Heck they get messy in spoken form too! Some punctuation to help the reader understand what’s being communicated can go a long way, and in the format of a forum discussion where folks will quickly tap out a brain fart from a 5" slab of plastic and glass, when I see what appear to be multiple sentences mashed together into one incoherent one, I’ll generally assume it’s a writing error, because folks don’t proof read, they aren’t writing literature with multiple drafts. They’re just quickly jotting down a thought or two and somethimes errors compound with that level of quick communication


  • Yikes you’re literally financing your hobby! Better financial move is to get a used system to start with (usually a used gaming PC can be had for like $500ish, and I’m sure there’s plenty of people online you can ask for help speccing something out), squirrel away money for a couple of years (I like to keep a dedicated savings account just for big purchases like tech upgrades. $40 biweekly dissearing into another account you don’t touch is $2k every 2 years, so a 4 year complete refresh cycle for 2 people) and buy when you feel like it. Good news is it’s a small enough amount of cash to easily right the financial ship but still yikes!




  • Also the scariest part of this datacenter inflation is how much of these new data centers are going to be abandoned within the next 5 years when the AI bubble pops and suddenly the companies spending like crazy on datacenter growth need to cut back. There’ll be lots of big empty buildings outside of small towns costing taxpayers a ton of money, much like when any big box store closes up shop. You can either spend a ton of money tearing it down, a ton of money rebuilding it into something useful, a ton of money attracting another business which may or may not front the cost for remodeling the space or a ton of money maintaining the empty property so it doesn’t fall over and become even more of a blight. There’s no winning for these small municipalities that just get used and abused by large businesses


  • Zram on Linux is awesome! I’ve used it heavily in both memory constrained systems and systems with 16+GB of memory running very poorly optimized code

    Running for example, Cities Skylines with 40GB of mods can easily lead to running memory usage being 20-30GB uncompressed. With zram I can load that same mod load out on a 16GB laptop with no swap and it won’t crash where it would crash for being out of memory before.

    Another example is Proxmox with over-provisioned lxc containers. Since it’s still the kernel scheduler running all of the processes in those containers zram can keep them all running nicely even when a heavily modded Minecraft server gets a few players online and starts pushing past memory limits, where before I set it up I’d have some of the Minecraft server processes get killed to free up memory resources without warning or proper logging by Minecraft

    Edit to add: my daughter’s first laptop has only 4GB of memory and runs a decade old Celeron booting from a spinning hard drive, the definition of budget ewaste. Zram makes it so it’s CPU limited running Minecraft rather than memory limited!



  • The good thing about new AM4 boards being available at this point in time is you have options to keep older hardware running. Usually the CPU and memory will out-survive motherboard. Much like those new Chinese motherboards supporting 4th and 6th gen Intel CPUs, this is great for longevity and reduces how much production is needed

    In a sane world, the limitations of a CPU socket would be reached, and then newer SKUs would no longer be released

    I’d argue that it would be best if computers were more like cars, a new platform gets released each decade or so, and small improvements are made to individual parts but the parts are largely interchangable within the platform and produced for a decade or two before production is retired. More interchangable parts, slower release cycle and more opportunities for repair instead of replacement


  • Gaming GPUs during normal crypto markets don’t compute fast enough to mine crypto profitably, but if crypto prices get high enough such as during a boom cycle, it can become profitable to mine on gaming GPUs

    Edit to add: For crypto there’s basically a set speed that any given GPU mines at. The hash rate specifically. It really doesn’t change noticably over time through software updates, nor does the power consumption of the GPU. Its basically a set cost per cryptocurrency mined with any given hardware. If the value earned by mining can exceed the cost to run the GPU then GPU mining can quickly start making sense again.


  • There was a nice window from about a year or two ago to about 3 months ago where no individual components were noticably inflated. Monitors took the longest to recover since the pandemic shortages so that was arguably around the beginning of this year that they seemed to fully normalize

    Its funny because at work we’ve been pushing hard on Windows 11 refreshes all year and warning that there will likely be a rush of folks refreshing at the last possible minute at the end of the year inflating prices. And we ended up being correct on the inflated prices part but it was actually the AI bubble that did it






  • Yes they do! For example, here’s a single rule I have setup which moves all of the various emails from ecommerce sites into a dedicated folder so they don’t clutter my inbox

    On Outlook each of those line items within my “online shopping” rule would have to be an individual rule, making my outbook rules far more cluttered and difficult to maintain. Thunderbird also lets you do partial matches, so places like LinkedIn and Indeed who send emails from lots of different addresses can be covered on a single “from” line whereas Outlook would require a dedicated rule for each of those addresses and you’d have to keep creating rules as the sites keep spinning up new emails.

    Thunderbird also has a surprisingly good junkmail algorithm in place. It requires some training by marking junk emails as junk and unmarking legitimate emails, but once its trained it works really well

    Oh yeah it also does this awesome conversation threading now automatically, and honestly the overall views showing the big list of messages is super good with lots of useful info at a glance and far less digging through conversation history to find a specific email. Honestly I’d hazard to say Thunderbird has a better interface than Outlook now.