If software worked and was good in 2005 on pcs with 2gb RAM and with CPUs/GPUs vastly worse than modern ones, then why not write modern software like how that was written? Why not leverage powerful hardware when needed, but leave resource demands low at other times?

What are the reasons for which it might not work? What problems are there with this idea/approach? What architectural (and other) downgrades would this entail?

Note: I was not around at that time.

  • Ephera@lemmy.ml
    link
    fedilink
    English
    arrow-up
    2
    ·
    3 hours ago

    I’ve seen it argued that the best way to create lightweight software is to give devs old hardware to develop on.

    Which, yeah, I can see that. The problem is that as a dev, you might have some generic best practices in your head while coding, but beyond that, you don’t really concern yourself with performance until it becomes an issue. And on new hardware, you won’t notice the slowness until it’s already pretty bad for those on older hardware.

    But then, as the others said, there’s little incentive to actually give devs old hardware. In particular, it costs a lot of money to have your devs waiting for compilation on older hardware…