You can write code just fine on 20 or even 30 year old hardware. Basically if it runs Linux, chances are it can also run vim and compile code. If you spring for 10 year old hardware, you can even get an LSP + coc or helix, for error highlighting and goto definition and code actions. And you definitely don’t need a GPU for it (unless you’re doing something GPU-specific of course).
Editing 720p videos (which, if you encode with a high enough bitrate, still looks alright) can be done on 10-15 year old hardware.
Research is where it gets complicated. It does indeed often require a lot of computing power to do modern computational research. But for some simpler stuff - especially outside STEM - you can sometimes get away with a LibreOffice spreadsheet on an old Dell or something.
From the looks of it we will have to get used to doing more with less when it comes to computers. And TBH I’m all for it. I just hope that either my job won’t require compiling a lot more stuff, or they provide me with a modern machine at their expense.
SoleInvictus@lemmy.blahaj.zone 22 hours ago
But wait! They can pay for remote computing time for a fraction of the cost! Each month. Forever.
I fully expect personal computers to be phased out in favor of a remote-access, subscription model. AI popping would leave these big data centers with massive computational power available for use, plus it’s the easiest way to track literally everything you do on your system.
wonderingwanderer@sopuli.xyz 10 hours ago
Hopefully the AI bubble popping means they have to close data centers and liquidate hardware. Dirt-cheap aftermarket servers would be good for the fediverse.
ExLisper@lemmy.curiana.net 16 hours ago
And ban undesired activities. “We see you’re building app to track ICE agents. That’s illegal. You’re account was banned and all your data removed.”.
SoleInvictus@lemmy.blahaj.zone 16 hours ago
“Remain in your cube - The Freedom Force is en route to administer freedom reeducation. Please be sure to provide proof of medical insurance.”
obbeel@lemmy.eco.br 19 hours ago
Remote computing is very expensive. It’s just the gated (owned by companies) LLMs that are cheap for the final consumer. Training a 2b LLM on remote compute will cost thousands of dollars if you try to.
wonderingwanderer@sopuli.xyz 10 hours ago
2B is nothing, even 7B is tiny. Commercial API-based LLMs are like 130-200 billion parameters.
I mean yeah, training a 7B LLM from scratch on consumer-grade hardware could take weeks or months, and run up an enormous electric bill. With a decent GPU and enough VRAM you could probably shorten that to days or weeks, and you might want to power it on solar panels.
But I haven’t calculated what it would take to do on rented compute.
UnderpantsWeevil@lemmy.world 12 hours ago
I wouldn’t hold my breath.
Jrockwar@feddit.uk 19 hours ago
This is true but at the current computer prices, nowhere near as bad as it sounds. I spend £100/year or thereabouts for GeForce Now, and
If you have a life and can’t play any more than 25 hours a week, the value proposition right now is great - there’s no viable alternative that allows you to keep playing AAA games for the equivalent of £100/year.
SoleInvictus@lemmy.blahaj.zone 17 hours ago
Fuck, you almost sold me on GeForce Now. Owning is still a better value proposition for me because I get my games at… steep discounts.