Voroxpete
@Voroxpete@sh.itjust.works
- Comment on Silicon Valley is buzzing about this new idea: AI compute as compensation 20 hours ago:
Gee guys… Did you maybe build a whole bunch of compute capacity for a product no one actually wants, and now you have to find a way to use it for something?
- Comment on Lutris now being built with Claude AI, developer decides to hide it after backlash 21 hours ago:
As I’ve said elsewhere here, I really don’t have a problem with people holding a moral stance against the use of genAI. It’s fine to just say “However useful this might be, I don’t want to see it used because I think it has too many ethical costs/consequences.” But blanket accusing all work that involved genAI in any capacity of being “slop” isn’t holding a moral stance, it’s demanding that reality conform to your beliefs; “I hate this, therefore it must be terrible in every respect.”
If you truly hold a well founded ethical stance against the use of genAI, that stance shouldn’t be threatened by people doing good and effective work with genAI, because it’s effectiveness should have nothing to do with your objections.
- Comment on Lutris now being built with Claude AI, developer decides to hide it after backlash 1 day ago:
Frankly, most AI generated code is often easier to review, thanks to a combination of standardized practices (LLMs regress to the mean by design) and a somewhat overly enthusiastic approach to commenting and segmented layouts.
- Comment on Lutris now being built with Claude AI, developer decides to hide it after backlash 1 day ago:
The thing is, you’re conflating ethical and practical concerns here. The commenter you’re responding to is clearly talking about the practical aspects of using AI tools.
If you have a fundamental moral issue with AI that is entirely independent of how efficacious it is, that’s fine. That’s a completely reasonable position to hold. But don’t fall into the trap of wanting every use of genAI to be impractical because it aligns with your morality to feel that way.
If this is an ethical stance that you truly hold, you should be willing to believe that using these tools is bad even when they’re effective. But a lot of people instead have to insist that every use of AI is impractical, in the face of any evidence to the contrary, because they’ve talked themselves into believing that on some fundamental level. Like “If AI useful, that means I’m wrong about it being immoral.”
- Comment on Lutris now being built with Claude AI, developer decides to hide it after backlash 1 day ago:
But that kind of proves their point, right?
Yes, a lot of projects have had issues with contributers who push unreviewed AI slop that they don’t understand, ultimately creating more work for the project. Or with avalanches of AI code review bug reports that do nothing to help. But that’s not what’s happening here.
In this case, the main developer of the project is choosing to use AI, on their own terms, because they find it helpful, and people are giving them shit for it. It’s their project and they feel this technology is beneficial. Isn’t that their call to make? Why are people treating the former and the latter as completely interchangeable scenarios when they’re clearly not? It kind of does suggest that people are coming at this from a more ideological rather than rational perspective.
- Comment on Lutris now being built with Claude AI, developer decides to hide it after backlash 1 day ago:
Nothing is being hidden from review. The code is open source. They removed the specific attribution that indicates which parts of the code were created using Claude. That changes absolutely nothing about the ability to review the code, because a code review should not distinguish between human written code and machine written code; all of it should be checked thoroughly. In fact, I would argue that specifically designating code as machine written is detrimental to code review, because there will be a subconscious bias among many reviewers to only focus on reviewing the machine code.
- Comment on Hisense TVs force owners to watch intrusive ads when switching inputs, visiting the home screen, or even changing channels — practice infuriates consumers, brand denies wrongdoing 1 day ago:
Never buy Hisense, got it.
- Comment on Is *arr stack a real Netflix replacement? 2 days ago:
So, yes, you’re basically correct.
There are search layers that remove the need to access radarr / sonarr directly when searching for shows (someone mentioned jellyseer, for example), so that part of the process can be streamlined, and once you’re watching a show it’s generally very good at pulling new episodes as soon as they’re available, so you’re typically, at most, a day behind actual airing dates. But if you’re trying to just bounce around and try a bunch of different shows it wouldn’t be the best for that. The biggest constraint is generally the speed of your internet and the popularity of what you’re watching. With a high speed connection and a well seeded torrent it’s often only a a couple of minutes to download a pilot episode, and you could have the whole season done by the time you finish watching that.
The other question is one of storage. If you’ve got plenty of hard disk space then you can probably afford to just throw anything that sounds interesting on your pull queue and work your way through it when you actually have time to sit down and watch. Basically you sort of pre-emptively build your “Netflix at home” library and then do your bouncing around channel hopping stuff with the five or so vaguely interesting shows that you added while you were at work.
Is it a replacement for Netflix et al? Not strictly speaking, but if you don’t mind changing up your habits a little it’s probably close enough.
- Comment on YouTube ads are about to get even longer and they’ll be unskippable 3 days ago:
If you run Fcast receiver on your android TV device, you can cast to it from Grayjay without ads.
- Comment on YouTube ads are about to get even longer and they’ll be unskippable - Dexerto 3 days ago:
Oh, excellent, I’ll be checking that out right away. I have an iPad that I’m stuck with from work and it’d be great to get ad free YouTube there.
- Comment on YouTube ads are about to get even longer and they’ll be unskippable - Dexerto 3 days ago:
Are you running Ublock on Chrome or Firefox? It works significantly better on Firefox. I’ve never seen an ad get through it.
- Comment on YouTube ads are about to get even longer and they’ll be unskippable - Dexerto 3 days ago:
Someone else in this thread mentioned TizenTube, that sounds like what you’re looking for.
But personally I just grabbed an Nvidia Shield. It works great and if you swap out the default launcher you’ll never see a single ad on it (with the right apps). Plus the pro is beefy enough to run some decent emulators too.
- Comment on YouTube ads are about to get even longer and they’ll be unskippable - Dexerto 3 days ago:
That genuinely would not surprise me in the slightest.
- Comment on YouTube ads are about to get even longer and they’ll be unskippable - Dexerto 3 days ago:
They keep trying. The adblockers keep winning. I’ve had my fair share of videos sometimes not loading, or regularly needing to update apps to keep up with Google’s latest bullshit, but the minor glitches and headaches are worth it for all the time I don’t spend staring at a greyed out skip button.
- Comment on Yann LeCun Raises $1 Billion to Build AI That Understands the Physical World 3 days ago:
I mean, if you could actually solve that problem, it would be worth a LOT more than 1 billion. I just don’t remotely believe that you can solve it for 1 billion. Training costs alone would eat that and more.
- Comment on YouTube ads are about to get even longer and they’ll be unskippable - Dexerto 3 days ago:
Firefox with ublock origin for desktop and mobile. Grayjay for mobile (integrates Nebula too so you can get both your feeds at once). Android TV with FCast and Smart Tube Next for your TV. Never see ads again.
- Comment on YouTube ads are about to get even longer and they’ll be unskippable - Dexerto 3 days ago:
Yeah, Nebula is awesome.
- Comment on Do you stick to the same linux distro across your devices? 4 days ago:
I do, but it’s more out of laziness than anything else. I hate having to remember sixteen different ways of doing things, so I tend to configure all my stuff as identical as reasonably possible. Is this the best way of doing things? Probably not. But it keeps my blood pressure down.
- Comment on NVIDIA could enter the desktop CPU market with performance equal to AMD and Intel 1 week ago:
Yeah, we’ve been through this exact same game with multiple iterations of Intel and AMD chips. When AMD first started doing consumer CPUs they badged them according to their equivalent Intel clock speed because one to one comparisons were misleading.
What’s the L1 and L2 cache? What are the bus speeds? How many cores and how are they architectured? Multi-threading? How many steps is the instruction cycle? There are so many factors beyond just clock speed that play into real world performance.
- Comment on NVIDIA could enter the desktop CPU market with performance equal to AMD and Intel 1 week ago:
I think that’s 100% what this is, and it’s a very smart play if that’s the case. Intel are reeling from some significant setbacks, while Nvidia is swimming in cash. There’s never been a better time for them to make a play for the desktop CPU space.
And they’ve got absolutely no illusions about what’s happening with AI. They’re the ones who are literally paying AI companies to buy their chips. They know the space is collapsing. But as the guys selling the picks and shovels, they can ride out that collapse if they’re smart.
End of the day, if what we get out of this is a new, serious competitor in the CPU space, that’ll at least be some kind of win. With Nvidia’s money and expertise they could really force Intel to get their shit together. AMD chasing their heels is the only that’s ever kept them from completely going to shit, but more competition is even better. With all three major companies playing in both the CPU and GPU spaces, that could be really good for consumers.
- Comment on Microsoft previews tech to ease creation of keyboard-accessible websites 1 week ago:
It annoys me that I can guarantee they’re doing this to make it easier for agentic AI to interact with the web, but I guess if we get some important accessibility benefits along the way that’s not a bad thing.
- Comment on US | Hegseth Claims Iran War Isn’t Endless — But He Refuses to Provide a Timeline for Its End 1 week ago:
It’s not endless. They just don’t know when it’s going to end. But don’t worry, Trump has assured everyone that they have enough munitions stockpiled to fight forever. Not that they’re going to. Because it’s not an endless war. It’s just… open-ended.
- Comment on Datacenters in space are a terrible, horrible, no good idea. 1 week ago:
Yes, the bottom of the ocean is a terrible place to put a data centre. And the fact that it is, somehow, still a more practical option than space is a really good indicator of how unbelievably stupid the entire notion of space data centres is.
- Comment on U.S. Supreme Court declines to hear dispute over copyrights for AI-generated material 1 week ago:
Seems reasonable. This case is substantially similar to previous cases that were taken up by the supreme court - in particular a finding over whether a selfie generated by a monkey was copyrightable - and the lower court decisions are in line with the previous precedents set by the supreme court. So they’re effectively just saying “Our opinion hasn’t changed.”
- Comment on Datacenters in space are a terrible, horrible, no good idea. 1 week ago:
Yep. Radiation is deadly to computers, and without the atmosphere to protect you there is a LOT of radiation in space.
- Comment on Datacenters in space are a terrible, horrible, no good idea. 1 week ago:
Basically the way you would make a stealth spaceship would be by focusing as much as possible on energy efficiency. At every juncture you would try to use as little power as possible, and use every bit of it as efficiently as possible, so that you’re not remitting waste. That waste, in the form of heat, radio waves, etc, is what gets you spotted.
(For the Elite: Dangerous players, yes, that game got it right.)
- Comment on Datacenters in space are a terrible, horrible, no good idea. 1 week ago:
The entire ISS has 14GW of cooling (and a lot of that just goes towards keeping the sun from cooking it). A single server rack can produce around 72GW of heat.
The ISS cost about $100 billion.
Basically, if you took the entire budget of Sam Altman’s “Stargate” project (money that, to be clear, he does not have and will not get) and put it into space data centres you might, optimistically, put one rack in space.
Most data centres have dozens to hundred.
- Comment on Datacenters in space are a terrible, horrible, no good idea. 1 week ago:
You need to think about how an infrared laser works. You’re taking electricity, converting it into light and then focusing the light.
So you’d need to take the heat from your GPUs, inefficiently convert it into electricity (a lot of it would remain as heat), then inefficiently convert electricity into light (much of the electricity would turn back into heat in this process) and then focus the light away from the space data centre.
Now, we already have a process for moving heat away from things as infrared light, without going through all those steps (which would just reduce the efficiency of the process). It’s called a radiator, and it’s how we cool things in space. That’s literally where the name comes from; they radiate heat away as infrared light. That’s why hot things glow in thermal cameras.
It is incredibly inefficient. Radiation (ie, infrared light) is, by far, the worst way of cooling things. But in space its the only option you have, because there’s no convection or conduction across vacuum.
A top end GPU puts out about 1,000 watts of waste heat. The entire International Space Station has enough cooling for 14 of those, if it was doing nothing else whatsoever. An average server rack contains 72. The ISS cost $100 billion dollars. So at a minimum you’re looking at around $500 billion to put one single server rack in space. And that’s before accounting for the heat from the sun, which we can’t avoid because we need solar power to run this thing. So probably closer to a trillion. In other words, twice the already ludicrous price tag of Sam Altman’s “Stargate” project. For a single server rack.
- Comment on Datacenters in space are a terrible, horrible, no good idea. 1 week ago:
For anyone who doesn’t know, this is because space is an absolutely terrible place to put computers. Getting power is actually the easiest problem to solve, and is still really hard, because building any kind of infrastructure in space is hard. Then you’ve got all that radiation you have to shield against because you’re no longer protected by the Earth’s atmosphere, and worst of all you’ve got the cooling problem because Jesus fucking Christ, space is not cold!
This is why I get annoyed every time a scifi movie shows people freezing to death in space. Because it leads to this level of mass delusion and then suddenly it matters and everyone just unquestioningly believes the lie that space is cold. Space is a vacuum. A vacuum is what your Contigo travel mug uses to keep your coffee scalding hot after four hours. If vacuums are that good at keeping something hot when it naturally wants to get colder, think about what they’ll do to something that is actively generating heat. All of your components are going to cook.
There are proposals to put data centres at the bottom of the ocean that are substantially more credible than this idiocy.
- Comment on Anthropic says it ‘cannot in good conscience’ allow Pentagon to remove AI checks 2 weeks ago:
Don’t buy the hype. They’re not acting in good conscience, they’ve just weighed the pros and cons and decided that the PR hit isn’t worth it.