skibidi
@skibidi@lemmy.world
- Comment on The grand prize 3 weeks ago:
Dropping anything in orbit just means it is still in orbit.
You’d need a lot of fuel to deorbit that cube on a steep trajectory.
- Comment on California’s new law forces digital stores to admit you’re just licensing content, not buying it 1 month ago:
Well, sort of. HDCP exists, and does make it harder to capture an AV stream.
For interactive content, the current push online components hosted on external servers adds a lot of complexity. While a lot of that stuff can be patched around by a very dedicated community, not every piece of content gets enough community appeal to attract the wizards to do such a thing.
And while anyone can digivolve into a wizard given enough commitment and effort, the onramp is not easy these days. Wayyy back when cracking a game meant opening the file and finding the line for 'if cd_key == ‘whru686’, it was much easier to get casually involved. Nowadays, DRM has gotten so much more sophisticated that a tech background is essentially required to start.
- Comment on 'Global Oligarchy' Reigns as Top 1% Controls More Wealth Than Bottom 95% of Humanity 1 month ago:
Yes, you. And me. And probably most of the people reading this, who live in the US or another Western country
Not quite. 1% of global population is ~80 million people. There are about a billion people in the highly developed nations (US, Canada, Western Europe, Japan, South Korea, and some minor others). So the top 8% of the golden billion, if we assume all in the US, the top ~25% of the country.
- Comment on Amazon cloud boss echoes NVIDIA CEO on coding being dead in the water: "If you go forward 24 months from now, it's possible that most developers are not coding" 2 months ago:
An inherent flaw in transformer architecture (what all LLMs use under the hood) is the quadratic memory cost to context. The model needs 4 times as much memory to remember its last 1000 output tokens as it needed to remember the last 500. When coding anything complex, the amount of code one has to consider quickly grows beyond these limits. At least, if you want it to work.
This is a fundamental flaw with transformer - based LLMs, an inherent limit on the complexity of task they can ‘understand’. It isn’t feasible to just keep throwing memory at the problem, a fundamental change in the underlying model structure is required. This is a subject of intense research, but nothing has emerged yet.
Transformers themselves were old hat and well studied long before these models broke into the mainstream with DallE and ChatGPT.
- Comment on Telegram CEO Pavel Durov Arrested in France 2 months ago:
There is always a tension between security, privacy, and convenience. With how the Internet works, there isn’t really a way - with current technology - of reliably catching content like that without violating everyone’s privacy.
Of course, there is also a lack of trust here (and there should be given the leaks about mass surveillance) that the ‘stop child porn powers’ would only be used for that and not simply used for whatever the powers that be wish to do with them.
- Comment on Twitter loses World Bank ads over pro-Nazi content placement 2 months ago:
The world bank isn’t involved so much in printing money - that’s central banks like the US Federal Reserve or European Central Bank.
They do love to force developing nations to adopt US-style capitalism by withholding loans for needed development projects. They also focus far too much on increasing GDP at all costs and do not give really any weight to increasing living standards or reducing inequality. Basically, think loans to institute Reaganomics and you won’t be too far off.
The loans pay for large capital projects (power plants, large-scale irrigation, etc) that are built by the state and then mandated to he handed over to private entities that then charge rents and extract wealth. Not every loan and program is bad, but there’s plenty to give pause when they are involved in a project.