ArchRecord
@ArchRecord@lemm.ee
- Comment on Meta shareholders overwhelmingly rejected a proposal to explore adding Bitcoin to the company's treasury, with less than 1% voting in favor of the measure 3 weeks ago:
uhhhh the nation in question is the US. not a bad idea to consider other things to put money into
I don’t disagree that the US has been quite destabilized as a financial player on the world stage, but the US still has an insane amount of influence over global trade, and holds a ton of power within its own economy.
To argue that Bitcoin is more strongly backed than the entire long-standing, heavily globally financially integrated nation is silly, especially considering how relatively few manufacturers of ASIC miners there are for Bitcoin that could theoretically heavily influence the distribution of hashrate over time if compelled.
holding BTC long term isn’t that risky
And the original post was comparing short term treasuries to Bitcoin, not long term ones.
And even then, Bitcoin’s long-term outlook is bleak considering the % of block rewards paid from fees hasn’t substantially increased to make up for the halvings, which if the trend continues, will result in the cost per block cratering over time, leading to heavily slashed overall hashrate protecting the network.
- Comment on Catbox.moe got screwed 😿 3 weeks ago:
I’ll gladly give you a reason. I’m actually happy to articulate my stance on this, considering how much I tend to care about digital rights.
Services that host files should not be held responsible for what users upload, unless:
- The service explicitly caters to illegal content by definition or practice (i.e. the if the website is literally titled uploadyourcsamhere[.]com then it’s safe to assume they deliberately want to host illegal content)
- The service has a very easy mechanism to remove illegal content, either when asked, or through simple monitoring systems, but chooses not to do so (catbox does this, and quite quickly too)
Because holding services responsible creates a whole host of negative effects. Here’s some examples:
- Someone starts a CDN and some users upload CSAM. The creator of the CDN goes to jail now. Nobody ever wants to create a CDN because of the legal risk, and thus the only providers of CDNs become shady, expensive, anonymously-run services with no compliance mechanisms.
- You run a site that hosts images, and someone decides they want to harm you. They upload CSAM, then report the site to law enforcement. You go to jail. Anybody in the future who wants to run an image sharing site must now self-censor to try and not upset any human being that could be willing to harm them via their site.
- A social media site is hosting the posts and content of users. In order to be compliant and not go to jail, they must engage in extremely strict filtering, otherwise even one mistake could land them in jail. All users of the site are prohibited from posting any NSFW or even suggestive content, (including newsworthy media, such as an image of bodies in a warzone) and any violation leads to an instant ban, because any of those things could lead to a chance of actually illegal content being attached.
This isn’t just my opinion either. Digital rights organizations such as the Electronic Frontier Foundation have talked at length about similar policies before. To quote them:
“When social media platforms adopt heavy-handed moderation policies, the unintended consequences can be hard to predict. For example, Twitter’s policies on sexual material have resulted in posts on sexual health and condoms being taken down. YouTube’s bans on violent content have resulted in journalism on the Syrian war being pulled from the site. It can be tempting to attempt to “fix” certain attitudes and behaviors online by placing increased restrictions on users’ speech, but in practice, web platforms have had more success at silencing innocent people than at making online communities healthier.”
Now, to address the rest of your comment, since I don’t just want to focus on the beginning:
I think you have to actively moderate what is uploaded
Catbox does, and as previously mentioned, often at a much higher rate than other services, and at a comparable rate to many services that have millions, if not billions of dollars in annual profits that could otherwise be spent on further moderation.
there has to be swifter and stricter punishment for those that do upload things that are against TOS and/or illegal.
The problem isn’t necessarily the speed at which people can be reported and punished, but rather that the internet is fundamentally harder to track people on than real life. It’s easy for cops to sit around at a spot they know someone will be physically distributing illegal content at in real life, but digitally, even if you can see the feed of all the information passing through the service, a VPN or Tor connection will anonymize your IP address in a manner that most police departments won’t be able to track, and most three-letter agencies will simply have a relatively low success rate with.
There’s no good solution to this problem of identifying perpetrators, which is why platforms often focus on moderation over legal enforcement actions against users so frequently. It accomplishes the goal of preventing and removing the content without having to, for example, require every single user of the internet to scan an ID (and also magically prevent people from just stealing other people’s access tokens and impersonating their ID)
I do agree, however, that we should probably provide larger amounts of funding, training, and resources, to divisions who’s sole goal is to go after online distribution of various illegal content, primarily that which harms children, because it’s certainly still an issue of there being too many reports to go through, even if many of them will still lead to dead ends.
I hope that explains why making file hosting services liable for user uploaded content probably isn’t the best strategy. I hate to see people with good intentions support ideas that sound good in practice, but in the end just cause more untold harms, and I hope you can understand why I believe this to be the case.
- Comment on public services of an entire german state switches from Microsoft to open source (Libreoffice, Linux, Nextcloud, Thunderbird) 3 weeks ago:
My understanding is that for reliable email, you need to host with microsoft or google otherwise you are more likely to get sorted into junk mail.
That’s technically accurate, but it depends on the context. For example, if you set up DMARC properly and use a brand new custom domain as a personal email, yeah, you’re much more likely to get sent to spam, but not necessarily right away, and as you use that more frequently, or communicate with people using the larger providers like Google or Microsoft, the higher the “reputation” of your domain will get.
If you want the highest possible level of reliability though, then yeah, Google or Microsoft’s options are likely gonna give you the highest chance right off the bat without any fuss.
- Comment on Meta shareholders overwhelmingly rejected a proposal to explore adding Bitcoin to the company's treasury, with less than 1% voting in favor of the measure 3 weeks ago:
There is no logical reason not to hold BTC on your balance sheet over short duration US treasuries.
One is stable and is backed by the full faith and credit of a nation, the other’s value is determined solely by the current speculative state of the market. For a company which requires stability, they don’t want to invest in an extremely volatile asset. That is a highly logical reason.
- Comment on This new 40TB hard drive from Seagate is just the beginning—50TB is coming fast! 3 weeks ago:
When running a local node, the most other people could possibly see is that “x IP is running a Monero node”
When connecting to a remote node, the node can see:
- Your IP address
- When you submit a transaction (which could link your IP to your transactions)
- The last block your wallet synced (which could be used to determine when you usually use/spent monero last)
It’s also possible for a remote node to feed your wallet a manipulated list of decoys, which can reduce the anonymity of the transaction you submit by allowing the remote node to simply remove the fake decoys to find which isn’t the decoy (you.)
- Comment on The Daily Wire is now trying to pass off AI slop as actual footage from Palestine. 4 weeks ago:
No idea I’d thats a common thing in AI video
As far as I’m aware, it’s not. Most AI video is consistently clear, or at most just low resolution. (i.e. it tends to maintain a consistent visual style for each clip, without much changing to composition, lighting, style, depth of field, etc)
I was originally inclined to think that the blurring was added in post, to cover up moments where the AI generated footage did something that made it obvious it was AI, but even regardless of that the faces, positioning, and lighting in the background remained pretty consistent across the length of the video, which AI tends to not be very good with. (i.e. in a crowd of people faces will randomly appear, disappear, move, change height, etc in a way that’s unnatural)
- Comment on The Daily Wire is now trying to pass off AI slop as actual footage from Palestine. 4 weeks ago:
I’m not inclined to trust The Daily Wire, but I also can’t exactly see anything that stands out like crazy that makes me think it’s fake.
It definitely has a somewhat surreal feeling look to it, but I’ve been tracking various faces and objects in the background and they don’t seem to move, change, or distort at all in a way that’s unnatural.
Either AI video generation technology got insanely good (or they just had a very lucky break with happening to get a good quality output) and I’m simply not able to fully identify it, or the video is real but the context and quote is what’s fake.
- Comment on Mom sues porn sites (Including Chaturbate, Jerkmate, Superporn and Hentaicity) for noncompliance with Kansas age assurance law; Teen can no longer enjoy life after mom caught him visiting Chaturbate 4 weeks ago:
I’m convinced this was written by GPT.
I’m a human being. I know my writing style can often come off weird to some people, but I can assure you I don’t outsource my thinking to a word prediction program to make my points for me.
We disagree on how good or bad porn is for society and the youth, so the rest doesn’t even matter.
I haven’t seen any evidence that light or moderate consumption of porn by legal adults produces significant negative consequences for them or society at large, so long as the porn doesn’t involve non-consenting parties, underage individuals, etc. Thus, I don’t think it’s reasonable to heavily monitor and restrict access to every single individual in our society.
As for kids, research is obviously lacking since it’s somewhat of a touchy subject for researchers to study, but since we know sex ed, conversations between kids & parents, and even the most basic of parental controls and monitoring can prevent the vast majority of the negative effects, and even the whole of the initial consumption while underage, then that’s what I advocate for.
Until I see evidence to the contrary, that demonstrates larger harms from general consumption trends than the surveillance of the online media consumption of every single citizen, on top of the possible risks to online censorship, while other methods we already know work well still can’t reduce that risk below the possible harms of a monitoring/access control system, then I’m not going to support such a system.
- Comment on Mom sues porn sites (Including Chaturbate, Jerkmate, Superporn and Hentaicity) for noncompliance with Kansas age assurance law; Teen can no longer enjoy life after mom caught him visiting Chaturbate 4 weeks ago:
You show your ID and a notary enters their credentials to allow you to create an account
The problem then lies in how whoever (likely the government) can ensure that verified accounts are indeed verified by real people.
If any notary can create these accounts by just claiming they saw a proper ID/biometrics, then even one malicious notary could make as many “verified” accounts as they want. If they’re then investigated, that would mean there’d be monitoring in place to see who they met with, which would defeat the privacy preservation method of only having them look at it.
This also doesn’t solve the problem of people reselling stolen accounts, going to multiple notaries and getting each one to individually attest and make multiple accounts to give out or sell, etc.
with your fingerprint or FaceID Your ID doesn’t get saved. Your biometrics are only saved in the way that your iPhone saves them for a password.
If your biometrics are stored, then there’s one of two places they could be stored and processed:
- On your own device (i.e. you just use your existing fingerprint lock on your phone to secure your account, say, one that’s made via a passkey so as to make fingerprint verification possible)
This can just be bypassed by the user once they log in with their biometrics, since the credentials are then decrypted and they can just export them raw, or just have them stolen by anyone who accesses their device or installs malware, etc.
This doesn’t solve the sale, transfer, or multiple creations of accounts.
- A hash of your biometrics are stored on a government server, then your device provides the resulting hash of your fingerprint scans to unlock your account to the government server when logging in.
The scanner that originally creates the hash for your fingerprint must be trusted to not transmit any other data about your fingerprint itself, and could be bypassed by modifying network requests to send fake hashes to the government server during account creation, thus allowing for infinite “verified” accounts to be created and sold.
This also doesn’t prevent the stealing or transfer of accounts, since you would essentially just be using your hash as a password instead of a different string of text, and then they’d just steal your hash, not a typical password. This also would mean the government would get a log of every time someone used their account, and you could be instantly re-identified the moment you go to the airport and scan your fingerprint at a TSA checkpoint, for example, permanently tying your real identity back to any account you verify with your biometrics in the future.
The fundamental problem with these systems is that if you have to verify your identity, you must identify yourself somehow. If that requires sending your personal data to someone, it risks your privacy and security going forward. If that doesn’t require sending your personal data, then the system is easily bypassed, and its existence can’t be justified.
What’s a solution that would be acceptable for you?
I’ve said it before, and I’ll continue advocating for it going forward:
- Parental controls and simple parent-controlled monitoring software on young children’s devices
- Actual straightforward conversations between parents and kids about adult content
- Sex ed classes.
We already know these things do the most we can reasonably do to prevent underage viewing of adult content. We don’t need age verification laws, because they either harm privacy or don’t even work, when much simpler, common sense solutions already solve the problem just fine.
- Comment on Mom sues porn sites (Including Chaturbate, Jerkmate, Superporn and Hentaicity) for noncompliance with Kansas age assurance law; Teen can no longer enjoy life after mom caught him visiting Chaturbate 4 weeks ago:
they then authorize you to create an account
Authorize you how?
That would involve someone having the ability to see which accounts where made, when, and how they were authorized, not to mention likely being able to track when they’re used in the future.
with biometric credentials
What does this mean? Do you mean you verify your biometric data with the notary to prove it’s you? Your ID should be enough. Do you mean where your biometric data is your password? This doesn’t prove it’s you. If processing is on-device like how phone lock screens work, then a simple piece of software could just extract the raw credentials and allow people to use/sell/transfer those, bypassing the biometrics. If it requires sending your biometric data to the company to log in like a traditional password flow, then all my previous issues with biometric verification online become present.
There’s still a key difference between this hybrid approach and, like I mentioned previously, buying alcohol by showing your ID to a clerk at a counter, and it’s that the interaction ends there. If you show ID, buy alcohol, then leave, the store doesn’t do anything after that. There’s no system monitoring when or how much you’re drinking, or if you’ve offered some of that drink to someone underage, for example.
But with something like what you’re proposing, the unfortunate reality is that it has to have some kind of monitoring for it to functionally work, otherwise it becomes trivially bypassed, and thus the interaction can’t end when the person leaves.
Not to mention the fact that not all platforms people find porn on are actually dedicated porn sites. Many people are first exposed via social media, just like how they’re exposed to much of their other information and general knowledge nowadays. If we want to age gate social media porn consumption as well, we then need to age verify everyone regardless of if they intend to view porn or not, because we can’t ensure it won’t end up on their feed.
There’s a reason why I’m so strongly against these verification methods, and it’s because they always cause a whole host of privacy and security issues, and don’t even create a strong enough system to prevent unauthorized porn viewing by minors in the first place.
- Comment on Mom sues porn sites (Including Chaturbate, Jerkmate, Superporn and Hentaicity) for noncompliance with Kansas age assurance law; Teen can no longer enjoy life after mom caught him visiting Chaturbate 4 weeks ago:
Who under the age of 18 will have money to buy these
Anyone with at least $0.25-$1, and access to any method of digital payments. (Gift Cards for most retailers, PayPal, Cash App, Zelle, prepaid or non-prepaid debit cards, any cryptocurrency, etc)
and who would be willing to sell them for the pittance teenagers would be willing to spend?
Primarily bad actors that obtain the credentials any number of ways, then either directly sell them, or sell them indirectly through third-party storefronts that buy from the bad actors in bulk. Believe me, I’ve watched hundreds of kids in Discord servers publicly sharing and using sites on the clearweb where they cashapp in a dollar then buy a stolen set of bank credentials and try withdrawing money back to their Cash App account.
I’ve monitored so many of these sites, and seen how easy it is for anybody, even teens with limited financial payment options, to buy stolen credentials with infinitely more importance and personal security measures taken to keep them safe than something specifically for accessing an NSFW site.
Some of these site owners operate for months before eventually shutting down and re-opening separate storefronts for anonymity, and I know of one who was selling stolen SSNs, IDs, Gift Cards, and assorted accounts, and made, by my estimates, at least a million dollars in revenue every month off items that were almost all within the price range of any child or teenager.
Especially if these get rotated out regularly via a system wide program.
Rotation can help, but doesn’t cut off these services from operating. They just sell stuff in smaller, more quickly refilled batches instead of buying large batches and reselling them over longer time periods. It can make prices slightly higher, but in the end it doesn’t prevent kids from accessing this content.
But what it does end up doing is creating perverse incentives.
It drives people to even less regulated, more harmful porn sites. It leads to the further stealing of credentials and personal information. It creates databases and online footprints that can be used to blackmail people, and it normalizes giving sensitive personal information to random websites online.
The last thing you want when you’re trying to prevent people from getting scammed is to monetarily encourage scamming people out of their credentials and biometric data, while simultaneously making it easier for people to unknowingly hand over credentials and biometrics by normalizing the process.
This is something practically every digital rights organization argues against, and for good reason. It’s a generally unsafe system that creates bad incentives and drives people to even more unsafe options.
The best mechanisms by far to prevent kids from being exposed to harmful material, or at the very least prevent them from experiencing much harm from such material is often proper parental controls and general internet monitoring by those parents, good sex education, and parents actually talking with their kids instead of fostering the us vs them mentality that drives many kids to rebel against these restrictions, even when they are to benefit the kid.
That’s why news like this is always so upsetting to me. It’s a mom who is understandably upset, but instead of taking accountability for leaving a unsecured laptop with access to the internet easily accessible to her kid while not monitoring it at all, she simply puts the blame on the platforms her child decided to access, even though we know she could have done many things herself to prevent this from happening without risking anybody’s privacy or safety, unlike what age-gating regulations do in practice.
- Comment on Mom sues porn sites (Including Chaturbate, Jerkmate, Superporn and Hentaicity) for noncompliance with Kansas age assurance law; Teen can no longer enjoy life after mom caught him visiting Chaturbate 4 weeks ago:
The conflict that this often boils down to is that the digital world does not emulate the real world. If you want to buy porn in the real world, you need ID, but online anything goes. I love my online anonymity just as much as everybody else, but we’ll eventually need to find some hybrid approach.
The problem is that because the internet is fundamentally different from the real world, it has its own challenges that make some of the things we do in the real world unfeasible in the digital world. showing an ID to a clerk at a store doesn’t transmit your sensitive information over the internet to/through an unknown list of companies, who may or may not store it for an undetermined amount of time, but doing so on the internet essentially has to do so.
While I do think we should try and prevent kids from viewing porn at young ages, a lot of the mechanisms proposed to do so are either not possible, cause many other harms by their existence that could outweigh their benefits, or are trivially bypassed.
We already scan our faces on our phones all the time, or scan our finger on our computer. How about when you want to access a porn site you have to type in a password or do some biometric credential?
Those systems are fundamentally different, even though the interaction is the same, so implementing them in places like porn sites carries entirely different implications.
For example, (and I’m oversimplifying a bit here for time’s sake) a biometric scan on your phone is just comparing the scan it takes each time with the hash (a processed version) of your original biometric scan during setup. If they match, the phone unlocks.
This verification process does nothing to verify if you’re a given age, just that your face/fingerprint is the same as during setup. It also never has to transmit or store your biometrics to another company. It’s always on-device.
Age verification online for something like porn is much more complex. When you’re verifying a user, you have to verify:
- The general location the user lives in (to determine which laws you must comply with, if not for the type of verification, then for the data retention and security, and access)
- The age of the user
- The reality of the user (e.g. a camera held up to a YouTube video shouldn’t verify as if the person is the one in the video)
- The uniqueness of the user (e.g. that this isn’t someone re-licensing the same clip of their face to be replayed directly into the camera feed, allowing any number of people to verify using the same face)
- And depending on the local regulations, the identity of the user (e.g. name, and sometimes other identifiers like address, email, phone number, SSN, etc)
This all carries immense challenges. It’s fundamentally incompatible with user privacy. Any step in this process could involve processing data about someone that could allow for:
- Blackmail/extortion
- Data breaches that allow access to other services the person has an account on
- Being added to spam marketing lists
- Heavily targeted advertising based on sexual preference
- Government registries that could be used to target opponents
This also doesn’t include the fact that most of these can simply be bypassed by anyone willing to put in even a little effort. If you can buy an ID or SSN online for less than a dollar, you’ll definitely be able to buy an age verification scan video, or a photo of an ID.
Plus, for those unwilling to directly bypass measures on the major sites, then if only the sites that actually fear government enforcement implement these measures, then people will simply go to the less regulated sites.
In fact, this is a well documented trend, that whenever censorship of any media happens, porn or otherwise, viewership simply moves to noncompliant services. And of course, these services can be hosting much worse content than the larger, relatively regulatory-compliant businesses, such as CSAM, gore, nonconsensual recordings, etc.
- Comment on Mom sues porn sites (Including Chaturbate, Jerkmate, Superporn and Hentaicity) for noncompliance with Kansas age assurance law; Teen can no longer enjoy life after mom caught him visiting Chaturbate 4 weeks ago:
They can prove its signed with the governments root cert, showing that its someone over 18, but not who.
This is generally a pretty decent system in concept, but it has some unique flaws.
A similar system is even being developed by Cloudflare (“Privacy Pass”) to make CAPTCHAs more private by allowing you to anonymously redeem “tokens” proving you’ve solved a CAPTCHA recently, without the CAPTCHA provider having to track any data about you across sites.
They know someone who had solved a captcha recently is redeeming a token, but they don’t know who.
This type of system will always have one core problem that really can’t be fixed though, which is the sale and transfer of authenticated tokens/keys/whatever they get called in a given implementation.
Someone could simply take their signed cert, and allow anybody else to use it. If you allow the government to view whoever is using their keys, but not the porn sites, then you give the government a database of every porn user with easily timestamped logs. If you don’t give the government that ability, even one cert being shared defeats the whole system. If you add a rate limit to try and solve the previous problem, you can end up blocking access if a site, browser, or extension, is just slightly misconfigured in how it handles requesting the cert, or could break someone’s ability to use their cert the moment it gets leaked.
And even if someone isn’t voluntarily offering up their cert, it will simply get sold. I’ve investigated sites selling IDs and SSNs for less than a dollar a piece before, and I doubt something even less consequential like an ID just for accessing online adult content would even sell for that much.
I’ve seen other methods before, such as “anonymous” scans of your face where processing is done locally to prove you’re an adult, then the result of the cryptographic challenge is sent back proving you’re over 18, but that would fail anyone who looks younger but is still an adult, can be bypassed by the aforementioned sale of personal data to people wanting to verify, and is often easily fooled by videos and photos of people on YouTube, for example.
- Comment on Mom sues porn sites (Including Chaturbate, Jerkmate, Superporn and Hentaicity) for noncompliance with Kansas age assurance law; Teen can no longer enjoy life after mom caught him visiting Chaturbate 4 weeks ago:
There’s absolutely something to be said for trying to ensure that people don’t have access to porn as kids, but that doesn’t come from what these legal battles inevitably want to impose, which is ID check requirements that create a massive treasure trove of data for attackers to target to steal IDs, blackmail individuals, and violate people’s privacy, while adding additional costs for porn sites that will inevitably lead to predatory monetization, such as more predatory ads.
The problem is that parents are offloading their own responsibility and education off themselves and schools, and instead placing an unworkable burden onto the sites that host and distribute pornographic content.
We know that when you provide proper sex education, talk to kids about how to safely consume adult content without risking their health, safety, and while setting realistic expectations, you tend to get much better outcomes.
If there’s one thing I think most people are very aware of, it’s that the more you try and hide something from kids, the more they tend to try and resist that, and find it anyways, except without any proper education or safeguards.
It’s why abstinence only education tends to lead to worse outcomes than sex education, even though on the surface, you’re “exposing” kids to sexually related materials.
This doesn’t mean we should deliberately expose kids to porn out of nowhere, remove all restrictions or age checks, etc, but it does mean that we can, for example:
- Implement reasonable sex education in schools. Kids who have sex ed generally engage in healthier masturbation and sex than kids who don’t.
- Have parents talk with their kids about safe and healthy sex & relationships. It’s an awkward conversation, but we know it keeps kids healthier and safer in the long run.
- Implement a captcha-like system to make it a little more difficult (and primarily, slower and less stimulating) for kids to quickly access porn sites. Requiring certain somewhat higher level math problems to be solved, for example. This doesn’t rely on giving up sensitive personal info.
Kids won’t simply stop viewing porn if you implement age gates. Kids are smart, they find their way around restrictions all the time. If we can’t reasonably stop them without producing a whole host of other extremely negative consequences, then the best thing we can do is educate them on how to not severely risk their own health.
It’s not perfect, but it’s better than creating massive pools of private data, perverse financial incentives, and pushing people to more fringe sites that do even less to comply with the law.
- Comment on Windows Is Adding AI Agents That Can Change Your Settings 1 month ago:
And the worst part is, I’m not even sure if they believe it, or if they’re just lying to try and pump the value of the coins they’re investing in that claim to be capable of doing that in the future.
And honestly, I don’t know which I dislike more. Deliberate ignorance, or actual stupidity.
- Comment on Windows Is Adding AI Agents That Can Change Your Settings 1 month ago:
Not that long ago. Many still do, although you’ll primarily find them in more niche spaces within the overarching crypto community.
In fact, just a few years back, I used to be one of them. Of course, later on I became disillusioned with the promises of crypto after learning more about socialism, thinking more closely about how the system fundamentally worked, and realizing that it was effectively just a slightly more distributed variant of capitalism that would inevitably fall to the same structural failings, that being capital accumulation.
To clarify the reasoning that was often used, including by myself, the reason people specifically thought blockchains would make microtransactions better is because they thought that it would lead to more user freedom, and open markets. If you can buy a skin now, then sell it later when you’re done with it, then the effective cost of the skin is lower than in a game where you are unable to sell, for instance.
Obviously the concept of selling in-game items isn’t novel in any way, but the main selling point was that it could be tradeable on any marketplace (or peer-to-peer with no marketplace at all), meaning low to no fees, and they items could be given native revenue-share splits, where the publisher of a game would get a set % of every sale, leading to a way for them to generate revenue that didn’t have to be releasing new but low quality things at a quick pace, and could then allow them to focus on making higher quality items with a slower release schedule.
Of course, looking back retrospectively:
- Financializing games more just means people play them more for money than for enjoyment
- This increases the incentives for hacking accounts to steal their items/skins
- Game publishers would then lose profits from old accounts being able to empty their skins onto the market when they quit the game instead of those skins being permanently tied to that account
There are a small subset of people who legitimately just don’t understand game development fundamentals though, and they actually believe that things would just be fully interchangeable. As in, you buy a skin in Fortnite, and you can then open up Roblox and set it as your player model.
Those ones are especially not the brightest.
- Comment on ‘The Worst Internet-Research Ethics Violation I Have Ever Seen’ | The most persuasive “people” on a popular subreddit turned out to be a front for a secret AI experiment. 1 month ago:
To be fair, I do believe their research was based on how convincing it was compared to other Reddit commenters, rather than say, an actual person you’d normally see doing the work for a government propaganda arm, with the training and skillset to effectively distribute propaganda.
Their assessment of how “convincing” it was seems to also have been based on upvotes, which if I know anything about how people use social media, and especially Reddit, are often given when a comment is only slightly read through, and people are often scrolling past without having read the whole thing. The bots may not have necessarily optimized for convincing people, but rather, just making the first part of the comment feel upvote-able over others, while the latter part of the comment was mostly ignored. I’d want to see more research on this, of course, since this seems like a major flaw in how they assessed outcomes.
This, of course, doesn’t discount the fact that AI models are often much cheaper to run than the salaries of human beings.
- Comment on Perplexity CEO says its browser will track everything users do online to sell 'hyper personalized' ads | TechCrunch 1 month ago:
Chrome is relatively limited in scope compared to, say, a user on an instance of degoogled-chromium just using the same Google services along with all the other browsing they do. The extra data that’s gathered is generally going to be things like a little more DNS query information, (assuming your device isn’t already set to default to Google’s DNS server) links you visit that don’t already have Google’s trackers on them (very few) and some general information like when you’re turning on your computer and Chrome is opening up.
The real difference is in how Chrome doesn’t protect you like other browsers do, and it thus makes more of the collection that Google’s services do directly, possible.
Perplexity is still being pretty vague here, but if I had to guess, it would essentially just be taking all the stuff that Google would usually get from tracking pixels and ad cookies, and baking that directly in to the browser instead of it relying on individual sites using it.
- Comment on Philosophy moment 1 month ago:
The fact that we ever allowed kids to scroll instead of paying attention in class is absurd.
I’ve never actually seen a classroom where this was the case. (aside from after work was completed, sort of as a reward for finishing their assignments on time) Most teachers will immediately tell students to put the phone away and will confiscate it if they keep trying to use it.
When they’re talking about phone bans, they’re usually meaning things like taking phones away at the front and returning them at the end of the day, or requiring students to leave them in lockers/locked pouches.
- Comment on Put him on the cart. 2 months ago:
- Comment on In heat 2 months ago:
It depends on the person in my experience.
For instance, I’ll often use a question format, but usually because I’m looking for similar results from a forum, in which I’d expect to find a post with a similar question as the title. This sometimes produces better results than just plain old keywords.
Other times though, I’m just throwing keywords out and adding
“”
to select the ones I require be included.But I do know some people who only ever ask in question format no matter the actual query. (e.g. “What is 2+2” instead of just typing “2+2” and getting the calculator dialogue)
- Comment on The Fairphone 5 price has been dropped to €499. The phone is designed to be the most advanced environmentally friendly smartphone. 2 months ago:
Is this phone also more secure?
Probably not.
Apple & Google have spent considerable amounts of time building out hardware security infrastructure for their products that I find it extremely unlikely Fairphone would have been able to match.
For example, the popular alternative Android OS GrapheneOS only supports Google Pixels, because: (Emphasis added by me)
“There are currently no other devices meeting even the most basic security requirements while running an alternate OS. GrapheneOS is very interested in supporting a non-Pixel brand, but the vast majority of Android OEMs do not take security seriously. Samsung takes security almost as seriously as Google, but they deliberately cripple their devices when unlock them to install another OS and don’t allow an alternate OS to use important security features. If Samsung permitted GrapheneOS to support their devices properly, many of their phones would be the closest to meeting our requirements. They’re currently missing the very important hardware memory tagging feature, but only because it’s such a new feature”
If even Samsung, the only other phone brand on the market they consider close to meeting their standards, doesn’t support every modern hardware security feature, and deliberately cripples their security for alternate OS’s, as a multi billion dollar company, I doubt Fairphone has custom-built hardware security mechanisms for their phones to the degree that Google has.
- Comment on Jack Dorsey and Elon Musk would like to ‘delete all IP law’ | TechCrunch 2 months ago:
Not to mention the fact that the stronger IP law is, the more it’s often used to exploit people.
Oh, did you as an artist get given stronger rights for your work? That platform you’re posting on demands that you give them a license for any possible use, in exchange for posting your art there to get eyeballs on your work.
Did your patents just get stronger enforcement? Too bad it’s conveniently very difficult to fund and develop any product at scale under that patent without needing outside investor funding into a new corporate entity that will own the patent, instead of you!
To loosely paraphrase from Cory Doctorow: If someone wants a stronger lock, but won’t give you the key, then it’s not for your benefit.
If corporations get to put locks on everything with keys they own, but also make it hard for you to get or enforce access to the keys to the locks on your stuff, then the simplest way to level the playing field is to simply eliminate the locks.
- Comment on Did ChatGPT come up with Trump’s tariff rate formula? AI chatbots ChatGPT, Gemini, Claude and Grok all return the same formula for reciprocal tariff calculations, several X users claim. 2 months ago:
I tried replicating this myself, and got no similar results. It took enough coaxing just to get the model to not specify existing tariffs, then to make it talk about entire nations instead of tariffs on specific sectors, then after that it mostly just did 10, 12, and 25% for most of the answers.
I have no doubt this is possible, but until I see some actual amount of proof, this is entirely hearsay.
- Comment on Online ‘Pedophile Hunters’ Are Growing More Violent — and Going Viral: With the rise of loosely moderated social media platforms, a fringe vigilante movement is experiencing a dangerous evolution. 2 months ago:
These folks include presenting a false person as being of age, then switching to underage at the time of meetup when the target shows up.
I’ve never seen even a single instance in my own viewership of numerous channels that engage in pedophile hunting where the person is presented as being above the legal age of consent, then only switching to underage at the time of the meeting. They’re presented as underage from the get-go.
Then the group tries to kill the person
Again, this doesn’t seem to be a widespread thing compared to the number of them that simply lure them to a location then ask them questions (and directly state that they are free to leave at any time since they’re not law enforcement and can’t arrest them) The people you’re talking about are a small minority of both the actual number of pedo hunters, and the number of overall views received.
And the perpetrators think this is justice.
I doubt the people that are explicitly lying to farm content think it’s justice. I do believe the people actually catching people who voluntarily contacted someone presented as underage from the start do.
- Comment on Online ‘Pedophile Hunters’ Are Growing More Violent — and Going Viral: With the rise of loosely moderated social media platforms, a fringe vigilante movement is experiencing a dangerous evolution. 2 months ago:
It depends on how these channels are going about finding their victims for it to be considered similar.
Remember, entrapment is based around luring someone to do something they otherwise would not have done had the operation to entrap them not occurred. If they created an account posing as a minor, then directly DM’d a person asking if they wanted to do x/y/z with a minor, that would be entrapment.
But if they made an account claiming to be a minor on social media, and the person contacted them voluntarily, asked their age, was told it was under 18 and still continued messaging, then sent explicit photos, that’s not entrapment.
However, if they were then the people who initiated the conversation about wanting the person to come to their house / visit them somewhere, that could be considered entrapment, and the only evidence against the person that could be eligible for use in court would be the explicit material they sent without being prompted.
It varies case-by-case, but from what I’ve seen, most of the larger operations tend to try and avoid entrapment-like tactics in most cases, where they only allow the other person to initiate unlawful behaviors, rather than prompting anything themselves.
- Comment on Online ‘Pedophile Hunters’ Are Growing More Violent — and Going Viral: With the rise of loosely moderated social media platforms, a fringe vigilante movement is experiencing a dangerous evolution. 2 months ago:
couldn’t they just run down the registered sex offender list
The point of their channels is usually to find new predators that haven’t been caught, so they can then face legal consequences, (or at least be pushed to stop acting on their desires) rather than to punish people for being pedophiles in general, so it wouldn’t really make sense to go after those who were already convicted.
- Comment on [deleted] 2 months ago:
hotel
I think you mean “all-inclusive” resort (that isn’t all inclusive and actually charges a gazillion dollars in random fees) that makes them feel like they’re experiencing local culture while actually just experiencing the effects of the resort chain exploiting the local population for cheap labor while cheaply imitating the culture.
Don’t worry, we Americans are definitely capable of escaping our cultural bubble! /s
- Comment on 50% of parents financially support adult children, report finds. Here's how much it costs them. 2 months ago:
Not only are their wages lower than their parents’ earnings when they were in their 20s and 30s, after adjusting for inflation, but they are also carrying larger student loan balances, many reports show.
True, true. Surely they won’t try to both-sides this and make it seem like they’re overreacti-
But by other measures, young adults are doing well.
Oh no.
Compared with their parents at this age, Gen Zers are more likely to have a college degree
Because more jobs require them even when they’re not necessary. Also, see the crippling debt you just mentioned.
and work full time.
Yet still make less than their parents while working longer hours. How is this “doing well?”
Plus, many millennials have more saved for retirement than they did just a few years ago, after reaping the benefits of positive market conditions.
Fun fact, if you save money for retirement, it tends to go up, shocking.
- Comment on GrapheneOS partial offline backup? 2 months ago:
GrapheneOS has Seedvault integrated in Settings > System > Backup, where you can choose what you want backed up, back it up into a file, then transfer that file anywhere you want.
You can then restore a backup from the three-dot menu in the top right.
I haven’t had to do restore yet myself, so I’m unsure how well it restores, say, text messages, to an existing client on your phone. (I’m not sure if it replaces your existing messages or not)
As for Signal, I believe it now supports restoring up to the last 45 days of message history from your device. (Even after wiping your messages off Signal from your phone, leave the app itself, as it has totally innocuous files that definitely don’t possibly harm police hardware used to crack phones embedded in it.)