Comment on Apparently Palantir can access the content of social media accounts that were deleted a decade ago.
CovfefeKills@lemmy.world 4 days ago
So like internet archive? oooo spooky
Comment on Apparently Palantir can access the content of social media accounts that were deleted a decade ago.
CovfefeKills@lemmy.world 4 days ago
So like internet archive? oooo spooky
AnyOldName3@lemmy.world 4 days ago
The scary dystopian part is the ability to work out that the account belonged to someone who hadn’t used it for a decade rather than just that they could see what had been posted. The Internet Archive doesn’t let you ask it what someone’s Digg username was.
CovfefeKills@lemmy.world 4 days ago
So you acknowledge that the data exists, what you are scared of is being able to search it? Spooky stuffs.
mriormro@lemmy.zip 4 days ago
So you acknowledge that bullets exist, what you are scared of is being able to continuously fire them at an extremely high rpm? Spooky stuff.
You fucking knuckle dragger.
CovfefeKills@lemmy.world 4 days ago
You never considered that bullets could be fired at a high rate until an article you saw on lemmy told you to be scared of it?
korazail@lemmy.myserv.one 4 days ago
I’m going to say that this is actually spooky.
Not that it’s unreasonable, but that the scale of what AI can surveil is so vast that there’s no more personal security-via-obscurity.
It used to be that unless someone had a reason to start looking at you, anything you did online or off was effectively impossible to search. You might be caught on some store’s CCTV, Or your cell provider might have location pings, but that wasn’t online for anyone and needed a warrant to have the police use it to track your activities. Now cities are using Flock and similar tools to enable tracking vehicles across the country without any reason, and stores are using cloud-service AI cameras to attempt to track your mood as you move through the store. These tools can and have been abused.
Now, due to the harvesting of this data for AI, anything that’s ever been recorded (video footage, social media posts, etc) and used as training data can be correlated much more easily, long after it occurred, and without needing to be law enforcement with a warrant.
I’d call that spooky.
CovfefeKills@lemmy.world 4 days ago
So you think private and opensource intelligence spontaneously came into existence in the last 5 years because of AI?
Trainguyrom@reddthat.com 3 days ago
Usually that’s the insurmountable mountain. Data collection is easy. Formatting, storing and querying the data so you can actually get useful information out of it in a time efficient manner is the extremely hard part.
For a real world example, the organization I work at does quarterly audits of all of the field offices to make sure all of the field offices are in compliance, checking required document retention, gear, etc. and when an audit finds a requirement that is out of compliance they’re given a task with a deadline to complete said task to bring them back into compliance, and these tasks have visibility all the way up the chain of command to where even the C-levels are reviewing them regularly. I’ve been working a project recently to flag repeated failures of the same audit requirement for the same location and it’s highlighting that some field offices are not actually coming into compliance once these high visibility assigned tasks are completed which when I presented it to leadership it was a revelation just how many field offices are continuously out of compliance.
Point is, this data is being actively collected and formatted for easy access and there’s still glaring issues being missed due to the difficulty of finding these trends buried in the hundreds of pages of data being generated each quarter per field office
CovfefeKills@lemmy.world 3 days ago
Yea for sure more accountable reliable systems would be better than worse systems great point.