Yeah sure, but when do you stop gathering regularly constructed data, when your goal is to grab as much as possible?
Markov chains are an amazingly simple way to generate data like this, and a little bit of stacked logic it’s going to be indistinguishable from real large data sets.
Korhaka@sopuli.xyz 11 months ago
You can compress multiple TB nothing with the occasional meme down to a few MB.
essteeyou@lemmy.world 11 months ago
When I deliver it as a response to a request I have to deliver the gzipped version if nothing else. To get to a point where I’m poisoning an AI I’m assuming it’s going to require gigabytes of data transfer that I pay for.
At best I’m adding to the power consumption of AI.
MonkeMischief@lemmy.today 11 months ago
…and it’s just bouncing around and around and around in circles before its handler figures out what’s up…
Heehee I like where your head’s at!