Comment on Teens exploited by fake nudes illustrate threat of unregulated AI
rar@discuss.online 1 year agoYes, I suppose given equal input (model, keyword, seed, etc.) two Stable Diffusion installs should output same images; what I am curious about is whether the hardware configuration (e.g. gpu manufacturers) could result in traceable variations. As abuse of this tech gains prominence, tracing back the producer of a certain synthetic media by the specific hardware combination could become a thing.
JohnEdwa@sopuli.xyz 1 year ago
While it could work like bullet forensics, where given access to the gun you can shoot it and compare it to the original bullet, there is no way to look at a generated image and figure out exactly what made it. Well, unless the creator is dumb enough to keep the metadata enabled, by default automatic1111 stable diffusion embeds all of it in the file itself.