But that’s my point: just because you apply deterministic steps to a truly random input doesn’t make the output not truly random. You use real entropy as your starting point, which is literally exactly what you call “true randomness”. This means the output has the same level of “true randomness” as your “truly random” input, because you mathematically don’t lose entropy along the way.
KairuByte@lemmy.dbzer0.com 1 year ago
The input is not truly random though. If it was, we could just use that input, with no other steps, and have a truly random output. You’re confusing an unknown state with randomness.
FooBarrington@lemmy.world 1 year ago
No, it actually and literally is truly random. You’d need to know everything about the hardware itself and the environment around it in incredible detail (incl. the temperature of every individual small patch of material, air flow and the state of air in and around the case) to reliably predict the initial entropy for a given modern system, since tiny changes in e.g. temperature will completely change the input.
It’s only a small bit of entropy, but enough to kick-start the RNG in a way that can reliably create high-quality entropy.
KairuByte@lemmy.dbzer0.com 1 year ago
So you’re literally arguing that knowable inputs, however unlikely knowing those inputs might be, run through known deterministic calculations, results in a guaranteed unknowable output?
FooBarrington@lemmy.world 1 year ago
No, I’m arguing that the inputs aren’t knowable to the required degree in the general case, which defines their entropy, and that entropy isn’t mathematically lost, it’s improved through deterministic calculations.