@themurphy @rigatti There is one difference ... LLM's can't be more efficient there is an inherent limitation to the technology.
https://blog.dshr.org/2021/03/internet-archive-storage.html
In 2021 they used 200PB and they for sure didn't make a copy of the complete internet. Now aks yourself if all this information without loosing informations can fit into a 1TB Model ?? ( Sidenote deepseek r1 is 404GB so not even 1TB ) ... local llm's usually < 16GB ...
This technology has been and will be never able to 100% replicate the original informations.
It has a certain use ( Machine Learning has been used much longer already ) but not what people want it to be (imho).