Comment on Does anyone else feel like "analog" stuff is more "tangible"?

<- View Parent
Nibodhika@lemmy.world ⁨15⁩ ⁨hours⁩ ago

There are several ways to counter that sort of thing, but let’s start from the beginning. LLMs (what people call AI) is VERY computational heavy, you need a powerful GPU to run a model locally, and it occupies lots of power and memory. The idea that we’re even remotely close to something like that being embed into hardware without people realizing it is just absurd.

But let’s imagine someone is able to make it, and magically prevents hackers from breaking it and using it as extra free power. This will have to live in the CPU as anywhere else wouldn’t have authority to “delete files”, and even the CPU would have a hard time doing that. Now this LLM needs to distinguish stuff I’m writing with stuff I’m reading, otherwise it would also delete files when someone is observing me. It also needs to reply in sub millisecond otherwise the computer will lag absurdly. It also can’t update it’s local model because it doesn’t have network access, so just use tokens it hasn’t heard of.

In short if someone managed to add a piece of hardware capable of doing that it would have to be significantly more powerful than the piece of hardware it’s embed in, and it would only work until someone breaks it and gives everyone a free hardware upgrade.

You can relax, nothing like that is even remotely close of being theoretically possible.

That being said, Windows doing this or similar is a possibility, your best bet is to use an open source system.

source
Sort:hotnewtop