Comment on Is possible to run a LLM on a mini-pc like the GMKtec K8 and K9?
al4s@feddit.de 4 months agoLLMs work by always predicting the next most likely token and LLM detection works by checking how often the next most likely token was chosen. You can tell the LLM to choose less likely tokens more often (turn up the heat parameter) but you will only get gibberish out if you do. So no, there is not.
TheBigBrother@lemmy.world 4 months ago
What about if you train the AI with human generated content? For example e-books?