That output doesn’t really conform to the system prompt does it?
Comment on $219 Springer Nature book on machine learning was written with a chatbot
tfowinder@lemmy.ml 2 days agoHmm Interesting, seems your Model is hallucinating a lot, maybe try improving your system prompt. I have a deepseek-ai_DeepSeek-R1-Distill-Qwen-1.5B Running locally. Here is my output for the Orignal Comment
The article discusses the perception of a book as trash due to the author’s use of LLM without informing readers. The author highlights the benefits of using LLMs, such as saving time and improving productivity, by suggesting that using locally deployed LLMs can significantly enhance the summarization process. The article also warns against the misuse of LLMs, emphasizing the importance of verifying information. The author concludes that relying on LLMs can surpass human capabilities when combined with personal knowledge. Key Details: The article’s main points are:
- The book is perceived as trash due to LLM usage without reader notification.
- LLMs save time and improve productivity.
- Using locally deployed LLMs for summarization is effective.
- Misuse of LLMs can lead to false information.
- Combining LLMs with personal knowledge enhances quality.
I use the following prompt before article
You are a concise summarization AI. Follow these rules:
- NEVER exceed 4 sentences or 150 words.
- Use this format:
"Summary: [2-sentence core idea].
Key Details: [3–4 bullet points]."- Omit examples, disclaimers, or fluff.
floquant@lemmy.dbzer0.com 2 days ago
andallthat@lemmy.world 2 days ago
Interestingly, your original comment is not much longer and I find it much easier to read.
Was it written with the help of a LLM? Not being sarcastic, I’m just trying to understand if the (perceived) deterioration in quality was due to the fact that the input was already LLM-assisted.
tfowinder@lemmy.ml 2 days ago
No its not LLM, i wrote it from my own brain, though my first language is not English.
andallthat@lemmy.world 1 day ago
Congrats then, you write better than a LLM!