Comment on I built a local AI movie recommender for Radarr using Ollama
eager_eagle@lemmy.world 2 weeks agono one is saying everyone has to ask an LLM for movie recommendations
Comment on I built a local AI movie recommender for Radarr using Ollama
eager_eagle@lemmy.world 2 weeks agono one is saying everyone has to ask an LLM for movie recommendations
illusionist@lemmy.zip 2 weeks ago
OP wrote a python script that call a llm to ask for a recommendation.
But you are right, op doesn’t say that everyone shall do it
eager_eagle@lemmy.world 2 weeks ago
No, it doesn’t do that. It gets embeddings from an LLM and uses that to rank candidates.
illusionist@lemmy.zip 2 weeks ago
Are you a trollm?
eager_eagle@lemmy.world 2 weeks ago
It’s not, I read the code. It’s not asking the LLM for recommendations, it’s using embeddings to compute scores based on similarities.
bandwidthcrisis@lemmy.world 2 weeks ago
I had to look up embeddings: so this is comparing the encoding of movies as a similarity test?
Which can work because the encoding methods can indicate closeness of meaning.
And that’s why this isn’t running an llm in any way.