I think I need this, finally a real use for ‘ai’.
The amount of how to videos you have to watch through, when all you want is one little piece of info you should be able to search or scan for has been a problem since before the internet figured out how to increase clicks by making a web page in to slides.
Can you link me a how-to video on how to get startedt and send me a summary from your working setup?
UnderpantsWeevil@lemmy.world 2 months ago
Only a matter of time before LLMs start injecting their own ads into these responses.
pennomi@lemmy.world 2 months ago
Nah, local LLMs are easily in the range of transcribe/summarize. I bet you could do that nicely with llama 8B without even needing a gpu.
cheese_greater@lemmy.world 2 months ago
Cant wait to have these
Gigasser@lemmy.world 2 months ago
You already can I think? Ollama is something you can install, and then you can set up a webui like sillytavern for roleplays, or some other more fitting ui for whatever you want. Also, Linux is great for projects like these, on windows it’s fucking a pain to set up, Linux it’s easy.
seaQueue@lemmy.world 2 months ago
By that point I’m pretty sure we’ll have an effective compact model that can run locally and transcribe downloaded videos on reasonable hardware. Or you can just sic a paid model like chatgpt on the task. The corporate Internet is entirely focused on subscription service models now, unless you run the model yourself on local hardware you’re going to end up paying someone somewhere a service fee.
Wildly_Utilize@infosec.pub 2 months ago
Local and open source