It seems reasonable given it includes multiple AI models.
Comment on Audacity adds AI audio editing capabilities thanks to free Intel OpenVINO plugins
BlastboomStrice@mander.xyz 1 year ago
2gb plugin??!
bamboo@lemm.ee 1 year ago
Fisch@lemmy.ml 1 year ago
2gb is pretty normal for an AI model. I have some small LLM models on my PC and they’re about 7-10gb big. The big ones take up even more space.
Sneptaur@pawb.social 1 year ago
Isn’t tenacity a joke project made by 4channers
CaptainBasculin@lemmy.ml 1 year ago
That fork is sneedacity, which is very dead.
Sneptaur@pawb.social 1 year ago
Gotcha, thank you for the info. Gotta admit their made-up words are pretty funny
RmDebArc_5@lemmy.ml 1 year ago
Tenacity is a Audacity fork without telemetry
mp3@lemmy.ca 1 year ago
Isn’t the telemetry in Audacity opt-in anyway?
Fisch@lemmy.ml 1 year ago
The fork was created when Audacity was bought and one of the first things the new developers were about to do was add opt-out telemetry. People didn’t like that at all. From what I read in this thread, they ended up adding opt-in telemetry instead.
9point6@lemmy.world 1 year ago
AI models are often multiple gigabytes, tbh it’s a good sign that it’s not “AI” marketing bullshit (less of a risk with open source projects anyway). I’m pretty wary of “AI” audio software that’s only a few megabytes.
interdimensionalmeme@lemmy.ml 1 year ago
Tensorflowlite models are tiny, but they’re potentially as much an audio revolution as synthetizer were in the 70s. It’s hard to tell if that’s what we’re looking at here.
Neato@ttrpg.network 1 year ago
Why are they that big? Is it more than code? How could you get to gigabytes of code?
General_Effort@lemmy.world 1 year ago
Currently, AI means Artificial Neural Network (ANN). That’s only one specific approach. What ANN boils down to is one huge system of equations.
The file stores the parameters of these equations. It’s what’s called a matrix in math. A parameter is simply a number by which something is multiplied. Colloquially, such a file of parameters is called an AI model.
2 GB is probably an AI model with 1 billion parameters with 16 bit precision. Precision is how many digits you have. The more digits you have, the more precise you can give a value.
When people talk about training an AI, they mean finding the right parameters, so that the equations compute the right thing. The bigger the model, the smarter it can be.
Does that answer the question? It’s probably missing a lot.
Aatube@kbin.social 1 year ago
It's data
acockworkorange@mander.xyz 1 year ago
It’s really nothing of the sort.
9point6@lemmy.world 1 year ago
The current wave of AI is around Large Language Models or LLMs. These are basically the result of a metric fuckton of calculation results generated from running a load of input data in, in different ways. Given these are often the result of things like text, pictures or audio that have been distilled down into numbers, you can imagine we’re talking a lot of data.
Amir@lemmy.ml 1 year ago
They’re composed of many big matrices, which scale quadratically in size. A 32x32 matrix is 4x the size of a 16x16 matrix.