Is this the one for ‘research only’ that is trained on YouTube transcripts including mkbhd?
Apple AI Released a 7B Open-Source Language Model Trained on 2.5T Tokens on Open Datasets.
Submitted 3 months ago by ModerateImprovement@sh.itjust.works to technology@lemmy.world
Comments
JohnDClay@sh.itjust.works 3 months ago
SuckMyWang@lemmy.world 3 months ago
As someone who knows nothing about this stuff, yes.
cheese_greater@lemmy.world 3 months ago
Happy cake day, Wang suck dude
reddwarf@feddit.nl 3 months ago
As someone who know f-all about AI I support your endorsement, if only by being impressed the training took 2.5 tokes so you know that AI smokes like a banger!
iridium@piefed.social 3 months ago
Hopefully they'll be able to put something together that can run locally, so they can finally stop using this "Open"AI bullshit
A_A@lemmy.world 3 months ago
They managed a substantial incremental improvement over previous models by first creating a better set of data as their starting point.