Context: Falcon is a popular free LLM, this is their biggest model yet and they claim it’s now the best open model in the market right now.
It better be really good, it needs 400GB!
Submitted 1 year ago by simple@lemm.ee to technology@lemmy.world
https://huggingface.co/blog/falcon-180b
Context: Falcon is a popular free LLM, this is their biggest model yet and they claim it’s now the best open model in the market right now.
It better be really good, it needs 400GB!
LLM? Lunar Landing Module?
Large Language Model, like ChatGPT.
Double_A@discuss.tchncs.de 1 year ago
Am I the only one that really hates huggingface? It’s such a confusing website to use, and most of the time the things just spit out errors. And I’m never sure if this is just a free thing, or a demo, or something that I have to pay.
Like what is the proper way to use that thing?!
0x0001@sh.itjust.works 1 year ago
Huggingface takes a bit of getting used to but it’s the place to find models and datasets, imo it may be one of the most important websites on the internet today.
Double_A@discuss.tchncs.de 1 year ago
But what exactly is a model? Can I download and run it? Do I need to access it through their API? Do I need to pay for some server that has all the needed software already running on it? It seems open and not open at the same time.
GenderNeutralBro@lemmy.sdf.org 1 year ago
It’s made for researchers and engineers. Nothing is packaged in a form to simply download and run on a stock PC. It assumes a high level of comfort configuring Python environments, GPU drivers, and GPU compute backends like CUDA.
If you don’t know what all of that means, you would be better off looking upstream to projects like GPT4All that package some of this stuff into a simple installer that anyone can run.
As for Falcon in particular, you will not be able to run this on any consumer hardware. It requires at least 160GB of memory, and that’s GPU memory ideally. The largest consumer GPU on the market only has 24GB.