Comment on Local AI is one step closer through Mistral-NeMo 12B
mememuseum@lemmy.world 5 months agoThe laptop 3080tis have 16 gigs? The desktop ones only got 12.
Comment on Local AI is one step closer through Mistral-NeMo 12B
mememuseum@lemmy.world 5 months agoThe laptop 3080tis have 16 gigs? The desktop ones only got 12.
j4k3@lemmy.world 5 months ago
Yeah. It is different for laptops. The 3080 market nomenclature is all over the place. Look up the model number to confirm the manufacturer’s specs. Linux hardware probe should list them too. It has been a year since I did all of my research but IIRC the mobile version only came with “3080” 8 GB and “3080Ti” 16 GB. I have the 16 GB and can confirm it is a thing. That is the largest GPU in a laptop for the last generation GTX 3xxx stuff. The largest AMD sold at the same time is a 12 GB 6850 IIRC but the AMD 6k series doesn’t (did not at the time) have the same HIPS ROCM support as the AMD 7k series new stuff (not sure if it has changed).
The following is an old pic from a year ago. The left side is tiled into 3 terminals, top is the running model inference, middle is htop, bottom is my GPU monitoring script that also shows the total memory available. This was also a 70b Llama 2 model (GGML and 4bit quantization)
Image