rs137

joined 1 year ago
[–] [email protected] 1 points 9 months ago

Llama 2 70B with 8b quantization takes around 80GB VRAM if I remember correctly. I’ve tested it a while ago.

[–] [email protected] 2 points 9 months ago

I think it was around 7.5k EUR. But because I bought and sold Apple shares with this in mind I’ve paid around 500 EUR for that thing.

[–] [email protected] 2 points 9 months ago (3 children)

And what exactly is wrong with that?

My MacBook Pro was 128 GB of memory is so desperate to fill it up that it gives the applications insane amounts of memory. That only took around 30 GB of memory, so the Mac also loaded the entire file system to the memory which takes around 80 GB. The whole system is super fast because it doesn’t have to read the files from a slow SSD, but from a fast memory.

It’s just a matter of how you look at it. The empty memory = wasted memory.

[–] [email protected] 4 points 10 months ago (1 children)

Get fucked for what? That I’m from the EU? That I send donations to the country at war? That I have a gf?

[–] [email protected] 11 points 10 months ago (3 children)

Not everyone! I’m one of those. From time to time some American reminds me but my brain filters out as a completely useless information and I forget it.