MrLLM
Not OP, but I was curious about it, so I watched a video tutorial (in Spanish).
Basically, you prepare a mixture of evaporated milk, condensed milk and regular milk with some ice and vanilla extract, and in another bowl just orange juice with ice.
The secret is that both liquids need to be cold so they get mixed properly. She didn’t mention what temperature was necessary, just that 20 minutes in the cold did the trick.
- Windows update on boot
I think Satan should save this one for the Hell.
And if you don’t wash your hands after using the toilet, your hands will be covered in Doritos dust for a day no matter what.
😳🏳️🌈?
¿Y quien lo decidió? /s
¿Podrías nombrar 12 tipos de leche?
I don’t mean to say that you’re completely wrong in your reasoning, but grammatically speaking, we use have + verb in past participle which we call present perfect no matter what verb is used.
In this case, you’re talking about something you’ve experienced, so the correct way would be “I’ve run” (as the past participle of run is run).
If you’d like to take a detailed look at it, here you have: Present Perfect - British Council and Using "have ran" or "have run".
Btw, It's completely normal to make mistakes! We're all human, and part of being human is learning and growing from our errors.
Wanna join?
The actually useful shit LLMs can do
Which is?
Waste energy and pollute the environment? I can relate… not useful, tho
We’ve been mishearing him, it’s fee speech absolutist.
Name...absolutely does not check out.
Uhh, oh, fair enough (゚∀゚)
Saliently enough, have you managed to try DeepSeek, or even get it set up locally?
Yeah, I’ve successfully run the cut down version of deepseek-r1 through Ollama. The model itself is the 7b (I’m VRAM limited to 8GB). I used an M1 Mac Mini to run it, in terms of performance, is fast and the quality of the generated content is okay.
Depending on your hardware and SO, you will or not be able to get to run a LLM locally with reasonable speed. You might want to check the GPU support for Ollama. You don’t need a GPU as it can run on the CPU, but it’ll certainly be slower.