It’s actually hilarious when you zoom their test images and try to see how much more clothes the emperor has with them newfangled ai clothes.
It’s actually hilarious when you zoom their test images and try to see how much more clothes the emperor has with them newfangled ai clothes.
Run it using ollama in a terminal (like ollama run model_name), ask it a question.
Jeeez, just copy the ollama’s directory (something like .ollama) from user’s dir to wherever. You can check and find the files inside. I find the published 14b really useful, it’s ten GB that think and reason in english.
Two was horrible, the end boss skeleton is the stupidest shit. I liked the first, endured the second to the end and never touched the third or Andromeda
Now do Vučić
I mainly use lemmy.world through Summit on mobile and can’t reproduce that slowness either that way or the other way I use it, as old.lemmy.world in Firefox.
I use 14b and it’s certainly great for my modest highschool physics and python (to help the kids) needs, but for party games and such it’s a drag its pop culture stops at mid 2023