If you want to make Kefir at home you need raw milk. Obviously US goes into Soviet times so they will need it. This and potatoes in every home garden.
If you want to make Kefir at home you need raw milk. Obviously US goes into Soviet times so they will need it. This and potatoes in every home garden.
I don’t think they taboo some topics but I’m sure the model has a bias specific to what people say in the internet. Which might not be correct according to people who challenge some views on historical facts.
Of course Chinese censorship is super obvious and made by design. American is rather a side effect of some cultural facts or beliefs.
What I wanted to say that all models are shit when it comes to fact checking or seeking truth. They are good for generating words that look like truth and in most cases are representing the overall consensus in that cultural area.
I asked about Tiananmen events the smallest deepseek model and at first it refused to talk about it (while thinking loud that it should not give me any details because it’s political) and then later when I tried to make it to compare these events to Solidarity events where former Polish government would use violence against the people, it would start talking about how sometimes the government has to use violence when the leadership thinks it’s required to bring peace or order.
Fair enough Mister Model made by autocratic country!
However. Compared to GPT and some others I tried it did count Rs in a word tomato. Which is zero. All others would tell me it has two R.
If you use the model it literally tells where it will not tell something to the user. Same as guardrails on any other LLM model on the market. Just different topics are censored.
I don’t have opinion on this because I’m from Europe and I stopped understanding American politics but just wanted to point out that 80s were 40 fucking years ago. 40.
Ok then this thing. My father used to do it. I’m not a fan.
https://en.wikipedia.org/wiki/Soured_milk?wprov=sfti1