Hacker Newsnew | past | comments | ask | show | jobs | submit | trissi1996's commentslogin

Why wouldn't they? It's FOSS after all...


You're just wrong.

Vibes are not a meaningful comparison.

You can't compare different quantizations(Q4_K_M & Q4_K_XL) and also use a temp of 0.7 and use that for any meaningful comparison.

All difference you notice could be either due to temp randomness or quantization differences even if the original weights are exactly the same.

Also these were likely sourced differently so the quantizers could've used different quantization importance matrizes or process.

The weights are the same, the weights are all that matters.


Wtf are "EU DNS defenses"?

Never heard about that.


What ? Why ? Doesn't seem clear to me.


There’s more But it’s clear to me from the first word “Look — …“. No one starts a comment like that, except ChatGPT who does it all the time.

I frequent subreddits where AI use is super common and the vibe is always the same.


I assure you that my comment was not AI generated, but thank you for the future shock of the accusation.


Not really, llama.cpp can download for quite some time, not as elegant as ollama but:

    llama-server --model-url "https://huggingface.co/bartowski/DeepSeek-R1-Distill-Qwen-32B-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-IQ4_XS.gguf"
Will get you up and running in one single command.


And now you need a server per model? Ollama loads models on-demand, and terminates them after idle, all accessible over the same HTTP API.


Which is not the same model, it's not R1 it's R1-Distill-Qwen-1.5B....


A distinction they make clear and write extensively about on the model page, yes?


wheres that made clear in "ollama run deepseek-r1” the command to download/run the model?


Which you have to go to the model page to find.


Why would they want to sell ?


And what are they going to sell? The weights and the model architecture are already open source. I doubt the datasets of DeepSeek are better than OpenAI's


plus, if the US were to decide to ban DeepSeek (the company) wouldn't non-chinese companies be able to pick up the models and run them at a relatively low expense?


I just set up systemd-timers to nix-gc/docker-prune daily.

Still a bit of an annoyance, but one I don't notice once it's set up.


Great idea. I don't know why I didn't think of that.


I have my seedbox behind hideme vpn. They seem not that good privacy wise, but for torrenting that's not the main concern IMO.

Port forwarding works and you can get a raw wireguard config to dockerize it.

For general use in e.g. open hotspots I still use mullvad/mozilla vpn as I trust them more. (And can pay cashfor mullvad)


They dropped 2 nuclear bombs on, mostly on innocent civilians, seems like pretty good revenge to me.


And even before that, burned > 500 square miles of Japanese urban areas to ash.

Granted, by the last weeks of the war, the USAAF was dropping leaflets warning they'd be back with incendiaries in a few days, a masterstroke in psychological warfare.


Hot take, but Japanese civilians (all civilians, not just men) were to fight American soldiers if they invaded Japan. We'll never know what would have happened if America invaded, but Japan certainly was not making civilian evacuation plans. So maybe innocent-ish civilians?

https://en.wikipedia.org/wiki/File:Kokumin_Giyutai.jpg


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: