While `uv` works amazingly well I think a lot of people don't realize that installing packages through conda (or let's say the conda-forge ecosystem) has technical advantages compared to wheels/pypi.
When you install the numpy wheel through `uv` you are likely installing a pre-compiled binary that bundles openblas inside of it. When you install numpy through conda-forge, it dynamically links against a dummy blas package that can be substituted for mkl, openblas, accelerate, whatever you prefer on your system. It's a much better solution to be able to rely on a separate package rather than having to bundle every dependency.
Then lets say you install scipy. Scipy also has to bundle openblas in their wheel, and now you have two copies of openblas sitting around. They don't conflict, but this quickly becomes an odd thing to have to do.
In this sense I personally prefer pixi because of this. It is pixi like but resolves using conda channels like conda, and similar to conda it supports PyPI packages via uv.
With a background in scientific computing where many of the dependencies I managed are compiled, conda packages gives me much more control.
P.S. I’d like to point out to others to differentiate between package index and package managers. PyPI is an index (that hosts packages in a predefined format) while pip, poetry, uv are package managers that resolve and build your environments using the index.
Similarly but a bit more confusingly, conda can be understood as the index, hosted by anaconda but can also be hosted elsewhere, with different “channels” (kinda like a GitHub organization) where conda-forge is a popular one built by communities. Conda is also a reference implementation of a package manager that uses anaconda channels to resolve. Mamba is an independent, performant, drop in replacement of conda. And pixi is a different one with a different interface by the author of mamba.
Even more confusingly, there are distributions. Distributions come with a set of predefined packages together with the package manager such that you just start running things immediately (sort of like a TeXLive distribution in relation to the package manager tlmgr.) there are anaconda distributions (if you installed anaconda instead of installing conda, that’s what you get), but also Intels distribution for Python, mini forge, mambaforge, etc.
My guess is that the difference is more that PyPI intends to be a Python package repository, and thus I don’t think you can just upload say a binary copy of MKL without accompanying Python code. It’s originally a source-based repository with binary wheels being an afterthought. (I still remember the pre-wheel nightmare `pip install numpy` used to give, when it required compiling the C/C++/Fortran pieces which often failed and was often hard to debug…)
But Anaconda and CondaForge are general package repository, they are not Python-specific but are happy to be used for R, Julia, C/C++/Fortran binaries, etc. it’s primarily a binary-based repository. For example, you can `conda install python` but you can’t `pip install python`.
I don’t know if there is any technical barrier or just a philosophical barrier. Clearly, Pip handles binary blobs inside of Python packages fine, so I would guess the latter but am happy to be corrected :).
Well, uv itself is just a binary wheel, with source distribution available of course. Uv uses basically no python, it's pure Rust and is still distributed using PyPI.
Fundamentally, conda is like a linux distro (or homebrew): it is cross-language package manager designed to work with a coherent set of packages (either via the anaconda channel or conda-forge). uv is currently a different installer for PyPI, which means inheriting all the positives and negatives of it. One of the negatives is the packages are not coherent, so everything needs to be vendored in such a way as to not interfere with other packages. Unless Astral wants to pay packagers to create a parallel ecosystem, uv cannot do this.
When you install the numpy wheel through `uv` you are likely installing a pre-compiled binary that bundles openblas inside of it. When you install numpy through conda-forge, it dynamically links against a dummy blas package that can be substituted for mkl, openblas, accelerate, whatever you prefer on your system. It's a much better solution to be able to rely on a separate package rather than having to bundle every dependency.
Then lets say you install scipy. Scipy also has to bundle openblas in their wheel, and now you have two copies of openblas sitting around. They don't conflict, but this quickly becomes an odd thing to have to do.