You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Ollama vendors a copy of llama.cpp as a git submodule: https://github.com/ollama/ollama/tree/main/llm. Llama-cpp is built as part of the buildGoModule, apparently during the go generate. This significantly increases build times when iterating on the ollama derivation (because you have to rebuild the submodule each time you change the ambient ollama package). This also makes it much harder to configure (patch, tweak) llama.cpp used by ollama. Overall, I think we want less vendoring in Nixpkgs, we want fewer "build inside a build" situations.
It is desirable to refactor the ollama expression so as to reuse the prebuilt llama-cpp from Nixpkgs
Issue description
Ollama vendors a copy of
llama.cpp
as a git submodule: https://github.com/ollama/ollama/tree/main/llm. Llama-cpp is built as part of thebuildGoModule
, apparently during thego generate
. This significantly increases build times when iterating on the ollama derivation (because you have to rebuild the submodule each time you change the ambient ollama package). This also makes it much harder to configure (patch, tweak) llama.cpp used by ollama. Overall, I think we want less vendoring in Nixpkgs, we want fewer "build inside a build" situations.It is desirable to refactor the ollama expression so as to reuse the prebuilt
llama-cpp
from NixpkgsEDIT: A detailed overview of the current situation is provided by @abysssol at #323249 (comment)
CC @abysssol @dit7ya @elohmeier @RoyDubnium
Historical context
This is a follow-up to #323249 and https://github.com/NixOS/nixpkgs/pull/323056/files#diff-99616cb74ed708e718f689a9fc8f62999fdff6164590233bbb1a81a0f8f8c08b, aiming to make Nixpkgs' ollama nicer and easier to maintain
The text was updated successfully, but these errors were encountered: