3 unstable releases

0.2.1 Dec 28, 2024
0.2.0 Dec 16, 2024
0.1.0 Dec 13, 2024

#356 in Value formatting

Download history 56/week @ 2024-12-07 174/week @ 2024-12-14 11/week @ 2024-12-21 123/week @ 2024-12-28 3/week @ 2025-01-04

367 downloads per month

Custom license

24KB
321 lines

Llama link

Setup Llama.cpp Server

Manual

NixOs

Options:

  1. Use the package https://search.nixos.org/options?channel=unstable&from=0&size=50&sort=relevance&type=packages&query=llama-cpp

  2. Use the flake at ./llama.cpp/flake.nix. E.g.

{
  description = "My CUDA-enabled llama.cpp development environment";

  inputs = {
    nixpkgs.url = "github:NixOS/nixpkgs/nixos-unstable";
    flake-parts.url = "github:hercules-ci/flake-parts";
    llama-cpp.url = "github:ggerganov/llama.cpp";
  };

  outputs = { self, nixpkgs, flake-parts, llama-cpp }@inputs:
    flake-parts.lib.mkFlake { inherit inputs; } {
      systems = [ "x86_64-linux" "aarch64-linux" ];

      perSystem = { config, self', inputs', pkgs, system, ... }: {
        devShells.default = pkgs.mkShell {
          buildInputs = [
            llama-cpp.packages.${system}.cuda

            pkgs.cudatoolkit
            pkgs.gcc
            pkgs.cmake
          ];

          shellHook = ''
            export CUDA_PATH=${pkgs.cudatoolkit}
            export LD_LIBRARY_PATH=${pkgs.cudatoolkit}/lib:$LD_LIBRARY_PATH
          '';
        };
      };
    };
}

Dependencies

~11–24MB
~335K SLoC