| .. |
|
nix
|
68ff663a04
repo : update links to new url (#11886)
|
11 месяцев назад |
|
cloud-v-pipeline
|
1c641e6aac
`build`: rename main → llama-cli, server → llama-server, llava-cli → llama-llava-cli, etc... (#7809)
|
1 год назад |
|
cpu.Dockerfile
|
6e264a905b
docker : add GGML_CPU_ARM_ARCH arg to select ARM architecture to build for (#11419)
|
11 месяцев назад |
|
cuda.Dockerfile
|
dbc2ec59b5
docker : drop to CUDA 12.4 (#11869)
|
11 месяцев назад |
|
intel.Dockerfile
|
7c0e285858
devops : add docker-multi-stage builds (#10832)
|
1 год назад |
|
llama-cli-cann.Dockerfile
|
75207b3a88
docker: use GGML_NATIVE=OFF (#10368)
|
1 год назад |
|
llama-cpp-cuda.srpm.spec
|
68ff663a04
repo : update links to new url (#11886)
|
11 месяцев назад |
|
llama-cpp.srpm.spec
|
68ff663a04
repo : update links to new url (#11886)
|
11 месяцев назад |
|
musa.Dockerfile
|
bd6e55bfd3
musa: bump MUSA SDK version to rc3.1.1 (#11822)
|
11 месяцев назад |
|
rocm.Dockerfile
|
68ff663a04
repo : update links to new url (#11886)
|
11 месяцев назад |
|
tools.sh
|
f643120bad
docker: add perplexity and bench commands to full image (#11438)
|
11 месяцев назад |
|
vulkan.Dockerfile
|
d0c08040b6
ci : fix build CPU arm64 (#11472)
|
11 месяцев назад |