| .. |
|
nix
|
e52aba537a
nix: allow to override rocm gpu targets (#10794)
|
пре 1 година |
|
cloud-v-pipeline
|
1c641e6aac
`build`: rename main → llama-cli, server → llama-server, llava-cli → llama-llava-cli, etc... (#7809)
|
пре 1 година |
|
cpu.Dockerfile
|
7c0e285858
devops : add docker-multi-stage builds (#10832)
|
пре 1 година |
|
cuda.Dockerfile
|
7c0e285858
devops : add docker-multi-stage builds (#10832)
|
пре 1 година |
|
intel.Dockerfile
|
7c0e285858
devops : add docker-multi-stage builds (#10832)
|
пре 1 година |
|
llama-cli-cann.Dockerfile
|
75207b3a88
docker: use GGML_NATIVE=OFF (#10368)
|
пре 1 година |
|
llama-cpp-cuda.srpm.spec
|
0e814dfc42
devops : remove clblast + LLAMA_CUDA -> GGML_CUDA (#8139)
|
пре 1 година |
|
llama-cpp.srpm.spec
|
1c641e6aac
`build`: rename main → llama-cli, server → llama-server, llava-cli → llama-llava-cli, etc... (#7809)
|
пре 1 година |
|
musa.Dockerfile
|
7c0e285858
devops : add docker-multi-stage builds (#10832)
|
пре 1 година |
|
rocm.Dockerfile
|
7c0e285858
devops : add docker-multi-stage builds (#10832)
|
пре 1 година |
|
tools.sh
|
11e07fd63b
fix: graceful shutdown for Docker images (#10815)
|
пре 1 година |
|
vulkan.Dockerfile
|
7c0e285858
devops : add docker-multi-stage builds (#10832)
|
пре 1 година |