| .. |
|
nix
|
68ff663a04
repo : update links to new url (#11886)
|
11 月之前 |
|
cloud-v-pipeline
|
1c641e6aac
`build`: rename main → llama-cli, server → llama-server, llava-cli → llama-llava-cli, etc... (#7809)
|
1 年之前 |
|
cpu.Dockerfile
|
bd3f59f812
cmake : enable curl by default (#12761)
|
9 月之前 |
|
cuda.Dockerfile
|
b0091ecc1e
docker : added all CPU to GPU images (#12749)
|
9 月之前 |
|
intel.Dockerfile
|
b0091ecc1e
docker : added all CPU to GPU images (#12749)
|
9 月之前 |
|
llama-cli-cann.Dockerfile
|
6e1c4cebdb
CANN: Support Opt CONV_TRANSPOSE_1D and ELU (#12786)
|
9 月之前 |
|
llama-cpp-cuda.srpm.spec
|
68ff663a04
repo : update links to new url (#11886)
|
11 月之前 |
|
llama-cpp.srpm.spec
|
68ff663a04
repo : update links to new url (#11886)
|
11 月之前 |
|
musa.Dockerfile
|
b0091ecc1e
docker : added all CPU to GPU images (#12749)
|
9 月之前 |
|
rocm.Dockerfile
|
b0091ecc1e
docker : added all CPU to GPU images (#12749)
|
9 月之前 |
|
tools.sh
|
f643120bad
docker: add perplexity and bench commands to full image (#11438)
|
11 月之前 |
|
vulkan.Dockerfile
|
b0091ecc1e
docker : added all CPU to GPU images (#12749)
|
9 月之前 |