| .. |
|
baby-llama
|
1d656d6360
ggml : change ggml_graph_compute() API to not require context (#1999)
|
2 gadi atpakaļ |
|
benchmark
|
1d656d6360
ggml : change ggml_graph_compute() API to not require context (#1999)
|
2 gadi atpakaļ |
|
embd-input
|
5656d10599
mpi : add support for distributed inference via MPI (#2099)
|
2 gadi atpakaļ |
|
embedding
|
5656d10599
mpi : add support for distributed inference via MPI (#2099)
|
2 gadi atpakaļ |
|
jeopardy
|
5ddf7ea1fb
hooks : setting up flake8 and pre-commit hooks (#1681)
|
2 gadi atpakaļ |
|
main
|
32c5411631
Revert "Support using mmap when applying LoRA (#2095)" (#2206)
|
2 gadi atpakaļ |
|
metal
|
1d656d6360
ggml : change ggml_graph_compute() API to not require context (#1999)
|
2 gadi atpakaļ |
|
perplexity
|
5656d10599
mpi : add support for distributed inference via MPI (#2099)
|
2 gadi atpakaļ |
|
quantize
|
5656d10599
mpi : add support for distributed inference via MPI (#2099)
|
2 gadi atpakaļ |
|
quantize-stats
|
1b107b8550
ggml : generalize `quantize_fns` for simpler FP16 handling (#1237)
|
2 gadi atpakaļ |
|
save-load-state
|
527b6fba1d
llama : make model stateless and context stateful (llama_state) (#1797)
|
2 gadi atpakaļ |
|
server
|
32c5411631
Revert "Support using mmap when applying LoRA (#2095)" (#2206)
|
2 gadi atpakaļ |
|
simple
|
5656d10599
mpi : add support for distributed inference via MPI (#2099)
|
2 gadi atpakaļ |
|
train-text-from-scratch
|
5bf2a27718
ggml : remove src0 and src1 from ggml_tensor and rename opt to src (#2178)
|
2 gadi atpakaļ |
|
CMakeLists.txt
|
cfa0750bc9
llama : support input embeddings directly (#1910)
|
2 gadi atpakaļ |
|
Miku.sh
|
a8a2efdc81
examples : various prompt and example fixes (#1298)
|
2 gadi atpakaļ |
|
alpaca.sh
|
a17a2683d8
alpaca.sh : update model file name (#2074)
|
2 gadi atpakaļ |
|
chat-13B.bat
|
d9ad104440
Create chat-13B.bat (#592)
|
2 gadi atpakaļ |
|
chat-13B.sh
|
6daa09d879
examples : read chat prompts from a template file (#1196)
|
2 gadi atpakaļ |
|
chat-persistent.sh
|
1359b6aba5
chat-persistent.sh : use bracket expressions in grep (#1564)
|
2 gadi atpakaļ |
|
chat-vicuna.sh
|
c36e81da62
examples : add chat-vicuna.sh (#1854)
|
2 gadi atpakaļ |
|
chat.sh
|
79b2b266db
If n_predict == -1, generate forever
|
2 gadi atpakaļ |
|
common.cpp
|
32c5411631
Revert "Support using mmap when applying LoRA (#2095)" (#2206)
|
2 gadi atpakaļ |
|
common.h
|
c9c74b4e3f
llama : add classifier-free guidance (#2135)
|
2 gadi atpakaļ |
|
gpt4all.sh
|
107980d970
examples : add -n to alpaca and gpt4all scripts (#706)
|
2 gadi atpakaļ |
|
reason-act.sh
|
a6956b25a1
add example of re-act pattern (#583)
|
2 gadi atpakaļ |