| .. |
|
baby-llama
|
1d656d6360
ggml : change ggml_graph_compute() API to not require context (#1999)
|
2 anni fa |
|
benchmark
|
1d656d6360
ggml : change ggml_graph_compute() API to not require context (#1999)
|
2 anni fa |
|
embd-input
|
5656d10599
mpi : add support for distributed inference via MPI (#2099)
|
2 anni fa |
|
embedding
|
5656d10599
mpi : add support for distributed inference via MPI (#2099)
|
2 anni fa |
|
jeopardy
|
5ddf7ea1fb
hooks : setting up flake8 and pre-commit hooks (#1681)
|
2 anni fa |
|
main
|
5656d10599
mpi : add support for distributed inference via MPI (#2099)
|
2 anni fa |
|
metal
|
1d656d6360
ggml : change ggml_graph_compute() API to not require context (#1999)
|
2 anni fa |
|
perplexity
|
5656d10599
mpi : add support for distributed inference via MPI (#2099)
|
2 anni fa |
|
quantize
|
5656d10599
mpi : add support for distributed inference via MPI (#2099)
|
2 anni fa |
|
quantize-stats
|
1b107b8550
ggml : generalize `quantize_fns` for simpler FP16 handling (#1237)
|
2 anni fa |
|
save-load-state
|
527b6fba1d
llama : make model stateless and context stateful (llama_state) (#1797)
|
2 anni fa |
|
server
|
5656d10599
mpi : add support for distributed inference via MPI (#2099)
|
2 anni fa |
|
simple
|
5656d10599
mpi : add support for distributed inference via MPI (#2099)
|
2 anni fa |
|
train-text-from-scratch
|
1d656d6360
ggml : change ggml_graph_compute() API to not require context (#1999)
|
2 anni fa |
|
CMakeLists.txt
|
cfa0750bc9
llama : support input embeddings directly (#1910)
|
2 anni fa |
|
Miku.sh
|
a8a2efdc81
examples : various prompt and example fixes (#1298)
|
2 anni fa |
|
alpaca.sh
|
a17a2683d8
alpaca.sh : update model file name (#2074)
|
2 anni fa |
|
chat-13B.bat
|
d9ad104440
Create chat-13B.bat (#592)
|
2 anni fa |
|
chat-13B.sh
|
6daa09d879
examples : read chat prompts from a template file (#1196)
|
2 anni fa |
|
chat-persistent.sh
|
1359b6aba5
chat-persistent.sh : use bracket expressions in grep (#1564)
|
2 anni fa |
|
chat-vicuna.sh
|
c36e81da62
examples : add chat-vicuna.sh (#1854)
|
2 anni fa |
|
chat.sh
|
79b2b266db
If n_predict == -1, generate forever
|
2 anni fa |
|
common.cpp
|
db4047ad5c
main : escape prompt prefix/suffix (#2151)
|
2 anni fa |
|
common.h
|
d7d2e6a0f0
server: add option to output probabilities for completion (#1962)
|
2 anni fa |
|
gpt4all.sh
|
107980d970
examples : add -n to alpaca and gpt4all scripts (#706)
|
2 anni fa |
|
reason-act.sh
|
a6956b25a1
add example of re-act pattern (#583)
|
2 anni fa |