Georgi Gerganov e1675d133c llama : avoid fprintf in favor of LLAMA_LOG (#3538) 2 gadi atpakaļ
..
baby-llama bc39553c90 build : enable more non-default compiler warnings (#3200) 2 gadi atpakaļ
batched 8c70a5ff25 batched : add bench tool (#3545) 2 gadi atpakaļ
batched-bench 8c70a5ff25 batched : add bench tool (#3545) 2 gadi atpakaļ
batched.swift 1a159553f9 tokenizer : special token handling (#3538) 2 gadi atpakaļ
beam-search 16bc66d947 llama.cpp : split llama_context_params into model and context params (#3301) 2 gadi atpakaļ
benchmark 65c2c1c5ab benchmark-matmult : do not use integer abs() on a float (#3277) 2 gadi atpakaļ
convert-llama2c-to-ggml 3aefaab9e5 check C++ code with -Wmissing-declarations (#3184) 2 gadi atpakaļ
embd-input 70c29da118 common : fix mirostat state when using multiple sequences (#3543) 2 gadi atpakaļ
embedding 16bc66d947 llama.cpp : split llama_context_params into model and context params (#3301) 2 gadi atpakaļ
export-lora 0e76a8992c train : finetune LORA (#2632) 2 gadi atpakaļ
finetune 424b6381c4 ggml : add context enumeration functions (#3605) 2 gadi atpakaļ
gguf 3aefaab9e5 check C++ code with -Wmissing-declarations (#3184) 2 gadi atpakaļ
gptneox-wip 20c7e1e804 gguf : fix a few general keys (#3341) 2 gadi atpakaļ
infill 70c29da118 common : fix mirostat state when using multiple sequences (#3543) 2 gadi atpakaļ
jeopardy a8777ad84e parallel : add option to load external prompt file (#3416) 2 gadi atpakaļ
llama-bench bc39553c90 build : enable more non-default compiler warnings (#3200) 2 gadi atpakaļ
llava 940efa95fe llava : fix tokenization to not add bos between image embeddings and user prompt (#3645) 2 gadi atpakaļ
main e1675d133c llama : avoid fprintf in favor of LLAMA_LOG (#3538) 2 gadi atpakaļ
main-cmake-pkg 095231dfd3 cmake : fix transient definitions in find pkg (#3411) 2 gadi atpakaļ
metal 6381d4e110 gguf : new file format with flexible meta data (beta) (#2398) 2 gadi atpakaļ
parallel 70c29da118 common : fix mirostat state when using multiple sequences (#3543) 2 gadi atpakaļ
perplexity 16bc66d947 llama.cpp : split llama_context_params into model and context params (#3301) 2 gadi atpakaļ
quantize bc39553c90 build : enable more non-default compiler warnings (#3200) 2 gadi atpakaļ
quantize-stats 16bc66d947 llama.cpp : split llama_context_params into model and context params (#3301) 2 gadi atpakaļ
save-load-state 1142013da4 save-load-state : fix example + add ci test (#3655) 2 gadi atpakaļ
server e74c705e15 editorconfig : remove trailing spaces 2 gadi atpakaļ
simple 16bc66d947 llama.cpp : split llama_context_params into model and context params (#3301) 2 gadi atpakaļ
speculative 70c29da118 common : fix mirostat state when using multiple sequences (#3543) 2 gadi atpakaļ
train-text-from-scratch a5e8c1d8c7 train-text-from-scratch : fix assert failure in ggml-alloc (#3618) 2 gadi atpakaļ
CMakeLists.txt 370359e5ba examples: support LLaVA v1.5 (multimodal model) (#3436) 2 gadi atpakaļ
Miku.sh 019fe257bb MIKU MAYHEM: Upgrading the Default Model for Maximum Fun 🎉 (#2287) 2 gadi atpakaļ
alpaca.sh a17a2683d8 alpaca.sh : update model file name (#2074) 2 gadi atpakaļ
chat-13B.bat d9ad104440 Create chat-13B.bat (#592) 2 gadi atpakaļ
chat-13B.sh 6daa09d879 examples : read chat prompts from a template file (#1196) 2 gadi atpakaļ
chat-persistent.sh ac2219fef3 llama : fix session saving/loading (#3400) 2 gadi atpakaļ
chat-vicuna.sh c36e81da62 examples : add chat-vicuna.sh (#1854) 2 gadi atpakaļ
chat.sh 8341a25957 main : log file (#2748) 2 gadi atpakaļ
gpt4all.sh 107980d970 examples : add -n to alpaca and gpt4all scripts (#706) 2 gadi atpakaļ
json-schema-to-grammar.py 7c2227a197 chmod : make scripts executable (#2675) 2 gadi atpakaļ
llama.vim 2d7baaf50f vim : streaming and more (#2495) 2 gadi atpakaļ
llama2-13b.sh 73643f5fb1 gitignore : changes for Poetry users + chat examples (#2284) 2 gadi atpakaļ
llama2.sh 73643f5fb1 gitignore : changes for Poetry users + chat examples (#2284) 2 gadi atpakaļ
llm.vim ad9ddcff6e llm.vim : stop generation at multiple linebreaks, bind to <F2> (#2879) 2 gadi atpakaļ
make-ggml.py ac43576124 make-ggml.py : compatibility with more models and GGUF (#3290) 2 gadi atpakaļ
reason-act.sh 7c2227a197 chmod : make scripts executable (#2675) 2 gadi atpakaļ
server-llama2-13B.sh 7c2227a197 chmod : make scripts executable (#2675) 2 gadi atpakaļ