This website works better with JavaScript
Home
Explore
Help
Sign In
cturan
/
llama.cpp
mirror of
https://github.com/cturan/llama.cpp
Watch
1
Star
0
Fork
0
Files
Issues
0
Wiki
Tree:
c56b715269
Branches
Tags
k2v2
master
minimax
qwen3_next
qwen3_next_optimized
toolinjection
test
b6814
llama.cpp
/
prompts
Tomáš Pazdiora
f4d277ae17
main : alternative instruct mode (Vicuna support, etc.) (
#863
)
2 years ago
..
dan.txt
ab77d76312
Add longer DAN prompt for testing big batch numbers
2 years ago
reason-act.txt
82d146df9b
do not force the prompt file to end with a new line (
#908
)
2 years ago