| .. |
|
cmake
|
f3f65429c4
llama : reorganize source code + improve CMake (#8006)
|
преди 1 година |
|
minja
|
7a84777f42
sync: minja (#12739)
|
преди 9 месеца |
|
CMakeLists.txt
|
d7a14c42a1
build : fix build info on windows (#13239)
|
преди 8 месеца |
|
arg.cpp
|
4773d7a02f
examples : remove infill (#13283)
|
преди 8 месеца |
|
arg.h
|
2d451c8059
common : add common_remote_get_content (#13123)
|
преди 8 месеца |
|
base64.hpp
|
381efbf480
llava : expose as a shared library for downstream projects (#3613)
|
преди 2 години |
|
build-info.cpp.in
|
b12fa0d1c1
build : link against build info instead of compiling against it (#3879)
|
преди 2 години |
|
chat.cpp
|
b6930ebc42
`tool-call`: fix non-tool-calling grammar crashes w/ Qwen / Hermes 2 templates (#12900)
|
преди 9 месеца |
|
chat.h
|
4e39a3c332
`server`: extract <think> tags from qwq outputs (#12297)
|
преди 10 месеца |
|
common.cpp
|
bc091a4dc5
common : Define cache directory on AIX (#12915)
|
преди 9 месеца |
|
common.h
|
4773d7a02f
examples : remove infill (#13283)
|
преди 8 месеца |
|
console.cpp
|
8277a817f1
console : utf-8 fix for windows stdin (#9690)
|
преди 1 година |
|
console.h
|
6381d4e110
gguf : new file format with flexible meta data (beta) (#2398)
|
преди 2 години |
|
json-schema-to-grammar.cpp
|
d5fe4e81bd
grammar : handle maxItems == 0 in JSON schema (#13117)
|
преди 8 месеца |
|
json-schema-to-grammar.h
|
669912d9a5
`tool-call`: fix Qwen 2.5 Coder support, add micro benchmarks, support trigger patterns for lazy grammars (#12034)
|
преди 10 месеца |
|
json.hpp
|
5b7b0ac8df
json-schema-to-grammar improvements (+ added to server) (#5978)
|
преди 1 година |
|
llguidance.cpp
|
2447ad8a98
upgrade to llguidance 0.7.10 (#12576)
|
преди 9 месеца |
|
log.cpp
|
bfd11a2344
Fix: Compile failure due to Microsoft STL breaking change (#11836)
|
преди 11 месеца |
|
log.h
|
fef0cbeadf
cleanup: fix compile warnings associated with gnu_printf (#11811)
|
преди 11 месеца |
|
ngram-cache.cpp
|
5bbe6a9fe9
ggml : portability fixes for VS 2017 (#12150)
|
преди 10 месеца |
|
ngram-cache.h
|
727368c60f
llama : use LLAMA_TOKEN_NULL (#11062)
|
преди 1 година |
|
sampling.cpp
|
233461f812
sampling : Integrate Top-nσ into main sampling chain (and add it to the server) (#13264)
|
преди 8 месеца |
|
sampling.h
|
ff227703d6
sampling : support for llguidance grammars (#10224)
|
преди 11 месеца |
|
speculative.cpp
|
e0dbec0bc6
llama : refactor llama_context, llama_kv_cache, llm_build_context (#12181)
|
преди 10 месеца |
|
speculative.h
|
abd4d0bc4f
speculative : update default params (#11954)
|
преди 11 месеца |
|
stb_image.h
|
ad76569f8e
common : Update stb_image.h to latest version (#9161)
|
преди 1 година |