Home Esplora Aiuto
Accedi
cturan
/
llama.cpp
mirror da https://github.com/cturan/llama.cpp
1
0
Forka 0
File Problemi 0 Wiki
Albero (Tree): 8ece3836b4
Rami (Branch) Tag
k2v2
master
minimax
qwen3_next
qwen3_next_optimized
toolinjection
test
b6814
llama.cpp
/
examples
/
idle
/
README.md

README.md 75 B
Cronologia Originale

llama.cpp/example/idle

https://github.com/ggml-org/llama.cpp/pull/17766

© 2026 Git
Pagina: 256ms Template: 6ms
italiano
italiano English 简体中文 繁體中文(香港) 繁體中文(臺灣) Deutsch français Nederlands latviešu русский 日本語 español português do Brasil polski български suomi Türkçe čeština српски svenska 한국어 galego українська English (United Kingdom) Magyar Slovenčina Indonesian Persian Vietnamese Português Монгол Română
Javascript Licenses Sito Web