| .. |
|
android
|
1c641e6aac
`build`: rename main → llama-cli, server → llama-server, llava-cli → llama-llava-cli, etc... (#7809)
|
пре 1 година |
|
CMakeLists.txt
|
7841fc723e
llama : Add Gemma 3 support (+ experimental vision capability) (#12343)
|
пре 10 месеци |
|
MobileVLM-README.md
|
e665744317
llava : fix the script error in MobileVLM README (#9054)
|
пре 1 година |
|
README-gemma3.md
|
7841fc723e
llama : Add Gemma 3 support (+ experimental vision capability) (#12343)
|
пре 10 месеци |
|
README-glmedge.md
|
0cec062a63
llama : add support for GLM-Edge and GLM-Edge-V series models (#10573)
|
пре 11 месеци |
|
README-granitevision.md
|
84d5f4bc19
Update granite vision docs for 3.2 model (#12105)
|
пре 10 месеци |
|
README-minicpmo2.6.md
|
8352cdc87b
llava : fix bug in minicpm-v code (#11513)
|
пре 10 месеци |
|
README-minicpmv2.5.md
|
8352cdc87b
llava : fix bug in minicpm-v code (#11513)
|
пре 10 месеци |
|
README-minicpmv2.6.md
|
8352cdc87b
llava : fix bug in minicpm-v code (#11513)
|
пре 10 месеци |
|
README-quantize.md
|
1ec208083c
llava: add quantization for the visual projector LLAVA, Qwen2VL (#11644)
|
пре 11 месеци |
|
README.md
|
7a2c913e66
llava : Add Granite Vision Support (#11794)
|
пре 10 месеци |
|
clip-quantize-cli.cpp
|
1ec208083c
llava: add quantization for the visual projector LLAVA, Qwen2VL (#11644)
|
пре 11 месеци |
|
clip.cpp
|
7841fc723e
llama : Add Gemma 3 support (+ experimental vision capability) (#12343)
|
пре 10 месеци |
|
clip.h
|
96e1280839
clip : bring back GPU support (#12322)
|
пре 10 месеци |
|
convert_image_encoder_to_gguf.py
|
e9b2f84f14
llava: add big-endian conversion for image encoder (#12218)
|
пре 10 месеци |
|
gemma3-cli.cpp
|
e0dbec0bc6
llama : refactor llama_context, llama_kv_cache, llm_build_context (#12181)
|
пре 10 месеци |
|
gemma3_convert_encoder_to_gguf.py
|
7841fc723e
llama : Add Gemma 3 support (+ experimental vision capability) (#12343)
|
пре 10 месеци |
|
glmedge-convert-image-encoder-to-gguf.py
|
0cec062a63
llama : add support for GLM-Edge and GLM-Edge-V series models (#10573)
|
пре 11 месеци |
|
glmedge-surgery.py
|
0cec062a63
llama : add support for GLM-Edge and GLM-Edge-V series models (#10573)
|
пре 11 месеци |
|
llava-cli.cpp
|
afa8a9ec9b
llama : add `llama_vocab`, functions -> methods, naming (#11110)
|
пре 1 година |
|
llava.cpp
|
7a2c913e66
llava : Add Granite Vision Support (#11794)
|
пре 10 месеци |
|
llava.h
|
3071c0a5f2
llava : support MiniCPM-V-2.5 (#7599)
|
пре 1 година |
|
llava_surgery.py
|
e235b267a2
py : switch to snake_case (#8305)
|
пре 1 година |
|
llava_surgery_v2.py
|
7a2c913e66
llava : Add Granite Vision Support (#11794)
|
пре 10 месеци |
|
minicpmv-cli.cpp
|
96e1280839
clip : bring back GPU support (#12322)
|
пре 10 месеци |
|
minicpmv-convert-image-encoder-to-gguf.py
|
8352cdc87b
llava : fix bug in minicpm-v code (#11513)
|
пре 10 месеци |
|
minicpmv-surgery.py
|
3e3357fd77
llava : support Minicpm-omni (#11289)
|
пре 1 година |
|
qwen2_vl_surgery.py
|
4ddd199f6f
llava : Allow locally downloaded models for QwenVL (#10833)
|
пре 1 година |
|
qwen2vl-cli.cpp
|
afa8a9ec9b
llama : add `llama_vocab`, functions -> methods, naming (#11110)
|
пре 1 година |
|
requirements.txt
|
d3ae0ee8d7
py : fix requirements check '==' -> '~=' (#8982)
|
пре 1 година |