piDack
|
0cec062a63
llama : add support for GLM-Edge and GLM-Edge-V series models (#10573)
|
11 meses atrás |
HimariO
|
ba1cb19cdd
llama : add Qwen2VL support + multimodal RoPE (#10361)
|
1 ano atrás |
tc-mb
|
d565bb2fd5
llava : support MiniCPM-V-2.6 (#8967)
|
1 ano atrás |
tc-mb
|
3071c0a5f2
llava : support MiniCPM-V-2.5 (#7599)
|
1 ano atrás |
Ikko Eltociear Ashimine
|
74b239b3d5
llava : update clip.h (#7580)
|
1 ano atrás |
Ting Lou
|
4e9a7f7f7f
llava : change API to pure C style for Rust FFI bindgen (#6079)
|
1 ano atrás |
Elbios
|
0d4177126b
llava : fix memory management bug (#5491)
|
1 ano atrás |
John
|
aa23412989
llava : support v1.6 (#5267)
|
1 ano atrás |
Georgi Gerganov
|
9fbda719de
clip : refactor + bug fixes (#4696)
|
2 anos atrás |
Damian Stewart
|
381efbf480
llava : expose as a shared library for downstream projects (#3613)
|
2 anos atrás |
M. Yusuf Sarıgöz
|
370359e5ba
examples: support LLaVA v1.5 (multimodal model) (#3436)
|
2 anos atrás |