slaren 16bc66d947 llama.cpp : split llama_context_params into model and context params (#3301) 2 년 전
..
CMakeLists.txt ec893798b7 llama : custom attention mask + parallel decoding + no context swaps (#3228) 2 년 전
README.md ec893798b7 llama : custom attention mask + parallel decoding + no context swaps (#3228) 2 년 전
parallel.cpp 16bc66d947 llama.cpp : split llama_context_params into model and context params (#3301) 2 년 전

README.md

llama.cpp/example/parallel

Simplified simluation for serving incoming requests in parallel