Kaynağa Gözat

readme : update hot topics

Georgi Gerganov 2 yıl önce
ebeveyn
işleme
7eb41179ed
1 değiştirilmiş dosya ile 2 ekleme ve 0 silme
  1. 2 0
      README.md

+ 2 - 0
README.md

@@ -11,6 +11,8 @@ Inference of [LLaMA](https://arxiv.org/abs/2302.13971) model in pure C/C++
 
 
 ### Hot topics
 ### Hot topics
 
 
+- Parallel decoding + continuous batching support incoming: [#3228](https://github.com/ggerganov/llama.cpp/pull/3228) \
+  **Devs should become familiar with the new API**
 - Local Falcon 180B inference on Mac Studio
 - Local Falcon 180B inference on Mac Studio
 
 
   https://github.com/ggerganov/llama.cpp/assets/1991296/98abd4e8-7077-464c-ae89-aebabca7757e
   https://github.com/ggerganov/llama.cpp/assets/1991296/98abd4e8-7077-464c-ae89-aebabca7757e