Browse Source

server : free llama_batch on exit (#7212)

* [server] Cleanup a memory leak on exit

There are a couple memory leaks on exit of the server. This hides others.
After cleaning this up, you can see leaks on slots. But that is another
patch to be sent after this.

* make tab into spaces
Steve Grubb 1 year ago
parent
commit
988631335a
1 changed files with 2 additions and 0 deletions
  1. 2 0
      examples/server/server.cpp

+ 2 - 0
examples/server/server.cpp

@@ -673,6 +673,8 @@ struct server_context {
             llama_free_model(model);
             llama_free_model(model);
             model = nullptr;
             model = nullptr;
         }
         }
+
+        llama_batch_free(batch);
     }
     }
 
 
     bool load_model(const gpt_params & params_) {
     bool load_model(const gpt_params & params_) {