Commit History

Autor SHA1 Mensaxe Data
  Didzis Gosko 527b6fba1d llama : make model stateless and context stateful (llama_state) (#1797) %!s(int64=2) %!d(string=hai) anos
  Georgi Gerganov 4f9c43e3bd minor : warning fixes %!s(int64=2) %!d(string=hai) anos
  FrankHB 5b9ccaf104 Fixed possible macro redefinition (#1892) %!s(int64=2) %!d(string=hai) anos
  Borislav Stanimirov 9cbf50c041 build : fix and ignore MSVC warnings (#1889) %!s(int64=2) %!d(string=hai) anos
  Georgi Gerganov 2347e45e7b llama : do a warm-up eval at start for better timings (#1824) %!s(int64=2) %!d(string=hai) anos
  Kerfuffle fa84c4b3e8 Fix issue where interactive mode crashes when input exceeds ctx size (#1789) %!s(int64=2) %!d(string=hai) anos
  Willy Tarreau 35a84916fb main: add the possibility to open the prompt cache read-only (#1640) %!s(int64=2) %!d(string=hai) anos
  Georgi Gerganov ecb217db4f llama : Metal inference (#1642) %!s(int64=2) %!d(string=hai) anos
  Evan Jones 136476e898 Fix prompt cache saving and chat-persistent rollover (#1678) %!s(int64=2) %!d(string=hai) anos
  DannyDaemonic 248367605e Work around for recalculating logits in cached prompts (Fixes #1585) (#1609) %!s(int64=2) %!d(string=hai) anos
  Kerfuffle 66874d4fbc Some improvements to loading the session with --prompt-cache (#1550) %!s(int64=2) %!d(string=hai) anos
  Georgi Gerganov ec2e10c444 llama : add llama_init_backend() API (close #1527) %!s(int64=2) %!d(string=hai) anos
  Jason McCartney 7694b52b9a main : make reverse prompt option act as a stop token in non-interactive mode (#1032) %!s(int64=2) %!d(string=hai) anos
  DannyDaemonic ee9654138a Fixes #1511 lambda issue for w64devkit (mingw) (#1513) %!s(int64=2) %!d(string=hai) anos
  András Salamon 9560655409 define default model path once, sync path with readme (#1366) %!s(int64=2) %!d(string=hai) anos
  Georgi Gerganov fb62f92433 llama : fix --mtest option (close #1414) %!s(int64=2) %!d(string=hai) anos
  Evan Jones cf348a60e0 main : add option to save full output to session (#1338) %!s(int64=2) %!d(string=hai) anos
  DannyDaemonic 41654efea8 Interface improvements and `--multiline-input` (previously `--author-mode`) (#1040) %!s(int64=2) %!d(string=hai) anos
  Georgi Gerganov f9a6364912 llama : require first token to be BOS (#1303) %!s(int64=2) %!d(string=hai) anos
  Jed Fox 3924088512 Remove default arguments from sampling functions (#1343) %!s(int64=2) %!d(string=hai) anos
  44670 2edbdb0f99 main : add --in-suffix option (#1318) %!s(int64=2) %!d(string=hai) anos
  Tomas f647ce040f fix #1224 reverse prompt and multi line (#1297) %!s(int64=2) %!d(string=hai) anos
  DannyDaemonic 13b0c68ed7 Handle signals properly on Windows (#1123) %!s(int64=2) %!d(string=hai) anos
  Ron Evans 67c77799e0 examples : add llama_init_from_gpt_params() common function (#1290) %!s(int64=2) %!d(string=hai) anos
  Ron Evans 8c9be35ff9 examples : improve vertical alignment of a few variables (#1286) %!s(int64=2) %!d(string=hai) anos
  Robert Brisita 2bb992f034 llama : allow 0 as a seed number. (#1275) %!s(int64=2) %!d(string=hai) anos
  Ron Evans e2cd506999 main : switch input_noecho to input_echo to remove negation (#979) %!s(int64=2) %!d(string=hai) anos
  DannyDaemonic f4cef87edf Add git-based build information for better issue tracking (#1232) %!s(int64=2) %!d(string=hai) anos
  Georgi Gerganov 70269cae37 llama : fix session load / save (#1263) %!s(int64=2) %!d(string=hai) anos
  Georgi Gerganov 334637e43e common : change default parameters to pre-#1126 (#1223) %!s(int64=2) %!d(string=hai) anos