Revīziju vēsture

Autors SHA1 Ziņojums Datums
  DannyDaemonic 3498588e0f Add --simple-io option for subprocesses and break out console.h and cpp (#1558) 2 gadi atpakaļ
  Xiao-Yong Jin 0c06204fb3 main : add `--in-prefix-bos` to prefix BOS to user inputs; keep EOS (#2304) 2 gadi atpakaļ
  Evan Jones 84e09a7d8b llama : add grammar-based sampling (#1773) 2 gadi atpakaļ
  Georgi Gerganov e76d630df1 llama : grouped-query attention + LLaMAv2 70B support (#2276) 2 gadi atpakaļ
  Georgi Gerganov b47b8a9cfe llama : optimize memory buffers (#2325) 2 gadi atpakaļ
  Guillaume "Vermeille" Sanchez ab0e26bdfb llama : remove cfg smooth factor as it is only a reparameterization of the guidance scale (#2280) 2 gadi atpakaļ
  Xiao-Yong Jin 6e7cca4047 llama : add custom RoPE (#2054) 2 gadi atpakaļ
  Bach Le c9c74b4e3f llama : add classifier-free guidance (#2135) 2 gadi atpakaļ
  Evan Miller 5656d10599 mpi : add support for distributed inference via MPI (#2099) 2 gadi atpakaļ
  Judd 36680f6e40 convert : update for baichuan (#2081) 2 gadi atpakaļ
  Howard Su b8c8dda75f Use unsigned for random seed (#2006) 2 gadi atpakaļ
  zrm b853d45601 ggml : add NUMA support (#1556) 2 gadi atpakaļ
  Didzis Gosko 527b6fba1d llama : make model stateless and context stateful (llama_state) (#1797) 2 gadi atpakaļ
  Georgi Gerganov 4f9c43e3bd minor : warning fixes 2 gadi atpakaļ
  FrankHB 5b9ccaf104 Fixed possible macro redefinition (#1892) 2 gadi atpakaļ
  Borislav Stanimirov 9cbf50c041 build : fix and ignore MSVC warnings (#1889) 2 gadi atpakaļ
  Georgi Gerganov 2347e45e7b llama : do a warm-up eval at start for better timings (#1824) 2 gadi atpakaļ
  Kerfuffle fa84c4b3e8 Fix issue where interactive mode crashes when input exceeds ctx size (#1789) 2 gadi atpakaļ
  Willy Tarreau 35a84916fb main: add the possibility to open the prompt cache read-only (#1640) 2 gadi atpakaļ
  Georgi Gerganov ecb217db4f llama : Metal inference (#1642) 2 gadi atpakaļ
  Evan Jones 136476e898 Fix prompt cache saving and chat-persistent rollover (#1678) 2 gadi atpakaļ
  DannyDaemonic 248367605e Work around for recalculating logits in cached prompts (Fixes #1585) (#1609) 2 gadi atpakaļ
  Kerfuffle 66874d4fbc Some improvements to loading the session with --prompt-cache (#1550) 2 gadi atpakaļ
  Georgi Gerganov ec2e10c444 llama : add llama_init_backend() API (close #1527) 2 gadi atpakaļ
  Jason McCartney 7694b52b9a main : make reverse prompt option act as a stop token in non-interactive mode (#1032) 2 gadi atpakaļ
  DannyDaemonic ee9654138a Fixes #1511 lambda issue for w64devkit (mingw) (#1513) 2 gadi atpakaļ
  András Salamon 9560655409 define default model path once, sync path with readme (#1366) 2 gadi atpakaļ
  Georgi Gerganov fb62f92433 llama : fix --mtest option (close #1414) 2 gadi atpakaļ
  Evan Jones cf348a60e0 main : add option to save full output to session (#1338) 2 gadi atpakaļ
  DannyDaemonic 41654efea8 Interface improvements and `--multiline-input` (previously `--author-mode`) (#1040) 2 gadi atpakaļ