Historique des commits

Auteur SHA1 Message Date
  Faisal Zaghloul 42c76d1358 Threadpool: take 2 (#8672) il y a 1 an
  Daniel Bevenius 01aae2b497 baby-llama : remove duplicate vector include il y a 1 an
  Georgi Gerganov 2b3389677a ggml : refactor rope norm/neox (#7634) il y a 1 an
  Georgi Gerganov ab336a9d5e code : normalize enum names (#5697) il y a 1 an
  NawafAlansari 4480542b22 baby-llama : allocate graphs in ggml_context (#5573) il y a 2 ans
  Georgi Gerganov afefa319f1 ggml : change ggml_scale to take a float instead of tensor (#4573) il y a 2 ans
  slaren cafcd4f895 ggml : remove n_dims from ggml_tensor (#4469) il y a 2 ans
  Cebtenzzre bc39553c90 build : enable more non-default compiler warnings (#3200) il y a 2 ans
  xaedes 0e76a8992c train : finetune LORA (#2632) il y a 2 ans
  Georgi Gerganov ec893798b7 llama : custom attention mask + parallel decoding + no context swaps (#3228) il y a 2 ans
  Cebtenzzre 3aefaab9e5 check C++ code with -Wmissing-declarations (#3184) il y a 2 ans
  Cebtenzzre ef15649972 build : fix most gcc and clang warnings (#2861) il y a 2 ans
  Kawrakow eb542d3932 Add LLAMA_DEFAULT_RMS_EPS so we can change the default (#2384) il y a 2 ans
  slaren 41c674161f make rms_norm_eps a parameter (#2374) il y a 2 ans
  Qingyou Meng 1d656d6360 ggml : change ggml_graph_compute() API to not require context (#1999) il y a 2 ans
  Howard Su 0be54f75a6 baby-llama : fix build after ggml_rope change (#2016) il y a 2 ans
  Borislav Stanimirov 9cbf50c041 build : fix and ignore MSVC warnings (#1889) il y a 2 ans
  0xspringtime 9254920265 baby-llama : fix operator!= (#1821) il y a 2 ans
  xaedes e32089b2c2 train : improved training-from-scratch example (#1652) il y a 2 ans
  xaedes f954edda93 ggml : implement backward pass for llama + small training-llama-from-scratch example (#1360) il y a 2 ans