1
0

019-bug-misc.yml 3.2 KB

1234567891011121314151617181920212223242526272829303132333435363738394041424344454647484950515253545556575859606162636465666768697071727374757677787980818283848586878889909192
  1. name: Bug (misc.)
  2. description: Something is not working the way it should (and it's not covered by any of the above cases).
  3. title: "Misc. bug: "
  4. labels: ["bug-unconfirmed"]
  5. body:
  6. - type: markdown
  7. attributes:
  8. value: >
  9. Thanks for taking the time to fill out this bug report!
  10. This issue template is intended for miscellaneous bugs that don't fit into any other category.
  11. If you encountered the issue while using an external UI (e.g. ollama),
  12. please reproduce your issue using one of the examples/binaries in this repository.
  13. - type: textarea
  14. id: version
  15. attributes:
  16. label: Name and Version
  17. description: Which version of our software is affected? (You can use `--version` to get a version string.)
  18. placeholder: |
  19. $./llama-cli --version
  20. version: 2999 (42b4109e)
  21. built with cc (Ubuntu 11.4.0-1ubuntu1~22.04) 11.4.0 for x86_64-linux-gnu
  22. validations:
  23. required: true
  24. - type: dropdown
  25. id: operating-system
  26. attributes:
  27. label: Operating systems
  28. description: Which operating systems do you know to be affected?
  29. multiple: true
  30. options:
  31. - Linux
  32. - Mac
  33. - Windows
  34. - BSD
  35. - Other? (Please let us know in description)
  36. validations:
  37. required: false
  38. - type: dropdown
  39. id: module
  40. attributes:
  41. label: Which llama.cpp modules do you know to be affected?
  42. multiple: true
  43. options:
  44. - Documentation/Github
  45. - libllama (core library)
  46. - llama-cli
  47. - llama-server
  48. - llama-bench
  49. - llama-quantize
  50. - Python/Bash scripts
  51. - Test code
  52. - Other (Please specify in the next section)
  53. validations:
  54. required: false
  55. - type: textarea
  56. id: command
  57. attributes:
  58. label: Command line
  59. description: >
  60. Please provide the exact commands you entered, if applicable. For example: `llama-server -m ... -c ...`, `llama-cli -m ...`, etc.
  61. This will be automatically formatted into code, so no need for backticks.
  62. render: shell
  63. validations:
  64. required: false
  65. - type: textarea
  66. id: info
  67. attributes:
  68. label: Problem description & steps to reproduce
  69. description: >
  70. Please give us a summary of the problem and tell us how to reproduce it (if applicable).
  71. validations:
  72. required: true
  73. - type: textarea
  74. id: first_bad_commit
  75. attributes:
  76. label: First Bad Commit
  77. description: >
  78. If the bug was not present on an earlier version and it's not trivial to track down: when did it start appearing?
  79. If possible, please do a git bisect and identify the exact commit that introduced the bug.
  80. validations:
  81. required: false
  82. - type: textarea
  83. id: logs
  84. attributes:
  85. label: Relevant log output
  86. description: >
  87. If applicable, please copy and paste any relevant log output, including any generated text.
  88. This will be automatically formatted into code, so no need for backticks.
  89. If you are encountering problems specifically with the `llama_params_fit` module, always upload `--verbose` logs as well.
  90. render: shell
  91. validations:
  92. required: false