security.feature 1.8 KB

12345678910111213141516171819202122232425262728293031323334353637383940414243444546474849505152
  1. @llama.cpp
  2. @security
  3. Feature: Security
  4. Background: Server startup with an api key defined
  5. Given a server listening on localhost:8080
  6. And a model file tinyllamas/stories260K.gguf from HF repo ggml-org/models
  7. And a server api key llama.cpp
  8. Then the server is starting
  9. Then the server is healthy
  10. Scenario Outline: Completion with some user api key
  11. Given a prompt test
  12. And a user api key <api_key>
  13. And 4 max tokens to predict
  14. And a completion request with <api_error> api error
  15. Examples: Prompts
  16. | api_key | api_error |
  17. | llama.cpp | no |
  18. | llama.cpp | no |
  19. | hackeme | raised |
  20. | | raised |
  21. Scenario Outline: OAI Compatibility
  22. Given a system prompt test
  23. And a user prompt test
  24. And a model test
  25. And 2 max tokens to predict
  26. And streaming is disabled
  27. And a user api key <api_key>
  28. Given an OAI compatible chat completions request with <api_error> api error
  29. Examples: Prompts
  30. | api_key | api_error |
  31. | llama.cpp | no |
  32. | llama.cpp | no |
  33. | hackme | raised |
  34. Scenario Outline: CORS Options
  35. Given a user api key llama.cpp
  36. When an OPTIONS request is sent from <origin>
  37. Then CORS header <cors_header> is set to <cors_header_value>
  38. Examples: Headers
  39. | origin | cors_header | cors_header_value |
  40. | localhost | Access-Control-Allow-Origin | localhost |
  41. | web.mydomain.fr | Access-Control-Allow-Origin | web.mydomain.fr |
  42. | origin | Access-Control-Allow-Credentials | true |
  43. | web.mydomain.fr | Access-Control-Allow-Methods | POST |
  44. | web.mydomain.fr | Access-Control-Allow-Headers | * |