Michael Bromley 18a7e1f05d chore: Publish v3.3.2 7 ay önce
..
example-plugins 8a4ab91de1 Merge branch 'master' into minor 1 yıl önce
graphql 13c8969421 feat(dashboard): Implement alert system with customizable alerts 9 ay önce
load-testing 7659fc805b feat(dashboard): Integrate facet value selection and UI components 10 ay önce
scripts 50a9dc84c5 feat(dev-server): Enhance past orders generation with retry logic and configurable limits 9 ay önce
test-plugins 8763eb7338 feat(dashboard): Improve support for dashboard extensions 9 ay önce
.gitignore 81b46074a9 fix(core): Ensure deterministic sorting in case of duplicates (#2632) 2 yıl önce
README.md 54f6c7c4c4 chore: Rename development scripts from "start" to "dev" 1 yıl önce
dev-config.ts a2721d12d6 fix(core): Create spans for method calls within same class (#3564) 7 ay önce
index-worker.ts 0441a0338a chore: Migrate from TSLint to ESLint 2 yıl önce
index.ts 341003bd17 chore(dev-server): Add runMigrations to bootstrap 11 ay önce
instrumentation.ts 0f7ae934f8 feat(telemetry-plugin): Refine and document telemetry-plugin APIs 8 ay önce
memory-profiler.ts 0441a0338a chore: Migrate from TSLint to ESLint 2 yıl önce
migration.ts a203037e1d feat(dashboard): Experimental packaging setup 10 ay önce
package.json 18a7e1f05d chore: Publish v3.3.2 7 ay önce
populate-dev-server.ts 0441a0338a chore: Migrate from TSLint to ESLint 2 yıl önce
tsconfig.json 2f48709963 chore(dev-server): Test paths in dashboard extensions 9 ay önce
vite.config.mts a9003f0354 chore(dev-server): Remove theme configuration from Vite setup for simplification 9 ay önce

README.md

Vendure Dev Server

This package is not published to npm. It is used in development of the Vendure server and plugins.

Running

Ensure you have a database running. From the root directory, run:

docker-compose up -d mariadb

To run the server, run the dev script. The database configuration can be specified by the DB=<type> environment variable:

cd packages/dev-server

[DB=mysql|postgres|sqlite] npm run dev

The default if no db is specified is mysql.

Populating data

Test data can be populated by running the populate script. This uses the same sample data as is used by the Vendure CLI when running init, albeit with the additional step of populating some sample customer & address data too.

Specify the database as above to populate that database:

[DB=mysql|postgres|sqlite] npm run populate

Testing custom ui extension compilation

In order to compile ui extensions within this monorepo, you need to add the following entry to the temporary admin ui tsconfig.json file:

  "paths": {
      "@vendure/admin-ui/*": ["../../admin-ui/package/*"]
  }

Load testing

This package also contains scripts for load testing the Vendure server. The load testing infrastructure and scripts are located in the ./load-testing directory.

Load testing is done with k6, and to run them you will need k6 installed and (in Windows) available in your PATH environment variable so that it can be run with the command k6.

The load tests assume the existence of the following tables in the database:

  • vendure-load-testing-1000
  • vendure-load-testing-10000
  • vendure-load-testing-100000

The npm scripts load-test:1k, load-test:10k and load-test:100k will populate their respective databases with test data and then run the k6 scripts against them.

Running individual scripts

An individual test script may be by specifying the script name as an argument:

npm run load-test:1k deep-query.js

pg_stat_statements

The following queries can be used when running load tests against postgres to analyze the queries:

SELECT 
  dbid,
  (total_time / 1000 / 60) as total, 
  (total_time/calls) as avg, 
  calls,
  query 
FROM pg_stat_statements 
WHERE dbid = <db_id>
ORDER BY total DESC 
LIMIT 100;

-- SELECT pg_stat_statements_reset();

Results

The results of the test are saved to the ./load-testing/results directory. Each test run creates two files:

  • load-test-<date>-<product-count>.json Contains a summary of all load tests run
  • load-test-<date>-<product-count>-<script-name>.csv Contains time-series data which can be used to create charts

Historical benchmark results with charts can be found in this Google Sheet