Michael Bromley 483dca6a13 chore: Publish v2.3.3 1 yıl önce
..
example-plugins 5e375fa6ee Merge branch 'master' into minor 1 yıl önce
load-testing fb0ea1326e fix(core): Fix DefaultSearchPlugin for non-default languages (#2515) 2 yıl önce
scripts 09c7175f8b feat(core): Add `Facet.valueList` resolver for paginated values 2 yıl önce
test-plugins f9c18643b9 chore: Publish v2.2.6 1 yıl önce
.gitignore 81b46074a9 fix(core): Ensure deterministic sorting in case of duplicates (#2632) 2 yıl önce
README.md cb0307482b chore(dev-server): Update readme & tsconfig 1 yıl önce
cockroachdb-search-strategy.ts f9c18643b9 chore: Publish v2.2.6 1 yıl önce
compileUiExtensions.ts f9c18643b9 chore: Publish v2.2.6 1 yıl önce
dev-config.ts 9db77d79b1 chore(dev-server): Comment out ui extensions by default 1 yıl önce
docker-compose.yml beae6930a5 fix(dev-server): Change postgres container credentials to defaults in config (#2811) 1 yıl önce
eager-relations-bug.ts f9c18643b9 chore: Publish v2.2.6 1 yıl önce
file-logger.ts 8d49af31d9 chore(dev-server): Create SimpleFileLogger 6 yıl önce
get-product-count.ts f9c18643b9 chore: Publish v2.2.6 1 yıl önce
index-worker.ts 0441a0338a chore: Migrate from TSLint to ESLint 2 yıl önce
index.ts 0441a0338a chore: Migrate from TSLint to ESLint 2 yıl önce
memory-profiler.ts 0441a0338a chore: Migrate from TSLint to ESLint 2 yıl önce
migration.ts d509805f58 feat(core): Export database migration helpers 6 yıl önce
package.json 483dca6a13 chore: Publish v2.3.3 1 yıl önce
populate-dev-server.ts 0441a0338a chore: Migrate from TSLint to ESLint 2 yıl önce
tsconfig.json cb0307482b chore(dev-server): Update readme & tsconfig 1 yıl önce

README.md

Vendure Dev Server

This package is not published to npm. It is used in development of the Vendure server and plugins.

Running

To run the server, run the start script. The database configuration can be specified by the DB=<type> environment variable:

DB=mysql npm run start
DB=postgres npm run start
DB=sqlite npm run start

The default if no db is specified is mysql.

Populating data

Test data can be populated by running the populate script. This uses the same sample data as is used by the Vendure CLI when running init, albeit with the additional step of populating some sample customer & address data too.

Specify the database as above to populate that database:

DB=sqlite npm run populate

Testing custom ui extension compilation

In order to compile ui extensions within this monorepo, you need to add the following entry to the temporary admin ui tsconfig.json file:

  "paths": {
      "@vendure/admin-ui/*": ["../../admin-ui/package/*"]
  }

Load testing

This package also contains scripts for load testing the Vendure server. The load testing infrastructure and scripts are located in the ./load-testing directory.

Load testing is done with k6, and to run them you will need k6 installed and (in Windows) available in your PATH environment variable so that it can be run with the command k6.

The load tests assume the existence of the following tables in the database:

  • vendure-load-testing-1000
  • vendure-load-testing-10000
  • vendure-load-testing-100000

The npm scripts load-test:1k, load-test:10k and load-test:100k will populate their respective databases with test data and then run the k6 scripts against them.

Running individual scripts

An individual test script may be by specifying the script name as an argument:

npm run load-test:1k deep-query.js

pg_stat_statements

The following queries can be used when running load tests against postgres to analyze the queries:

SELECT 
  dbid,
  (total_time / 1000 / 60) as total, 
  (total_time/calls) as avg, 
  calls,
  query 
FROM pg_stat_statements 
WHERE dbid = <db_id>
ORDER BY total DESC 
LIMIT 100;

-- SELECT pg_stat_statements_reset();

Results

The results of the test are saved to the ./load-testing/results directory. Each test run creates two files:

  • load-test-<date>-<product-count>.json Contains a summary of all load tests run
  • load-test-<date>-<product-count>-<script-name>.csv Contains time-series data which can be used to create charts

Historical benchmark results with charts can be found in this Google Sheet