The tokenization service is designed to be a hosting service that can support thousands of clients to access concurrently. If this service is installed in-house, it is supposed to be used in a high throughput and low latency enterprise environment. The performance of the service is critical.
The following section described how the performance test is done locally to measure if the service is suitable for your use case.
A remote instance and tokenization instance are in a docker-compose file to start them together. The test case is designed to use the local cache as we only use one token to retrieve the value for the token. It is to mimic the real world scenario as the production environment should have a big cache for a small set of key-value pair.
To start the service with docker locally.
git clone https://github.com/networknt/light-docker.git
docker-compose -f docker-compose-tokenization.yml up -d
The performance test will only be focus on the detokenizer as it is the most concerns in real world scenario. The tokenizer is basically database throughput test.