Distributed and Cloud Execution with k6

For large-scale tests, a single k6 instance may not generate enough load or may hit client-side limits first. Distributed and cloud execution options let you scale k6 horizontally and run tests closer to your users or infrastructure.

Scaling k6 with Distributed or Cloud Setups

You can run k6 on multiple machines and coordinate them manually or via orchestration tools like Kubernetes, or use hosted solutions such as k6 Cloud that handle distribution for you. Container images and infrastructure-as-code make it easier to provision consistent load-generating environments.

# Example: running k6 in Docker
docker run -i loadimpact/k6 run - < scripts/checkout.js

# Example: Kubernetes job (conceptual)
# apiVersion: batch/v1
# kind: Job
# spec:
#   template:
#     spec:
#       containers:
#         - name: k6
#           image: loadimpact/k6
#           args: ["run", "/scripts/checkout.js"]
Note: When scaling out, ensure the system under test and its monitoring can handle the additional load and telemetry.
Tip: Start by validating scripts and thresholds at smaller scales before running large distributed tests.
Warning: Large tests against shared or production environments must be carefully planned and communicated to avoid accidental outages.

Distributed execution unlocks realistic, high-volume scenarios such as regional traffic patterns or global campaigns.

Common Mistakes

Mistake 1 โ€” Scaling k6 clients before fixing script or environment issues

This amplifies problems.

โŒ Wrong: Running huge tests with unvalidated scripts.

โœ… Correct: Fix correctness and stability at small scale first.

Mistake 2 โ€” Ignoring network proximity and latency

This skews results.

โŒ Wrong: Running all load from a distant region when your users are local.

โœ… Correct: Place load generators in regions that reflect real user locations when possible.

🧠 Test Yourself

What is a key consideration when scaling k6 tests horizontally?