Performance Testing Fundamentals

Performance testing evaluates how systems behave under load, not just whether they function correctly. Users notice slow, unstable applications as much as they notice outright failures. Understanding performance testing fundamentals helps QA engineers design tests that reveal how systems scale, where they struggle, and what users actually experience.

Goals and Types of Performance Testing

Common goals include measuring response times, throughput, resource usage, and stability under different loads. Types of tests include load tests (expected traffic), stress tests (beyond normal limits), soak or endurance tests (sustained load), spike tests (sudden changes), and capacity tests (finding safe limits). Each type answers different questions about the system.

# Example performance questions

- How many requests per second can we handle with acceptable latency?
- What happens when traffic briefly doubles?
- Does the system remain stable over several hours of continuous use?
- Where do errors start appearing as load increases?
Note: Performance tests are not only about maximum numbers; they also help validate consistency and predictability under normal conditions.
Tip: Start with small, controlled tests to understand baseline behaviour before running large or aggressive scenarios.
Warning: Running heavy tests against production or shared environments without coordination can degrade service for real users.

Performance testing terminology includes concepts such as response time percentiles, throughput, error rate, concurrency, and resource utilisation. Becoming familiar with these terms makes it easier to read dashboards, communicate with engineers, and interpret results.

When to Perform Performance Testing

Effective teams run performance tests regularly, not just before major releases. Even small code or configuration changes can affect performance, especially in complex systems. Integrating lightweight performance checks into pipelines and running deeper tests on schedules helps catch regressions early.

Common Mistakes

Mistake 1 โ€” Treating performance testing as a one-time activity

This misses regressions over time.

โŒ Wrong: Running a single big test before launch and never again.

โœ… Correct: Include performance checks as part of ongoing quality practice.

Mistake 2 โ€” Focusing only on peak throughput numbers

User experience depends on latency, errors, and stability too.

โŒ Wrong: Ignoring response time distributions and error patterns.

โœ… Correct: Look at multiple signals, including percentiles and error rates.

🧠 Test Yourself

What is the main purpose of performance testing?