
I. What is Performance Testing?
Performance testing is a testing process for a part of or the entire system to determine its stability, speed, scalability, and responsiveness under certain workload volumes.
- According to the ISTQB Glossary: “Performance testing” is the process of testing to determine the performance of a component or system. And “performance” is the degree to which a component or system fulfills its functions regarding resource utilization, time, and capability.
The goals of performance testing often include evaluating the application’s processing results, processing speed, data transfer rate, network bandwidth usage, maximum number of concurrent users it can handle, memory usage, workload efficiency, and response time to requests.
The main focus areas of performance testing typically include:
- Speed: Determining if the system responds to requests fast enough.
- Stability: Determining if the system remains stable under various load levels.
- Scalability: Determining the maximum number of users the software system can handle.
II. Why is Performance Testing Necessary?
When evaluating a software system, attention is paid not only to the features and functions it supports, but also to its working performance (efficiency) such as response time, reliability, resource utilization, and scalability.
Depending on the system being developed, the allocated time, or customer requirements, the reasons for conducting performance testing will vary. Below are some common reasons:
- To verify if the application system meets performance requirements (e.g., can the system handle a maximum of 3,000 concurrent users?).
- To identify bottlenecks within the entire application system.
- To verify if the performance level announced/published by the software vendor is accurate.
- To compare two or more systems to determine which one performs better.
- To measure stability during peak events (e.g., during the sale of concert tickets for a famous singer).
The objective of performance testing is not to find bugs in the software but to find and eliminate bottlenecks that reduce the system’s working performance.
By performing performance testing, we provide stakeholders (including customers, PMs, and QA Managers) with information on the system’s speed, stability, and scalability. This allows them to decide on the next steps, such as whether to release the software or if performance issues (like fixing the bottleneck module identified during testing) need to be resolved before deployment.
III. Steps to Implement Performance Testing
Depending on the type of performance test you are deploying, the specific execution steps will differ. Here are the common steps for performance testing:
- Identify the Testing Environment and Required Tools
- Define the actual environment where the system will be deployed, the testing environment, and suitable testing tools based on customer requirements or/and the PM.
- You need a comparative table of technical specifications for hardware, software, infrastructure, and configuration between the testing environment and the production environment to ensure consistency.
- Determine Acceptable Performance Criteria
- After meetings with customers and the PM, you must clearly define (and ideally have sign-off from stakeholders) the goals, thresholds, and other constraints that indicate testing is complete.
- These criteria are usually mentioned in the Software Requirements Specification (SRS); additionally, the performance tester should propose additional test cases or scenarios and compare the system’s performance against external benchmarks.
- Plan and Design Test Cases
- Once you have enough information, the next step is to plan the testing. Consider different levels of usage and then devise appropriate test scenarios for feasible use cases. Design suitable test cases and outline the metrics to be checked. For example, predict load numbers for concurrent requests in load testing.
- Prepare the Environment and Tools (Including Script Writing)
- Once you have a preliminary or detailed plan, you need to prepare the environment as early as possible to address any issues quickly and avoid impacting the overall project schedule.
- You need to build (prepare) the testing environment, often requiring assistance or coordination with developers and DevOps. In addition, you must write the scripts for your test cases.
- Execute Performance Testing (Run the Scripts)
- After everything is prepared—including the testing environment, scripts in the performance testing tool, and the system/modules to be tested being developed and deployed—we execute the test cases and record the results.
- Resolve Issues and Retest
- Based on the reports after testing, the team will collectively analyze and identify the slow-performing module or function cluster to be fixed. Developers typically handle software-related fixes (e.g., optimizing processing algorithms), while DevOps handles performance issues related to infrastructure.
- After all or the most critical issues have been resolved, you must re-run the scenario (following the same steps as the previous performance test) to determine if the performance errors have been completely fixed.
IV. Performance Testing Tips
Based on years of software testing experience, a valuable lesson is that the testing environment—especially for performance testing—must be as close to the real-world (production) environment as possible. If these environments are dissimilar, the tests performed may not accurately assess the system’s performance upon actual deployment.
Here are some tips for better performance testing:
- Isolate the performance testing environment from other environments.
- Find tools that meet/suit your performance testing requirements.
- Run test scenarios multiple times to obtain more accurate data.
- Do not change or adjust the testing environment between test runs to ensure reliable results.
V. Common Types of Performance Testing
- Load Testing: Checks the application’s ability to operate under different loads (load is the number of users or requests sent to the system). The main goal is to determine the maximum load the system can handle.
- Stress Testing: Tests to evaluate an application system under extremely heavy workloads to assess and observe how the system handles high traffic or large data volumes. The main goal is to determine the application’s “breaking point”—the moment the system, despite being very slow, transitions to an inoperable state.
- Endurance Testing: Performed to ensure the software can handle the expected continuous load for a prolonged period.
- Spike Testing: Checks the software system’s reaction to sudden, massive spikes in user-generated load.
- Volume Testing: Tests to evaluate the system’s response time and behavior when there is a large amount of data in the database.
- Scalability Testing: Testing activities performed to determine the scalability of a system/software product. The main goal is to determine the system’s effectiveness in “scaling up” to support an increased user base.
VI. Common Performance Metrics
Depending on the scope and type of system (e.g., web or mobile application), the parameters measured and information collected/monitored will differ. Below are the general metrics commonly collected and monitored during performance testing:
- Processor Usage (CPU): The time the processor spends executing processing threads.
- Memory Usage (RAM): The amount of temporary memory (RAM) used to fulfill requests.
- Disk I/O: The time the disk is busy executing read or write requests.
- Response Time: The time from when the user initiates a request until the first response is received from the server.
- Throughput: Measures the number of requests per unit of time, often per second.
- Hits per Second: The number of requests/hits sent to the server per second.
- Thread Counts: The number of threads currently running and active, reflecting the system’s “health.”
VII. Popular Performance Testing Tools
There are many performance testing tools for web and mobile applications. Choosing the right tool depends on factors such as the application type, supported protocols, cost, hardware requirements, etc. Here are some popular performance testing tools:
- Apache JMeter
- One of the leading performance testing tools widely used for server load testing (web applications). JMeter can be used for load testing, analysis, and performance measurement of various service types, focusing on web applications.
- BlazeMeter
- Designed and built by open-source enthusiasts. BlazeMeter helps you conduct large-scale performance and load testing directly within your IDE. It offers high performance, full functionality, no coding required, API testing/monitoring, test data generation, and service virtualization.
- LoadRunner Professional
- Currently one of the most popular, though high-cost, performance testing tools on the market. It can simulate hundreds of thousands of users, helping load test applications to determine their behavior under specific load levels.
- K6 Open Source (Grafana k6)
- A developer-centric, extensible, and open-source load testing tool that easily and efficiently evaluates the performance of web systems. K6 helps you check the reliability and performance of systems under development and catch performance-related issues and bugs earlier.


EN
日本語