Does your project have requirements for performance? Even if the requirements are not written, I think there are always expectations. Or what would you say about a web page which responds after 20 seconds?
Performance is often a topic in software projects. There are many articles about how to write good performance requirements. But who should gather the requirements and ensure that they get implemented?
At Zühlke we recently had an interesting workshop about how to incorporate performance into agile projects. I would like to share the outcome with you.
Establishing the requirements
In some projects, performance issues are only addressed when the user says that the system is slow. The reason for this approach is that optimizing performance too early might lead to unnecessarily complex structures. I think that addressing the performance issues at the end of the project is usually too late.
So, when is the best time to consider the performance?
As early as possible.
Of course at the beginning of a project it is usually not yet possible to define the final performance requirements. But, similar to when writing functional requirements and stories, it is possible to have an idea about the performance requirements and refine these during the project.
It is important that performance requirements are related to stories or scenarios. This way it is easier to discuss and prioritize them with the team and the stakeholders. Performance criteria are usually different for different stories.
How do you identify the performance requirements?
They can be identified by:
- Asking questions
- Using common values
During the requirements workshops or story definition it is a good idea to find out which stories are performance-critical. The team can define the acceptance criteria together with the product owner. The exact values for performance criteria can be refined later in the process.
If similar systems already exist, benchmarking or general policies can help regulate performance. For example there can be a policy stating that a server can use only 75 % of its CPU or a benchmark which allows for an average of 10 editing users.
Using common values
Some domains are so common that everybody has expectations about how they should work, even when requirements are not written. For example, if you buy a table, would you expect it to hold a book? How many books should it hold? For a common domain such as a web domain, it is possible to define the response time as a general goal, to which users are accustomed. The stories can have their own requirements, which vary from the general goal. At Zühlke, we are thinking of creating guidelines for this purpose.
Defining the requirements
It is a good idea to define the requirements – using smart criteria for example. It is especially important that performance requirements are measurable. Defining the following factors facilitates the measurement:
- response time
When defining a response time, it is good practice to define a percentage thresholdfor meeting the criteria and decide what should happen with requests which don’t meet the criteria. For example, 95 % should finish in 5 seconds, while others can take longer or even time out.
It is also important to be precise about whether the values represent an optimistic goal, an average or the value to be attained at peak times.
It is also good practice to establish whether the response time is for a single server request or for a client loading the full context with third party libraries.
When defining criteria for concurrency, it is important to define not only the amount of users but also the type of operation and the think time required. For example, reading is different from streaming and a new user has longer think time than a polling server.
When defining the environment for a performance test, it is ideal to have a similar system to that of production. It is also ideal to use a copy of the production data, in which the confidential information is masked out or to generate the same amount of test data as contained in the production data.
Implementing the requirements
OK, so when you have a bunch of stories and requirements related to performance, then what? How can you make sure they get implemented?
We think that performance requirements should be a part of the story or a task or even have their own story. Having performance criteria as part of the Definition of Done can be too general. When there are a few hundred stories, one option is to categorize stories and create performance criteria for each category.
In a workshop, we came to the conclusion that incorporating performance involves several different tasks. Usually, a software architect is responsible for carrying out or delegating the following :
- gathering and updating the performance requirements
- designing how to measure the performance
- making sure the measurements are done
The development team can help to gather and refine the requirements by asking explicitly about the performance issues.
There are many good tools for automating the performance measurements.
Running the performance analysis just before release can be too late. The best practice is to run performance tests as part of the continuous integration – as part of a nightly deployment for example.
It is important to start defining the performance requirements as early in the project as possible and to automate the measurement as much as necessary.
If performance requirements exist and someone is responsible for gathering them, they become as natural a part of the implementation as are the quality and the features.
What kind of experiences do you have about defining or implementing performance? How do you incorporate performance? Feel free to share a tip or two about a good measurement tool or a way to gather and formulate the performance requirements.