A facilitated discussion was started by listing areas where it is difficult to apply performance testing techniques to iterative development. Some discussion followed each point, but they are listed here a as a collection of some of the difficulties we face in trying to provide performance feedback early and often. Any attendee (or website visitor!) is […]
WOPR22 Brainstorm: Optimizations
During WOPR22, we spend a few minutes before lunch collecting systems-based (as opposed to code) optimizations that the group had used in the past. This was time- and hunger-bound. This is not an exhaustive or ranked list, and does not include enough information to use these as heuristics. It might be a source of ideas. This […]
WOPR22 Practitioner Tool Survey
During WOPR22, we took an informal survey of tools that practitioners have used regularly over the last year. It should be remembered that tools are frequently chosen by non-practitioners for us, for reasons besides fitness for purpose. This survey does not meet any standard of statistical significance, and does not include many well-known tools – just the […]
WOPR22 Is Underway
WOPR22 started off with John Meza discussing the performance metrics that are generated per build for software his company produces. John’s team publishes charts tracking rendering performance of certain GIS data test cases across builds. This helps alert Development promptly when performance degradation has been introduced, allowing them to address problems promptly. The discussion that followed unearthed […]
WOPR22: Malmö, Sweden
WOPR22 will be held May 21-23, in Malmö, Sweden. Maria Kedemo of Verisure will be our host. I’m the content owner, and my theme is Early Performance Testing. I want to explore testing performance and reliability when a system is not complete, and/or a system will be deployed to multiple environments. How are we testing […]