WOPR2 was held in Palo Alto, California on April 15-17, 2004, and was hosted by Sun Microsystems.
Attendees
Eric Ang, James Bach, Marini Ballard, Scott Barber, Michael Butler, Ross Collard, Neil Gunther, Linda Hamm, Julian Harty, Doug Hoffman, Paul Holland, Pam Holt, Philip Joung, Dave Liebreich, Jude McQuaid, Alex Podelko, Gail Rutherford, Andrew Sliwkowski, Roland Stens
Theme: Where and how performance testing made a difference (or failed to make an anticipated difference)
To explore this theme, we expect to receive presentations and discuss several experience reports. We are interested in your data analysis and interpretation: how you decided what data to collect, how your understanding of the situation evolved as you accumulated data, and how you interpreted the data to form conclusions. We are particularly interested in innovative or uncommon methods of data analysis, as project teams often need to invent – or re-invent – data analysis techniques to fit their situations. We would like to know which methods you used, and whether you developed new techniques to fit your situation.
We anticipate that the reports will share some or all of the following information with the group:
– What were the project’s original objectives and focus (i.e. what was the problem that performance testing was intended to solve?)
– Did the objectives and focus evolve during the project based on your interim findings and conclusions? If so, how?
– What specific difference did performance testing make, or what difference was the performance testing expected to make but didn’t? (E.g., prevented a potential disaster, extended schedule, led to the project being abandoned, etc.)
– What data was collected that led to the ‘difference’?
– What methods/techniques did you use for data analysis, reduction and interpretation that led to the ‘difference’? (I.e., how were the conclusions derived from the major findings, which in turn themselves were derived from the raw data?)
– What results or conclusions from the testing led to the ‘difference’?
– Did you validate your findings and conclusions, and if so, how did you do it?
– Did your perception of the problem to be solved change as your project proceeded? If so, how?
Focusing Questions
If you give a presentation, please consider the following questions when you create it. These questions enhance the overall theme. The focusing questions are:
- What specific difference did performance testing make, or what difference was the performance testing expected to make but didn’t?
- What specific data was collected that led to the ‘difference’?
- What specific methods/techniques did you use for data analysis, reduction and interpretation that led to the ‘difference’?
- Did you validate your data, analysis and conclusions, and if so, how did you do it?