The Workshop on Performance and Reliability (WOPR) announces its 25th Workshop, to be held in Wellington, NZ, February 15-17, 2017.

The Content Owners are Richard Leeke and Eric Proegler. With the WOPR Organizers, they invite you to submit your proposal for WOPR25.

WOPR25 is sponsored by Wild Strait.


Theme: Performance Tools for 2017 (and beyond)

Performance Engineering as a distinct software development and deployment practice has existed for some time, but it started to really grow in the late 1990s, as companies moved from paper to electronic processes for business-critical operations. Mercury and Rational brought commercial load tools to the market, and modern Performance Testing was born.

20 years later, Client-Server applications have become Web Applications, browsers are the new operating systems, system resources have become much easier to scale in the Cloud, and communications protocols for software has standardized on HTTP. HPE is looking to sell Mercury, second and third generation load tools have moved the scale of load testing from hundreds of concurrent users in a test lab to millions of cloud-based virtual users, and Open Source tools have significantly changed how load tests are created.

The changes in Software Development processes over the same time have pushed performance testing even farther: from Waterfalling towards one or two iterations of dress-rehearsal Performance Tests, to frequent/continuous releases and elaborate automation of deployments and testing as part of continuous integration. Some performance testing is still used as a gate to production, but many of the new performance tests being created today are designed to be run (and interpreted) automatically, hundreds of thousands of times a year. This is generally called Shifting Left.  

Performance Monitoring has changed just as much. Real-time APM and RUM tools have been added to production environments, using information from every transaction to provide performance feedback post-deployment – Shifting Right. Once the expectation has changed from deploying completely finished software to expecting to iterate and deploy fixes frequently, the approach to testing changes as well.

One formulation of the classic phases of Performance Testing Projects is:

  1. Discovery (Modeling activities and workload)
  2. Building (Test Scripts and other assets)
  3. Execution
  4. Remediation/Retesting
  5. Reporting

Each one of these steps has been changed, and some of them have been disrupted. For this WOPR, we’d like to discuss how all of these changes in the ways we conduct our work have changed our tools requirements. Some of the questions we’d like to discuss at WOPR25 include:

  1. What are the attributes of tools that are useful for a modern Performance Engineer to gather information, simulate workloads, and identify capacity limits?
  2. What do we need our tools to do that we still don’t have?
  3. What features have we always wanted our testing and monitoring tools to have?
  4. What experiences are there with Open Source tools? Are there reasons besides price that people are choosing them more frequently?
  5. What Production conditions are we successfully testing for that we didn’t use to be able account for? What are we still not including in our models?
  6. How do we measure, evaluate, and simulate new technologies like Single Page Applications, Web Sockets, and HTTP/2?
  7. What is needed to successfully fit performance testing into CI and CD?

Conference Location and Dates

WOPR25 will be hosted by Wild Strait in Wellington, New Zealand, on 15-17 February, 2017. The traditional Pre-WOPR Dinner will take place on Tuesday, 14 February.

If you would like to attend WOPR25, please submit your application soon. We will begin sending invitations in October.

About WOPR

WOPR is a peer workshop for practitioners to share experiences in system performance and reliability, allow people interested in these topics to network with their peers, and to help build a community of professionals with common interests. WOPR is not vendor-centric, consultant-centric, or end user-centric, but strives to accommodate a mix of viewpoints and experiences. We are looking for people who are interested in system performance, reliability, testing, and quality assurance.

WOPR has been running since 2003, and over the years has included many of the world’s most skillful and well-known performance testers and engineers. To learn more about WOPR, visit our About page, connect at LinkedIn and Facebook, or follow @WOPR_Workshop.


WOPR is not-for-profit. We do ask WOPR participants to help us offset expenses, as employers greatly benefit from the learning their employees can get from WOPR. The expense-sharing amount for WOPR25 is $300 USD. If you are invited to the workshop, you will be asked to pay the expense-sharing fee to indicate acceptance of your invitation.

Applying for WOPR

WOPR conferences are invitation-only and sometimes over-subscribed. For WOPR25, we plan to limit attendance to about 20 people. We usually have more applications and presentations than can fit into the workshop; not everyone who submits a presentation will be invited to WOPR, and not everyone invited to WOPR will be asked to present.

Our selection criteria are weighted heavily towards practitioners, and interesting ideas expressed in WOPR applications. We welcome anyone with relevant experiences or interests. We reserve seats to identify and support promising up-and-comers. Please apply, and see what happens.

The WOPR organizers will select presentations, and invitees will be notified by email according to the above dates. You can apply for WOPR25 here.

© 2003-2023 WOPR Frontier Theme Vlone