Additionally, we plan on hosting a testing-related event with a track on performance in Montevideo on Friday, December 9th. More on this soon!
Many thanks to Abstracta for their support of WOPR, and for supporting knowledge sharing on the practice of performance measurement and testing. For more background on these subjects, you could visit these links:
Iterative Performance Testing
Performance Testing examines the experience of users of a system; Load Testing generates simulated traffic to examine the performance characteristics of a system. For both disciplines, many of their current practices were developed during a time of slower and more infrequent releases, when products and systems would release large, significant chunks of functionality all at once.
Since that time, deployment frequencies have accelerated to as often as multiple times per day in some contexts. Software projects in these contexts build and deploy through pipelines, moving code from Commit to Production with minimal human analysis, assessment, or intervention. There are also contexts beyond Silicon Valley-style Agile/CI/CD, where highly skilled people are creating and releasing important software less often, with great care in examining and validating each release. These contexts (and others not mentioned here) are subject to significant recent changes in how software is shipped, provisioned, deployed, supported, monitored, and maintained.
Great progress has been made in Observability, and we can automate rollback/forward decisions based on what it can tell us. Specific versions of code and systems are shorter-lived, and frequency of code change has significantly increased. System complexity in service architecture and cloud container scaling have changed the model of what a “complete” system is altogether. These changes, separately and collectively, have greatly reduced both opportunity and value in multi-week (and month) load testing projects.
But there are still aspects of system performance that are best explored with load. While Observing Production performance can provide similar information, investigating scaling risks in a controlled fashion by injecting synthetic usage can still provide predictive benefits hard to get in any other way. What does Performance and Load Testing look like for you these days?
WOPR29’s Content Owner is Andy Hohenner.
At WOPR29, we want to hear about your experiences with Iterative Performance Testing.
We are looking for your real, recent experiences in using performance and load testing techniques in the context of modern software projects. Here are some prompts that may help you consider how your experiences apply to WOPR29’s Theme:
- How has your load testing changed recently?
- Has your definition of what Performance Testing is changed?
- What new techniques do you use to do your performance testing?
- Do you use iterative/incremental performance tests? What do they look like?
- Are you Performance testing inside sprints?
- Are you conducting Performance testing as part of CI and/or CD?
- How is Observability impacting your Performance testing?
- Are you measuring performance through Synthetic Monitoring?
- How have DevOps approaches altered how you approach performance testing?
Conference Location and Dates
If you would like to attend WOPR29, please submit your application soon. We expect to start sending invitations by early October.
WOPR is a peer workshop for practitioners to share experiences in system performance and reliability, allow people interested in these topics to network with their peers, and to help build a community of professionals with common interests. Participants are asked to share first-person experience reports which are then discussed by the group. More information about Experience Reports is available at http://www.performance-workshop.org/experience-reports/.
WOPR is not vendor-centric, consultant-centric, or end user-centric, but strives to accommodate a mix of viewpoints and experiences. We are looking for people who are interested in system performance, reliability, testing, and quality assurance.
WOPR has been running since 2003, and over the years has included many of the world’s most skillful and well-known performance testers and engineers. To learn more about WOPR, visit our About page, connect at LinkedIn and Facebook, or follow @WOPR_Workshop.
- Call for Proposals Opened: August 4
- Invitations Sent By: October 15
- Pre-WOPR Dinner: December 5
- WOPR29: December 6-8
WOPR is not-for-profit. In the past, we have asked WOPR participants with access to training/PDP budgets to help offset expenses, as their employers greatly benefit from the learning their employees can get from WOPR.
We are not asking for an expense-sharing fee for WOPR29, and will use sponsorship to cover expenses. With our additional community day, we have an opportunity to gather more sponsorship to help bring together a WOPR that works in this new geography.
We are even hoping to subsidize some travel expenses for a few first-time attendees and those without employee sponsorship, particularly from LatAm. More information on this soon. If you are interested in attending WOPR, please do not let travel costs prevent you from applying – we hope to assist!
Applying for WOPR
WOPR conferences are invitation-only and sometimes over-subscribed. For WOPR29, we plan to limit attendance to about 20 people. We usually have more applications and presentations than can fit into the workshop; not everyone who submits a presentation will be invited to WOPR, and not everyone invited to WOPR will be asked to present.
Our selection criteria are weighted heavily towards practitioners, and interesting ideas expressed in WOPR applications. We welcome anyone with relevant experiences or interests. We reserve seats to identify and support promising up-and-comers. Please apply, and see what happens.
The WOPR organizers will select presentations, and invitees will be notified by email according to the above dates. You can apply for WOPR29 here.