Reinventing Performance Testing with Alexander Podelko
Load testing is an important part of the performance engineering process. However, the industry is changing and load testing should adjust to these changes. A stereotypical, last-moment performance check is not enough anymore. In this episode Alexander Podelko will share with us his views on how to align performance testing to these new industry trends. Listen up and discover how performance testing can still add value in an Agile world.
About Alexander Podelko
The last sixteen years Alex Podelko supported major performance initiatives for Oracle, Hyperion, Aetna, and Intel in different roles including performance tester, performance analyst, performance architect, and performance engineer. Currently he is Consulting Member of Technical Staff at Oracle, responsible for performance testing and optimization of Hyperion (a.k.a. Enterprise Performance Management and Business Intelligence) products . Before specializing in performance, Alex led software development for Rodnik Software. Having more than twenty years of overall experience in the software industry, he holds a PhD in Computer Science from Gubkin University and an MBA from Bellevue University.
Alex speaks and writes about performance engineering and load testing. See his collection of performance-related links and documents at www.alexanderpodelko.com
Quotes & Insights from this Test Talk
- I guess stereotypes were correct up to some degree, because we as waterfall development process, most corporations develops some kind of process which was kind of low testing before moving products to production. When everything is ready, you set up product, use some kind of usually expensive location tool like Load Runner, run some test, and report results. That was kind of typical corporate process, and it not exactly fit what we have now, with all these kind of edge trial iterative developments, and now, I say everybody kind of struggling with it. There is no ideal solution here, so it is not, if somebody is struggling, I don't think it is problem with that person, it is overall problem, it is not easy to fit.
- I have quite a lot of discussion about how Netflix handles performance testing. I would say in some Internet companies, there is ideas you don't need to do. Performance tests except some special cases, you just put that change in production with limited number of users, and see how it behaves. If it behaves not a good way, you just back up that change. If it behaves in a good way, you roll out that change to other machines. Not everything at once, but slowly increase the number of systems using that new software, and, I guess, Netflix was the ideal case for that. It was difficult to argue with them. It worked for them, and difficult to argue, why not to use in their situation, but they have quite a lot of factors that are important here. First, they have quite homogeneous workload, so, you say, put server, and redirect some users to it, they exactly know how many users, and what workloads they will create, basically.
- So, it provide that opportunity, but you need to have time, resources, and skills to do that, when you get to early stage, you may need a little different set of skills, and if you test it not only in the end, but during the development, you just need more time to do that, because every test will require some time and effort from you, and that is probably the main challenge, how you will integrate that, and the performance department and operations are somewhat limited in size, so you couldn't extend that forever, so probably some kind of trade off and some kind of test will be run by development team, and some will done by people specialize in performance, and that trade off is probably what happens at each company, when people try to find optimal way to do that.
- So, I believe that in the beginning, when system new, you should do a lot of that kind of investigation, to understand how system behave, and what you could do with it, and maybe, some parts can be automated later, to run application performance monitoring systems, that is not, they are already here, and they are around for a while, but now, they come to some level of maturity, so I believe that it become useful to incorporate them in performance testing, because they provide insights into systems, so you don't only run test, you may figure out what the problem is in site system, not just reports that performance is bad.
- The main piece that performance testing need to adjust environment. It cannot do the same things that we did all the time, and expect that they will work with completely new way of development, but there is no reason why we couldn't adjust, and bring a lot to the table, allow to find the performance problems earlier in the development cycle, and prevent all these production failures. Of course, performance testing is not complete guarantee, but if you cut issues, say 80% of issues definitely cover cost of performance testing, and save a lot of money, and whatever you're losing, when you get to these failures.
Connect with Alexander Podelko
May I Ask You For a Favor?
Thanks again for listening to the show. If it has helped you in any way, shape or form, please share it using the social media buttons you see on the page.
Additionally, reviews for the podcast on iTunes are extremely helpful and greatly appreciated! They do matter in the rankings of the show and I read each and every one of them.
Test Talks is sponsored by the fantastic folks at Sauce Labs. Try it for free today!