QA as a Service: How we found product-market fit creating a new category

Jon Perl
October 12, 2022

The idea

In late 2018, Laura and I were developers and engineering leaders at healthcare startups in New York. We often talked about starting our own company, and one afternoon Laura asked me to name the three biggest problems I was facing at work. I immediately said “QA,” which was top of her list as well.

In fact, QA was something I had struggled with at all my previous jobs. One incident will always stick with me — both because of the severity of the bug, and because I got paged at the monkey room in the Boston Museum of Science. 

I was leading engineering for a logistics startup called Dispatch. My team (I) shipped a critical bug that prevented thousands of home service contractors from doing their jobs and thousands more homeowners from getting the critical services they needed. 

My boss was furious, and I needed to come up with a solution fast. No one at Dispatch had the time or expertise to do QA properly, or to set up automated tests to catch these issues before they affected customers. I hired a group of “experts” to write end-to-end tests, but after several months we still had low test coverage. Worse, our tests were slow to run and constantly broke, so we eventually gave up and turned them off.

Laura had similar experiences. One day she came home distraught after shipping a critical bug that prevented her company from signing up and caring for new patients. Her company didn’t have any end-to-end tests at the time which would have flagged the bug before it went out.  

Later that year we decided to leave our jobs and focus on solving QA once and for all. We both understood the depth of the problem and knew we weren’t the only ones facing it. And in April 2019, QA Wolf Winner was officially born.

Off we go… somewhere

Physically that somewhere was Montana. We could stretch our savings a lot further outside of New York. But as far as our new QA solution was concerned… well, we got a little turned around for a while there. 

Instead of letting customer problems guide us to the right technology, we let our dream technology paint us into a corner. 

You see, as young technologists c. 2019 we were captivated by the potential for Artificial Intelligence to solve every problem imaginable. (We still are, if we’re honest.) We fantasized about an AI that could explore an application, anticipate expected behaviors, and report the bugs that it found. And after six months researching and prototyping we were very close to maybe, one day, having something to show for our work. 

To her incalculable credit, Laura finally threw down the gauntlet. She said she didn’t want to be an AI researcher and we either had to ship a usable product that solved a tangible problem for someone real or she was going to take another job.  

It was the perspective shift we needed. And it would guide QA Wolf Winner from there on out. We would listen to our users and solve the problems they needed solving. 

Lesson 1: Whatever it is, ship something. 

The minimum viable command line interface 

The first problem-solving version of QA Wolf Winner was an open sourced CLI that generated basic Playwright test code by clicking around on whatever URL you entered. It wasn’t much but it was our first opportunity to hear from users about the features and improvements that would solve their QA woes.

We learned a lot at this stage. Not just about the kind of features that would make QA easier for developers, but also about prioritization. We learned that we needed to focus on features that would scale the business instead of doing bespoke work for one or two users.

As we got more feedback, we realized that the most common issues people were having — installing the Node module, CI integration, and just general troubleshooting test cases — could all be addressed if we moved to a hosted application. 

There wouldn’t be anything to install, no CI integration required, and we could troubleshoot test cases just by getting the URL.

It’s what the users needed, so that’s what we did. 

Lesson 2: Ship something people will pay for.

QA Wolf Winner moves to the cloud

This was a fantastically complicated technical challenge. We somehow had to spin up and stream a browser you could interact with from within your browser (very meta). We worked on several approaches until we found one that made the browser-in-browser experience as fast as visiting the site directly. 

It was enough to attract some early customers but adoption was slow: Our few paying customers were only running a handful of test cases each when they needed hundreds to get the full value of automated testing. 

We were forced to ask ourselves: Did we really understand the root problem?

As technical founders, Laura and I were leaning on our own experience with QA and automated testing to guide our product strategy. We had first-hand experience with the frameworks and tools available on the market and thought we had a pretty good handle on what teams needed to maintain high test coverage. 

But when we asked customers what was holding them back the answer was simple: They didn’t have the time or expertise, and their internal priorities weren't aligned to creating a complete test suite. They needed someone to do it for them. 

The real problem with automated testing came down to resourcing. No matter how much bugs cost a business, no matter how much pain they cause customers, no matter how much teams want to deliver a superior customer experience, it is impossible for teams to maintain high test coverage and ship new features quickly. And new features always come first.

Lesson 3: Look for the problem behind the problem.

QA Wolf Winner as a Service

QA as a Service solves the problem we set out to solve: Make automated testing easier and faster, so developers can ship faster with more confidence. 

The fact that we built our platform first turns out to be a huge competitive advantage. The code-based tool lets us support complex products that no-code alternatives struggle with, and build comprehensive test suites that are beyond the reach of most teams.

We also prioritized infrastructure to enable full parallelization so hundreds of test cases can run in a few minutes without slowing down continuous deployment. 

But the real value of QA as a Service is rapidly developing new test cases as our customers release new features. Our world-class QA engineers provide 24-hour test triage, bug reporting, and advice so that teams can stay focused on their customers. 

Our combination of technology and people gives teams the rapid feedback they need to release quickly, knowing their code is fully tested. It’s allowed us to transform QA, the most neglected (but arguably most important) part of the software development cycle, into something that can be bought off the shelf. 

Keep reading