A discussion on crowdsourcing: Interview with Matt Johnston Part I

by Kate Brodock on 14 October 2009

Posted in: Interviews

  • Sharebar

final_logo

We’ve interviewed Matt Johnston, VP of Marketing and Community at uTest, to discuss some of the concepts around the use of crowdsourcing.  This is the first installment of that discussion.

uTest provides software testing services for web, desktop and mobile apps.  The company offers this testing to companies of all sizes – from five-person startups to Fortune 500 enterprises – by leveraging its community of 20,000+ QA professionals from 157 countries around the globe.

You have a pretty well-defined explanation of what crowdsourcing is to you - and what it isn’t.  In fact, I think you said that you had two big criteria that have to met for a process to be considered crowdsourcing.  Can you talk a bit about what this definition includes?  What doesn’t it include?

That’s a good question – and one that’s hotly debated in crowdsourcing circles.  From my perspective, the first criteria is that something of substance needs to be sourced. I’ve heard people say that they’re going to “crowdsource a question” and then simply publish a one-time poll with a single question – that’s not crowdsourcing to me, it’s glomming onto a media-friendly term.

Whether it’s for professional services like copywriting, photography, design, development or software testing, I think there’s some minimum level of contribution and value-add that’s required before you can say that something has been “sourced” (whether it be in-sourced, outsourced or crowdsourced).  I’m not trying to define what this minimum criteria is, but I know that there’s a line.

The second criteria is that the output comes from a “crowd”, not just a single individual.  There are many outstanding companies that use “onesourcing” models to great success (eg: Elance, Guru, OnForce).  And in many categories, what the customer wants may be to find that one person who precisely matches their needs, as opposed to a team of people or collaborators.  I simply don’t consider that crowdsourcing.  In other categories and companies, the deliverable requires a team of people making a contribution (eg: uTest, Trada) and the overall deliverable is the sum of the contributions.

When it comes to implementing the crowdsourcing model, there are a lot of things to think about: control issues, organization, data gathering etc.  What were some of the things uTest felt were the most important to address and how did you address them?

That’s the right question, because a common misconception is that crowdsourcing is easy… just build a crowd, add water, and watch the revenues roll in!   A few hard-earned, hard-learned lessons that I’ve picked up:

  1. This will sound odd coming from a marketing guy, but the first key to building a successful crowdsourcing model is structured data.  The truth is that it’s not too tough to build a loosely assembled, unstructured crowd – what we call a “mob”.  uTest’s plan, however, was to advance far beyond that and to create a rated, ranked community of professional testers.   We use structured data to ensure that we can precisely match each customer’s project with the right testers, based upon location, languages spoken, hardware, software, expertise and, above all else, past performance.  If you’re thinking about a crowdsourced model, structured data is the foundation upon which you’ll build it.
  2. Recognize that you have two sets of customers – in our case, we have software companies and we have our community of testers. Most businesses are hardwired to think of their customer base as their lone stakeholders.  The other constituencies (like vendors, employees, etc.) are there to serve the customer.  In a crowdsourcing model, you must realize that you have two distinct sets of customers to whom you need to cater.  Serving both audiences well on an ongoing basis is a difficult, but wholly necessary, balancing act.
  3. There are two types of currency in an online community: money and reputation. Of course, money serves as a primary motivator, but the transparency and accountability of crowdsourcing motivates community members to build strong reputations. The net effect is that top performers get promoted and poor performers get demoted.  In both the near- and long-term, this means that you’re simultaneously rewarding your best community members and better serving your clients.
  4. The final lesson is that crowdsourcing is a model – an approach to solving a problem.  It’s not a solution in and of itself.  There are some verticals where it’s a great fit, and others where it’s not.  If you’re considering incorporating crowdsourcing into your existing business (or launching a new business based upon this model), make sure that the space and the problem you’re solving are a good fit.

Management of the crowdsourcing model can seem - well - unwieldy and chaotic to people thinking about starting it out.  How have you handled this aspect of the process?  What are some key things to do or not to do?

This question made me laugh, because I was just debating this with a friend who runs a crowdsourcing company on the west coast.  Ultimately your clients aren’t coming to you because they want crowdsourcing (really, they’re not).  They’re coming to you because they have a problem to be solved and the traditional, status quo solutions aren’t cutting it and they’re exploring alternatives.

When it comes to managing crowdsourcing solutions, it can seem unfamiliar to clients.  In those cases, it’s critical that the crowdsourcing company offers services and tools to help clients climb the learning curve.  In cases like these, you’re really competing with muscle memory of the client (“I’m used to doing things in a certain way and vendors need to adapt to my needs …”).

Here are the steps that I’d follow if I was thinking about creating a crowdsourcing company or offering:

  1. Start by looking carefully at the problem you’re trying to solve and the audience for whom you’re trying to solve it.
  2. Understand how they’ve historically solved this problem.
  3. Discover what works and doesn’t work for them with these traditional solutions
  4. At this point, you have a decision to make:
    1. Reduce the management burden on your customers to the point that using crowdsourcing is no more time-consuming than managing the traditional solutions
    2. Convince the world that your crowdsourcing solution is vastly more effective (aka: better) or efficient (aka: cheaper) to the point that it’s worthwhile for them to learn a new motion.

More specifically, uTest has identified and leveraged the “reputation” system in their model.  How have you done that and how has it worked to your advantage in having an effective community?

The reputation system behind a crowdsourcing model – or any community/marketplace model – is critically important.  As stated previously, there are two types of currency in a healthy crowdsourcing model:  money and reputation.

Of course, money serves as a primary motivator, but a well-designed reputation system creates effective incentives (and disincentives) for community behavior.  Among a community of peers, how one is perceived is a powerful driver.  For example, if everyone can see that I’ve achieved silver status, then I’m proud of my accomplishments AND I’m driven to achieve gold status.

As for the particulars of a reputation system (invisible vs. visible, stars vs. points, etc), there is no single right answer.  Ultimately, it depends upon the space that you occupy and the make-up of your community.  For example, eBay reveals the reputations of buyers and sellers in order to promote good behavior.  Conversely, Google’s search algorithm (which is really just a reputation system for websites that tries to match the right sites with each query) is kept secret in order to prevent people from gaming the system.

The real key is to align your incentives (both monetary and reputation) with the desired behaviors.  Before we ever build a new feature or make a change in our policies or pricing, we talk a great deal about what behavior are we incenting or disincenting.

We’ll be posting up the second half of our interview in the coming days, so be sure to check back!  Do you have any experiences in crowdsourcing?

[Update: The second half of our interview is up]

About uTest

uTest is the world’s largest marketplace for software testing services. The company provides real-world testing services through its community of 20,000+ professional testers from 158 countries around the world. Hundreds of companies - from web startups to global software leaders - have joined the uTest marketplace to get their web, desktop and mobile applications tested. More information can be found at http://www.utest.com or the company’s Software Testing Blog at http://blog.utest.com.

droid 3 review
ralph lauren coupon
open heart surgery
compass bank online
chase banking online

Previous post:

Next post: