Interviews

You are currently browsing the archive for the Interviews category.

This is the second half of our two-part discussion on crowdsourcing with uTest’s VP of Marketing and Community, Matt Johnston.  The first installment of our interview is here.

So, you’ve gathered a bunch of information and data from your users.  How do you address issues of IP and who owns the content, both from the perspective of your client companies and from your users?  How do you protect the rights of both of those groups?  Do either of those groups give certain aspects up by using your model that they may not have to otherwise?

That’s a legitimate issue for crowdsourcing companies who are creating something new – for example, in categories like copy writing, design, development and animation.  In such instances, companies like Guru, crowdSPRING or GeniusRocket have to think carefully about fairness and address these issues head on and up front.  Clear communication (and consistent enforcement) of the rules is vital to maintaining a level playing field in a B2B crowdsourcing model.

In the case of uTest, we don’t have to worry about issues of IP ownership because the customer created the apps and simply wants to test them with our community.  The IP issues that are sometimes a concern for new customers are around IP protection.

Again, in this case our job is to establish the ground rules that protect everyone, and then to communicate them clearly and consistently.  In the case of uTest, the client created the app and they own it.  Testers have no rights to the app or to any of the bugs that they discovered within that app.  Beyond that, companies can require testers to sign an NDA and testers are forbidden from sharing any information about customers, their apps, or any bugs outside the uTest platform (including blogs, message boards, Twitter, Facebook, etc.).

You’ve spoken about 90-9-1 Rule, which says that 90% of your community will be passive users, 10% will be active and engaged, and 1% will be the “stars.”  Can you shed some light on these numbers and how each group can add value to the process?

We had a great discussion about this at the recent TiE event!  The 90-9-1 rule states that 90% of community members will be “lurkers,” 9% will be part-time contributors and 1% will fully adopt and become “super users.”  While the math changes from crowd to crowd, the underlying premise does not:  the vast majority of the contribution or participation will come from your top users.  This isn’t just true in crowdsourcing, it’s true on message boards, social networks, blogging services, and most online user bases.

If you’re building an ad-based revenue model, you may not care, because even “lurkers” generate page views.   However, if you’re building a transaction-oriented revenue model like uTest is, you need participation in order for a community member to make a direct contribution.

Thus, one of uTest’s keys to success is to identify those future “stars” early on so we can nurture them along the path to success.  We recognize that some testers join our community because they want to use their testing skills to earn money, while others join because they want to network with their peers and consume the educational content that we provide.  We welcome all testers into our community, but we make a special effort to engage those who want to participate in projects.

We talk a lot about “inside-the-transaction” vs. “outside-the-transaction” types of participation.   The former refers to testers who are participating in projects, reporting bugs and getting paid (as well as building their reputation).  The latter refers to those who don’t participate in projects, but could still make meaningful contributions.  This could be by subscribing to our newsletter, commenting on our blog, participating in our forums, or contributing content, ideas and expertise to other members of our community.  Both types of participation are valuable to uTest’s long-term future, but the wants and needs of these two groups are unique, so we cater to them as such.

Your company didn’t just come to the decision to implement crowdsourcing one day, you actually built the entire company on that model specifically.  It’s in the name - uTest.  Do you feel you could have gotten similar results had you been offering the same end-products through more traditional methods of production/testing?  Would they have been worse?  Could things be better using a different model?

If we tried to do this with a traditional model, the results wouldn’t even be close to what we’ve achieved to date.  It would be impossible from a logistics or financial perspective for us to have built the level of testing coverage that we have today.  We’re doing testing for some of the top software companies in the world in our first year of operations – no small feat.  And we couldn’t do it without our community and our investment in our community.

The testing needs in the world of software have become exceedingly complex. Companies now have to test their apps across locations, languages, operating system, browser, as well as handset makers and models, and wireless carriers.  This is a prohibitively expensive task for even the most mature and sophisticated companies.  And crowdsourcing is uniquely suited to meet the challenges of software testing.  That underlying belief is what prompted our co-founders to build uTest based on a crowdsourcing model – and what led me to join the company.

About uTest

uTest is the world’s largest marketplace for software testing services. The company provides real-world testing services through its community of 20,000+ professional testers from 158 countries around the world. Hundreds of companies - from web startups to global software leaders - have joined the uTest marketplace to get their web, desktop and mobile applications tested. More information can be found at http://www.utest.com or the company’s Software Testing Blog at http://blog.utest.com.

We’ve interviewed Matt Johnston, VP of Marketing and Community at uTest, to discuss some of the concepts around the use of crowdsourcing.  This is the first installment of that discussion.

uTest provides software testing services for web, desktop and mobile apps.  The company offers this testing to companies of all sizes – from five-person startups to Fortune 500 enterprises – by leveraging its community of 20,000+ QA professionals from 157 countries around the globe.

You have a pretty well-defined explanation of what crowdsourcing is to you - and what it isn’t.  In fact, I think you said that you had two big criteria that have to met for a process to be considered crowdsourcing.  Can you talk a bit about what this definition includes?  What doesn’t it include?

That’s a good question – and one that’s hotly debated in crowdsourcing circles.  From my perspective, the first criteria is that something of substance needs to be sourced. I’ve heard people say that they’re going to “crowdsource a question” and then simply publish a one-time poll with a single question – that’s not crowdsourcing to me, it’s glomming onto a media-friendly term.

Whether it’s for professional services like copywriting, photography, design, development or software testing, I think there’s some minimum level of contribution and value-add that’s required before you can say that something has been “sourced” (whether it be in-sourced, outsourced or crowdsourced).  I’m not trying to define what this minimum criteria is, but I know that there’s a line.

The second criteria is that the output comes from a “crowd”, not just a single individual.  There are many outstanding companies that use “onesourcing” models to great success (eg: Elance, Guru, OnForce).  And in many categories, what the customer wants may be to find that one person who precisely matches their needs, as opposed to a team of people or collaborators.  I simply don’t consider that crowdsourcing.  In other categories and companies, the deliverable requires a team of people making a contribution (eg: uTest, Trada) and the overall deliverable is the sum of the contributions.

When it comes to implementing the crowdsourcing model, there are a lot of things to think about: control issues, organization, data gathering etc.  What were some of the things uTest felt were the most important to address and how did you address them?

That’s the right question, because a common misconception is that crowdsourcing is easy… just build a crowd, add water, and watch the revenues roll in!   A few hard-earned, hard-learned lessons that I’ve picked up:

  1. This will sound odd coming from a marketing guy, but the first key to building a successful crowdsourcing model is structured data.  The truth is that it’s not too tough to build a loosely assembled, unstructured crowd – what we call a “mob”.  uTest’s plan, however, was to advance far beyond that and to create a rated, ranked community of professional testers.   We use structured data to ensure that we can precisely match each customer’s project with the right testers, based upon location, languages spoken, hardware, software, expertise and, above all else, past performance.  If you’re thinking about a crowdsourced model, structured data is the foundation upon which you’ll build it.
  2. Recognize that you have two sets of customers – in our case, we have software companies and we have our community of testers. Most businesses are hardwired to think of their customer base as their lone stakeholders.  The other constituencies (like vendors, employees, etc.) are there to serve the customer.  In a crowdsourcing model, you must realize that you have two distinct sets of customers to whom you need to cater.  Serving both audiences well on an ongoing basis is a difficult, but wholly necessary, balancing act.
  3. There are two types of currency in an online community: money and reputation. Of course, money serves as a primary motivator, but the transparency and accountability of crowdsourcing motivates community members to build strong reputations. The net effect is that top performers get promoted and poor performers get demoted.  In both the near- and long-term, this means that you’re simultaneously rewarding your best community members and better serving your clients.
  4. The final lesson is that crowdsourcing is a model – an approach to solving a problem.  It’s not a solution in and of itself.  There are some verticals where it’s a great fit, and others where it’s not.  If you’re considering incorporating crowdsourcing into your existing business (or launching a new business based upon this model), make sure that the space and the problem you’re solving are a good fit.

Management of the crowdsourcing model can seem - well - unwieldy and chaotic to people thinking about starting it out.  How have you handled this aspect of the process?  What are some key things to do or not to do?

This question made me laugh, because I was just debating this with a friend who runs a crowdsourcing company on the west coast.  Ultimately your clients aren’t coming to you because they want crowdsourcing (really, they’re not).  They’re coming to you because they have a problem to be solved and the traditional, status quo solutions aren’t cutting it and they’re exploring alternatives.

When it comes to managing crowdsourcing solutions, it can seem unfamiliar to clients.  In those cases, it’s critical that the crowdsourcing company offers services and tools to help clients climb the learning curve.  In cases like these, you’re really competing with muscle memory of the client (“I’m used to doing things in a certain way and vendors need to adapt to my needs …”).

Here are the steps that I’d follow if I was thinking about creating a crowdsourcing company or offering:

  1. Start by looking carefully at the problem you’re trying to solve and the audience for whom you’re trying to solve it.
  2. Understand how they’ve historically solved this problem.
  3. Discover what works and doesn’t work for them with these traditional solutions
  4. At this point, you have a decision to make:
    1. Reduce the management burden on your customers to the point that using crowdsourcing is no more time-consuming than managing the traditional solutions
    2. Convince the world that your crowdsourcing solution is vastly more effective (aka: better) or efficient (aka: cheaper) to the point that it’s worthwhile for them to learn a new motion.

More specifically, uTest has identified and leveraged the “reputation” system in their model.  How have you done that and how has it worked to your advantage in having an effective community?

The reputation system behind a crowdsourcing model – or any community/marketplace model – is critically important.  As stated previously, there are two types of currency in a healthy crowdsourcing model:  money and reputation.

Of course, money serves as a primary motivator, but a well-designed reputation system creates effective incentives (and disincentives) for community behavior.  Among a community of peers, how one is perceived is a powerful driver.  For example, if everyone can see that I’ve achieved silver status, then I’m proud of my accomplishments AND I’m driven to achieve gold status.

As for the particulars of a reputation system (invisible vs. visible, stars vs. points, etc), there is no single right answer.  Ultimately, it depends upon the space that you occupy and the make-up of your community.  For example, eBay reveals the reputations of buyers and sellers in order to promote good behavior.  Conversely, Google’s search algorithm (which is really just a reputation system for websites that tries to match the right sites with each query) is kept secret in order to prevent people from gaming the system.

The real key is to align your incentives (both monetary and reputation) with the desired behaviors.  Before we ever build a new feature or make a change in our policies or pricing, we talk a great deal about what behavior are we incenting or disincenting.

We’ll be posting up the second half of our interview in the coming days, so be sure to check back!  Do you have any experiences in crowdsourcing?

[Update: The second half of our interview is up]

About uTest

uTest is the world’s largest marketplace for software testing services. The company provides real-world testing services through its community of 20,000+ professional testers from 158 countries around the world. Hundreds of companies - from web startups to global software leaders - have joined the uTest marketplace to get their web, desktop and mobile applications tested. More information can be found at http://www.utest.com or the company’s Software Testing Blog at http://blog.utest.com.

Bad Behavior has blocked 287 access attempts in the last 7 days.