Macroblog

About Me

Crowdsourcing: A Definition

  • I like to use two definitions for crowdsourcing:

    The White Paper Version: Crowdsourcing is the act of taking a job traditionally performed by a designated agent (usually an employee) and outsourcing it to an undefined, generally large group of people in the form of an open call.

    The Soundbyte Version: The application of Open Source principles to fields outside of software.

The Rise of Crowdsourcing

  • Read the original article about crowdsourcing, published in the June, 2006 issue of Wired Magazine.
Blog powered by Typepad

« Andrew Keen's Cult of the Amateur | Main | Assignment Zero on Wired.com »

June 28, 2007

Comments

Alan

Hi Jeff, how did Jay arrive at the 28% success rate and what did he consider to be the obvious failure points?

The interviews “were” refreshing in so far as they were void the editorial molding or spin that, as you pointed out, so often permeates mainstream content! I gave up eating or reading anything that was overly processed years ago.

I am surprised or maybe just naïve, that there is any question about the advantages and qualitative differences between face to face interviews or the other options mentioned in the Washington Post piece. Right off the bat the lead sentence “The humble interview, the linchpin of journalism for centuries, is under assault” stresses an upside down premise!

Where is the assault, unshackled perhaps or at the very least free from the impurities that come to mind when considering the debate around Mr. Murdoch’s most recent bid for the WSJ.

“Is old media dead, or is the blogosphere just a flash in the pan? This is obviously a rhetorical question. If the blogosphere is a flash in the pan then the traditional media kitchens are already smoldering and one awaits the resulting inferno!

The AZ attempt to do the CS project brings some much needed balance to the fore! On that score alone the percentage points should be much, much higher.

Cheers, Alan.

David Cohn

The 28 percent success was a bit of a random number. Obviously there was no real quantitative analysis done.

When we first started the project Jay had envisioned finishing around 100 or more full articles. That's a tall order. If you look at the final product, we have 80 full QA's (not full stories) -- which as Jeff pointed out, are pretty good, and we have 7 full length feature stories. We also have a sprinkle of stories (I'd say maybe another 4-5) that could have become feature stories with a bit more time.

All that said. AZ was not an outright success. But as you pointed out Jeff -- its failure was beautiful. Not only because it was not an outright failure, but also: The parts of the project that didn't work are very clear -- both through the application of crowdsourced journalism and the content within the QA's -- there is a great resource within AZ about how to organize, manage and take to task community/collaborative projects.

Lars

You can now read Wired's blogs on cell phones by entering 'clfy.net/wired' in the phone's web browser.

Daren C. Brabham

I'm glad the AZ project was as successful as it was. I wholeheartedly agree that there was mission creep with the project, Jeff. In my research, I've made a point to try and distinguish between crowdsourcing, open source, contests, collaborations that just happen to be online, and other types of "Web 2.0-ness." So, I was admittedly a bit turned off with how quickly AZ seemed to lump all of the previously mentioned concepts/models/theories into the label "crowdsourcing." Crowdsourcing is a production and problem solving model that, while similar in some ways to some of the basic underpinnings of open source, is not open source. Different motives, different format, different ethics, different power base.

I do think there are some gems among the interviews, from what I read. I look forward to the interviews you post here Jeff.

mhh5

So... you speak of both failure and success... but give examples of neither. Where did this experiment fail? Just in the quantity of work it collected? Or also in the quality of the content it created?

Where did this project succeed? In that it created any content at all from volunteers? The unexpected quality of work? The level of cooperation between strangers?

Daren C. Brabham

Jeff has much more insider knowledge about the AZ project and its failures and successes, but I can answer your questions, mhh5, to some degree:

I read most of the interviews and edited a few of them. Several of the Q&As are insightful and provide compelling anecdotes, advice, experiences, and opinions on the greater Web 2.0 phenomenon. There were a noticeable number of Q&As, however, that really didn't say anything interesting. This was, as far as I could tell, a mixture of poorly conceived questions from the interviewers and uninteresting answers from some of the interviewees.

I think Jeff and others see the project as a failure to some degree because the objective--to produce a lot of feature stories--wound up being mostly Q&As. A good interview is certainly a skill some people have and some people don't, but at the core of journalism is the ability to write a fresh, interesting, newsworthy story. And so, AZ's attempt to be a crowdsourced journalism project was more like sending out the yearbook staff to collect interviews in Q&A format rather than something that could rival the NY Times or Newsweek. The Q&As weren't on the level of Playboy or Esquire, either, so...I guess the project is a success for its face value, which is that it did uncover a few nuggets of wisdom about this whole new Web phenomenon.

My beef with the project, as a scholar of crowdsourcing, is that AZ too quickly lumped crowdsourcing, open source, collaborative art projects, and so on under the umbrella term of "crowdsourcing." As I've stated many times, and as a forthcoming paper I've written ("Crowdsourcing as a Model for Problem Solving: An Introduction and Cases," Convergence, 14.1, Feb. 2008) makes clear, crowdsourcing is a new kind of thing. It can be lumped together with all those other phenomena under the umbrella term of "Web 2.0," but it ain't crowdsourcing. (By the way, I see Web 2.0 as a term to describve the increased user-productive nature of the Web in the recent years, now that the Web has become a ubiquitous thing in industrialized nations).

Based on what we know about how some of the successful ventures in crowdsourcing work, I think it is safe to say AZ's failures come from the giant scope of its problem and its failure to clarify the problem to guarantee a predictable range of inputs--in terms of quantity, quality, and form--from the crowd. It was just too big and undefined.

Still, though, there's some value in an undefined problem. It draws out an unrestrained spread of creative solutions. But, that collective of solutions is not like crowdsourcing, which is a genius model that seeks out top-notch, quality solutions.

Alan

Great comments Daren, the last paragraph, third sentence had me somewhat stumped though!

Why are the collective solutions not like crowdsourcing? The solutions that in this case arose from “the giant scope of its problem and its failure to clarify the problem to guarantee a predictable range of inputs” are the natural/organic result of the above described shortcomings.

Why a “genius model that seeks out top-notch, quality solutions Daren?”

I see the emphasis on the word unrestrained. Originality appears to be the one element that should come to the fore through a CS process. When originality rises to the top it does not presuppose any particular standard. Are you applying the term genius to the collective end result as in “wisdom of the crowds?”

I have no intention to quibble but top-notch quality solutions might indeed be one part of the resulting outcome of any particular venture. The outcomes in this case, with the AZ project, appear to prove the point that the end result did indeed fall short of expectations despite the intention to seek out top-notch, quality solutions.

Jeff’s definition, “Crowdsourcing is the act of taking a job traditionally performed by a designated agent (usually an employee) and outsourcing it to an undefined, generally large group of people in the form of an open call,” appears to fit the bill, so why is it not like CS?

The architecture and results of participation, in this case, looks to me like a beautiful sandcastle rather than the mansions that flickr or wikipedia have built.

Cheers, Alan.

Alan

AZ breathtaking!
Dan Gillmor offer a year-on-year progress report on the state of citizen journalism.
http://citmedia.org/blog/2007/07/15/citizen-media-a-progress-report-2/

mhh5

Thanks for the recap, Daren.

I guess it's not surprising that content that requires a singular "voice" may not be the best target for a crowdsourcing project. There's likely a category for projects that are suited for "creation by committee" -- and it may not include journalism, novels, etc.

Perhaps CS only works for content creation like Wikipedia, recipe books, and collections like Youtube....? Where the audience expects a variety of voices and doesn't mind a bit of quality variance.

Monogram Canvas

Great comments Daren, the last paragraph, third sentence had me somewhat stumped though!

Why are the collective solutions not like crowdsourcing? The solutions that in this case arose from “the giant scope of its problem and its failure to clarify the problem to guarantee a predictable range of inputs” are the natural/organic result of the above described shortcomings.

Why a “genius model that seeks out top-notch, quality solutions Daren?”

I see the emphasis on the word unrestrained. Originality appears to be the one element that should come to the fore through a CS process. When originality rises to the top it does not presuppose any particular standard. Are you applying the term genius to the collective end result as in “wisdom of the crowds?”

I have no intention to quibble but top-notch quality solutions might indeed be one part of the resulting outcome of any particular venture. The outcomes in this case, with the AZ project, appear to prove the point that the end result did indeed fall short of expectations despite the intention to seek out top-notch, quality solutions.

Jeff’s definition, “Crowdsourcing is the act of taking a job traditionally performed by a designated agent (usually an employee) and outsourcing it to an undefined, generally large group of people in the form of an open call,” appears to fit the bill, so why is it not like CS?

The architecture and results of participation, in this case, looks to me like a beautiful sandcastle rather than the mansions that flickr or wikipedia have built.

Cheers, Alan.

200mw laser

It's morning now.I begin to do my work everyday.

chinese new year

Its great resource. i was finding that type inf and now i get it.thanks for this...

Penis Enlargement

I am thoroughly convinced in this said post. I am currently searching for ways in which I could enhance my knowledge in this said topic you have posted here. It does help me a lot knowing that you have shared this information here freely. I love the way the people here interact and shared their opinions too. I would love to track your future posts pertaining to the said topic we are able to read.

Air Jordan 1

Hello! I ran into your site utterly by mistake, and am really glad I did! This is very interesting, and I can tell you have spent some time studying this. Thanks!

Hermes

"We are almost out of time" for a compromise, warned President Barack Obama as U.S. financial markets trembled at the prospect of economic chaos next week.

クラシックアーガイルニットuggの

[url=http://www.ugg-rakuten.com/ugg-adirondack-ii-ugg-5_6.html]UGGアディロンダックII[/url] 、LouisAtlantaははroadSt.Louisで6試合を続けると、実際に4月2日SUは5オンラインゲーム上でのキャリーのために、通常5〜0 SUですAtlantaTheの合計に対して、あなたの家の内部で演奏するときにセントについては6で倒産。ルイのhomeSt.Louisで7ビデオゲームを行っておくのがhomeTheの合計で5のMMORPGを続けるとともに、実際に4月1日SUであるアトランタを学習する際に5つのビデオゲームを続けるセントルイスの約〜4を超え、。

 wholesale

You blog is so lovely that speak the words right out my month. . I bookmarkt you so that we can talk about it in details, I really can't help myself but have to leave a comment,you are so good.
http://www.sencart.com

The comments to this entry are closed.