NASA, the National Aeronautic and Space Administration, is in the final days of a crowdsourced project to modify and improve the email server of the International Space Station. With the help of firm Topcoder, it developed a contest called the “Astronaut Email Challenge”. Arguably, this project is unusual for NASA even though the space agency has been involved in crowdsourcing long before we coined the word “crowdsourcing” or knew what it meant. When viewed in light of NASA’s history with crowdsourcing, the Astronaut Email Challenge helps to illustrate how organizations have to structure and discipline crowdsourced processes. It also suggests how all forms of crowdsourcing are ultimately searches for expertise.
We can trace NASA’s interest in crowdsourcing back to its predecessor, the National Advisory Council on Aeronautics (NACA). In its earliest days, NACA started holding contests to encourage amateurs to tackle challenging problems with flight and to promote the aerospace industry. In the 1920s, they sponsored miniature aircraft contests for students. In the 1930s, they helped sponsor contests for women pilots; contests that included races and bombing competitions.
All of these contests were lightly managed activities that were designed to promote the agency and aeronautics in general. NASA has continued this kind of work in more recent contests such as the recently completed challenge to design a Mars Ascent Vehicle. Fundamentally, it was a student contest that was attempting to identify and strengthen aerospace skill among the next generation of engineers. The contest didn’t require the participants to build an actual Mars Ascent Vehicle. Instead, they were asked to create a small rocket that would perform a certain number of tasks and meet a detailed set of specifications. The contest publicized NASA’s interest in Mars and it identified a group of potential aerospace engineers at colleges both big and small.
While the Astronaut Email Challenge is also a contest, it is a contest that is intended to produce functioning modifications to the Microsoft Outlook/Exchange Server software that is operating on the International Space Station. As a result, it is more tightly managed than the standard student contests. NASA has contracted with the crowdsourcing firm Topcoder to manage the project. Topcoder is managing the contest with a detailed set of specifications, timelines, and tests. At the end of the challenge, which will occur in mid-May, Topcoder should be able to present NASA with a functional set of specifications to its operational software.
While these contests are often presented as ways of creating innovation, they are really addressing the issue of managing expertise. They don’t do much to expand national capacity or establish new companies in the field. Instead, they assemble a group of workers with a broad set of skills that begins with a knowledge of the commercial Microsoft software and ends with an understanding of the NASA communication system. It guides them through the project, tests their work, and awards their accomplishments.
While team members are not hard to find, they are not always found in obvious places. Many a software firm has an employee with underused skills. Such employees once mastered the details of one software system only to discover that they rarely have a chance to use those skills. To earn a living, they turn to more common work. Yet, they retain their specialized expertise and would like to use it, if only given a chance.
For projects such as this NASA email challenge, crowdsourcing is a replacement of conventional management. It is a management structure that operates outside of the world of established firms and traditional procurement contracts. If NASA had managed this project in a conventional way, it would have to identify a firm with the proper expertise, verify that the firm has been able to do what it claimed, and create detailed specifications and contracts for the modifications to the email system.
The final deadline for the Astronaut Email Challenge is still a few weeks away. We probably won’t know the outcome of the challenge until several more weeks have passed. The reports on the intermediate goals suggest that the project is finding the expertise it needs. While the result may not radically change the Space Station, it will at least show how crowdsourcing can fit into a project that is large, complicated, and managed by strict hierarchy. Even under these circumstances, crowdsourcing can be used to identify expertise and manage a team with specialized skills.
For more info (and to see a larger infographic), visit NASA’s Astronaut Email Challenge.
About David Alan Grier
David Alan Grier is the author of Crowdsourcing for Dummies and When Computers Were Human. He is an associate professor at the Center for International Science & Technology Policy at George Washington University. He can be reached at firstname.lastname@example.org.
Read more from David Alan Grier on the Owler Blog.