I can still remember teaches telling my classes not to rely on Wikipedia as a source, because of its ability to have anyone contribute information. This makes it possible for incorrect information to be added to the online encyclopedia, but Wikipedia does have policies and editors in place to prevent incorrect information from going up, and removing it if necessary. Not all crowdsourced collections of information have such measures, and some cannot even support them, but researchers at the University of Southampton have put together a plant to provide such verification to crowdsourcing.
The challenge is to make sure the volunteers contributing information are doing so in good-faith and that those directly overseeing them are trustworthy. What the researchers propose is to have incentives for the volunteers and those that are responsible for their content. If the incentives are only awarded when correct information is contributed, then the volunteers will want to make sure their contributions are correct. By presenting the overseers with incentives as well, they will be encouraged to verify the information from the volunteers and to recruit trustworthy volunteers.
As crowdsourcing is becoming more popular for many projects, such as disaster relief, verification methods like this are becoming more and more important. Plus, having an incentive might get more people involved and accelerate the crowdsourced project.