Last month, SciCast joined ACS for a webinar, Forecasting Chemistry: Predicting Tomorrow’s Cutting Edge Science, Today.
SciCast has been featured in a Wall Street Journal article about crowdsourced forecasting in the U.S. intelligence community. We’re excited to share that SciCast now has nearly 10,000 participants, a 50% increase in the last two months – an important achievement for a crowdsourced prediction site.
Have you ever wondered what will be the next ‘big thing’ in technology? What if you could garner collective wisdom from your peers – those who are interested in the same topics as you – with global reach?
Don’t miss two unique opportunities to learn more about how you can do this on SciCast (www.scicast.org), the largest known science and technology-focused crowdsourced forecasting site.
SciCast will be the featured topic in a Reddit Science AMA and an American Chemistry Society webinar this week! Don’t miss these opportunities to share your SciCast expertise and weigh in on the discussion. We also encourage you to share the information with your friends and colleagues.
Lynda Baldwin – 708-703-8804;
Candice Warltier – 312-587-3105;
FOR IMMEDIATE RELEASE
SciCast Calls for Science, Technology Experts to Make Predictions
Largest sci-tech crowdsourcing forecast site in search of professionals and enthusiasts to predict future events
FAIRFAX, Va (June 19, 2014) – SciCast, a research project run by George Mason University, is the largest known science and technology-focused crowdsourced forecasting site. So what makes a crowdsourced prediction market more powerful? An even bigger crowd. SciCast is launching its first worldwide call for participants to join the existing 2,300 professionals and enthusiasts ranging from engineers to chemists, from agriculturists to IT specialists.
On SciCast, we’ve posted three questions about the missing plane. Can crowdsourcing help to locate it?
Dr. Charles Twardy, Project Principal, explains the different ways to crowdsource a search. “When a community turns out to help look for a lost child, that’s crowdsourcing,” he says. “The community volunteers typically aren’t as well-prepared as the search teams, but when directed by experienced Field Team Leaders, they can greatly extend the search effort. Similarly, experimental micro-tasking sites like TomNod.com let volunteers help search piles of digital images. Call it the effort of the crowd. SciCast is about the wisdom of the crowd: weighing the vast amounts of uncertain and conflicting evidence to arrive at a group judgment, of say the relative chances of several regions or scenarios. This could be as simple as an average – a robust method with much to recommend it when judgments are independent. Or it could be something more advanced, like SciCast’s combinatorial prediction market. A market reduces double-counting, and may be better suited to the case where most of us are just mulling over the same information, but a few have real insight. The trick is to find a large and diverse crowd, and persuade them to participate.”
Following are the questions. Click any of them to make your forecast (register or login first). Also, see the discussion and background tabs of each question for more details and links to news sources.
The extended search region uses this map.
See this blog post for info on how to explore conditional probabilities.
Click here to read more about approaches to crowdsourcing Search & Rescue.
We’re excited to announce that the Decision Analysis Journal has published Probabilistic Coherence Weighting for Optimizing Expert Forecasts about some work last year related to DAGGRE.
It’s natural to want to help forecasters stay coherent as we ask related questions. For example, what is your confidence that:
1. “Jefferson was the third president of the United States.”
2. “Adams was the third president of the United States.”
People are known to be more coherent when these are immediate neighbors than when on separate pages with many unrelated questions in between. So it’s natural to think it’s better to present related questions close together.
We found that’s not necessarily a good idea. On a large set of general knowledge questions like these, we got more benefit by allowing people to be incoherent, and then giving more weight to coherent people. At least on general knowledge questions, coherence signals knowledge. We have yet to extend this to forecasting questions.
We found other things, too – cool and interesting things. Here’s the abstract, but be warned, it gets technical:
Methods for eliciting and aggregating expert judgment are necessary when decision-relevant data are scarce. Such methods have been used for aggregating the judgments of a large, heterogeneous group of forecasters, as well as the multiple judgments produced from an individual forecaster. This paper addresses how multiple related individual forecasts can be used to improve aggregation of probabilities for a binary event across a set of forecasters. We extend previous efforts that use probabilistic incoherence of an individual forecaster’s subjective probability judgments to weight and aggregate the judgments of multiple forecasters for the goal of increasing the accuracy of forecasts. With data from two studies, we describe an approach for eliciting extra probability judgments to (i) adjust the judgments of each individual forecaster, and (ii) assign weights to the judgments to aggregate over the entire set of forecasters. We show improvement of up to 30% over the established benchmark of a simple equal-weighted averaging of forecasts. We also describe how this method can be used to remedy the “fifty–fifty blip” that occurs when forecasters use the probability value of 0.5 to represent epistemic uncertainty.
Read the article!
Christopher W. Karvetski, Kenneth C. Olson, David R. Mandel, and Charles R. Twardy. Probabilistic coherence weighting for optimizing expert forecasts. Decision Analysis 2013 10:4, 305-326