Access and Feeds

Machine Learning: Can Algorithms Become a Crystal Ball for Future Events?

By Dick Weisinger

In Isaac Asimov’s classic science fiction series Foundation, mathematics, statistics, history, and sociology have all progressed and consolidated into a field called psychohistory. Psychohistory is able to predict long-term future trends and changes in society. In Asimov’s book the statistical predictions of the future made using psychohistory were highly accurate to an astounding level of detail.

Could Machine Learning ever rival the fictional field of psychohistory in predicting the future? Researchers are trying. They are applying Machine Learning models to historical data to make predictions about the occurrence of future kinds of events, like global terrorism. The goal is that ML algorithms might be able to make some rough predictions about frequency, and produce some indication of when and where serious terrorist events might occur. The data for the project is based on the Global Terrorism Database.

The ML global terrorism project divides the world into millions of small grid points and mapped events from the terrorism database to the locations. Each location point was tagged with metadata describing factors like GDP and other regional characteristics. Data collected between 2002 and 2016 was used for the study.

Did it work? Basically not, or at least not with that much different results than what you’d see with standard statistical modeling. The researchers found that “it is virtually impossible to predict ‘black swan events’—those events that occur only once over a very large period of time.”

But maybe we shouldn’t try to predict the future. Prediction of future crises and problems may only be self fulfilling.

“You don’t need to predict the future. Just choose a future — a good future, a useful future — and make the kind of prediction that will alter human emotions and reactions in such a way that the future you predicted will be brought about. Better to make a good future than predict a bad one.” — Isaac Asimov

Digg This
Reddit This
Stumble Now!
Buzz This
Vote on DZone
Share on Facebook
Bookmark this on Delicious
Kick It on DotNetKicks.com
Shout it
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)

Leave a Reply

Your email address will not be published.

*

two × 1 =