You may recall that in the 2012 election, polls dominated the discussion, such as who people said they would vote for or what particular issues where important to them. Often these polls were contradictory and several of them focused on issues unimportant to the election.
But one person, Nate Silver, used a statistical model that filtered out the noise and focused on the signals. By doing so, he successfully picked the election results for President in all fifty states.
So how did he do it?
He looked at the data, developed a predictive algorithm, and ran that algorithm against multiple scenarios to get a distribution of the most likely results based upon available data.
That’s super complicated, so let’s simplify. Based on information from 538.com, the model is based around 4 principles:
- A good model should be based in probability, not final predictions – You’ll often hear a stat that says, if you drink a glass of red wine before bed, you’ll reduce your chance of a heart attack by 10%. You won’t hear someone say, “You won’t die from a heart attack if you drink a glass of wine before bed.” While we tend to want to make hard and fast predictions, it’s important to understand that there are no absolutes.
- A good model should be empirical – It should be simple as well as fit within the theory and experience present in the industry. It should pass the sniff test.
- A good model should respond sensibly to changes in inputs – As new data comes in and as new things are learned about the industry, the model shouldn’t vary extremely. A model that’s subject to volatility should be seen skeptically.
- A good model should have rules that don’t change midstream – The rules and assumptions that govern a model shouldn’t change just because the results are unfavorable. While, like Google, the algorithm may require a tweak from year to year, in order for a model to be valuable, consistency is key.
Based on the data that you can collect and the model that you develop, you can make real, informed business decisions that are based in facts and not preferences or traditions.
So what does that mean for an event organizer?
Imagine a time in the future where you can begin to examine your organization not only from the fill rates of your event, but also baseline your results against events similar to yours in size or geography. How is your event doing compared to others? How do you figure that out now?
What if you could determine the optimal price to charge for your event to maximize income without sacrificing participation? What will the market in your area bear for an event like yours? What’s the impact to registration if your event price is $1?
What would you do if you could determine the registration rates for events during a certain month, by a certain gender of a certain age group? When do women register for similar events? How effective is email marketing to 18-24 year old males? What’s the best way to reach 55+ audiences?
Could you improve your marketing with knowledge like this? Would you change your event workflow to take advantage of these insights? What investments would you make?
The next frontier for the event management industry lies within the realm of data. It will be an advancement that will revolutionize the industry and forever change how organizers like you run your business.
With ACTIVE Network Activity Cloud™ to be launched later this year, you will have the tools to analyze events, uncover insights and recommendations, and immediately act on these insights to optimize your revenue potential.
To learn more, visit http://www.activitycloud.com/contact-us to talk to an ACTIVE Network Activity Cloud™ expert.