Traffic Data Will Be a Battlezone in 2010


Strategy Analytics’ Roger Lanctot talks about traffic data and it’s evolution in 2010.  In particular he highlights the need for better incident data and his view of the three technologies that will lead that charge.

The solutions will come in 2010 from three key sources: mobile-phone-based crowd-sourced info, traffic cameras and, perhaps, vehicle-mounted cameras. The challenges to delivery include the creation of traffic reporting “crowds,” something TrafficTalk, Waze and Aha Mobile are working on; and camera input interpretation and delivery platforms.

Vehicle mounted cameras are a viable option, but require further adoption and development.  Traffic camera volume is growing rapidly as cities rollout public and privatized camera networks.

The most interesting opportunity to me is “mobile-phone-based crowd-sourced info.”  For decades the primary source of traffic incident data has been crowd-sourced information (if calling in to a radio station isn’t the original crowd sourcing traffic then I’m not a big nerd…trust me, i’m a big nerd).  Now Twitter is the de facto real-time data source (followed closely by Facebook).  Don’t believe me? Next time there is a blackout in your area check Twitter and Facebook versus any other media source.  You’ll most likely be able to figure out just how widespread the blackout is within 5 minutes through Twitter and you wont find a single mention through your local newspaper or other sources.   In fact, next time you’re in traffic, search twitter to see if you can find the source! You might just have some luck.

So, Twitter already has the technology platform to deliver real time information. They’ve also got a significant user base (ranked 14th most visited site on, millions of Twitter capable phones (including SMS and Twitter specific apps), and a strong API that has already resulted in numerous other companies developing entire business models around Twitter’s service.

Take all of this and add in Twitters announcement of geolocation features and purchase of Mixer Labs and their GeoAPI Twitter is particularly well positioned to deliver effective, organized and real-time crowd sourced data on an unparalled level. The addition of position information to the twitter steam will allow for users to search not only by keyword but also by proximity.  The combination of these features will provide the largest volume of crowd sourced data since people started calling radio stations (this includes apps that will inevitably be augmented to include or built on top of Twitter as a platform).

Roger also touches on the introduction of Google to the navigation market.  Google’s data has been questions previously, but there is no doubt that with every download of Google Maps and the sale of every Google phone, that data is improving.  It should also be noted that according to Alexa the #1 upstream and downstream site for Twitter is Google.  That means 10% of all traffic into and out of Twitter is Google.  Whatever geolocation features Twitter may develop Google will be in the best position to utilize the data.  Combine all of this with the Google development super power, almost unlimited funding and an army of some of the best mathematicians in the world there is little doubt that Google will have a huge impact on navigation in 2010 and beyond.

In the end we know the technology is coming.  The infrastructures have been built and the companies are in place to make their moves.  As always with new technology the question that remains is who will pay for it and how?  More on that one later :-)

Be Sociable, Share!

Both comments and pings are currently closed.
Powered by WordPress