Data Velocity: The Race to Zero
The seeds have been sown… Augmented Reality and beyond
Craig Bachmann & Natasha Léger are partners in ITF Advisors, LLC, an independent consulting firm with a focus on next-generation strategy and on translating the increasingly complex new media business environment’s impact on business models, markets and users. Natasha is also editor of the new spin-off publication, LBx Journal.
The year 2009 was a remarkable year by any measure.
Now that the world is sharing information in real time, the velocity of millions of bits of data is creating an environment that is challenging even the best thinkers on how to make sense of it all.
Clearly, instant communications and the ability to access any data, any time have also become expectations. Tweets, texts, IM, posts – they are all about instant reporting, whether stream of consciousness or an important piece of news, from tornado or earthquake damage to political manifestations to corporate activity (either good corporate citizen or corporate malfeasance).
As storage costs continue to drop and software development (or, more appropriately these days app development time) decreases (100,000 iPhone apps and counting!) the world appears to find itself in a “race to zero.” What is this race to zero? It is a frictionless world, whereby not only the time but also the money once needed to accomplish seemingly complex tasks is grinding down to zero.
Have you noticed that the digital generation is churning out new apps on almost a daily basis? The founder of Digg said at the Enterprise 2.0 conference this past fall that “…if you can’t turn out a prototype without outside investment, there’s a problem.” Companies like Nike that invested billions of dollars in expensive system integration projects are now turning to cheap consumer web tools for more effective global collaboration. Tom Friedman’s book, The World is Flat, described the global IT infrastructure that ushered in the first level of “cloud” applications and massive cost savings potential. Friedman’s work, Enterprise 2.0, and social networking applications such as Jive and Twitter, along with geospatial mashup platforms such as Google Earth, JackBe and others, have sown the seeds for any business, government, NGO, or individual to aggregate almost any piece of data needed to analyze almost anything – except possibly what makes sense.
Why is technology grinding to zero?
The most influential of many factors have to do with Moore’s law and with the advent of user-generated content and new web-based business models that are driven by advertising. With user-generated content, companies from Google to Waze to OpenStreetMap are benefiting from free labor, an advantage that gives the illusion of providing a “free” or “cheaper” product or service.
This race to zero has its pros and cons. One pro is the dissemination of real-time information that can save and improve lives. The cons are the destruction of jobs that keep the economy going and a lack of critical thinking on issues that require more time than a bullet on a PowerPoint, a 140-character tweet, and a 20-minute decision so that one can move on to the next task to hit that all-important KPI.
What comes after the “race to zero” when commodity-based tools are readily available through a simple drag and drop exercise? More importantly, how do today’s providers of dumb pipes (think telephone company), dumb data (raw imagery), and dumb storage (a hard drive), add value and remain relevant in a world of new players?
For example, the iPhone has decoupled the phone company from the network service provider in the minds of the consumer, Google Earth has made imagery a backdrop, and cloud computing is turning hard drives and flash drives into clutter. Becoming a trusted platform is critical to creating value. With the iPhone, Apple monetized interactivity and an integrated environment; Google monetized search and tied data to advertising, and Amazon has turned commodity storage into a cloud service.
Imagery offers a significant opportunity to provide a context for “real time” trusted analysis. The distribution of imagery (the data); the processing of imagery (the software); the integration of imagery (the implementation and workflow); and the visualization of imagery have all become relatively frictionless in the last few years with Internet distribution and with more cost-effective and easier-to-use software processing and feature extraction applications. Now that imagery is more widely available, has an unprecedented reach, and is increasingly becoming a defacto visualization platform for other data, can it provide a “trusted analysis” platform?
Augmented reality (AR) may offer such an opportunity, because it provides for the merging of a real-world environment with other data and imagery to create an enhanced, augmented, and more “intelligent” view of the original environment. This concept was highlighted in the James Bond film, Quantum of Solace.
The ability to access multiple databases of information that connect the dots in real time between seemingly disparate data points, and thus to paint a virtual canvas of insights, is the future of data velocity. We hope to see in 2010 the further integration of satellite imagery into such frontiers as AR. The notion of “connecting the dots – real time” is a reality in itself.