People inevitably employ new technologies in creative and unanticipated ways. When the Advanced Research Projects Agency began working with the packet switching and TCP/IP protocols, for example, it was clear that the technology would improve communications.

But no one predicted that eBay would soon be using a thing called the Internet to help you clean out your attic, or that Amazon would be there to help you fill it back up again. Unexpected developments often give new technologies their greatest value.

The Internet Of Things

Much of the discussion regarding the coming “Internet of Things” seems to regard it as a straight line extension of activities that already exist. More devices will become Ethernet-aware, and they’ll be smarter. More data will become available.

Your refrigerator will be able to tell you when it is running low on milk, and your car keys will be able to report that they’re hiding under the couch. You’ll be able to communicate with everything from industrial machinery to personal possessions, and won’t this be useful?

Expanding the network edge to include more objects and more locations will indeed be useful. But it’s not the whole story. It’s probably not even the most important part.

Connecting a wider range of devices will create many new data streams. That in itself won’t be transformative. The true value of the Internet of Things will become apparent, and the big changes will begin to occur, when we develop the software that lets us mash together the data from the ever-increasing number of data streams and use it to make highly informed decisions.

Mashing Up Data

Consider your last trip to the airport. Did the shuttle bus for remote parking show up in three minutes or 30? How long did it take to check in? How long did it take to get through security? Did you arrive at the boarding gate with an hour to spare, or did all of the unpredictability make you start to wonder if you were going to make your flight at all? A data mashup would have made that experience less stressful.

Imagine an app that knows how many airline tickets have been sold, the scheduled departure times, and how many vehicles and passengers should be entering the terminal at any specified moment. It also would use real-time feeds to track what the actual figure is, monitor activity at check-in lines and security stations, plug in real-time feeds from traffic agencies, and keep an eye on the weather to see if things may suddenly slow down. It would take a lot of the guesswork out of airline travel. If you needed to leave 45 minutes earlier than planned, the app would say so.

More data mashing would already be taking place if we weren’t hampered by the various closed standards that are currently vying for market share. There is movement afoot to abolish them, and open standards are beginning to replace them in every area, from information models and protocols to device management and security. Much of the push will come from the service companies that provide the bandwidth for the data payloads, as their growth models depend upon the steady expansion of data plans and data networking.

For example, oneM2M is a new organization of global standards bodies working to rally manufacturers, carriers, and standards groups around common standards. It is developing technical specs for a common machine-to-machine communications (M2M) service layer for embedding in hardware and software so M2M devices worldwide can speak the same language, connect, and be interoperable. The organization anticipates its first work release to be available before year-end.

The evolution of the Internet of Things is not so much about new, breakthrough technologies as it is about new standards and software. Much of the supporting cast of technologies is already in place (see “How Does The Internet Work? Very Well, Thanks To Standards”). IP protocols are already standardized, so the sheer numbers of connected devices will ensure that IPv6 will come into play.

Moore’s Law, and its effect on microprocessors, dictates that low-power wireless sensor networks will soon be the source of many new data streams (see “Moore’s Law Makes Household Robots Possible”). Those that fail to use truly open standards will be far less useful and will attract far less investment.

The nebulous “cloud” will host tremendous amounts of data and will serve as the storage technology for those data streams. But the exponential growth won’t occur until information models and protocols have been standardized. Security standards and implementation will be another key ingredient, and they will delay large-scale adoption until reliable systems are firmly in place.

Data mashing will lie at the very heart of the Internet of Things, and it will be used to organize unprecedented quantities of data in useful ways. We will see the effects in every industry and in every human activity. But what we can’t predict, any more than ARPA could have predicted the rise of eBay and Amazon, are the entirely new industries and activities that data mashups and human creativity will soon bring into existence.

Mike Fahrion, director of product management at B&B Electronics, has more than 20 years of design and application experience overseeing M2M connectivity solutions for wireless and wired networks. He also is author of the eConnections newsletter.