The development of increasingly affordable technologies has been a major driving factor in making the Internet of Things (IoT) possible. For example, the cost of sensors has come down by a factor of two over the past decade, and even more significantly the costs for connectivity and computing have come down by factors of 40 and 60, respectively.
Essentially, the Internet of Things is the next evolution of computing – it is moving to the 'cloud plus edge' paradigm. The ability to connect end-to-end – from the device to the cloud, often via a gateway – and extract information in real time via data analytics in the cloud will transform businesses and enable them to achieve greater productivity and efficiency, reduce operating expenses and also contribute to the top line with new services and products.
According to a report from McKinsey Global Institute, published in 2013, the number of connected machines has increased by about 300 percent in the past five years. New devices are being added every day with approximately half a billion ‘non-personal’ devices added in 2013, and around 50 billion devices or more are currently predicted to be connected by 2020.
So, while the opportunity is vast, a huge issue for IoT development is the high level of fragmentation. The development of hardware and software technologies that are suitable for a diverse range of industries and markets can mean a significantly slower pace of adoption.
And the challenges and technical issues are many when considering how to connect devices at the edge up into the cloud. In industrial applications, for example, much support will be required for the integration of legacy infrastructure and protocols and perhaps previously unconnected devices and equipment.
Also, the myriad of different devices will create different data types, so normalization of data will be important. In addition, this multitude of devices will need to be provisioned and managed. And in terms of connectivity