Saturday, April 13, 2013

#The Internet of Things is coming. Now what?

The Internet of Things is coming. Now what?

First came the internet, a vast, chaotic conglomeration of servers connected to one another and to dedicated devices, mainly PCs. Now the world is starting its migration to a new type of connectivity topology, the so-called ‘Internet of Things’.



Today, computing power is still largely contained in dedicated devices (including the PC), on tap through defined and often fragmented software interfaces largely inherited from the pre-internet era. When computing power and intelligence is distributed or ‘embedded’ in devices it tends to be on islands dedicated to specific tasks.

Increasingly though computing power is being distributed to types of intelligent devices – ‘things’ - that not long ago would have been used in a completely ‘dumb’ state.
Examples of ‘smart’ devices today include mobile devices, embedded systems, industrial control and in-cars systems and even in a few cases household appliances. RFID and GPS tags are also early examples of how even inanimate objects can become ‘active’ on the Internet of Things, able to store and transmit data about themselves without people even being aware of this.
But with four billion people and more than 31 billion devices expected to be using what is still called ‘the internet’ by 2020, the Internet of Things has far deeper implications than making the digital world more crowded and noisier with information.
As devices and objects of almost every sort acquire processing power and the ability to perform tasks in an automated way, the nature of the systems they are part of don’t just expand but changes quite profoundly. In whatever form this Internet of Things unfolds it is clear that it changes the nature not only of computing in a general sense, but the expectations and horizons of its users and therefore the way services – including security – must be provisioned.

Shift in power 

The first major consequence is to shift power away from established companies (both vendors and their customers) towards those who can define the standards through which objects will process and interact without human intervention through machine-to-machine (M2M), but also those enterprises that can make good use of this infrastructure.

The Internet of Things has the potential to be a dramatic leveller, in part because access to advanced technologies will no longer be restricted to the large organisation but also because of the way it will undermine established organisations struggling to evolve. In some ways larger enterprises will face the biggest challenges of all.
Commercially, we see the effect of the Internet of Things in the struggling of Japanese electronics vendors that rose up to dominate the electronics era of the 1960s onwards as the makers of ‘things’. These things, on their own, are no longer profitable enough and the next generation of successful vendors will be those for which intelligence is embedded and connected in marketable ways.
This is the outline of a world that will turn up in some form in the next decade, so how might businesses prepare themselves? Beyond the flux of change itself, what specific problems will it bring in its wake?

Big Data and the cloud

The first challenge will be what analysts and vendors alike have come to call the problem of ‘Big Data’, that is the potentially vast and exponential volumes of data that will have to be processed, stored and moved to and from the multitude of ‘things’.
This presents an analytics problem – what are the important patterns within the huge volumes of data being generated by M2M devices in particular – but also where it can possibly be stored. Big Data is a vast apparently amorphous mass of data and by its very nature it increases in real time, straining current technologies beyond breaking point.
Standalone storage systems inherited from the pre-internet era are not up to the job of coping with this sort of storage demand on a physical or logical level; it would overwhelm them very quickly. Consequently, cloud storage has been put forward as the answer but in reality this simply shifts the problem to a set of service providers in a way that generates new problems.
What technological standards do these providers meet, both for the physical and logical storage of data but also its possible migration at a later date? Can these same providers meet regulatory and privacy standards that often differ between countries, trading blocks and even industries?
As with every other aspect of the cloud services, Big Data also brings with it a set of security concerns as to how access is authenticated, logged, and possible failures insured against.
Without viable answers to these basic big data issues, the Internet of Things starts to look more like an internet of anxiety, where even small security breakdowns could have huge consequences. Exactly how M2M systems can be guarded against dangerous chains of dependency will slow its provision for the next decade.

Intel’s Intelligent Systems Framework

Companies such as Intel have argued that the only survival strategy is to adopt an interlocking series of technologies as a foundation, rather than building them gradually in a piecemeal way.
The Intelligent Systems Framework (ISF) covers a range of approaches to this, including being built on the company’s commodity processor hardware, including manageability in devices from the start and making sure that this infrastructure will work across heterogeneous networks (fixed, wireless, near-field radio and the like), but perhaps the most intriguing element is the idea of embedding security itself.
The need to embed security is no scaremongering as the extraordinary 2010 attack on industrial control technology by Stuxnet has already demonstrated. These systems had never been seen as a security risk simply because nobody had bothered to attack them before. But if industrial control systems could be protected what about other autonomous systems of the sort that will fill the Internet of Things?
The answer was not to place protection on the chip in a static way but to embed the necessary circuitry for software to access above that layer. This can take the form of Trusted Platform Modules, secure cryptographic spaces able to store data such as authentication tokens, or embed routines that make it harder for malware to subvert a system directly. Security embedded with a suite of software services offers the important possibility of evolution.
In parallel, Intel has also been a prime mover in initiatives such as the Open Data Centre Alliance, collaboration between a range of large enterprises and a handful of technology companies to develop a set of standards to tie the technologies contained in the ISF together.

Regulation and compliance

Data protection is slowly growing into one of the biggest functions of national and supra-national Governments and institutions and there are plenty of signs that the process of ironing out the wrinkles will take the rest of this decade and beyond to form into a global system.
The problem for Big Data, and for the organisations coping with it, is that as more data flows from the objects or ‘things’ around individual users, the issue of privacy becomes more pressing. Up to now, the sort of data collected about people has been static, including names, addresses, social security numbers; increasingly this is becoming transactional and contextual, tied to systems that can identify who did what, when and with whom.
The issue of privacy is moot for some right now because much of this data is fragmented across databases or quickly discarded. The economic principles of Big Data suggest that these fragments will eventually coalesce and the regulation of private data will become a political issue.
It is often assumed that the Internet of Things will be built in a competitive but benevolent way by the free market; it seems just as likely that its shape will be fashioned by governments, agreements and protocols. Governments too will also stand to benefit from Big Data, indeed the most controversial aspect of the trend is already the way states across the globe are looking to mine data from the habits and associations of citizens. Big Data’s fate is to be controversial.
The best example of how ‘big’ legislation might start to influence businesses is the EU’s Data Protection Directive, for now mostly focussed on important sub-issues as improved data breach notification. Such trans-national rules will impose similar restrictions on the data being collected – or breached - from the Internet of Things. These will become de facto standards.
Beyond this, some organisations will learn where the limits lie in the courts as special interest groups or individuals test the boundaries of legal acceptability. Businesses must prepare for complexities such as the need to allow individuals to ‘opt out’ in ways that could present major challenges in an age of extensive data.

Conclusion

There is no simple security solution to plaster over these issues. The objects making up the Internet of Things will have security embedded into them. Real-time analytics will be used to handle the data these objects generate and to manage them in an automated way. Management will be hands -off unless certain thresholds are breached. Governments will seek to both access data and introduce ‘kill switches’ that reduce the potential of devices to be attacked for economic or political gain.
This era will arrive in one form or another, whether enterprises engage with that fact now or not. It would be a huge mistake to ignore the way it will change organisations and the customers and citizens they serve. It would be an equal folly to pretend that it will roll out in the way that the Internet one unfolded. In the new world, governments, customers and citizens will all have an active influence.

No comments:

Post a Comment

CPXcenter