The Internet of Things is only in its baby shoes, but the first challenges are already making themselves felt: the data volumes being produced by networked devices worldwide are literally clogging the data pipelines in the cloud. That's because nearly all of the information being collected by sensors, chips and readers is sent directly to remote data storage for processing or storage. And there's no end in sight. By 2020 as many as 50 billion devices are predicted to be connected via the Internet of Things.
The volumes of data being churned out are nothing short of grotesque. For example, it takes the sensors in the turbine of a jet airplane just 30 minutes to generate performance and status data on the order of 10 terabytes, points out network hardware maker Cisco .
Bandwidth Capacity Costs Money
Telecommunication service providers are already levying higher costs for data upload and eyeing weaknesses in the new EU regulations on net neutrality. Critics warn that the law waters down net neutrality through vague, loophole-ridden formulations. In their view, the threat of a "two-class internet" is real. Under such as system, anyone hoping to push their data into the Cloud more quickly will have to pay extra for the privilege.
A variety of companies, Cisco first and foremost, are trying to find alternative solutions to the problem by establishing an interim tier for data processing — known as Fog Computing. In this system, information is not immediately sent in raw, unprocessed form to the Cloud or a remote data center, but rather pre-processed in server systems or storage and network components at the edges of an IT infrastructure.
Separating the Wheat from the Chaff
By tasking these so-called edge devices with services and processes previously handled within the Cloud, businesses can significantly reduce the amounts of data they need to transfer. To take an analogy from agriculture, the grains are no longer brought to the mill together with the straw, but rather are threshed first to separate the wheat from the chaff.
It's also possible to take advantage of untapped computing capacity of nearby servers. A similar system is already in place in the energy industry: excess power from solar cells and power plants flows into community networks. This helps optimize distribution and reclamation, which ultimately lowers prices. The future seems likely to bring fewer centralized data centers and more local units capable of reacting more quickly to the user's individual needs . This approach effectively avoids the much-noted weakness of data centers: high energy consumption, performance bottlenecking and heavy bandwidth use.
Green Lights for Emergency Vehicles
"Fog computing positions analysis, processing and storage functions at the periphery of the network," explains Kay Wintrich , Technical Director at Cisco Germany. "On the Internet of Everything, in a completely networked world, this is the only option for handling the huge volume of data being generated," added Wintrich.
Cisco sees potential practical applications as including intelligent traffic control systems using video cameras. They could for example recognize emergency vehicles with emergency lights on and clear a corridor of green lights. Both analysis and reaction are handled locally, eliminating the current need to forward that data to a computing center. The local computing power comes courtesy of a second, Linux-based operating system installed by Cisco on switches and routers.
Special Hardware Needed
While the Cloud tends to be understood as a more nebulous, distance place, the "Fog" stays close to the ground. After all, that's where the work is actually performed. The "Fog" isn't comprised of powerful computers, but rather of the weaker distributed computers found in the devices, factories, cars, street lamps and all other products of our material culture. Not that one approach covers all. Startup Nebbiolo Technologies for example is proposing a different course, with hardware for industry designed specifically for Fog Computing.
"Our small central computers are the heart of Industry 4.0," says Flavio Bonomi , CEO and Founder of Nebbiolo Technologies. "They offer local storage space, conduct real-time analyses, merge processes and also serve as a firewall against external attacks."
Yet it should be noted that Fog Computer still has unresolved weaknesses and risks of its own. One is the availability of the computational units. Fog Computing is only clearly beneficial when it can be established that the data truly does require processing — but when its computers are located in devices such as smart meters, they are susceptible to breakdowns and misuse and become difficult to monitor, warns Rick Stevenson, CEO at Opengear, a solutions provider for IT administration.
The haze around the future of Fog Computing has yet to lift. But Cloud Computing itself was once considered by many experts to be no more than a buzzword with few real prospects for future success.
The truth looks very different today, as will be on full display at the CeBIT. Halls 2, 4 and 5 are completely focused on the topics of Big Data & Cloud Computing and will be presenting potential applications and visions of the future . The data center of tomorrow has its own home at CeBIT 2016 — in Hall 12, which houses both the DatacenterDynamic@CeBIT and a related conference on DatacenterDynamics Converged.