The Internet of Things requires centralized and distributed compute, processing and storage - Cloud and Fog.


The real business value enabled by the Internet of Things is derived not really from the data, but from insights that facilitate real-time actions that increase asset efficiency, reliability and utilization. That value takes many forms with IoT use cases ranging from supply chain management and manufacturing automation to smart lighting, parking and waste management solutions to streamline municipal operations. But, to make the transition from connecting devices to saving time and money, the data insight has to come from somewhere--typically centralized, scalable cloud computing platforms tailored for the device, connectivity and data management needs of the IoT.

Managing IoT from the cloud

At a basic level, cloud computing is a way for businesses to use the internet to connect to off-premise storage and compute infrastructure. In the context of the Internet of Things, the cloud provides a scalable way for companies to manage all aspects of an IoT deployment including device location and management, billing, security protocols, data analysis and more. Cloud services also allow developers to leverage powerful tools to create IoT applications and deliver services quickly. On demand scalability is key here given the grand vision of the IoT as a world filled with smart, connected objects.

Many major technology players have brought to market cloud-as-a-service offerings to serve the IoT. Microsoft has its Azure suite, Amazon Web Services, a giant in cloud services, has an IoT-specific play, IBM offers access to the Watson platform via its Bluemix cloud, and the list goes on and on. Regardless of the specific product, the commonality is the ability to access flexible IT resources without having to make big time investments into hardware and software and the management that comes with it.

But, particularly for services and applications that require very low latency or have a limited "pipe" through which to pipe data, there are some downsides to the cloud that are better addressed at the edge. Monica Paolini, president of Senza Fili Consulting, wrote on LinkedIn, “In recent years, there has been a strong push to move everything to a centralized cloud, enabled by virtualization and driven by the need to cut costs, reduce the time to market for new services, and increase flexibility. In the process, we lost sight of how important the location of functionality is to performance, efficient use of network resources and subscriber experience. Physical distance inevitably increases latency.”

The Fog Rolls In

The OpenFog Consortium organized to develop a cross-industry approach to enabling end-to-end IoT deployments by creating a reference architecture to drive interoperability in connecting the edge and the cloud. The group has identified numerous IoT use cases that require edge computing including smart buildings, drone-based delivery services, real-time subsurface imaging, traffic congestion management and video surveillance. The group released a fog computing reference architecture in February.

Helder Antunes, OpenFog Consortium chairman and senior director of the corporate strategic innovation group at Cisco, said the release will drive IoT adoption by providing a “universal framework. While fog computing is starting to be rolled out in smart cities, connected cars, drones and more, it needs a common, interoperable platform to turbocharge the tremendous opportunity in digital transformation.”

Another group formed to drive edge interoperability is the Edge X Foundry, an open source consortia approach managed by The Linux Foundation and seeded with some 125,000 lines of code developed internally by Dell Technologies. To learn more about how open source initiatives like the Edge X Foundry are impacting the internet of things, read our primer, “Open Source and the IoT: Innovation through Collaboration.”

A Cloud/Fog Hybrid Approach

Let’s consider autonomous driving. Cellular networks will connect vehicles, equipped with advanced LIDAR, image processing and other self-driving technologies, to other vehicles, pedestrians, smart infrastructure and an array of cloud-based services to support in-car entertainment, predictive maintenance, remote diagnostics and the like. It’s fine for your car to access your cloud-based Netflix account or maintain operational and maintenance logs, but the cloud is not necessarily the best place for mission critical decision making that could help a vehicle avoid a collision on the highway--given the time (latency) demands, this type of processing is best handled at the network edge.

To facilitate this type of hybrid approach, Cisco and Microsoft have integrated the former’s Fog Data Services with the latter’s Azure IoT cloud platform. The combo joins edge analytics, security, control and data management with centralized connectivity, policy, security, analytics, app development and more. In a recent blog post, Cisco Head of IoT Strategy Macario Namie noted, “One of the beautiful outputs of connecting ‘things’ is unlocking access to real-time data. Next is turning that data into information and, more importantly, actions that drive business value. In trying to do so, companies are finding themselves deluged with data. So much so that demand for vast compute and storage needs have arisen, very nicely handled by public cloud providers. But the cost of transport and speed of processing has also increased, which is challenging for many uses cases such as mission-critical services...As a result, many IoT initiatives are now distributing this computing power across the edge network, data centers, and public cloud.”

WhitePaper 20Download 20CTA

Jennifer Halstead

Written by Jennifer Halstead

Jennifer Halstead, MBA, CPA brings more than 20 years financial industry experience to Link Labs. She began her career in finance within the pharmaceutical industry and has continued in both public accounting and private companies. She passed the CPA exam with the 3rd highest score in the state and completed her MBA with an accounting concentration (summa cum laude). Jennifer has worked with several software companies and has led multiple venture financing, merger and acquisitions deals. She has helped companies expand internationally and has managed the finance department of a startup to 33 consecutive quarters of growth prior to acquisition. After the acquisition, she served as the Controller of Dell Software Group’s Data Protection Division where she managed a portfolio of multiple hardware and software products to scale and achieve over triple-digit growth worldwide in 18 months. Jennifer brings a depth of finance experience to the Link Labs team.

Related Blogs

Asset Tracking, BLE Asset Management Election Integrity, RTLS, asset tracking system, automation, Elections, legal concerns, chain of custody, mail-in ballot

How Counties Can Avoid Public Scrutiny During Elections

Asset Tracking, BLE Asset Management RTLS, supply chain, remote assets, gps, satellite, automation, Satellite GPS tracker

How Satellites Optimize Supply Chains and Logistics

Asset Tracking, BLE Asset Management Asset Tracking, RTLS Solution, RTLS, asset monitoring, rtls asset tracking, Utilization, process efficiencies, manufacturing, Downtime, historic data, predictive data

Why Monitoring Utilization Improves Process Efficiencies

Subscribe to Link Labs' blog weekly update!

Subscribe

Subscribe to Link Labs' blog weekly update!