Skip to main content

Cisco: Linux is the “Single and Best” Tech for IoT

By 2014-07-078月 22nd, 2017Blog

Michael-Enescu

Cisco earlier this year unveiled its plans to build smarter routers and switches to help manage the massive flows of data expected between Internet-connected devices and the data center. This re-architecting of the Internet to bring computing capabilities to the edge of the network is what the company calls “fog computing” and it could help alleviate the data center strain that Gartner analysts predict will come from 26 billion installed units in the Internet of Things by 2020.

“Tens of billions of ever-smarter edge devices create data (we call it big data now) that it is impossible to move fast enough through the network,” said Michael Enescu, CTO of Open Source Initiatives at Cisco. “We have to deal with this, otherwise it will present huge scale and security issues in the core.”

Enescu, the former vice president of product at XenSource, and a founding member of the Java Content and J2ME projects at Sun, will further address this connection between cloud, fog computing and the Internet of Things, in his keynote at LinuxCon and CloudOpen in Chicago Aug. 20-22. In this Q&A he discusses the differences between fog and cloud computing, the role of Linux in the Internet of Things, and previews his keynote.

Linux.com: How do you define “Fog computing?”

Michael Enescu: Fog computing is a virtualized environment that provides compute, storage and networking services between end devices typically located at the edge of the network and traditional Cloud computing data centers. While some may know it as Internet of Things, Fog computing is a non-trivial extension of Cloud computing.

How is that different from cloud computing?

There are several characteristics that make fog computing non-trivial: edge location, location awareness, and low latency in highly constrained connectivity, bandwidth and energy availability or scale are some of the challenges addressed that make fog computing different than cloud.

Why is that distinction important and what does it mean for the cloud?

Network scales very differently than “compute” or “storage”, and particularly the bandwidth at the edge. Another way to say that is that the mismatch between Moore’s law (though Gordon and I like to call it Carver Mead’s law) and Nielsen’s law are creating a tidal wave at the edges. Storage power grows exponentially faster than compute power which in turns grows exponentially faster than bandwidth at the edges. Thus, tens of billions of ever-smarter edge devices create data (we call it big data now) that it is impossible to move fast enough through the network. So we discover that data has “gravity,” applications will come to data faster than data can come to applications in the cloud. This is a gigantic change in the compute model as we knew it in Data Center or Cloud. This is why IoT and (big data) analytics, virtualization and visualization are becoming very important. We have to deal with this, otherwise it will present huge scale and security issues in the core.

What is the role of Linux in this new form of computing?

Linux is the single thing that is or could be common across all the compute platforms, in the large or small devices, be they in the core or in the edge network. Therefore it is the single and best opportunity we have to get the development, collaboration and fundamentally the deployment model right.

What else will you cover in your CloudOpen keynote?

A more colorful aperture on what IoT, data gravity, and fog computing consists of, the architecture changes, and the relevant opportunities in IoT, Fog computing, and most importantly, Linux, open source, and innovation.

Register now to attend CloudOpen North America, co-located with LinuxCon in Chicago, Aug. 20-22, 2014.

The Linux Foundation
Follow Us