Computing beyond the cloud

March 2, 2022

Assistant Professor Zheng Song explains how edge computing will help realize the dream of an "Internet of Things."

A whimsical collage graphic featuring Assistant Professor Zheng Song, floating above the clouds, encircled by an orbit of IoT devices.
Graphic by Violet Dashi

In a world where once-niche phrases like “the cloud” and “Internet of Things” (or IoT) have become everyday terms, edge computing remains something of an enigma to the general public. Part of the reason is that it’s a fairly new field. UM-Dearborn Assistant Professor Zheng Song, who’s made edge computing his primary research area, says the paper that sparked the discipline dates back a mere six years. Edge computing is so novel, in fact, there’s not even a dominant paradigm that defines it yet, though computer scientists seem to agree that as edge computing matures, it will be transformative. 

To see why, it helps to understand some of the limitations of today’s connected environment. Cloud computing is a crucial part of that landscape and it’s absolutely transformed how we think about data. Before everything was connected to the internet, our digital content was often stored locally. Think back a few years, and you’ll remember that Netflix was once a DVD rental by mail company, and you kept your documents, emails and vacation photos on your hard drive. To back them up, you needed an external hard drive or a CD/DVD burner. That seems old-school now. Today, movies and emails all exist in “the cloud,” meaning your data largely lives outside your devices on enormous central servers. This makes it possible to stream a movie or edit a Google doc anytime you ask for it, from any device. 

But storing things on the cloud for instant, anywhere access has its limitations. The first challenge is the sheer amount of data that our devices are constantly downloading and uploading. Transmitting all that information requires robust networks that, just like highways, get bogged down when there’s too much traffic. Another challenge is that the seeming instant access provided by the cloud isn’t really instantaneous. In particular, Song says the physical distance between an end user and cloud server matters quite a lot. If you live several hundred miles away from a cloud server, which is common, since they are highly centralized, you’re going to bump up against both latency and network congestion issues. This might not matter much if you’re just waiting an extra second for an app to deliver all the pictures of cats in your cloud-stored photo archive. But in high-stakes, time-sensitive applications, like the split-second decisions your autonomous vehicle will one day make, delays aren’t something we can really accommodate.

Much of edge commuting’s potential lies in solving challenges like this, and in general, it does so by shrinking the distance between computing resources and the end user who is requesting them. By pushing more storage and computer processing away from the central hubs to the “edges” of networks, we reduce latency and congestion and get the results we want more swiftly. That’s the core idea anyway. But Song says there are still different ideas about how to organize this decentralized edge commuting environment, and there are at least three major paradigms contending for market space. One of the most straightforward is one in which cloud service providers, like Amazon and Akamai, simply build a network of mini data and computing centers that are more geographically dispersed, thus putting storage and computing services closer to end users. The wireless network operator Verizon is pushing a variation on this theme, by adding data and service centers right onto their cell towers and other network infrastructure. 

But a third approach, the one Song is working on, looks much different than the other two. Rather than add a bunch of new storage and computing resources, Song is looking into ways various connected devices that already exist in close proximity could work together to form self-organized edge commuting platforms that are hyperlocal and thus very, very fast. Right now, for example, when your smart doorbell sends you an alert that someone is at your door, that notification is the end result of a series of executed tasks that still relies on the cloud. First, the camera on the doorbell records some video. It then uses your WiFi connection to send the imagery to the cloud for processing. Then a cloud server analyzes the video and makes a judgment about what’s happening. And if the AI determines there’s a person at your door, it’ll send a notification to your phone. But all this takes time. And if you live far away from a cloud server, or all this happens during a time of network congestion, there might be a substantial delay between the time the video is recorded and when you get a notification. This is a common complaint, in fact, of WiFi-powered security systems. If it happens to be a burglar at your door, the intruder might already be in your house by the time you hear about it.

Now, let’s rerun the same scenario but with a hyper-local edge commuting environment. As before, your doorbell records the video, but instead of sending it to a cloud server hundreds of miles away for processing, it uses local resources that can execute the tasks instead. For example, Samsung’s new SmartThings Hub is being billed as just that: a cloudless data processing center for your various smart home IoT devices. And some newer devices contain that kind of processing power right in the device itself. But Song says there’s also a lot of potential for building edge networks out of things we already have readily available. Cell phones, for example, also have image processing capabilities — and could be leveraged as a super fast, on-demand edge node for a smart doorbell for the many hours someone is home. And Song is particularly excited by the possibilities for phone-car collaborations. For example, once autonomous vehicles are doing the driving for us, cars will feature more entertainment options for passengers. Given the car’s finite computing power, however, it’ll be important not to strain the vehicle’s resources as it’s making real-time navigation decisions. To protect those resources, the car might form an edge network with your phone, asking the phone to use its cell connection to give you information about the sights you’re seeing along the way, or stream and mirror a movie to the car’s LED screen. That keeps the vehicle’s own internet connection and computer processing power free for the important stuff. To the end user, of course, the experience is seamless. It looks like the car is doing it all, even though some of the tasks are performed by the car’s computer and some are done by your phone.

To make such invisible “collaborations” between different devices happen requires special software called “middleware,” which Song recently received a National Science Foundation grant to develop. Song describes middleware sort of like a traffic cop, who, when a stoplight malfunctions, can signal to different drivers to yield or go, and thus keep traffic moving in a harmonious way. The middleware would be able to analyze a user’s request, divvy out which parts of the task can be handled by various local devices, send out orders, collect the results, and then send a unified response back to the user. It sounds like a lot of steps, but because they all happen hyper locally, and cut out the cloud, the process can happen really, really fast. Moreover, Song says edge computing like this has some distinct security and privacy advantages. Because the process doesn’t involve sending so much data out to the cloud, and can even rely on devices that are all owned by a user, there’s less opportunity for data to get into the wrong hands. 

The potential applications for edge computing are multiplying fast. In addition to studying vehicle-phone collaborations, Song, for example, is interested in an even edgier edge application: vehicle-drone collaborations. And here in Southeast Michigan, there’s a huge appetite for edge computing in the data-driven advanced manufacturing space. For sure, there’s a lot riding on researchers like Song to perfect middleware applications, which could give edge computing a kind of ubiquity the cloud has today. Fortunately, he won’t be tackling his research questions alone. Some key local allies include students in his new edge commuting course. Over the next few semesters, they’ll be building “technology collaboration” applications between everyday IoT devices, allowing Song to put the middleware platform through its paces. Some successful projects for them could signal a bright future for his middleware — and an exciting future for edge computing.

###

Story by Lou Blouin