IoT at the Edge: How AI Will Transform IoT Architecture

By Lou Lutostanski, vice president, Internet of Things, Avnet

Futurists say artificial intelligence (AI) and the Internet of Things (IoT) will transform business and society more profoundly than the industrial and digital revolutions combined, and we’re now starting to see how that world might shape up. Yet even as the future unfolds before our eyes, what few are talking about is how AI-driven IoT actually gets implemented in an effective and profitable way. One critical factor – if not the critical factor – is where the intelligence actually resides and how that influences IoT architecture.

Many organizations believe the rightful place for AI is in the cloud, since that’s where they are moving their data and IT computing power. But a key requirement for functional IoT is interoperable connections between the various sensors at the edge to a gateway and bi-directionally from the cloud – which then poses the problem of latency.

Many of the AI and machine learning applications that are about to truly change industries and shape our world require real-time responsiveness. For example, while we might not mind the slight delay that Amazon Echo’s Alexa takes in answering our questions about today’s weather, the responsiveness of autonomous vehicles on the road or industrial machinery in a factory is a whole different matter.

Many AI applications require a lot of computational muscle to process algorithms and device data. When real-time response and low latency is critical, you need edge computing architectures. But that may not always be the case. AI can still be done in the cloud, in a data warehouse, at the edge, or on an IoT device — or a combination of these. To create the most efficient and sustainable IoT architecture, you need to know what types of computing power go where. That will enable you to balance the economies of scale offered by the cloud with the performance requirements of having AI processing performance at the edge. Some refer to this as “fluid computing” where there are different levels of computing intelligence and processing throughout the network architecture, but it’s really an all-encompassing term for this shift of IT computing power in the cloud to operational technology (OT), computing power at the edge.

Securing the IoT Architecture

Naturally, security is another concern. IoT opens up a lot of holes for bad actors since encryption and other security protections are difficult to pack into endpoint devices. Here, architectures employing secure gateways between the IoT devices and cloud can mitigate security risks while still providing low latency. There needs to be trust of data from the device to the cloud. If there isn’t sufficient security throughout the architecture, then organizations and the IoT and AI systems they implement will be vulnerable. This increases the possibility that their AI decisions are based on potentially compromised or bad data.

Redundancy is a consideration as well. Organizations need to determine if they have designed in sufficient redundancy into their architectures so that when something goes down – and it eventually will—the network can recover quickly.

What all this means is that AI-driven IoT at the edge will be a highly complex ecosystem with many moving parts and expertise in multiple disciplines that will evolve over time as we learn more about this new world we’re shaping. Otherwise, security risks, unexpected downtime, low efficiency and information latency will hinder an organization’s ability to deliver on the promise of IoT. The next generation of innovators will need to rely on multiple disciplines in order to take their vision from idea to design, from prototype to production, from operation to maintenance.

A final point is the development of new hardware and software. As AI moves to the edge, we’ll see more manufacturers designing AI-specific chips specifically for IoT deployment. Not only are venture capitalists backing startups in this area, but also large technology powerhouses such as Intel, Microsoft, Google and Apple are getting in on custom chips, too. The big cloud players like Microsoft and Amazon will introduce new edge-to-cloud hybrid computing services. And we’re seeing an influx of development kits entering the market designed specifically to accelerate prototyping of AI-at-the-edge solutions, and the compute power of these solutions will need to evolve as our collective needs do.

Bringing all these moving pieces together means providing a great deal of flexibility in the solutions. It will require a technology partner with the advisory, supply chain and ecosystem resources necessary to navigate a rapidly changing world. At Avnet, we believe that AI-driven IoT at the edge will be key to driving the disruptive transformation needed for long-term business growth.

Lou Lutostanski

Lou Lutostanski is vice president of Internet of Things (IoT) for Avnet.

Mr. Lutostanski works across the company to identify and create opportunities to leverage Avnet’s capabilities and end-to-end ecosytem to expand the company’s reach and expertise to entrepreneurs, startups, leading technology OEMs and other IoT innovators.

Related posts

Leave a Comment