Hopp til hovedinnhold

I first heard about using Kubernetes for Internet of Things (IoT) and edge computing less than a year ago. I was pretty new to Kubernetes, and didn't know a lot about IoT. So it seemed like a big pile of buzzwords. And even if it seems like a lot of buzzwords to you, you should still consider Kubernetes when deciding how to manage your IoT and edge applications. In this article I will discuss some of the features and frameworks that are especially important when running Kubernetes on the edge.

First of all, let's clear up some misconceptions. At least my misconceptions from before I got into IoT. I had the idea that edge computing and IoT were the same. It turns out they're not.

IoT is all about connecting physical devices that senses and interacts with their environment to the cloud. Edge computing is bringing computing resources closer to the end users (and IoT devices) that connect. Edge computing doesn't have to involve physical devices, but can merely be a server located closer to the end users. For instance, Content Delivery Networks brings storage of content closer to end users to save bandwidth and reduce latency.

Another misconception I had, was that Kubernetes needed too much resources to run on smaller devices. But I can actually run Kubernetes on the Raspberry Pi that sits on my shelf collecting dust. k3s makes it possible to run Kubernetes clusters on everything from small devices to powerful VMs with your favorite cloud provider. How? It's opinionated and removed a lot of cruft you don't really need, and is packaged as a single binary that is less than 40MB.

We'll look at three concrete use cases, and look at different features of Kubernetes that might be useful.

Industrial IoT

Industrial IoT deployments can have millions of IoT devices. The devices usually doesn't have a lot of computing resources, and can't act as nodes in a Kubernetes cluster. However, they send the collected data to IoT gateways (also known as hubs or bridges), that often can.

IoT gateways can be nodes in a cluster, running k3s or a lightweight KubeEdge agent. Either in your main cluster, or in a local cluster on the edge. The latter is especially useful if network connectivity at the edge location is poor.

Kubernetes provides pod priority and quality of service classes that should be used to ensure the critical workloads can't be pre-empted by another workload consuming too much resources. Using these features ensure that workloads aren't killed willy-nilly, but according to some set priority. And, yes, workloads will be killed when you're running out of memory in your constrained edge environment.

Edge applications

Applications requiring low latency or a lot of bandwidth are typical candidates for edge deployments. Online streaming services, communication applications, and gaming servers are some examples that benefit from edge deployments, reducing latency and network contention.

Priority and quality of service features help schedule important workloads in this use case too. For instance, video calls must have higher priority than chat messages. Also, NetworkPolicy and the more experimental traffic shaping plugin can between them control both how a pod is allowed to communicate with others, and how much bandwidth it's allowed to consume.

Mixed and hybrid clusters

Sometimes parts of you applications can't run in a public cloud, due to compliance or special hardware requirements. Or, maybe you have some nodes in your public cloud with powerful GPUs? Kubernetes can be configured to run applications on specific nodes, or on nodes that have, e.g., GPU resources available.

There's no silver bullet

Using Kubernetes on the edge provides almost all your favorite features you're using in your centralized cloud. You can manage deployments using GitOps, have your favorite observability stack and use the same policies. However, Kubernetes can't solve scaling issues if the underlying platform on the edge doesn't support it. Furthermore, if (when!) you have to manage the hardware and operating systems your edge applications run on, you have a lot of security threats that Kubernetes can't handle alone.

If you're interested in using Kubernetes for edge deployments check out this talk from the Kubernetes IoT Edge Working Group. The working group has also published a whitepaper that discusses the edge security challenges.

Did you like the post?

Feel free to share it with friends and colleagues