This is an old site, looking for next year's edition? Plase visit WASM I/0 2025

Session

Look who's talking!

Detail illustration WASM I/O Detail illustration WASM I/O
Dan Mihai Dumitriu
blend-mode

Programmable Embedded Vision Sensors

Dan Mihai Dumitriu - Midokura

WebAssembly is perfect for embedded systems that need to be programmed over the air. AI powered sensors are such an example. We use Wasm for isolating 3rd party code as well as enabling polyglot development on embedded vision sensors.

Embedded software development has traditionally relied on a monolithic approach, with firmware written by a single vendor and infrequent updates. Many IoT devices lack a full Linux OS and hardware-based memory isolation, thus safety is an issue. As IoT devices become increasingly connected to the cloud, there is a need for customization and frequent updates.

We believe that WebAssembly (Wasm) has the potential to change this paradigm by enabling the creation of truly customizable devices. Our runtime agent and cloud mgmt, is like Kubernetes for embedded devices. Using AoT (ahead of time) compilation to native, we can achieve very good performance.

We have also developed a model for vision AI called Vision Sensing Application on top of the Wasm layer. A cloud-based service automatically specializes the application for the deployment targets, removing the need for developers to be concerned with device architecture or capabilities.

To further streamline the development process, we have added a REST API and a visual programming interface inspired by Node-RED.

A brief demo will be shown.

View all sessions