Jitters in Operating Systems for the Internet of Things
The Internet of Things (IoT) is an extension of the internet into the physical world through the use of sensing, actuation, control, and interaction with embedded devices. A large number of IoT devices are being deployed in the world. The emerging applications involving IoT require reliable network connectivity. Latency is one of the most critical network performance metric that will determine the user experience with IoT applications. There are two aspects of latency metric – the overall delay and the jitters. Most of the focus is on low delay but many applications, especially the ones with real-time-like requirements, also need low jitters in latency to have predictable protocols or interactions at the system level. This thesis presents a study of jitters in the IoT operating systems observed through various networking-related operations and systems. The execution and performance of the application can be greatly affected by the characteristics of an Operating System (OS) in the IoT system. This thesis presents a study of network stack performance, layer-wise packet trace, and its analysis. The key focus of the analysis is identifying the presence of jitters in the IoT OS and the contributing factors behind their presence. The approach taken by this thesis is that it performs a series of measurement studies of basic applications on IoT hardware and OS platforms. We evaluate this study with two OS – RIOT and Contiki OS and two IoT hardware platforms – IoTLAB-M3 open node and TelosB. This thesis provides guidance on the achievable network performance and characteristics for different system requirements of IoT applications.