What possibilities are open to us if you could train an ultra-robust IoT platform to recognize and infer a wide range of environmental occurrences - such as human activities or the mechanical health of objects?

SC Faculty and Researchers

Yuvraj Agarwal

Chris Harrison

It’s 2050. The first generation raised in an interconnected world, the “Millenials”, are starting to experience the ailments of old age.

Emily, a member of this generation, goes to her doctor and says she’s starting to experience memory loss. She loses track of her keys, her wallet and her schedule. Emily lives alone. Her doctor gives her a set of Mites, quarter-sized computing devices, and tells her to place them around her house. They have a combination of physical sensors that detect what is happening nearby. The Mites connect to Emily’s smart speakers and tell her when her laundry is done and when she’s left the oven on. They are helpful to Emily, but even more helpful to her doctor.

With Emily’s permission, her doctor downloads information from the Mites’ accompanying software. At Emily’s next appointment, her doctor asks if she took any phone calls the previous day. Emily says no. Her doctor checks the information provided by the Mites. Though the sensors don’t hold the contents of the conversation, they registered the distinct sound of Emily’s ringtone and 20 minutes of chatter. He asks Emily how she’s been sleeping. Emily can’t remember. The Mites reveal the sound of sheets ruffling and Emily coughing and a few trips to the bathroom throughout the night. Emily’s doctor is concerned.

One day, the Mites identify the sound and vibration of a body hitting the ground and detect Emily’s heat signature on the bedroom floor. Acting on information from the Mites, her home emergency response system alerts medics.

This is one potential application for Mites, actual “super-sensing” devices that we have been developing in collaboration with the Future Interfaces group in the Human-Computer Interaction Institute. The square, flat devices, about 2 inches on each side in size, combine nine different sensors — including ones for vibration, audio, light, humidity, temperature, magnetism, sound — to create one all-purpose “Mite” sensor that can be plugged into a small USB wall plug adapter. The sensors then connect over wifi to classifying software equipped with machine learning functionality that match a set of environmental measures to the name of an everyday event.

  • Hardware:

    We set out to design a novel sensor board, we call Mites, which integrates a myriad of sensing capabilities, minus a camera. Not only does this serve as an interesting vehicle for investigation, this is an exemplar embodiment of board design using many low-level sensors – one that we hoped could approach the versatility of camera-based approaches, but without the stigma and privacy implications. We incorporated nine physical sensors capturing twelve distinct sensor dimensions.

    The heart of our sensor tag design is a Particle Photon STM32F205 microcontroller with a 120MHz CPU. We strategically placed sensors on the PCB to ensure optimal performance (e.g., ambient light sensor faces outwards), and we spatially separated analog and digital components to isolate unintended electrical noise from affecting the performance of neighboring components. For connectivity, we considered industry standards such as Ethernet, ZigBee, and Bluetooth, but ultimately chose WiFi for its ubiquity, ease- of-setup, range and high bandwidth.

  • Firmware:

    Our firmware featurizes data on-board. Not only does this reduce network overhead, but it also denatures the data, better protecting privacy while still preserving the essence of the signal. In particular, we selected features that do not permit reconstruction of the original signal.

    Data from our high-sample-rate sensors are transformed into a spectral representation via a 256-sample sliding window FFT (10% overlapping), ten times per second. We also discard phase information. Our raw 8x8 GridEye matrix is flattened into row and column means (16 features). For our other low-sample-rate sensors, we compute seven statistical features (min, max, range, mean, sum, standard deviation and centroid) on a rolling one-second buffer (at 10Hz). The featurized data for every sensor is concatenated and sent to a server as a single data frame, encrypted with 128-bit AES.

    We tune our raw sensor sampling rates over the course of deployment, collecting data at the speed needed to capture environmental events, but with no unnecessary fidelity. Specifically, we sample temperature, humidity, pressure, light color, light intensity, magnetometer, Wifi RSSI, GridEye and PIR motion sensors at 10Hz. All three axes of the accelerometer are sampled at 4 kHz, our microphone at 17kHz, and our and EMI sensor at 500 kHz. Note that when accelerometers are sampled at high speed, they can detect minute oscillatory vibrations propagating through structural elements in an environment (e.g., dry-wall, studs, joists), very much like a geophone.

  • Backend:

    Our backend is written on node.js, a popular Javascript web framework. We have provided a number of features and capabilities to make our backend work with the mites. In particular, we have developed a load balancer that enables multiple node.js instances to be run in parallel and share the load of multiple mites posting data. We also provide end to end encryption support for all data send to/from the mites and our backend for security. We have implemented functionality to change the configuration of each mite, such as the data rate it posts at, which sensors are enabled on each mite, etc.

    We also support fine granularity time synchronization for all mites in our deployment with our backend such that the clock of each mite is within 1ms of our backend clock (which itself uses NTP). Our over-the-air (OTA) upgrade functionality enables multiple nodes to be upgraded, either individually or all at once. Finally, we have implemented an extensive logging framework that logs various statistics from the mites posting data to this backend to help with management tasks and performance tuning.

    All of these functionalities expose well defined REST APIs over an HTTPS interface to be able to get status information on all the mites (uptime, who is active, what rate, etc) as well as perform management functions (e.g. OTA upgrade, reboot, change sensor frequency, change server end points, etc).

  • Middleware:

    The data from our node.js backend is decrypted, deserialized and separated out into individual sensor streams from each mite. Our node.js backend is stateless and does not actually store any sensor data itself. Instead, we organize,tag and store sensor data streams in our open source IoT software stack, called GioTTO (www.iotexpedition.org), which includes a metadata management layer as well as several options for storing time series data.

    To store sensor data streams from various mites into our middleware, we create sensor "points" and add appropriate distinguishing metadata, such as the sensor type (e.g. temperature), the statistical feature it represents (e.g. min, max,), which Mites it belongs to (isPartOf=Mites101) and where they are located, what frequency the data is being sampled at (e.g. FREQ=10Hz), etc. Representing Mites and their sensors in a uniform schema makes applications built on top of our stack portable. Our middleware also provides authentication, confidentiality and access control for accessing data using a number of well defined REST APIs. We implement OAuth 2.0, an industry standard for user authentication, as well as SSL/TLS for transport layer end-to-end encryption. In addition, our middleware implements a robust and scalable Access Control Layer (ACL) that enables flexible specification of who (which user/app) has access to what resource (which sensors).

Learn More About This Project