Back when we started this current semester, our plan was to create a system that connects multiple clients based on Arduino microcontrollers with a central unit for the room that is able to send data and receive its processed results on AWS continuously. The system’s structure was detailed in this picture:
In order to build our first prototype, we wanted to fully implement one direction of communication from our system’s infrastructure: Create a fully operable Arduino -> Raspberry Pi -> AWS IoT Core data communication channel. To build this initial prototype, we ordered a wide variety of sensors, many of which arrived in a malfunctioning state.
We hooked up our initial prototype as follows: An Arduino Uno controller is positioned at the Milab class, with an IR distance sensor sticking to the edge of the table. The controller is connected to an HC-06 bluetooth pass-through module.
A Raspberry Pi 3 B is located at a nearby room, periodically trying to ping the HC-06 module. Once a connection has been established, the Arduino starts sending data periodically (currently every 0.5 seconds) to the Raspberry Pi, consisting of sensor reading and the ID of the sensor.
Once the Raspberry Pi receives a packet of data, it sends a mesasge to AWS IoT Core using a Thing Shadow representing the Raspberry Pi treated as a “room”. The message includes a field representing the ID of the Arduino controller, so data for all tables is stored at the same table for further processing, with an option to filter out a specific table.
In IoT Core, an Action Rule has been defined to store the received message’s data in a designed table corresponding to the current room, and hosted in the AWS DynamoDB service. Using the table, one can process and segment the receive data based on the keys and fields sent over time.
With this database continuously filled out, our initial prototype is complete. With that being said, we did run into a few problems with the current system:
- Sending a message every 0.5 seconds caused us to nearly fill out the free monthly quota in a couple hours. Since our system requires us to send tips and info in Real-Time, we sending data to AWS every 5-10 seconds, to keep our system responsive and yet vastly reduce the message load as we plan to add more sensors and tables to our system.A similar option being considered is gathering up data and sending it in sparser, larger messages, such as sending 60 bundled readings every half a minute.
- The initial run of the system has taught us that it is nearly impossible to keep a system running a script for extended periods of time without proper fail-safes. From exceptions caused by miscommunications, to power outages, to the Raspberry Pi randomly shutting off after a few hours, there are many problems that need to be addressed. We should plan to start the script as the controller boots up and reinitialize it whenever we identify that it stopped.
- The data is currently being transferred to AWS as a “Payload” JSON field, making it hard to process and filter out specific fields inside the Payload. We need to have a better understanding of how this data can be processed to store it in a useful manner on DynamoDB.
- Many of our current sensors are malfunctioning, and it requires re-evaluation of the data we can safely use for the next upcoming weeks, until we find replacements.
For next week, we hope to work on solutions for most of these issues, as well as pair up more sensors and possibly an additional Arduino controller to further analyze the behavior patterns in a room.