Become a leader in the IoT community!
New DevHeads get a 320-point leaderboard boost when joining the DevHeads IoT Integration Community. In addition to learning and advising, active community leaders are rewarded with community recognition and free tech stuff. Start your Legendary Collaboration now!
I suggest that you take a look at this solution: https://docs.aws.amazon.com/greengrass/v2/developerguide/manage-data-streams.html
Yes, it’s possible to store sensor data locally in a CSV file every 10 seconds with one process and then read and send it to the cloud with another process. This approach can help reduce latency.
It can also ensure reliable file transmission. But ensure you have proper error handling mechanism in check.
Can you tell us more about what type of data is this? how are you sending data, frequency of sending data? what size and how long is this packet and how are you receiving, parsing and packaging the data?
Getting more info will definitely be better to comment on your problem.
The data are sensor readings for temperature and humidity. I’m planning to send the data to the IoT Hub using MQTT protocol. The frequency of sending data is around 10 samples per second, but due to latency issues, I lose about 5 samples per second. Each sample is small, about 100 bytes. I receive the data on the Raspberry Pi, parsing it, and then packaging it into CSV format for storage
Instead of collecting and sending data 10 times per second, create an array store the values and send this array as 1 packet per second
You can also collect more and send that data like array size of 20 for 2 sec interval cycle. Along side you should also store data incase of connection or data lost scenarios.
Also packet of 100 bytes is kind of big you should try to reduce it if there’s repetitive data residing inside it like MAC address etc it will make your flow lighter to execute
Thanks for the advice! I’ll implement storing data in an array and sending it as one packet per second.
CONTRIBUTE TO THIS THREAD