Back to

Measuring Strategy - Measuring every 1-2 seconds and upload aggregated data for every 1-3 minutes

Further to my comment from 2nd May in category sensors, (as I can’t reopen the ‘answered’ question) I would like to discuss:

Would it make sense to aggregate the data ? Further, will we have good ‘no’ and ‘co2’ values when we measure every 2 seconds and build aggregates ?

If we agree on this, we should have a general agreement from the system architects if this makes sense. If yes, we need an agreement/direction how to change the code. I can make some changes but general rules on calling subroutines, how to aggregate and on the data model should be done before.

We could measure every 1-2 seconds (if the kit is able) and upload the aggregated data every 1-3 minutes. Three records, a min, a max and an avg record adding a field with the amount of raw values into the records.

In constants.h, you change the values ​​as indicated below:

#define DEFAULT_TIME_UPDATE “1” //Time between update and upgrade
#define DEFAULT_MIN_UPDATES “60” //Minimum number of updates before posting

Thus take one reading every second and will post to the site every minute.

I cannot determine what impact a more frequent reading has to the power consumption. I would assume that a more frequent activity also requires more power (especially the complex process of reading NO and CO2 values due to heating up the sensor (?) ).

Most sensor values should be very constant (temp, light,…), only noise would be very interesting if measured more frequently (noise (!)). For this one measurement it would be very interesting to get the peak values.