Calculating Max, Min, and Average Sensor Readings with GridDB Narrow Tables in IoT Project

Hello community,

I’m currently working on an IoT project, and I’ve chosen GridDB as my database to store
time-series data from 5 different sensors. I’m using narrow tables to store sensor readings, each
entry containing the sensor ID, reading value, and a timestamp.
I’m trying to determine the Maximum, Minimum, and Average sensor readings across all 5
sensors within a specific time period. The desired result should include three columns (max,
min, average) and five rows (one for each sensor).

To provide better context, here’s the command I used to create the table:

CREATE TABLE sensor_reading(
sensor_reading_id INTEGER PRIMARY KEY,
sensor_reading_sensor_id STRING NOT NULL,
sensor_reading_read_value DOUBLE NOT NULL
);

I would greatly appreciate any guidance or example queries on how to achieve this in GridDB.

Thank you!