Query SQL data sources

The Flux sql package provides functions for working with SQL data sources. sql.from() lets you query SQL data sources like PostgreSQL, MySQL, and SQLite, and use the results with InfluxDB dashboards, tasks, and other operations.

Query a SQL data source

To query a SQL data source:

  1. Import the sql package in your Flux query
  2. Use the sql.from() function to specify the driver, data source name (DSN), and query used to query data from your SQL data source:

PostgreSQL MySQL SQLite

  1. import "sql"
  2. sql.from(
  3. driverName: "postgres",
  4. dataSourceName: "postgresql://user:password@localhost",
  5. query: "SELECT * FROM example_table"
  6. )
  1. import "sql"
  2. sql.from(
  3. driverName: "mysql",
  4. dataSourceName: "user:password@tcp(localhost:3306)/db",
  5. query: "SELECT * FROM example_table"
  6. )
  1. // NOTE: InfluxDB OSS and InfluxDB Cloud do not have access to
  2. // the local filesystem and cannot query SQLite data sources.
  3. // Use the Flux REPL to query an SQLite data source.
  4. import "sql"
  5. sql.from(
  6. driverName: "sqlite3",
  7. dataSourceName: "file:/path/to/test.db?cache=shared&mode=ro",
  8. query: "SELECT * FROM example_table"
  9. )

See the sql.from() documentation for information about required function parameters.

Join SQL data with data in InfluxDB

One of the primary benefits of querying SQL data sources from InfluxDB is the ability to enrich query results with data stored outside of InfluxDB.

Using the air sensor sample data below, the following query joins air sensor metrics stored in InfluxDB with sensor information stored in PostgreSQL. The joined data lets you query and filter results based on sensor information that isn’t stored in InfluxDB.

  1. // Import the "sql" package
  2. import "sql"
  3. // Query data from PostgreSQL
  4. sensorInfo = sql.from(
  5. driverName: "postgres",
  6. dataSourceName: "postgresql://localhost?sslmode=disable",
  7. query: "SELECT * FROM sensors"
  8. )
  9. // Query data from InfluxDB
  10. sensorMetrics = from(bucket: "telegraf/autogen")
  11. |> range(start: -1h)
  12. |> filter(fn: (r) => r._measurement == "airSensors")
  13. // Join InfluxDB query results with PostgreSQL query results
  14. join(tables: {metric: sensorMetrics, info: sensorInfo}, on: ["sensor_id"])

Sample sensor data

The sample data generator and sample sensor information simulate a group of sensors that measure temperature, humidity, and carbon monoxide in rooms throughout a building. Each collected data point is stored in InfluxDB with a sensor_id tag that identifies the specific sensor it came from. Sample sensor information is stored in PostgreSQL.

Sample data includes:

  • Simulated data collected from each sensor and stored in the airSensors measurement in InfluxDB:

    • temperature
    • humidity
    • co
  • Information about each sensor stored in the sensors table in PostgreSQL:

    • sensor_id
    • location
    • model_number
    • last_inspected

Import and generate sample sensor data

Download and run the sample data generator

air-sensor-data.rb is a script that generates air sensor data and stores the data in InfluxDB. To use air-sensor-data.rb:

  1. Create a database to store the data.
  2. Download the sample data generator. This tool requires Ruby.

    Download Air Sensor Generator

  3. Give air-sensor-data.rb executable permissions:

    1. chmod +x air-sensor-data.rb
  4. Start the generator. Specify your database.

    1. ./air-sensor-data.rb -d database-name

    The generator begins to write data to InfluxDB and will continue until stopped. Use ctrl-c to stop the generator.

    *Note: Use the --help flag to view other configuration options.*

  5. Query your target database to ensure the generated data is writing successfully. The generator doesn’t catch errors from write requests, so it will continue running even if data is not writing to InfluxDB successfully.

    1. from(bucket: "database-name/autogen")
    2. |> range(start: -1m)
    3. |> filter(fn: (r) => r._measurement == "airSensors")

Import the sample sensor information

  1. Download and install PostgreSQL.
  2. Download the sample sensor information CSV.

    Download Sample Data

  3. Use a PostgreSQL client (psql or a GUI) to create the sensors table:

    1. CREATE TABLE sensors (
    2. sensor_id character varying(50),
    3. location character varying(50),
    4. model_number character varying(50),
    5. last_inspected date
    6. );
  4. Import the downloaded CSV sample data. Update the FROM file path to the path of the downloaded CSV sample data.

    1. COPY sensors(sensor_id,location,model_number,last_inspected)
    2. FROM '/path/to/sample-sensor-info.csv' DELIMITER ',' CSV HEADER;
  5. Query the table to ensure the data was imported correctly:

    1. SELECT * FROM sensors;