sql.to() function

The sql.to() function writes data to a SQL database.

*Function type: Output*

  1. import "sql"
  2. sql.to(
  3. driverName: "mysql",
  4. dataSourceName: "username:password@tcp(localhost:3306)/dbname?param=value",
  5. table: "example_table",
  6. batchSize: 10000
  7. )

Parameters

driverName

The driver used to connect to the SQL database.

*Data type: String*

The following drivers are available:

  • bigquery
  • mysql
  • postgres
  • snowflake
  • sqlite3 – Does not work with InfluxDB OSS or InfluxDB Cloud. More information below.
  • sqlserver, mssql

sql.to does not support Amazon Athena

The sql.to function does not support writing data to Amazon Athena.

dataSourceName

The data source name (DSN) or connection string used to connect to the SQL database. The string’s form and structure depend on the driver used.

*Data type: String*

Driver dataSourceName examples
  1. # Postgres Driver DSN
  2. postgres://pqgotest:password@localhost/pqgotest?sslmode=verify-full
  3. # MySQL Driver DSN
  4. username:password@tcp(localhost:3306)/dbname?param=value
  5. # Snowflake Driver DSNs
  6. username[:password]@accountname/dbname/schemaname?param1=value1&paramN=valueN
  7. username[:password]@accountname/dbname?param1=value1&paramN=valueN
  8. username[:password]@hostname:port/dbname/schemaname?account=<your_account>&param1=value1&paramN=valueN
  9. # SQLite Driver DSN
  10. file:/path/to/test.db?cache=shared&mode=rw
  11. # Microsoft SQL Server Driver DSNs
  12. sqlserver://username:password@localhost:1234?database=examplebdb
  13. server=localhost;user id=username;database=examplebdb;
  14. server=localhost;user id=username;database=examplebdb;azure auth=ENV
  15. server=localhost;user id=username;database=examplebdbr;azure tenant id=77e7d537;azure client id=58879ce8;azure client secret=0123456789
  16. # Google BigQuery DSNs
  17. bigquery://projectid/?param1=value&param2=value
  18. bigquery://projectid/location?param1=value&param2=value

table

The destination table.

*Data type: String*

batchSize

The number of parameters or columns that can be queued within each call to Exec. Defaults to 10000.

*Data type: Integer*

If writing to a SQLite database, set batchSize to 999 or less.

Examples

The examples below use InfluxDB secrets to populate sensitive connection credentials.

Write data to a MySQL database

  1. import "sql"
  2. import "influxdata/influxdb/secrets"
  3. username = secrets.get(key: "MYSQL_USER")
  4. password = secrets.get(key: "MYSQL_PASS")
  5. sql.to(
  6. driverName: "mysql",
  7. dataSourceName: "${username}:${password}@tcp(localhost:3306)/db",
  8. table: "example_table"
  9. )

Write data to a Postgres database

  1. import "sql"
  2. import "influxdata/influxdb/secrets"
  3. username = secrets.get(key: "POSTGRES_USER")
  4. password = secrets.get(key: "POSTGRES_PASS")
  5. sql.to(
  6. driverName: "postgres",
  7. dataSourceName: "postgresql://${username}:${password}@localhost",
  8. table: "example_table"
  9. )

Write data to a Snowflake database

  1. import "sql"
  2. import "influxdata/influxdb/secrets"
  3. username = secrets.get(key: "SNOWFLAKE_USER")
  4. password = secrets.get(key: "SNOWFLAKE_PASS")
  5. account = secrets.get(key: "SNOWFLAKE_ACCT")
  6. sql.to(
  7. driverName: "snowflake",
  8. dataSourceName: "${username}:${password}@${account}/db/exampleschema?warehouse=wh",
  9. table: "example_table"
  10. )

Write data to an SQLite database

InfluxDB OSS and InfluxDB Cloud do not have direct access to the local filesystem and cannot write to SQLite data sources. Use the Flux REPL to write to an SQLite data source on your local filesystem.

  1. import "sql"
  2. sql.to(
  3. driverName: "sqlite3",
  4. dataSourceName: "file:/path/to/test.db?cache=shared&mode=rw",
  5. table: "example_table"
  6. )

Write data to a SQL Server database

  1. import "sql"
  2. import "influxdata/influxdb/secrets"
  3. username = secrets.get(key: "SQLSERVER_USER")
  4. password = secrets.get(key: "SQLSERVER_PASS")
  5. sql.to(
  6. driverName: "sqlserver",
  7. dataSourceName: "sqlserver://${username}:${password}@localhost:1234?database=examplebdb",
  8. table: "Example.Table"
  9. )

SQL Server ADO authentication

Use one of the following methods to provide SQL Server authentication credentials as ActiveX Data Objects (ADO) connection string parameters:

Retrieve authentication credentials from environment variables
  1. azure auth=ENV
Retrieve authentication credentials from a file

InfluxDB OSS and InfluxDB Cloud user interfaces do not provide access to the underlying file system and do not support reading credentials from a file. To retrieve SQL Server credentials from a file, execute the query in the Flux REPL on your local machine.

  1. azure auth=C:\secure\azure.auth
Specify authentication credentials in the connection string
  1. # Example of providing tenant ID, client ID, and client secret token
  2. azure tenant id=77...;azure client id=58...;azure client secret=0cf123..
  3. # Example of providing tenant ID, client ID, certificate path and certificate password
  4. azure tenant id=77...;azure client id=58...;azure certificate path=C:\secure\...;azure certificate password=xY...
  5. # Example of providing tenant ID, client ID, and Azure username and password
  6. azure tenant id=77...;azure client id=58...;azure username=some@myorg;azure password=a1...
Use a managed identity in an Azure VM

For information about managed identities, see Microsoft managed identities.

  1. azure auth=MSI

Write to a BigQuery database

  1. import "sql"
  2. import "influxdata/influxdb/secrets"
  3. projectID = secrets.get(key: "BIGQUERY_PROJECT_ID")
  4. apiKey = secrets.get(key: "BIGQUERY_APIKEY")
  5. sql.to(
  6. driverName: "bigquery",
  7. dataSourceName: "bigquery://${projectID}/?apiKey=${apiKey}",
  8. table:"exampleTable"
  9. )

Common BigQuery URL parameters

  • dataset - BigQuery dataset ID. When set, you can use unqualified table names in queries.

BigQuery authentication parameters

The Flux BigQuery implementation uses the Google Cloud Go SDK. Provide your authentication credentials using one of the following methods:

  • The GOOGLE_APPLICATION_CREDENTIALS environment variable that identifies the location of your credential JSON file.
  • Provide your BigQuery API key using the apiKey URL parameter in your BigQuery DSN.

    Example apiKey URL parameter
    1. bigquery://projectid/?apiKey=AIzaSyB6XK8IO5AzKZXoioQOVNTFYzbDBjY5hy4
  • Provide your base-64 encoded service account, refresh token, or JSON credentials using the credentials URL parameter in your BigQuery DSN.

    Example credentials URL parameter
    1. bigquery://projectid/?credentials=eyJ0eXBlIjoiYXV0...