influx write

  • influx CLI 2.0.0+
  • InfluxDB 2.0.0+
  • Updated in CLI v2.0.5

The influx write command writes data to InfluxDB via stdin or from a specified file. Write data using line protocol, annotated CSV, or extended annotated CSV. If you write CSV data, CSV annotations determine how the data translates into line protocol.

Usage

  1. influx write [flags]
  2. influx write [command]

Required data

To write data to InfluxDB, you must provide the following for each row:

  • measurement
  • field
  • value

Line protocol

In line protocol, the structure of the line data determines the measurement, field, and value.

Annotated CSV

In annotated CSV, measurements, fields, and values are represented by the _measurement, _field, and _value columns. Their types are determined by CSV annotations. To successfully write annotated CSV to InfluxDB, include all annotation rows.

Extended annotated CSV

In extended annotated CSV, measurements, fields, and values and their types are determined by CSV annotations.

Subcommands

SubcommandDescription
dryrunWrite to stdout instead of InfluxDB

Flags

FlagDescriptionInput typeMaps to ?
-c—active-configCLI configuration to use for commandstring
-b—bucketBucket name (mutually exclusive with —bucket-id)stringINFLUX_BUCKET_NAME
—bucket-idBucket ID (mutually exclusive with —bucket)stringINFLUX_BUCKET_ID
—configs-pathPath to influx CLI configurations (default ~/.influxdbv2/configs)stringINFLUX_CONFIGS_PATH
—compressionInput compression (none or gzip, default is none unless input file ends with .gz.)string
—debugOutput errors to stderr
—encodingCharacter encoding of input (default UTF-8)string
—error-filePath to a file used for recording rejected row errorsstring
-f—fileFile to importstringArray
—formatInput format (lp or csv, default lp)string
—headerPrepend header line to CSV input datastring
-h—helpHelp for the write command
—hostHTTP address of InfluxDB (default http://localhost:8086)stringINFLUX_HOST
—max-line-lengthMaximum number of bytes that can be read for a single line (default 16000000)integer
-o—orgOrganization name (mutually exclusive with —org-id)stringINFLUX_ORG
—org-idOrganization ID (mutually exclusive with —org)stringINFLUX_ORG_ID
-p—precisionPrecision of the timestamps (default ns)stringINFLUX_PRECISION
—rate-limitThrottle write rate (examples: 5 MB / 5 min or 1MB/s).string
—skip-verifySkip TLS certificate verificationINFLUX_SKIP_VERIFY
—skipHeaderSkip first n rows of input datainteger
—skipRowOnErrorOutput CSV errors to stderr, but continue processing
-t—tokenAPI tokenstringINFLUX_TOKEN
-u—urlURL to import data fromstringArray

Examples

Authentication credentials

The examples below assume your InfluxDB host, organization, and token are provided by the active influx CLI configuration. If you do not have a CLI configuration set up, use the appropriate flags to provide these required credentials.

Write line protocol
Write CSV data

Line protocol

Write line protocol via stdin
  1. influx write --bucket example-bucket "
  2. m,host=host1 field1=1.2
  3. m,host=host2 field1=2.4
  4. m,host=host1 field2=5i
  5. m,host=host2 field2=3i
  6. "
Write line protocol from a file
  1. influx write \
  2. --bucket example-bucket \
  3. --file path/to/line-protocol.txt
Write line protocol from multiple files
  1. influx write \
  2. --bucket example-bucket \
  3. --file path/to/line-protocol-1.txt \
  4. --file path/to/line-protocol-2.txt
Write line protocol from a URL
  1. influx write \
  2. --bucket example-bucket \
  3. --url https://example.com/line-protocol.txt
Write line protocol from multiple URLs
  1. influx write \
  2. --bucket example-bucket \
  3. --url https://example.com/line-protocol-1.txt \
  4. --url https://example.com/line-protocol-2.txt
Write line protocol from multiple sources
  1. influx write \
  2. --bucket example-bucket \
  3. --file path/to/line-protocol-1.txt \
  4. --url https://example.com/line-protocol-2.txt
Write line protocol from a compressed file
  1. # The influx CLI assumes files with the .gz extension use gzip compression
  2. influx write \
  3. --bucket example-bucket \
  4. --file path/to/line-protocol.txt.gz
  5. # Specify gzip compression for gzipped files without the .gz extension
  6. influx write \
  7. --bucket example-bucket \
  8. --file path/to/line-protocol.txt.comp \
  9. --compression gzip

CSV

Write annotated CSV data via stdin
  1. influx write \
  2. --bucket example-bucket \
  3. --format csv \
  4. "#group,false,false,false,false,true,true
  5. #datatype,string,long,dateTime:RFC3339,double,string,string
  6. #default,_result,,,,,
  7. ,result,table,_time,_value,_field,_measurement
  8. ,,0,2020-12-18T18:16:11Z,72.7,temp,sensorData
  9. ,,0,2020-12-18T18:16:21Z,73.8,temp,sensorData
  10. ,,0,2020-12-18T18:16:31Z,72.7,temp,sensorData
  11. ,,0,2020-12-18T18:16:41Z,72.8,temp,sensorData
  12. ,,0,2020-12-18T18:16:51Z,73.1,temp,sensorData
  13. "
Write extended annotated CSV data via stdin
  1. influx write \
  2. --bucket example-bucket \
  3. --format csv \
  4. "#constant measurement,sensorData
  5. #datatype,datetime:RFC3339,double
  6. time,temperature
  7. 2020-12-18T18:16:11Z,72.7
  8. 2020-12-18T18:16:21Z,73.8
  9. 2020-12-18T18:16:31Z,72.7
  10. 2020-12-18T18:16:41Z,72.8
  11. 2020-12-18T18:16:51Z,73.1
  12. "
Write annotated CSV data from a file
  1. influx write \
  2. --bucket example-bucket \
  3. --file path/to/data.csv
Write annotated CSV data from multiple files
  1. influx write \
  2. --bucket example-bucket \
  3. --file path/to/data-1.csv \
  4. --file path/to/data-2.csv
Write annotated CSV data from a URL
  1. influx write \
  2. --bucket example-bucket \
  3. --url https://example.com/data.csv
Write annotated CSV data from multiple URLs
  1. influx write \
  2. --bucket example-bucket \
  3. --url https://example.com/data-1.csv \
  4. --url https://example.com/data-2.csv
Write annotated CSV data from multiple sources
  1. influx write \
  2. --bucket example-bucket \
  3. --file path/to/data-1.csv \
  4. --url https://example.com/data-2.csv
Prepend CSV data with annotation headers
  1. influx write \
  2. --bucket example-bucket \
  3. --header "#constant measurement,birds" \
  4. --header "#datatype dateTime:2006-01-02,long,tag" \
  5. --file path/to/data.csv
Write annotated CSV data from a compressed file
  1. # The influx CLI assumes files with the .gz extension use gzip compression
  2. influx write \
  3. --bucket example-bucket \
  4. --file path/to/data.csv.gz
  5. # Specify gzip compression for gzipped files without the .gz extension
  6. influx write \
  7. --bucket example-bucket \
  8. --file path/to/data.csv.comp \
  9. --compression gzip
Write annotated CSV data using rate limiting
  1. influx write \
  2. --bucket example-bucket \
  3. --file path/to/data.csv \
  4. --rate-limit 5 MB / 5 min

write