ANALYZE Statements

ANALYZE statements are used to collect statistics for existing tables and store the result to catalog. Only ANALYZE TABLE statements are supported now, and need to be triggered manually instead of automatically.

Attention Currently, ANALYZE TABLE only supports in batch mode. Only existing table is supported, and an exception will be thrown if the table is a view or table not exists.

Run an ANALYZE TABLE statement

Java

ANALYZE TABLE statements can be executed with the executeSql() method of the TableEnvironment.

The following examples show how to run a ANALYZE TABLE statement in TableEnvironment.

Scala

ANALYZE TABLE statements can be executed with the executeSql() method of the TableEnvironment.

The following examples show how to run a ANALYZE TABLE statement in TableEnvironment.

Python

ANALYZE TABLE statements can be executed with the execute_sql() method of the TableEnvironment.

The following examples show how to run a ANALYZE TABLE statement in TableEnvironment.

SQL CLI

ANALYZE TABLE statements can be executed in SQL CLI.

The following examples show how to run a ANALYZE TABLE statement in SQL CLI.

Java

  1. TableEnvironment tableEnv = TableEnvironment.create(...);
  2. // register a non-partition table named "Store"
  3. tableEnv.executeSql(
  4. "CREATE TABLE Store (" +
  5. " `id` BIGINT NOT NULl," +
  6. " `location` VARCHAR(32)," +
  7. " `owner` VARCHAR(32)" +
  8. ") with (...)");
  9. // register a partition table named "Orders"
  10. tableEnv.executeSql(
  11. "CREATE TABLE Orders (" +
  12. " `id` BIGINT NOT NULl," +
  13. " `product` VARCHAR(32)," +
  14. " `amount` INT," +
  15. " `sold_year` BIGINT," +
  16. " `sold_month` BIGINT," +
  17. " `sold_day` BIGINT" +
  18. ") PARTITIONED BY (`sold_year`, `sold_month`, `sold_day`) "
  19. ") with (...)");
  20. // Non-partition table, collect row count.
  21. tableEnv.executeSql("ANALYZE TABLE Store COMPUTE STATISTICS");
  22. // Non-partition table, collect row count and statistics for all columns.
  23. tableEnv.executeSql("ANALYZE TABLE Store COMPUTE STATISTICS FOR ALL COLUMNS");
  24. // Non-partition table, collect row count and statistics for column `location`.
  25. tableEnv.executeSql("ANALYZE TABLE Store COMPUTE STATISTICS FOR COLUMNS location");
  26. // Suppose table "Orders" has 4 partitions with specs:
  27. // Partition1 : (sold_year='2022', sold_month='1', sold_day='10')
  28. // Partition2 : (sold_year='2022', sold_month='1', sold_day='11')
  29. // Partition3 : (sold_year='2022', sold_month='2', sold_day='10')
  30. // Partition4 : (sold_year='2022', sold_month='2', sold_day='11')
  31. // Partition table, collect row count for Partition1.
  32. tableEnv.executeSql("ANALYZE TABLE Orders PARTITION(sold_year='2022', sold_month='1', sold_day='10') COMPUTE STATISTICS");
  33. // Partition table, collect row count for Partition1 and Partition2.
  34. tableEnv.executeSql("ANALYZE TABLE Orders PARTITION(sold_year='2022', sold_month='1', sold_day) COMPUTE STATISTICS");
  35. // Partition table, collect row count for all partitions.
  36. tableEnv.executeSql("ANALYZE TABLE Orders PARTITION(sold_year, sold_month, sold_day) COMPUTE STATISTICS");
  37. // Partition table, collect row count and statistics for all columns on partition1.
  38. tableEnv.executeSql("ANALYZE TABLE Orders PARTITION(sold_year='2022', sold_month='1', sold_day='10') COMPUTE STATISTICS FOR ALL COLUMNS");
  39. // Partition table, collect row count and statistics for all columns on partition1 and partition2.
  40. tableEnv.executeSql("ANALYZE TABLE Orders PARTITION(sold_year='2022', sold_month='1', sold_day) COMPUTE STATISTICS FOR ALL COLUMNS");
  41. // Partition table, collect row count and statistics for all columns on all partitions.
  42. tableEnv.executeSql("ANALYZE TABLE Orders PARTITION(sold_year, sold_month, sold_day) COMPUTE STATISTICS FOR ALL COLUMNS");
  43. // Partition table, collect row count and statistics for column `amount` on partition1.
  44. tableEnv.executeSql("ANALYZE TABLE Orders PARTITION(sold_year='2022', sold_month='1', sold_day='10') COMPUTE STATISTICS FOR COLUMNS amount");
  45. // Partition table, collect row count and statistics for `amount` and `product` on partition1 and partition2.
  46. tableEnv.executeSql("ANALYZE TABLE Orders PARTITION(sold_year='2022', sold_month='1', sold_day) COMPUTE STATISTICS FOR COLUMNS amount, product");
  47. // Partition table, collect row count and statistics for column `amount` and `product` on all partitions.
  48. tableEnv.executeSql("ANALYZE TABLE Orders PARTITION(sold_year, sold_month, sold_day) COMPUTE STATISTICS FOR COLUMNS amount, product");

Scala

  1. val tableEnv = TableEnvironment.create(...)
  2. // register a non-partition table named "Store"
  3. tableEnv.executeSql(
  4. "CREATE TABLE Store (" +
  5. " `id` BIGINT NOT NULl," +
  6. " `location` VARCHAR(32)," +
  7. " `owner` VARCHAR(32)" +
  8. ") with (...)");
  9. // register a partition table named "Orders"
  10. tableEnv.executeSql(
  11. "CREATE TABLE Orders (" +
  12. " `id` BIGINT NOT NULl," +
  13. " `product` VARCHAR(32)," +
  14. " `amount` INT," +
  15. " `sold_year` BIGINT," +
  16. " `sold_month` BIGINT," +
  17. " `sold_day` BIGINT" +
  18. ") PARTITIONED BY (`sold_year`, `sold_month`, `sold_day`) "
  19. ") with (...)");
  20. // Non-partition table, collect row count.
  21. tableEnv.executeSql("ANALYZE TABLE Store COMPUTE STATISTICS");
  22. // Non-partition table, collect row count and statistics for all columns.
  23. tableEnv.executeSql("ANALYZE TABLE Store COMPUTE STATISTICS FOR ALL COLUMNS");
  24. // Non-partition table, collect row count and statistics for column `location`.
  25. tableEnv.executeSql("ANALYZE TABLE Store COMPUTE STATISTICS FOR COLUMNS location");
  26. // Suppose table "Orders" has 4 partitions with specs:
  27. // Partition1 : (sold_year='2022', sold_month='1', sold_day='10')
  28. // Partition2 : (sold_year='2022', sold_month='1', sold_day='11')
  29. // Partition3 : (sold_year='2022', sold_month='2', sold_day='10')
  30. // Partition4 : (sold_year='2022', sold_month='2', sold_day='11')
  31. // Partition table, collect row count for Partition1.
  32. tableEnv.executeSql("ANALYZE TABLE Orders PARTITION(sold_year='2022', sold_month='1', sold_day='10') COMPUTE STATISTICS");
  33. // Partition table, collect row count for Partition1 and Partition2.
  34. tableEnv.executeSql("ANALYZE TABLE Orders PARTITION(sold_year='2022', sold_month='1', sold_day) COMPUTE STATISTICS");
  35. // Partition table, collect row count for all partitions.
  36. tableEnv.executeSql("ANALYZE TABLE Orders PARTITION(sold_year, sold_month, sold_day) COMPUTE STATISTICS");
  37. // Partition table, collect row count and statistics for all columns on partition1.
  38. tableEnv.executeSql("ANALYZE TABLE Orders PARTITION(sold_year='2022', sold_month='1', sold_day='10') COMPUTE STATISTICS FOR ALL COLUMNS");
  39. // Partition table, collect row count and statistics for all columns on partition1 and partition2.
  40. tableEnv.executeSql("ANALYZE TABLE Orders PARTITION(sold_year='2022', sold_month='1', sold_day) COMPUTE STATISTICS FOR ALL COLUMNS");
  41. // Partition table, collect row count and statistics for all columns on all partitions.
  42. tableEnv.executeSql("ANALYZE TABLE Orders PARTITION(sold_year, sold_month, sold_day) COMPUTE STATISTICS FOR ALL COLUMNS");
  43. // Partition table, collect row count and statistics for column `amount` on partition1.
  44. tableEnv.executeSql("ANALYZE TABLE Orders PARTITION(sold_year='2022', sold_month='1', sold_day='10') COMPUTE STATISTICS FOR COLUMNS amount");
  45. // Partition table, collect row count and statistics for `amount` and `product` on partition1 and partition2.
  46. tableEnv.executeSql("ANALYZE TABLE Orders PARTITION(sold_year='2022', sold_month='1', sold_day) COMPUTE STATISTICS FOR COLUMNS amount, product");
  47. // Partition table, collect row count and statistics for column `amount` and `product` on all partitions.
  48. tableEnv.executeSql("ANALYZE TABLE Orders PARTITION(sold_year, sold_month, sold_day) COMPUTE STATISTICS FOR COLUMNS amount, product");

Python

  1. table_env = TableEnvironment.create(...)
  2. # register a non-partition table named "Store"
  3. table_env.execute_sql(
  4. "CREATE TABLE Store (" +
  5. " `id` BIGINT NOT NULl," +
  6. " `location` VARCHAR(32)," +
  7. " `owner` VARCHAR(32)" +
  8. ") with (...)");
  9. # register a partition table named "Orders"
  10. table_env.execute_sql(
  11. "CREATE TABLE Orders (" +
  12. " `id` BIGINT NOT NULl," +
  13. " `product` VARCHAR(32)," +
  14. " `amount` INT," +
  15. " `sold_year` BIGINT," +
  16. " `sold_month` BIGINT," +
  17. " `sold_day` BIGINT" +
  18. ") PARTITIONED BY (`sold_year`, `sold_month`, `sold_day`) "
  19. ") with (...)");
  20. # Non-partition table, collect row count.
  21. table_env.execute_sql("ANALYZE TABLE Store COMPUTE STATISTICS");
  22. # Non-partition table, collect row count and statistics for all columns.
  23. table_env.execute_sql("ANALYZE TABLE Store COMPUTE STATISTICS FOR ALL COLUMNS");
  24. # Non-partition table, collect row count and statistics for column `location`.
  25. table_env.execute_sql("ANALYZE TABLE Store COMPUTE STATISTICS FOR COLUMNS location");
  26. # Suppose table "Orders" has 4 partitions with specs:
  27. # Partition1 : (sold_year='2022', sold_month='1', sold_day='10')
  28. # Partition2 : (sold_year='2022', sold_month='1', sold_day='11')
  29. # Partition3 : (sold_year='2022', sold_month='2', sold_day='10')
  30. # Partition4 : (sold_year='2022', sold_month='2', sold_day='11')
  31. # Partition table, collect row count for Partition1.
  32. table_env.execute_sql("ANALYZE TABLE Orders PARTITION(sold_year='2022', sold_month='1', sold_day='10') COMPUTE STATISTICS");
  33. # Partition table, collect row count for Partition1 and Partition2.
  34. table_env.execute_sql("ANALYZE TABLE Orders PARTITION(sold_year='2022', sold_month='1', sold_day) COMPUTE STATISTICS");
  35. # Partition table, collect row count for all partitions.
  36. table_env.execute_sql("ANALYZE TABLE Orders PARTITION(sold_year, sold_month, sold_day) COMPUTE STATISTICS");
  37. # Partition table, collect row count and statistics for all columns on partition1.
  38. table_env.execute_sql("ANALYZE TABLE Orders PARTITION(sold_year='2022', sold_month='1', sold_day='10') COMPUTE STATISTICS FOR ALL COLUMNS");
  39. # Partition table, collect row count and statistics for all columns on partition1 and partition2.
  40. table_env.execute_sql("ANALYZE TABLE Orders PARTITION(sold_year='2022', sold_month='1', sold_day) COMPUTE STATISTICS FOR ALL COLUMNS");
  41. # Partition table, collect row count and statistics for all columns on all partitions.
  42. table_env.execute_sql("ANALYZE TABLE Orders PARTITION(sold_year, sold_month, sold_day) COMPUTE STATISTICS FOR ALL COLUMNS");
  43. # Partition table, collect row count and statistics for column `amount` on partition1.
  44. table_env.execute_sql("ANALYZE TABLE Orders PARTITION(sold_year='2022', sold_month='1', sold_day='10') COMPUTE STATISTICS FOR COLUMNS amount");
  45. # Partition table, collect row count and statistics for `amount` and `product` on partition1 and partition2.
  46. table_env.execute_sql("ANALYZE TABLE Orders PARTITION(sold_year='2022', sold_month='1', sold_day) COMPUTE STATISTICS FOR COLUMNS amount, product");
  47. # Partition table, collect row count and statistics for column `amount` and `product` on all partitions.
  48. table_env.execute_sql("ANALYZE TABLE Orders PARTITION(sold_year, sold_month, sold_day) COMPUTE STATISTICS FOR COLUMNS amount, product");

SQL CLI

  1. Flink SQL> CREATE TABLE Store (
  2. > `id` BIGINT NOT NULl,
  3. > `location` VARCHAR(32),
  4. > `owner` VARCHAR(32)
  5. > ) with (
  6. > ...
  7. > );
  8. [INFO] Table has been created.
  9. Flink SQL> CREATE TABLE Orders (
  10. > `id` BIGINT NOT NULl,
  11. > `product` VARCHAR(32),
  12. > `amount` INT,
  13. > `sold_year` BIGINT,
  14. > `sold_month` BIGINT,
  15. > `sold_day` BIGINT
  16. > ) PARTITIONED BY (`sold_year`, `sold_month`, `sold_day`)
  17. > ) with (
  18. > ...
  19. > );
  20. [INFO] Table has been created.
  21. Flink SQL> ANALYZE TABLE Store COMPUTE STATISTICS;
  22. [INFO] Execute statement succeed.
  23. Flink SQL> ANALYZE TABLE Store COMPUTE STATISTICS FOR ALL COLUMNS;
  24. [INFO] Execute statement succeed.
  25. Flink SQL> ANALYZE TABLE Store COMPUTE STATISTICS FOR COLUMNS location;
  26. [INFO] Execute statement succeed.
  27. Flink SQL> ANALYZE TABLE Orders PARTITION(sold_year='2022', sold_month='1', sold_day='10') COMPUTE STATISTICS;
  28. [INFO] Execute statement succeed.
  29. Flink SQL> ANALYZE TABLE Orders PARTITION(sold_year='2022', sold_month='1', sold_day) COMPUTE STATISTICS;
  30. [INFO] Execute statement succeed.
  31. Flink SQL> ANALYZE TABLE Orders PARTITION(sold_year, sold_month, sold_day) COMPUTE STATISTICS;
  32. [INFO] Execute statement succeed.
  33. Flink SQL> ANALYZE TABLE Orders PARTITION(sold_year='2022', sold_month='1', sold_day='10') COMPUTE STATISTICS FOR ALL COLUMNS;
  34. [INFO] Execute statement succeed.
  35. Flink SQL> ANALYZE TABLE Orders PARTITION(sold_year='2022', sold_month='1', sold_day) COMPUTE STATISTICS FOR ALL COLUMNS;
  36. [INFO] Execute statement succeed.
  37. Flink SQL> ANALYZE TABLE Orders PARTITION(sold_year, sold_month, sold_day) COMPUTE STATISTICS FOR ALL COLUMNS;
  38. [INFO] Execute statement succeed.
  39. Flink SQL> ANALYZE TABLE Orders PARTITION(sold_year='2022', sold_month='1', sold_day='10') COMPUTE STATISTICS FOR COLUMNS amount;
  40. [INFO] Execute statement succeed.
  41. Flink SQL> ANALYZE TABLE Orders PARTITION (sold_year='2022', sold_month='1', sold_day) COMPUTE STATISTICS FOR COLUMNS amount, product;
  42. [INFO] Execute statement succeed.
  43. Flink SQL> ANALYZE TABLE Orders PARTITION(sold_year, sold_month, sold_day) COMPUTE STATISTICS FOR COLUMNS amount, product;
  44. [INFO] Execute statement succeed.

Syntax

  1. ANALYZE TABLE [catalog_name.][db_name.]table_name PARTITION(partcol1[=val1] [, partcol2[=val2], ...]) COMPUTE STATISTICS [FOR COLUMNS col1 [, col2, ...] | FOR ALL COLUMNS]
  • PARTITION(partcol1[=val1] [, partcol2[=val2], …]) is required for the partition table

    • If no partition is specified, the statistics will be gathered for all partitions
    • If a certain partition is specified, the statistics will be gathered only for specific partition
    • If the table is non-partition table , while a partition is specified, an exception will be thrown
    • If a certain partition is specified, but the partition does not exist, an exception will be thrown
  • FOR COLUMNS col1 [, col2, …] or FOR ALL COLUMNS are optional

    • If no column is specified, only the table level statistics will be gathered
    • If a column does not exist, or column is not a physical column, an exception will be thrown.
    • If a column or any column is specified, the column level statistics will be gathered
      • the column level statistics include:
        • ndv: the number of distinct values
        • nullCount: the number of nulls
        • avgLen: the average length of column values
        • maxLen: the max length of column values
        • minValue: the min value of column values
        • maxValue: the max value of column values
        • valueCount: the value count only for boolean type
      • the supported types and its corresponding column level statistics are as following sheet lists(“Y” means support, “N” means unsupported):
TypesndvnullCountavgLenmaxLenmaxValueminValuevalueCount
BOOLEANNYNNNNY
TINYINTYYNNYYN
SMALLINTYYNNYYN
INTEGERYYNNYYN
FLOATYYNNYYN
DATEYYNNYYN
TIME_WITHOUT_TIME_ZONEYYNNYYN
BIGINTYYNNYYN
DOUBLEYYNNYYN
DECIMALYYNNYYN
TIMESTAMP_WITH_LOCAL_TIME_ZONEYYNNYYN
TIMESTAMP_WITHOUT_TIME_ZONEYYNNYYN
CHARYYYYNNN
VARCHARYYYYNNN
other typesNYNNNNN

NOTE: For the fix length types (like BOOLEAN, INTEGER, DOUBLE etc.), we need not collect the avgLen and maxLen from the original records.