Hive Connector GCS Tutorial

Preliminary Steps

Ensure Access to GCS

Access to Cloud Storage data is possible thanks to Hadoop Cloud Storage connector.

If your data is publicly available, you do not need to do anything here. However, in most cases data is not publicly available, and the openLooKeng cluster needs to have access to it. This is typically achieved by creating a service account which has permissions to access your data. You can do this on the service accounts page in GCP. Once you create a service account, create a key for it and download the key in JSON format.

Hive Connector configuration

Another requirement is that you have enabled and configured a Hive connector in openLooKeng. The connector uses Hive metastore for data discovery and is not limited to data residing on HDFS.

Configuring Hive Connector

  • URL to Hive metastore:

    • New Hive metastore on GCP:

      If your openLooKeng nodes are provisioned by GCP, your Hive metastore should also be on GCP to minimize latency and costs. The simplest way to create a new Hive metastore on GCP is to create a small Cloud DataProc cluster (1 master, 0 workers), accessible from your openLooKeng cluster. Follow the steps for existing Hive metastore after finishing this step.

    • Existing Hive metastore:

      To use an existing Hive metastore with a openLooKeng cluster, you need to set the hive.metastore.uri property in your Hive catalog properties file to thrift://${METASTORE_ADDRESS}:${METASTORE_THRIFT_PORT}. If the metastore uses authentication, please refer to hive-security.

  • GCS access:

    Here are example values for all GCS configuration properties which can be set in Hive catalog properties file:

    1. # JSON key file used to access Google Cloud Storage
    2. hive.gcs.json-key-file-path=/path/to/gcs_keyfile.json
    3. # Use client-provided OAuth token to access Google Cloud Storage
    4. hive.gcs.use-access-token=false

Hive Metastore configuration

If your Hive metastore uses StorageBasedAuthorization it will also need to access GCS to perform POSIX permission checks. Configuring GCS access for Hive is outside the scope of this tutorial, but there are some excellent guides online:

GCS access is typically configured in core-site.xml, to be used by all components using Apache Hadoop.

GCS connector for Hadoop provides an implementation of a Hadoop FileSystem. Unfortunately GCS IAM permissions don’t map to POSIX permissions required by Hadoop FileSystem, so the GCS connector presents fake POSIX file permissions.

When Hive metastore accesses GCS, it will by default see fake POSIX permissions equal to 0700. If openLooKeng and Hive metastore are running as different user accounts, this will cause Hive metastore to deny openLooKeng data access. There are two possible solutions to this problem:

  • Run openLooKeng service and Hive service as the same user.
  • Make sure Hive GCS configuration includes afs.gs.reported.permissions property with a value of 777.

Accessing GCS Data From openLooKeng for the First Time

Accessing Data Already Mapped in the Hive metastore

If you migrate to openLooKeng from Hive, chances are that your GCS data is already mapped to SQL tables in the metastore. In that case, you should be able to query it.

Accessing Data Not Yet Mapped in the Hive metastore

To access GCS data that is not yet mapped in the Hive metastore you need to provide the schema of the data, the file format, and the data location. For example, if you have ORC or Parquet files in an GCS bucket my_bucket, you will need to execute a query:

  1. -- select schema in which the table will be defined, must already exist
  2. USE hive.default;
  3. -- create table
  4. CREATE TABLE orders (
  5. orderkey bigint,
  6. custkey bigint,
  7. orderstatus varchar(1),
  8. totalprice double,
  9. orderdate date,
  10. orderpriority varchar(15),
  11. clerk varchar(15),
  12. shippriority integer,
  13. comment varchar(79)
  14. ) WITH (
  15. external_location = 'gs://my_bucket/path/to/folder',
  16. format = 'ORC' -- or 'PARQUET'
  17. );

Now you should be able to query the newly mapped table:

  1. SELECT * FROM orders;

Writing GCS Data with openLooKeng

Prerequisites

Before you attempt to write data to GCS, make sure you have configured everything necessary to read data from GCS.

Create Export Schema

If Hive metastore contains schema(s) mapped to GCS locations, you can use them to export data to GCS. If you don’t want to use existing schemas (or there are no appropriate schemas in the Hive metastore), you need to create a new one:

  1. CREATE SCHEMA hive.gcs_export WITH (location = 'gs://my_bucket/some/path');

Export Data to GCS

Once you have a schema pointing to a location where you want to export the data, you can issue the export using a CREATE TABLE AS statement and select your desired file format. The data will be written to one or more files within the gs://my_bucket/some/path/my_table namespace.

Example:

  1. CREATE TABLE hive.gcs_export.orders_export
  2. WITH (format = 'ORC')
  3. AS SELECT * FROM tpch.sf1.orders;