This section contains an overview of the Greenplum Database PL/Python Language.

About Greenplum PL/Python

PL/Python is a loadable procedural language. With the Greenplum Database PL/Python extensions, you can write Greenplum Database user-defined functions in Python that take advantage of Python features and modules to quickly build robust database applications.

You can run PL/Python code blocks as anonymous code blocks. See the DO command in the Greenplum Database Reference Guide.

The Greenplum Database PL/Python extensions are installed by default with Greenplum Database. Two extensions are provided:

  • plpythonu supports developing functions using Python 2.7. Greenplum Database installs a version of Python 2.7 for plpythonu at $GPHOME/ext/python.
  • plpython3u, introduced with Greenplum 6.22, supports developing functions using Python 3.9. Greenplum Database installs a compatible Python at $GPHOME/ext/python3.9.

Greenplum Database PL/Python Limitations

  • Greenplum Database does not support PL/Python triggers.
  • PL/Python is available only as a Greenplum Database untrusted language.
  • Updatable cursors (UPDATE...WHERE CURRENT OF and DELETE...WHERE CURRENT OF) are not supported.
  • Within a single Greenplum session, all PL/Python functions must be called using either plpythonu or plpython3u. You must start a new session before you can call a function created with different PL/Python version (for example, in order to call a plpythonu function after calling a plpython3u function, or vice versa).

Enabling and Removing PL/Python support

The PL/Python language is installed with Greenplum Database. To create and run a PL/Python user-defined function (UDF) in a database, you must register the PL/Python language with the database.

Enabling PL/Python Support

Greenplum installs compatible versions of Python 2.7 and 3.9 in $GPHOME/ext.

For each database that requires its use, register the PL/Python language with the SQL command CREATE EXTENSION. Separate extensions are provided for Python 2.7 and Python 3.9 support, and you can install either or both extensions to a database.

Because PL/Python is an untrusted language, only superusers can register PL/Python with a database.

For example, run this command as the gpadmin user to register PL/Python with Python 2.7 support in the database named testdb:

  1. $ psql -d testdb -c 'CREATE EXTENSION plpythonu;'

Run this command as the gpadmin user to register PL/Python with Python 3.9 support:

  1. $ psql -d testdb -c 'CREATE EXTENSION plpython3u;'

PL/Python is registered as an untrusted language.

Removing PL/Python Support

For a database that no longer requires the PL/Python language, remove support for PL/Python with the SQL command DROP EXTENSION. Because PL/Python is an untrusted language, only superusers can remove support for the PL/Python language from a database. For example, running this command as the gpadmin user removes support for PL/Python for Python 2.7 from the database named testdb:

  1. $ psql -d testdb -c 'DROP EXTENSION plpythonu;'

Run this command as the gpadmin user to remove support for PL/Python for Python 3.9:

  1. $ psql -d testdb -c 'DROP EXTENSION plpython3u;'

The default command fails if any existing objects (such as functions) depend on the language. Specify the CASCADE option to also drop all dependent objects, including functions that you created with PL/Python.

Developing Functions with PL/Python

The body of a PL/Python user-defined function is a Python script. When the function is called, its arguments are passed as elements of the array args[]. Named arguments are also passed as ordinary variables to the Python script. The result is returned from the PL/Python function with return statement, or yield statement in case of a result-set statement.

PL/Python translates Python’s None into the SQL null value.

Data Type Mapping

The Greenplum to Python data type mapping follows.

Greenplum Primitive TypePython Data Type
boolean1bool
byteabytes
smallint, bigint, oidint
real, doublefloat
numericdecimal
other primitive typesstring
SQL null valueNone

1 When the UDF return type is boolean, the Greenplum Database evaluates the return value for truth according to Python rules. That is, 0 and empty string are false, but notably 'f' is true.

Example:

  1. CREATE OR REPLACE FUNCTION pybool_func(a int) RETURNS boolean AS $$
  2. # container: plc_python3_shared
  3. if (a > 0):
  4. return True
  5. else:
  6. return False
  7. $$ LANGUAGE plpythonu;
  8. SELECT pybool_func(-1);
  9. pybool_func
  10. -------------
  11. f
  12. (1 row)

Arrays and Lists

You pass SQL array values into PL/Python functions with a Python list. Similarly, PL/Python functions return SQL array values as a Python list. In the typical PL/Python usage pattern, you will specify an array with [].

The following example creates a PL/Python function that returns an array of integers:

  1. CREATE FUNCTION return_py_int_array()
  2. RETURNS int[]
  3. AS $$
  4. return [1, 11, 21, 31]
  5. $$ LANGUAGE plpythonu;
  6. SELECT return_py_int_array();
  7. return_py_int_array
  8. ---------------------
  9. {1,11,21,31}
  10. (1 row)

PL/Python treats multi-dimensional arrays as lists of lists. You pass a multi-dimensional array to a PL/Python function using nested Python lists. When a PL/Python function returns a multi-dimensional array, the inner lists at each level must all be of the same size.

The following example creates a PL/Python function that takes a multi-dimensional array of integers as input. The function displays the type of the provided argument, and returns the multi-dimensional array:

  1. CREATE FUNCTION return_multidim_py_array(x int4[])
  2. RETURNS int4[]
  3. AS $$
  4. plpy.info(x, type(x))
  5. return x
  6. $$ LANGUAGE plpythonu;
  7. SELECT * FROM return_multidim_py_array(ARRAY[[1,2,3], [4,5,6]]);
  8. INFO: ([[1, 2, 3], [4, 5, 6]], <type 'list'>)
  9. CONTEXT: PL/Python function "return_multidim_py_type"
  10. return_multidim_py_array
  11. --------------------------
  12. {{1,2,3},{4,5,6}}
  13. (1 row)

PL/Python also accepts other Python sequences, such as tuples, as function arguments for backwards compatibility with Greenplum versions where multi-dimensional arrays were not supported. In such cases, the Python sequences are always treated as one-dimensional arrays because they are ambiguous with composite types.

Composite Types

You pass composite-type arguments to a PL/Python function using Python mappings. The element names of the mapping are the attribute names of the composite types. If an attribute has the null value, its mapping value is None.

You can return a composite type result as a sequence type (tuple or list). You must specify a composite type as a tuple, rather than a list, when it is used in a multi-dimensional array. You cannot return an array of composite types as a list because it would be ambiguous to determine whether the list represents a composite type or another array dimension. In the typical usage pattern, you will specify composite type tuples with ().

In the following example, you create a composite type and a PL/Python function that returns an array of the composite type:

  1. CREATE TYPE type_record AS (
  2. first text,
  3. second int4
  4. );
  5. CREATE FUNCTION composite_type_as_list()
  6. RETURNS type_record[]
  7. AS $$
  8. return [[('first', 1), ('second', 1)], [('first', 2), ('second', 2)], [('first', 3), ('second', 3)]];
  9. $$ LANGUAGE plpythonu;
  10. SELECT * FROM composite_type_as_list();
  11. composite_type_as_list
  12. ------------------------------------------------------------------------------------
  13. {{"(first,1)","(second,1)"},{"(first,2)","(second,2)"},{"(first,3)","(second,3)"}}
  14. (1 row)

Refer to the PostgreSQL Arrays, Lists documentation for additional information on PL/Python handling of arrays and composite types.

Set-Returning Functions

A Python function can return a set of scalar or composite types from any sequence type (for example: tuple, list, set).

In the following example, you create a composite type and a Python function that returns a SETOF of the composite type:

  1. CREATE TYPE greeting AS (
  2. how text,
  3. who text
  4. );
  5. CREATE FUNCTION greet (how text)
  6. RETURNS SETOF greeting
  7. AS $$
  8. # return tuple containing lists as composite types
  9. # all other combinations work also
  10. return ( {"how": how, "who": "World"}, {"how": how, "who": "Greenplum"} )
  11. $$ LANGUAGE plpythonu;
  12. select greet('hello');
  13. greet
  14. -------------------
  15. (hello,World)
  16. (hello,Greenplum)
  17. (2 rows)

Running and Preparing SQL Queries

The PL/Python plpy module provides two Python functions to run an SQL query and prepare an execution plan for a query, plpy.execute and plpy.prepare. Preparing the execution plan for a query is useful if you run the query from multiple Python functions.

PL/Python also supports the plpy.subtransaction() function to help manage plpy.execute calls in an explicit subtransaction. See Explicit Subtransactions in the PostgreSQL documentation for additional information about plpy.subtransaction().

plpy.execute

Calling plpy.execute with a query string and an optional limit argument causes the query to be run and the result to be returned in a Python result object. The result object emulates a list or dictionary object. The rows returned in the result object can be accessed by row number and column name. The result set row numbering starts with 0 (zero). The result object can be modified. The result object has these additional methods:

  • nrows that returns the number of rows returned by the query.
  • status which is the SPI_execute() return value.

For example, this Python statement in a PL/Python user-defined function runs a query.

  1. rv = plpy.execute("SELECT * FROM my_table", 5)

The plpy.execute function returns up to 5 rows from my_table. The result set is stored in the rv object. If my_table has a column my_column, it would be accessed as:

  1. my_col_data = rv[i]["my_column"]

Since the function returns a maximum of 5 rows, the index i can be an integer between 0 and 4.

plpy.prepare

The function plpy.prepare prepares the execution plan for a query. It is called with a query string and a list of parameter types, if you have parameter references in the query. For example, this statement can be in a PL/Python user-defined function:

  1. plan = plpy.prepare("SELECT last_name FROM my_users WHERE
  2. first_name = $1", [ "text" ])

The string text is the data type of the variable that is passed for the variable $1. After preparing a statement, you use the function plpy.execute to run it:

  1. rv = plpy.execute(plan, [ "Fred" ], 5)

The third argument is the limit for the number of rows returned and is optional.

When you prepare an execution plan using the PL/Python module the plan is automatically saved. See the Postgres Server Programming Interface (SPI) documentation for information about the execution plans https://www.postgresql.org/docs/9.4/spi.html.

To make effective use of saved plans across function calls you use one of the Python persistent storage dictionaries SD or GD.

The global dictionary SD is available to store data between function calls. This variable is private static data. The global dictionary GD is public data, available to all Python functions within a session. Use GD with care.

Each function gets its own execution environment in the Python interpreter, so that global data and function arguments from myfunc are not available to myfunc2. The exception is the data in the GD dictionary, as mentioned previously.

This example uses the SD dictionary:

  1. CREATE FUNCTION usesavedplan() RETURNS trigger AS $$
  2. if SD.has_key("plan"):
  3. plan = SD["plan"]
  4. else:
  5. plan = plpy.prepare("SELECT 1")
  6. SD["plan"] = plan
  7. # rest of function
  8. $$ LANGUAGE plpythonu;

Handling Python Errors and Messages

The Python module plpy implements these functions to manage errors and messages:

  • plpy.debug
  • plpy.log
  • plpy.info
  • plpy.notice
  • plpy.warning
  • plpy.error
  • plpy.fatal
  • plpy.debug

The message functions plpy.error and plpy.fatal raise a Python exception which, if uncaught, propagates out to the calling query, causing the current transaction or subtransaction to be cancelled. The functions raise plpy.ERROR(msg) and raise plpy.FATAL(msg) are equivalent to calling plpy.error and plpy.fatal, respectively. The other message functions only generate messages of different priority levels.

Whether messages of a particular priority are reported to the client, written to the server log, or both is controlled by the Greenplum Database server configuration parameters log_min_messages and client_min_messages. For information about the parameters see the Greenplum Database Reference Guide.

Using the dictionary GD To Improve PL/Python Performance

In terms of performance, importing a Python module is an expensive operation and can affect performance. If you are importing the same module frequently, you can use Python global variables to load the module on the first invocation and not require importing the module on subsequent calls. The following PL/Python function uses the GD persistent storage dictionary to avoid importing a module if it has already been imported and is in the GD.

  1. psql=#
  2. CREATE FUNCTION pytest() returns text as $$
  3. if 'mymodule' not in GD:
  4. import mymodule
  5. GD['mymodule'] = mymodule
  6. return GD['mymodule'].sumd([1,2,3])
  7. $$;

Installing Python Modules

When you install a Python module for development with PL/Python, the Greenplum Database Python environment must have the module added to it across all segment hosts and mirror hosts in the cluster. When expanding Greenplum Database, you must add the Python modules to the new segment hosts.

Greenplum Database provides a collection of data science-related Python modules that you can use to easily develop PL/Python functions in Greenplum. The modules are provided as two .gppkg format files that can be installed into a Greenplum cluster using the gppkg utility, with one package supporting development with Python 2.7 and the other supporting development with Python 3.9. See Python Data Science Module Packages for installation instructions and descriptions of the provided modules.

To develop with modules that are not part of th Python Data Science Module packages, you can use Greenplum utilities such as gpssh and gpscp to run commands or copy files to all hosts in the Greenplum cluster. These sections describe how to use those utilities to install and use additional Python modules:

Verifying the Python Environment

As part of the Greenplum Database installation, the gpadmin user environment is configured to use Python that is installed with Greenplum Database. To check the Python environment, you can use the which command:

  1. which python

The command returns the location of the Python installation. All Greenplum installations include Python 2.7 installed as $GPHOME/ext/python and Python 3.9 installed as $GPHOME/ext/python3.9:

  1. which python3.9

When running shell commands on remote hosts with gpssh, specify the -s option to source the greenplum_path.sh file before running commands on the remote hosts. For example, this command should display the Python installed with Greenplum Database on each host specified in the gpdb_hosts file.

  1. gpssh -s -f gpdb_hosts which python

To display the list of currently installed Python 2.7 modules, run this command.

  1. python -c "help('modules')"

You can optionally run gpssh in interactive mode to display Python modules on remote hosts. This example starts gpssh in interactive mode and lists the Python modules on the Greenplum Database host sdw1.

  1. $ gpssh -s -h sdw1
  2. => python -c "help('modules')"
  3. . . .
  4. => exit
  5. $

Installing Python pip

The Python utility pip installs Python packages that contain Python modules and other resource files from versioned archive files.

Run this command to install pip for Python 2.7:

  1. python -m ensurepip --default-pip

For Python 3.9, use:

  1. python3.9 -m ensurepip --default-pip

The command runs the ensurepip module to bootstrap (install and configure) the pip utility from the local Python installation.

You can run this command to ensure the pip, setuptools and wheel projects are current. Current Python projects ensure that you can install Python packages from source distributions or pre-built distributions (wheels).

  1. python -m pip install --upgrade pip setuptools wheel

You can use gpssh to run the commands on the Greenplum Database hosts. This example runs gpssh in interactive mode to install pip on the hosts listed in the file gpdb_hosts.

  1. $ gpssh -s -f gpdb_hosts
  2. => python -m ensurepip --default-pip
  3. [centos6-mdw1] Ignoring indexes: https://pypi.python.org/simple
  4. [centos6-mdw1] Collecting setuptools
  5. [centos6-mdw1] Collecting pip
  6. [centos6-mdw1] Installing collected packages: setuptools, pip
  7. [centos6-mdw1] Successfully installed pip-8.1.1 setuptools-20.10.1
  8. [centos6-sdw1] Ignoring indexes: https://pypi.python.org/simple
  9. [centos6-sdw1] Collecting setuptools
  10. [centos6-sdw1] Collecting pip
  11. [centos6-sdw1] Installing collected packages: setuptools, pip
  12. [centos6-sdw1] Successfully installed pip-8.1.1 setuptools-20.10.1
  13. => exit
  14. $

The => is the inactive prompt for gpssh. The utility displays the output from each host. The exit command exits from gpssh interactive mode.

This gpssh command runs a single command on all hosts listed in the file gpdb_hosts.

  1. gpssh -s -f gpdb_hosts python -m pip install --upgrade pip setuptools wheel

The utility displays the output from each host.

For more information about installing Python packages, see https://packaging.python.org/tutorials/installing-packages/.

Installing Python Packages for Python 2.7

After installing pip, you can install Python packages. This command installs the numpy and scipy packages for Python 2.7:

  1. python -m pip install --user numpy scipy

For Python 3.9, use the python3.9 command instead:

  1. python3.9 -m pip install --user numpy scipy

The --user option attempts to avoid conflicts when installing Python packages.

You can use gpssh to run the command on the Greenplum Database hosts.

For information about these and other Python packages, see References.

Installing Python Packages for Python 3.9

By default, greenplum_path.sh changes the PYTHONPATH and PYTHONHOME environment variables for use with the installed Python 2.7 environment. In order to install modules using pip with Python 3.9, you must first unset those parameters. For example to install numpy and scipy for Python 3.9:

  1. gpssh -s -f gpdb_hosts
  2. => unset PYTHONHOME
  3. => unset PYTHONPATH
  4. => $GPHOME/ext/python3.9 -m pip install numpy scipy

You can optionally install Python 3.9 modules to a non-standard location by using the --prefix option with pip. For example:

  1. gpssh -s -f gpdb_hosts
  2. => unset PYTHONHOME
  3. => unset PYTHONPATH
  4. => $GPHOME/ext/python3.9 -m pip install --prefix=/home/gpadmin/my_python numpy scipy

If you use this option, keep in mind that the PYTHONPATH environment variable setting is cleared before initializing or executing functions using plpython3u. If you want to use modules installed to a custom location, you must configure the paths to those modules using the Greenplum configuration parameter plpython3.python_path instead of PYTHONPATH. For example:

  1. $ psql -d testdb
  2. testdb=# load 'plpython3';
  3. testdb=# SET plpython3.python_path='/home/gpadmin/my_python';

Greenplum uses the value of plpython3.python_path to set PLPYTHONPATH in the environment used to create or call plpython3u functions.

Note

plpython3.python_path is provided as part of the plpython3u extension, so you must load the extension (with load 'plpython3';) before you can set this configuration parameter in a session.

Ensure that you configure plpython3.python_path before you create or call plpython3 functions in a session. If you set or change the parameter after plpython3u is initialized you receive the error:

  1. ERROR: SET PYTHONPATH failed, the GUC value can only be changed before initializing the python interpreter.

To set a default value for the configuration parameter, use gpconfig instead:

  1. gpconfig -c plpython3.python_path \
  2. -v "'/home/gpadmin/my_python'" \
  3. --skipvalidation
  4. gpstop -u

Building and Installing Python Modules Locally

If you are building a Python module, you must ensure that the build creates the correct executable. For example on a Linux system, the build should create a 64-bit executable.

Before building a Python module to be installed, ensure that the appropriate software to build the module is installed and properly configured. The build environment is required only on the host where you build the module.

You can use the Greenplum Database utilities gpssh and gpscp to run commands on Greenplum Database hosts and to copy files to the hosts.

Testing Installed Python Modules

You can create a simple PL/Python user-defined function (UDF) to validate that Python a module is available in the Greenplum Database. This example tests the NumPy module.

This PL/Python UDF imports the NumPy module. The function returns SUCCESS if the module is imported, and FAILURE if an import error occurs.

  1. CREATE OR REPLACE FUNCTION plpy_test(x int)
  2. returns text
  3. as $$
  4. try:
  5. from numpy import *
  6. return 'SUCCESS'
  7. except ImportError, e:
  8. return 'FAILURE'
  9. $$ language plpythonu;

(If you are using Python 3.9, replace plpythonu with plpython3u in the above command.)

Create a table that contains data on each Greenplum Database segment instance. Depending on the size of your Greenplum Database installation, you might need to generate more data to ensure data is distributed to all segment instances.

  1. CREATE TABLE DIST AS (SELECT x FROM generate_series(1,50) x ) DISTRIBUTED RANDOMLY ;

This SELECT command runs the UDF on the segment hosts where data is stored in the primary segment instances.

  1. SELECT gp_segment_id, plpy_test(x) AS status
  2. FROM dist
  3. GROUP BY gp_segment_id, status
  4. ORDER BY gp_segment_id, status;

The SELECT command returns SUCCESS if the UDF imported the Python module on the Greenplum Database segment instance. If the SELECT command returns FAILURE, you can find the segment host of the segment instance host. The Greenplum Database system table gp_segment_configuration contains information about mirroring and segment configuration. This command returns the host name for a segment ID.

  1. SELECT hostname, content AS seg_ID FROM gp_segment_configuration
  2. WHERE content = <seg_id> ;

If FAILURE is returned, these are some possible causes:

  • A problem accessing required libraries. For the NumPy example, a Greenplum Database might have a problem accessing the OpenBLAS libraries or the Python libraries on a segment host.

    Make sure you get no errors when running command on the segment host as the gpadmin user. This gpssh command tests importing the numpy module on the segment host mdw1.

    1. gpssh -s -h mdw1 python -c "import numpy"
  • If the Python import command does not return an error, environment variables might not be configured in the Greenplum Database environment. For example, the Greenplum Database might not have been restarted after installing the Python Package on the host system.

Examples

This PL/Python function example uses Python 3.9 and returns the value of pi using the numpy module:

  1. CREATE OR REPLACE FUNCTION testpi()
  2. RETURNS float
  3. AS $$
  4. import numpy
  5. return numpy.pi
  6. $$ LANGUAGE plpython3u;

Use SELECT to call the function:

  1. SELECT testpi();
  2. testpi
  3. ------------------
  4. 3.14159265358979
  5. (1 row)

This PL/Python UDF returns the maximum of two integers:

  1. CREATE FUNCTION pymax (a integer, b integer)
  2. RETURNS integer
  3. AS $$
  4. if (a is None) or (b is None):
  5. return None
  6. if a > b:
  7. return a
  8. return b
  9. $$ LANGUAGE plpythonu;

You can use the STRICT property to perform the null handling instead of using the two conditional statements.

  1. CREATE FUNCTION pymax (a integer, b integer)
  2. RETURNS integer AS $$
  3. return max(a,b)
  4. $$ LANGUAGE plpythonu STRICT ;

You can run the user-defined function pymax with SELECT command. This example runs the UDF and shows the output.

  1. SELECT ( pymax(123, 43));
  2. column1
  3. ---------
  4. 123
  5. (1 row)

This example that returns data from an SQL query that is run against a table. These two commands create a simple table and add data to the table.

  1. CREATE TABLE sales (id int, year int, qtr int, day int, region text)
  2. DISTRIBUTED BY (id) ;
  3. INSERT INTO sales VALUES
  4. (1, 2014, 1,1, 'usa'),
  5. (2, 2002, 2,2, 'europe'),
  6. (3, 2014, 3,3, 'asia'),
  7. (4, 2014, 4,4, 'usa'),
  8. (5, 2014, 1,5, 'europe'),
  9. (6, 2014, 2,6, 'asia'),
  10. (7, 2002, 3,7, 'usa') ;

This PL/Python UDF runs a SELECT command that returns 5 rows from the table. The Python function returns the REGION value from the row specified by the input value. In the Python function, the row numbering starts from 0. Valid input for the function is an integer between 0 and 4.

  1. CREATE OR REPLACE FUNCTION mypytest(a integer)
  2. RETURNS setof text
  3. AS $$
  4. rv = plpy.execute("SELECT * FROM sales ORDER BY id", 5)
  5. region =[]
  6. region.append(rv[a]["region"])
  7. return region
  8. $$ language plpythonu EXECUTE ON MASTER;

Running this SELECT statement returns the REGION column value from the third row of the result set.

  1. SELECT mypytest(2) ;

This command deletes the UDF from the database.

  1. DROP FUNCTION mypytest(integer) ;

This example runs the PL/Python function in the previous example as an anonymous block with the DO command. In the example, the anonymous block retrieves the input value from a temporary table.

  1. CREATE TEMP TABLE mytemp AS VALUES (2) DISTRIBUTED RANDOMLY;
  2. DO $$
  3. temprow = plpy.execute("SELECT * FROM mytemp", 1)
  4. myval = temprow[0]["column1"]
  5. rv = plpy.execute("SELECT * FROM sales ORDER BY id", 5)
  6. region = rv[myval]["region"]
  7. plpy.notice("region is %s" % region)
  8. $$ language plpythonu;

References

Technical References

For information about the Python language, see https://www.python.org/.

For information about PL/Python see the PostgreSQL documentation at https://www.postgresql.org/docs/9.4/plpython.html.

For information about Python Package Index (PyPI), see https://pypi.python.org/pypi.

These are some Python modules that can be installed:

  • SciPy library provides user-friendly and efficient numerical routines such as routines for numerical integration and optimization. The SciPy site includes other similar Python libraries http://www.scipy.org/index.html.
  • Natural Language Toolkit (nltk) is a platform for building Python programs to work with human language data. http://www.nltk.org/. For information about installing the toolkit see http://www.nltk.org/install.html.