Custom Types

A variety of methods exist to redefine the behavior of existing typesas well as to provide new ones.

Overriding Type Compilation

A frequent need is to force the “string” version of a type, that isthe one rendered in a CREATE TABLE statement or other SQL functionlike CAST, to be changed. For example, an application may wantto force the rendering of BINARY for all platformsexcept for one, in which is wants BLOB to be rendered. Usageof an existing generic type, in this case LargeBinary, ispreferred for most use cases. But to controltypes more accurately, a compilation directive that is per-dialectcan be associated with any type:

  1. from sqlalchemy.ext.compiler import compiles
  2. from sqlalchemy.types import BINARY
  3.  
  4. @compiles(BINARY, "sqlite")
  5. def compile_binary_sqlite(type_, compiler, **kw):
  6. return "BLOB"

The above code allows the usage of types.BINARY, whichwill produce the string BINARY against all backends except SQLite,in which case it will produce BLOB.

See the section Changing Compilation of Types, a subsection ofCustom SQL Constructs and Compilation Extension, for additional examples.

Augmenting Existing Types

The TypeDecorator allows the creation of custom types whichadd bind-parameter and result-processing behavior to an existingtype object. It is used when additional in-Python marshalling of datato and/or from the database is required.

Note

The bind- and result-processing of TypeDecoratoris in addition to the processing already performed by the hostedtype, which is customized by SQLAlchemy on a per-DBAPI basis to performprocessing specific to that DBAPI. While it is possible to replace thishandling for a given type through direct subclassing, it is never needed inpractice and SQLAlchemy no longer supports this as a public use case.

ORM Tip

The TypeDecorator can be used to provide a consistent means ofconverting some type of value as it is passed into and out of the database.When using the ORM, a similar technique exists for converting user datafrom arbitrary formats which is to use the validates() decorator.This technique may be more appropriate when data coming into an ORM modelneeds to be normalized in some way that is specific to the business caseand isn’t as generic as a datatype.

Allows the creation of types which add additional functionalityto an existing type.

This method is preferred to direct subclassing of SQLAlchemy’sbuilt-in types as it ensures that all required functionality ofthe underlying type is kept in place.

Typical usage:

  1. import sqlalchemy.types as types
  2.  
  3. class MyType(types.TypeDecorator):
  4. '''Prefixes Unicode values with "PREFIX:" on the way in and
  5. strips it off on the way out.
  6. '''
  7.  
  8. impl = types.Unicode
  9.  
  10. def process_bind_param(self, value, dialect):
  11. return "PREFIX:" + value
  12.  
  13. def process_result_value(self, value, dialect):
  14. return value[7:]
  15.  
  16. def copy(self, **kw):
  17. return MyType(self.impl.length)

The class-level “impl” attribute is required, and can reference anyTypeEngine class. Alternatively, the load_dialect_impl() methodcan be used to provide different type classes based on the dialectgiven; in this case, the “impl” variable can referenceTypeEngine as a placeholder.

Types that receive a Python type that isn’t similar to the ultimate typeused may want to define the TypeDecorator.coerce_compared_value()method. This is used to give the expression system a hint when coercingPython objects into bind parameters within expressions. Consider thisexpression:

  1. mytable.c.somecol + datetime.date(2009, 5, 15)

Above, if “somecol” is an Integer variant, it makes sense thatwe’re doing date arithmetic, where above is usually interpretedby databases as adding a number of days to the given date.The expression system does the right thing by not attempting tocoerce the “date()” value into an integer-oriented bind parameter.

However, in the case of TypeDecorator, we are usually changing anincoming Python type to something new - TypeDecorator by default will“coerce” the non-typed side to be the same type as itself. Such as below,we define an “epoch” type that stores a date value as an integer:

  1. class MyEpochType(types.TypeDecorator):
  2. impl = types.Integer
  3.  
  4. epoch = datetime.date(1970, 1, 1)
  5.  
  6. def process_bind_param(self, value, dialect):
  7. return (value - self.epoch).days
  8.  
  9. def process_result_value(self, value, dialect):
  10. return self.epoch + timedelta(days=value)

Our expression of somecol + date with the above type will coerce the“date” on the right side to also be treated as MyEpochType.

This behavior can be overridden via thecoerce_compared_value() method, which returns a typethat should be used for the value of the expression. Below we set it suchthat an integer value will be treated as an Integer, and any othervalue is assumed to be a date and will be treated as a MyEpochType:

  1. def coerce_compared_value(self, op, value):
  2. if isinstance(value, int):
  3. return Integer()
  4. else:
  5. return self

Warning

Note that the behavior of coerce_compared_value is not inheritedby default from that of the base type.If the TypeDecorator is augmenting atype that requires special logic for certain types of operators,this method must be overridden. A key example is when decoratingthe postgresql.JSON and postgresql.JSONB types;the default rules of TypeEngine.coerce_compared_value() shouldbe used in order to deal with operators like index operations:

  1. class MyJsonType(TypeDecorator):
  2. impl = postgresql.JSON
  3.  
  4. def coerce_compared_value(self, op, value):
  5. return self.impl.coerce_compared_value(op, value)

Without the above step, index operations such as mycol['foo']will cause the index value 'foo' to be JSON encoded.

  • class Comparator(expr)
  • Bases: sqlalchemy.types.Comparator

A TypeEngine.Comparator that is specific toTypeDecorator.

User-defined TypeDecorator classes should not typicallyneed to modify this.

  1. - <code>operate</code>(_op_, _*other_, _**kwargs_)[](https://docs.sqlalchemy.org/en/13/core/#sqlalchemy.types.TypeDecorator.Comparator.operate)
  2. -

Operate on an argument.

This is the lowest level of operation, raisesNotImplementedError by default.

Overriding this on a subclass can allow commonbehavior to be applied to all operations.For example, overriding ColumnOperatorsto apply func.lower() to the left and rightside:

  1. class MyComparator(ColumnOperators):
  2. def operate(self, op, other):
  3. return op(func.lower(self), func.lower(other))
  1. - Parameters
  2. -
  3. -

op – Operator callable.

  1. -

*other – the ‘other’ side of the operation. Willbe a single scalar for most operations.

  1. -

**kwargs – modifiers. These may be passed by specialoperators such as ColumnOperators.contains().

  1. - <code>reverse_operate</code>(_op_, _other_, _**kwargs_)[](https://docs.sqlalchemy.org/en/13/core/#sqlalchemy.types.TypeDecorator.Comparator.reverse_operate)
  2. -

Reverse operate on an argument.

Usage is the same as operate().

Arguments sent here are passed to the constructorof the class assigned to the impl class level attribute,assuming the impl is a callable, and the resultingobject is assigned to the self.impl instance attribute(thus overriding the class attribute of the same name).

If the class level impl is not a callable (the unusual case),it will be assigned to the same instance attribute ‘as-is’,ignoring those arguments passed to the constructor.

Subclasses can override this to customize the generationof self.impl entirely.

  • bindexpression(_bindparam)
  • “Given a bind value (i.e. a BindParameter instance),return a SQL expression in its place.

This is typically a SQL function that wraps the existing boundparameter within the statement. It is used for special data typesthat require literals being wrapped in some special database functionin order to coerce an application-level value into a database-specificformat. It is the SQL analogue of theTypeEngine.bind_processor() method.

The method is evaluated at statement compile time, as opposedto statement construction time.

Note that this method, when implemented, should always returnthe exact same structure, without any conditional logic, as itmay be used in an executemany() call against an arbitrary numberof bound parameter sets.

See also

Applying SQL-level Bind/Result Processing

  • bindprocessor(_dialect)
  • Provide a bound value processing function for thegiven Dialect.

This is the method that fulfills the TypeEnginecontract for bound value conversion. TypeDecoratorwill wrap a user-defined implementation ofprocess_bind_param() here.

User-defined code can override this method directly,though its likely best to use process_bind_param() so thatthe processing provided by self.impl is maintained.

  1. - Parameters
  2. -

dialect – Dialect instance in use.

This method is the reverse counterpart to theresult_processor() method of this class.

  • coercecompared_value(_op, value)
  • Suggest a type for a ‘coerced’ Python value in an expression.

By default, returns self. This method is called bythe expression system when an object using this type ison the left or right side of an expression against a plain Pythonobject which does not yet have a SQLAlchemy type assigned:

  1. expr = table.c.somecolumn + 35

Where above, if somecolumn uses this type, this method willbe called with the value operator.addand 35. The return value is whatever SQLAlchemy type shouldbe used for 35 for this particular operation.

  • coerceto_is_types = (,)_
  • Specify those Python types which should be coerced at the expressionlevel to “IS ” when compared using == (and same forIS NOT in conjunction with !=.

For most SQLAlchemy types, this includes NoneType, as well asbool.

TypeDecorator modifies this list to only include NoneType,as typedecorator implementations that deal with boolean types are common.

Custom TypeDecorator classes can override this attribute toreturn an empty tuple, in which case no values will be coerced toconstants.

  • columnexpression(_column)
  • Given a SELECT column expression, return a wrapping SQL expression.

This is typically a SQL function that wraps a column expressionas rendered in the columns clause of a SELECT statement.It is used for special data types that requirecolumns to be wrapped in some special database function in orderto coerce the value before being sent back to the application.It is the SQL analogue of the TypeEngine.result_processor()method.

The method is evaluated at statement compile time, as opposedto statement construction time.

See also

Applying SQL-level Bind/Result Processing

  • property comparator_factory
  • Base class for custom comparison operations defined at thetype level. See TypeEngine.comparator_factory.

  • comparevalues(_x, y)

  • Given two values, compare them for equality.

By default this calls upon TypeEngine.compare_values()of the underlying “impl”, which in turn usuallyuses the Python equals operator ==.

This function is used by the ORM to comparean original-loaded value with an intercepted“changed” value, to determine if a net changehas occurred.

This is a shallow copy and is provided to fulfill part ofthe TypeEngine contract. It usually does notneed to be overridden unless the user-defined TypeDecoratorhas local state that should be deep-copied.

  • getdbapi_type(_dbapi)
  • Return the DBAPI type object represented by thisTypeDecorator.

By default this calls upon TypeEngine.get_dbapi_type() of theunderlying “impl”.

  • literalprocessor(_dialect)
  • Provide a literal processing function for the givenDialect.

Subclasses here will typically overrideTypeDecorator.process_literal_param() instead of this methoddirectly.

By default, this method makes use ofTypeDecorator.process_bind_param() if that method isimplemented, where TypeDecorator.process_literal_param() isnot. The rationale here is that TypeDecorator typicallydeals with Python conversions of data that are above the layer ofdatabase presentation. With the value converted byTypeDecorator.process_bind_param(), the underlying type willthen handle whether it needs to be presented to the DBAPI as a boundparameter or to the database as an inline SQL value.

New in version 0.9.0.

  • loaddialect_impl(_dialect)
  • Return a TypeEngine object corresponding to a dialect.

This is an end-user override hook that can be used to providediffering types depending on the given dialect. It is usedby the TypeDecorator implementation of type_engine()to help determine what type should ultimately be returnedfor a given TypeDecorator.

By default returns self.impl.

  • processbind_param(_value, dialect)
  • Receive a bound parameter value to be converted.

Subclasses override this method to return thevalue that should be passed along to the underlyingTypeEngine object, and from there to theDBAPI execute() method.

The operation could be anything desired to perform custombehavior, such as transforming or serializing data.This could also be used as a hook for validating logic.

This operation should be designed with the reverse operationin mind, which would be the process_result_value method ofthis class.

  1. - Parameters
  2. -
  3. -

value – Data to operate upon, of any type expected bythis method in the subclass. Can be None.

  1. -

dialect – the Dialect in use.

  • processliteral_param(_value, dialect)
  • Receive a literal parameter value to be rendered inline withina statement.

This method is used when the compiler renders aliteral value without using binds, typically within DDLsuch as in the “server default” of a column or an expressionwithin a CHECK constraint.

The returned string will be rendered into the output string.

New in version 0.9.0.

  • processresult_value(_value, dialect)
  • Receive a result-row column value to be converted.

Subclasses should implement this method to operate on datafetched from the database.

Subclasses override this method to return thevalue that should be passed back to the application,given a value that is already processed bythe underlying TypeEngine object, originallyfrom the DBAPI cursor method fetchone() or similar.

The operation could be anything desired to perform custombehavior, such as transforming or serializing data.This could also be used as a hook for validating logic.

  1. - Parameters
  2. -
  3. -

value – Data to operate upon, of any type expected bythis method in the subclass. Can be None.

  1. -

dialect – the Dialect in use.

This operation should be designed to be reversible bythe “process_bind_param” method of this class.

  • resultprocessor(_dialect, coltype)
  • Provide a result value processing function for the givenDialect.

This is the method that fulfills the TypeEnginecontract for result value conversion. TypeDecoratorwill wrap a user-defined implementation ofprocess_result_value() here.

User-defined code can override this method directly,though its likely best to use process_result_value() so thatthe processing provided by self.impl is maintained.

  1. - Parameters
  2. -
  3. -

dialect – Dialect instance in use.

  1. -

coltype – A SQLAlchemy data type

This method is the reverse counterpart to thebind_processor() method of this class.

In most cases this returns a dialect-adapted form ofthe TypeEngine type represented by self.impl.Makes usage of dialect_impl() but also traversesinto wrapped TypeDecorator instances.Behavior can be customized here by overridingload_dialect_impl().

TypeDecorator Recipes

A few key TypeDecorator recipes follow.

Coercing Encoded Strings to Unicode

A common source of confusion regarding the Unicode typeis that it is intended to deal only with Python unicode objectson the Python side, meaning values passed to it as bind parametersmust be of the form u'some string' if using Python 2 and not 3.The encoding/decoding functions it performs are only to suit what theDBAPI in use requires, and are primarily a private implementation detail.

The use case of a type that can safely receive Python bytestrings,that is strings that contain non-ASCII characters and are not u''objects in Python 2, can be achieved using a TypeDecoratorwhich coerces as needed:

  1. from sqlalchemy.types import TypeDecorator, Unicode
  2.  
  3. class CoerceUTF8(TypeDecorator):
  4. """Safely coerce Python bytestrings to Unicode
  5. before passing off to the database."""
  6.  
  7. impl = Unicode
  8.  
  9. def process_bind_param(self, value, dialect):
  10. if isinstance(value, str):
  11. value = value.decode('utf-8')
  12. return value

Rounding Numerics

Some database connectors like those of SQL Server choke if a Decimal is passed with toomany decimal places. Here’s a recipe that rounds them down:

  1. from sqlalchemy.types import TypeDecorator, Numeric
  2. from decimal import Decimal
  3.  
  4. class SafeNumeric(TypeDecorator):
  5. """Adds quantization to Numeric."""
  6.  
  7. impl = Numeric
  8.  
  9. def __init__(self, *arg, **kw):
  10. TypeDecorator.__init__(self, *arg, **kw)
  11. self.quantize_int = - self.impl.scale
  12. self.quantize = Decimal(10) ** self.quantize_int
  13.  
  14. def process_bind_param(self, value, dialect):
  15. if isinstance(value, Decimal) and \
  16. value.as_tuple()[2] < self.quantize_int:
  17. value = value.quantize(self.quantize)
  18. return value

Backend-agnostic GUID Type

Receives and returns Python uuid() objects. Uses the PG UUID typewhen using PostgreSQL, CHAR(32) on other backends, storing themin stringified hex format. Can be modified to storebinary in CHAR(16) if desired:

  1. from sqlalchemy.types import TypeDecorator, CHAR
  2. from sqlalchemy.dialects.postgresql import UUID
  3. import uuid
  4.  
  5. class GUID(TypeDecorator):
  6. """Platform-independent GUID type.
  7.  
  8. Uses PostgreSQL's UUID type, otherwise uses
  9. CHAR(32), storing as stringified hex values.
  10.  
  11. """
  12. impl = CHAR
  13.  
  14. def load_dialect_impl(self, dialect):
  15. if dialect.name == 'postgresql':
  16. return dialect.type_descriptor(UUID())
  17. else:
  18. return dialect.type_descriptor(CHAR(32))
  19.  
  20. def process_bind_param(self, value, dialect):
  21. if value is None:
  22. return value
  23. elif dialect.name == 'postgresql':
  24. return str(value)
  25. else:
  26. if not isinstance(value, uuid.UUID):
  27. return "%.32x" % uuid.UUID(value).int
  28. else:
  29. # hexstring
  30. return "%.32x" % value.int
  31.  
  32. def process_result_value(self, value, dialect):
  33. if value is None:
  34. return value
  35. else:
  36. if not isinstance(value, uuid.UUID):
  37. value = uuid.UUID(value)
  38. return value

Marshal JSON Strings

This type uses simplejson to marshal Python data structuresto/from JSON. Can be modified to use Python’s builtin json encoder:

  1. from sqlalchemy.types import TypeDecorator, VARCHAR
  2. import json
  3.  
  4. class JSONEncodedDict(TypeDecorator):
  5. """Represents an immutable structure as a json-encoded string.
  6.  
  7. Usage::
  8.  
  9. JSONEncodedDict(255)
  10.  
  11. """
  12.  
  13. impl = VARCHAR
  14.  
  15. def process_bind_param(self, value, dialect):
  16. if value is not None:
  17. value = json.dumps(value)
  18.  
  19. return value
  20.  
  21. def process_result_value(self, value, dialect):
  22. if value is not None:
  23. value = json.loads(value)
  24. return value

Adding Mutability

The ORM by default will not detect “mutability” on such a type as above -meaning, in-place changes to values will not be detected and will not beflushed. Without further steps, you instead would need to replace the existingvalue with a new one on each parent object to detect changes:

  1. obj.json_value["key"] = "value" # will *not* be detected by the ORM
  2.  
  3. obj.json_value = {"key": "value"} # *will* be detected by the ORM

The above limitation may befine, as many applications may not require that the values are ever mutatedonce created. For those which do have this requirement, support for mutabilityis best applied using the sqlalchemy.ext.mutable extension. For adictionary-oriented JSON structure, we can apply this as:

  1. json_type = MutableDict.as_mutable(JSONEncodedDict)
  2.  
  3. class MyClass(Base):
  4. # ...
  5.  
  6. json_data = Column(json_type)

See also

Mutation Tracking

Dealing with Comparison Operations

The default behavior of TypeDecorator is to coerce the “right hand side”of any expression into the same type. For a type like JSON, this means thatany operator used must make sense in terms of JSON. For some cases,users may wish for the type to behave like JSON in some circumstances, andas plain text in others. One example is if one wanted to handle theLIKE operator for the JSON type. LIKE makes no sense against a JSON structure,but it does make sense against the underlying textual representation. Toget at this with a type like JSONEncodedDict, we need tocoerce the column to a textual form using cast() ortype_coerce() before attempting to use this operator:

  1. from sqlalchemy import type_coerce, String
  2.  
  3. stmt = select([my_table]).where(
  4. type_coerce(my_table.c.json_data, String).like('%foo%'))

TypeDecorator provides a built-in system for working up typetranslations like these based on operators. If we wanted to frequently use theLIKE operator with our JSON object interpreted as a string, we can build itinto the type by overriding the TypeDecorator.coerce_compared_value()method:

  1. from sqlalchemy.sql import operators
  2. from sqlalchemy import String
  3.  
  4. class JSONEncodedDict(TypeDecorator):
  5.  
  6. impl = VARCHAR
  7.  
  8. def coerce_compared_value(self, op, value):
  9. if op in (operators.like_op, operators.notlike_op):
  10. return String()
  11. else:
  12. return self
  13.  
  14. def process_bind_param(self, value, dialect):
  15. if value is not None:
  16. value = json.dumps(value)
  17.  
  18. return value
  19.  
  20. def process_result_value(self, value, dialect):
  21. if value is not None:
  22. value = json.loads(value)
  23. return value

Above is just one approach to handling an operator like “LIKE”. Otherapplications may wish to raise NotImplementedError for operators thathave no meaning with a JSON object such as “LIKE”, rather than automaticallycoercing to text.

Applying SQL-level Bind/Result Processing

As seen in the section Augmenting Existing Types,SQLAlchemy allows Python functions to be invoked both when parameters are sentto a statement, as well as when result rows are loaded from the database, to applytransformations to the values as they are sent to or from the database. It is alsopossible to define SQL-level transformations as well. The rationale here is whenonly the relational database contains a particular series of functions that are necessaryto coerce incoming and outgoing data between an application and persistence format.Examples include using database-defined encryption/decryption functions, as wellas stored procedures that handle geographic data. The PostGIS extension to PostgreSQLincludes an extensive array of SQL functions that are necessary for coercingdata into particular formats.

Any TypeEngine, UserDefinedType or TypeDecorator subclasscan include implementations ofTypeEngine.bind_expression() and/or TypeEngine.column_expression(), whichwhen defined to return a non-None value should return a ColumnElementexpression to be injected into the SQL statement, either surroundingbound parameters or a column expression. For example, to build a Geometrytype which will apply the PostGIS function ST_GeomFromText to all outgoingvalues and the function ST_AsText to all incoming data, we can createour own subclass of UserDefinedType which provides these methodsin conjunction with func:

  1. from sqlalchemy import func
  2. from sqlalchemy.types import UserDefinedType
  3.  
  4. class Geometry(UserDefinedType):
  5. def get_col_spec(self):
  6. return "GEOMETRY"
  7.  
  8. def bind_expression(self, bindvalue):
  9. return func.ST_GeomFromText(bindvalue, type_=self)
  10.  
  11. def column_expression(self, col):
  12. return func.ST_AsText(col, type_=self)

We can apply the Geometry type into Table metadataand use it in a select() construct:

  1. geometry = Table('geometry', metadata,
  2. Column('geom_id', Integer, primary_key=True),
  3. Column('geom_data', Geometry)
  4. )
  5.  
  6. print(select([geometry]).where(
  7. geometry.c.geom_data == 'LINESTRING(189412 252431,189631 259122)'))

The resulting SQL embeds both functions as appropriate. ST_AsTextis applied to the columns clause so that the return value is run throughthe function before passing into a result set, and ST_GeomFromTextis run on the bound parameter so that the passed-in value is converted:

  1. SELECT geometry.geom_id, ST_AsText(geometry.geom_data) AS geom_data_1
  2. FROM geometry
  3. WHERE geometry.geom_data = ST_GeomFromText(:geom_data_2)

The TypeEngine.column_expression() method interacts with themechanics of the compiler such that the SQL expression does not interferewith the labeling of the wrapped expression. Such as, if we rendereda select() against a label() of our expression, the stringlabel is moved to the outside of the wrapped expression:

  1. print(select([geometry.c.geom_data.label('my_data')]))

Output:

  1. SELECT ST_AsText(geometry.geom_data) AS my_data
  2. FROM geometry

Another example is we decoratepostgresql.BYTEA to provide a PGPString, which will make use of thePostgreSQL pgcrypto extension to encrypt/decrypt valuestransparently:

  1. from sqlalchemy import create_engine, String, select, func, \
  2. MetaData, Table, Column, type_coerce, TypeDecorator
  3.  
  4. from sqlalchemy.dialects.postgresql import BYTEA
  5.  
  6. class PGPString(TypeDecorator):
  7. impl = BYTEA
  8.  
  9. def __init__(self, passphrase):
  10. super(PGPString, self).__init__()
  11. self.passphrase = passphrase
  12.  
  13. def bind_expression(self, bindvalue):
  14. # convert the bind's type from PGPString to
  15. # String, so that it's passed to psycopg2 as is without
  16. # a dbapi.Binary wrapper
  17. bindvalue = type_coerce(bindvalue, String)
  18. return func.pgp_sym_encrypt(bindvalue, self.passphrase)
  19.  
  20. def column_expression(self, col):
  21. return func.pgp_sym_decrypt(col, self.passphrase)
  22.  
  23. metadata = MetaData()
  24. message = Table('message', metadata,
  25. Column('username', String(50)),
  26. Column('message',
  27. PGPString("this is my passphrase")),
  28. )
  29.  
  30. engine = create_engine("postgresql://scott:tiger@localhost/test", echo=True)
  31. with engine.begin() as conn:
  32. metadata.create_all(conn)
  33.  
  34. conn.execute(message.insert(), username="some user",
  35. message="this is my message")
  36.  
  37. print(conn.scalar(
  38. select([message.c.message]).\
  39. where(message.c.username == "some user")
  40. ))

The pgp_sym_encrypt and pgp_sym_decrypt functions are appliedto the INSERT and SELECT statements:

  1. INSERT INTO message (username, message)
  2. VALUES (%(username)s, pgp_sym_encrypt(%(message)s, %(pgp_sym_encrypt_1)s))
  3. {'username': 'some user', 'message': 'this is my message',
  4. 'pgp_sym_encrypt_1': 'this is my passphrase'}
  5.  
  6. SELECT pgp_sym_decrypt(message.message, %(pgp_sym_decrypt_1)s) AS message_1
  7. FROM message
  8. WHERE message.username = %(username_1)s
  9. {'pgp_sym_decrypt_1': 'this is my passphrase', 'username_1': 'some user'}

See also

PostGIS Integration

Redefining and Creating New Operators

SQLAlchemy Core defines a fixed set of expression operators available to all column expressions.Some of these operations have the effect of overloading Python’s built in operators;examples of such operators includeColumnOperators.eq() (table.c.somecolumn == 'foo'),ColumnOperators.invert() (~table.c.flag),and ColumnOperators.add() (table.c.x + table.c.y). Other operators are exposed asexplicit methods on column expressions, such asColumnOperators.in_() (table.c.value.in_(['x', 'y'])) and ColumnOperators.like()(table.c.value.like('%ed%')).

The Core expression constructs in all cases consult the type of the expression in order to determinethe behavior of existing operators, as well as to locate additional operators that aren’t part ofthe built in set. The TypeEngine base class defines a root “comparison” implementationTypeEngine.Comparator, and many specific types provide their own sub-implementations of thisclass. User-defined TypeEngine.Comparator implementations can be built directly into asimple subclass of a particular type in order to override or define new operations. Below,we create a Integer subclass which overrides the ColumnOperators.add() operator:

  1. from sqlalchemy import Integer
  2.  
  3. class MyInt(Integer):
  4. class comparator_factory(Integer.Comparator):
  5. def __add__(self, other):
  6. return self.op("goofy")(other)

The above configuration creates a new class MyInt, whichestablishes the TypeEngine.comparator_factory attribute asreferring to a new class, subclassing the TypeEngine.Comparator classassociated with the Integer type.

Usage:

  1. >>> sometable = Table("sometable", metadata, Column("data", MyInt))
  2. >>> print(sometable.c.data + 5)
  3. sometable.data goofy :data_1

The implementation for ColumnOperators.add() is consultedby an owning SQL expression, by instantiating the TypeEngine.Comparator withitself as the expr attribute. The mechanics of the expressionsystem are such that operations continue recursively until anexpression object produces a new SQL expression construct. Above, wecould just as well have said self.expr.op("goofy")(other) insteadof self.op("goofy")(other).

When using Operators.op() for comparison operations that return aboolean result, the Operators.op.is_comparison flag should beset to True:

  1. class MyInt(Integer):
  2. class comparator_factory(Integer.Comparator):
  3. def is_frobnozzled(self, other):
  4. return self.op("--is_frobnozzled->", is_comparison=True)(other)

New methods added to a TypeEngine.Comparator are exposed on anowning SQL expressionusing a getattr scheme, which exposes methods added toTypeEngine.Comparator onto the owning ColumnElement.For example, to add a log() functionto integers:

  1. from sqlalchemy import Integer, func
  2.  
  3. class MyInt(Integer):
  4. class comparator_factory(Integer.Comparator):
  5. def log(self, other):
  6. return func.log(self.expr, other)

Using the above type:

  1. >>> print(sometable.c.data.log(5))
  2. log(:log_1, :log_2)

Unary operationsare also possible. For example, to add an implementation of thePostgreSQL factorial operator, we combine the UnaryExpression constructalong with a custom_op to produce the factorial expression:

  1. from sqlalchemy import Integer
  2. from sqlalchemy.sql.expression import UnaryExpression
  3. from sqlalchemy.sql import operators
  4.  
  5. class MyInteger(Integer):
  6. class comparator_factory(Integer.Comparator):
  7. def factorial(self):
  8. return UnaryExpression(self.expr,
  9. modifier=operators.custom_op("!"),
  10. type_=MyInteger)

Using the above type:

  1. >>> from sqlalchemy.sql import column
  2. >>> print(column('x', MyInteger).factorial())
  3. x !

See also

Operators.op()

TypeEngine.comparator_factory

Creating New Types

The UserDefinedType class is provided as a simple base classfor defining entirely new database types. Use this to represent nativedatabase types not known by SQLAlchemy. If only Python translation behavioris needed, use TypeDecorator instead.

Base for user defined types.

This should be the base of new types. Note thatfor most cases, TypeDecorator is probablymore appropriate:

  1. import sqlalchemy.types as types
  2.  
  3. class MyType(types.UserDefinedType):
  4. def __init__(self, precision = 8):
  5. self.precision = precision
  6.  
  7. def get_col_spec(self, **kw):
  8. return "MYTYPE(%s)" % self.precision
  9.  
  10. def bind_processor(self, dialect):
  11. def process(value):
  12. return value
  13. return process
  14.  
  15. def result_processor(self, dialect, coltype):
  16. def process(value):
  17. return value
  18. return process

Once the type is made, it’s immediately usable:

  1. table = Table('foo', meta,
  2. Column('id', Integer, primary_key=True),
  3. Column('data', MyType(16))
  4. )

The get_col_spec() method will in most cases receive a keywordargument type_expression which refers to the owning expressionof the type as being compiled, such as a Column orcast() construct. This keyword is only sent if the methodaccepts keyword arguments (e.g. **kw) in its argument signature;introspection is used to check for this in order to support legacyforms of this function.

New in version 1.0.0: the owning expression is passed tothe get_col_spec() method via the keyword argumenttype_expression, if it receives **kw in its signature.

  • class Comparator(expr)
  • Bases: sqlalchemy.types.Comparator

  • coercecompared_value(_op, value)

  • Suggest a type for a ‘coerced’ Python value in an expression.

Default behavior for UserDefinedType is thesame as that of TypeDecorator; by default it returnsself, assuming the compared value should be coerced intothe same type as this one. SeeTypeDecorator.coerce_compared_value() for more detail.

Working with Custom Types and Reflection

It is important to note that database types which are modified to haveadditional in-Python behaviors, including types based onTypeDecorator as well as other user-defined subclasses of datatypes,do not have any representation within a database schema. When using databasethe introspection features described at Reflecting Database Objects, SQLAlchemymakes use of a fixed mapping which links the datatype information reported by adatabase server to a SQLAlchemy datatype object. For example, if we lookinside of a PostgreSQL schema at the definition for a particular databasecolumn, we might receive back the string "VARCHAR". SQLAlchemy’sPostgreSQL dialect has a hardcoded mapping which links the string name"VARCHAR" to the SQLAlchemy VARCHAR class, and that’s how when weemit a statement like Table('my_table', m, autoload_with=engine), theColumn object within it would have an instance of VARCHARpresent inside of it.

The implication of this is that if a Table object makes use of typeobjects that don’t correspond directly to the database-native type name, if wecreate a new Table object against a new MetaData collectionfor this database table elsewhere using reflection, it will not have thisdatatype. For example:

  1. >>> from sqlalchemy import Table, Column, MetaData, create_engine, PickleType, Integer
  2. >>> metadata = MetaData()
  3. >>> my_table = Table("my_table", metadata, Column('id', Integer), Column("data", PickleType))
  4. >>> engine = create_engine("sqlite://", echo='debug')
  5. >>> my_table.create(engine)
  6. INFO sqlalchemy.engine.base.Engine
  7. CREATE TABLE my_table (
  8. id INTEGER,
  9. data BLOB
  10. )

Above, we made use of PickleType, which is a TypeDecoratorthat works on top of the LargeBinary datatype, which on SQLitecorresponds to the database type BLOB. In the CREATE TABLE, we see thatthe BLOB datatype is used. The SQLite database knows nothing about thePickleType we’ve used.

If we look at the datatype of my_table.c.data.type, as this is a Pythonobject that was created by us directly, it is PickleType:

  1. >>> my_table.c.data.type
  2. PickleType()

However, if we create another instance of Table using reflection,the use of PickleType is not represented in the SQLite database we’vecreated; we instead get back BLOB:

  1. >>> metadata_two = MetaData()
  2. >>> my_reflected_table = Table("my_table", metadata_two, autoload_with=engine)
  3. INFO sqlalchemy.engine.base.Engine PRAGMA main.table_info("my_table")
  4. INFO sqlalchemy.engine.base.Engine ()
  5. DEBUG sqlalchemy.engine.base.Engine Col ('cid', 'name', 'type', 'notnull', 'dflt_value', 'pk')
  6. DEBUG sqlalchemy.engine.base.Engine Row (0, 'id', 'INTEGER', 0, None, 0)
  7. DEBUG sqlalchemy.engine.base.Engine Row (1, 'data', 'BLOB', 0, None, 0)
  8.  
  9. >>> my_reflected_table.c.data.type
  10. BLOB()

Typically, when an application defines explicit Table metadata withcustom types, there is no need to use table reflection because the necessaryTable metadata is already present. However, for the case where anapplication, or a combination of them, need to make use of both explicitTable metadata which includes custom, Python-level datatypes, as wellas Table objects which set up their Column objects asreflected from the database, which nevertheless still need to exhibit theadditional Python behaviors of the custom datatypes, additional steps must betaken to allow this.

The most straightforward is to override specific columns as described atOverriding Reflected Columns. In this technique, we simplyuse reflection in combination with explicit Column objects for thosecolumns for which we want to use a custom or decorated datatype:

  1. >>> metadata_three = MetaData()
  2. >>> my_reflected_table = Table("my_table", metadata_three, Column("data", PickleType), autoload_with=engine)

The my_reflected_table object above is reflected, and will load thedefinition of the “id” column from the SQLite database. But for the “data”column, we’ve overridden the reflected object with an explicit Columndefinition that includes our desired in-Python datatype, thePickleType. The reflection process will leave this Columnobject intact:

  1. >>> my_reflected_table.c.data.type
  2. PickleType()

A more elaborate way to convert from database-native type objects to customdatatypes is to use the DDLEvents.column_reflect() event handler. Iffor example we knew that we wanted all BLOB datatypes to in fact bePickleType, we could set up a rule across the board:

  1. from sqlalchemy import BLOB
  2. from sqlalchemy import event
  3. from sqlalchemy import PickleType
  4. from sqlalchemy import Table
  5.  
  6. @event.listens_for(Table, "column_reflect")
  7. def _setup_pickletype(inspector, table, column_info):
  8. if isinstance(column_info["type"], BLOB):
  9. column_info["type"] = PickleType()

When the above code is invoked before any table reflection occurs (note alsoit should be invoked only once in the application, as it is a global rule),upon reflecting any Table that includes a column with a BLOBdatatype, the resulting datatype will be stored in the Column objectas PickleType.

In practice, the above event-based approach would likely have additional rulesin order to affect only those columns where the datatype is important, such asa lookup table of table names and possibly column names, or other heuristicsin order to accurately determine which columns should be established with anin Python datatype.