Engine Configuration

The Engine is the starting point for any SQLAlchemy application. It’s“home base” for the actual database and its DBAPI, delivered to the SQLAlchemyapplication through a connection pool and a Dialect, which describes howto talk to a specific kind of database/DBAPI combination.

The general structure can be illustrated as follows:../_images/sqla_engine_arch.pngWhere above, an Engine references both aDialect and a Pool,which together interpret the DBAPI’s module functions as well as the behaviorof the database.

Creating an engine is just a matter of issuing a single call,create_engine():

  1. from sqlalchemy import create_engine
  2. engine = create_engine('postgresql://scott:tiger@localhost:5432/mydatabase')

The above engine creates a Dialect object tailored towardsPostgreSQL, as well as a Pool object which will establish a DBAPIconnection at localhost:5432 when a connection request is first received.Note that the Engine and its underlying Pool do notestablish the first actual DBAPI connection until the Engine.connect()method is called, or an operation which is dependent on this method such asEngine.execute() is invoked. In this way, Engine andPool can be said to have a lazy initialization behavior.

The Engine, once created, can either be used directly to interact with the database,or can be passed to a Session object to work with the ORM. This sectioncovers the details of configuring an Engine. The next section, Working with Engines and Connections,will detail the usage API of the Engine and similar, typically for non-ORMapplications.

Supported Databases

SQLAlchemy includes many Dialect implementations for variousbackends. Dialects for the most common databases are included with SQLAlchemy; a handfulof others require an additional install of a separate dialect.

See the section Dialects for information on the various backends available.

Database Urls

The create_engine() function produces an Engine object basedon a URL. These URLs follow RFC-1738, and usually can include username, password,hostname, database name as well as optional keyword arguments for additional configuration.In some cases a file path is accepted, and in others a “data source name” replacesthe “host” and “database” portions. The typical form of a database URL is:

  1. dialect+driver://username:password@host:port/database

Dialect names include the identifying name of the SQLAlchemy dialect,a name such as sqlite, mysql, postgresql, oracle, or mssql.The drivername is the name of the DBAPI to be used to connect tothe database using all lowercase letters. If not specified, a “default” DBAPIwill be imported if available - this default is typically the most widelyknown driver available for that backend.

As the URL is like any other URL, special characters such as those thatmay be used in the password need to be URL encoded. Below is an exampleof a URL that includes the password "kx%jj5/g":

  1. postgresql+pg8000://dbuser:kx%25jj5%2Fg@pghost10/appdb

The encoding for the above password can be generated using urllib:

  1. >>> import urllib.parse
  2. >>> urllib.parse.quote_plus("kx%jj5/g")
  3. 'kx%25jj5%2Fg'

Examples for common connection styles follow below. For a full index ofdetailed information on all included dialects as well as links to third-partydialects, see Dialects.

PostgreSQL

The PostgreSQL dialect uses psycopg2 as the default DBAPI. pg8000 isalso available as a pure-Python substitute:

  1. # default
  2. engine = create_engine('postgresql://scott:tiger@localhost/mydatabase')
  3.  
  4. # psycopg2
  5. engine = create_engine('postgresql+psycopg2://scott:tiger@localhost/mydatabase')
  6.  
  7. # pg8000
  8. engine = create_engine('postgresql+pg8000://scott:tiger@localhost/mydatabase')

More notes on connecting to PostgreSQL at PostgreSQL.

MySQL

The MySQL dialect uses mysql-python as the default DBAPI. There are manyMySQL DBAPIs available, including MySQL-connector-python and OurSQL:

  1. # default
  2. engine = create_engine('mysql://scott:tiger@localhost/foo')
  3.  
  4. # mysqlclient (a maintained fork of MySQL-Python)
  5. engine = create_engine('mysql+mysqldb://scott:tiger@localhost/foo')
  6.  
  7. # PyMySQL
  8. engine = create_engine('mysql+pymysql://scott:tiger@localhost/foo')

More notes on connecting to MySQL at MySQL.

Oracle

The Oracle dialect uses cx_oracle as the default DBAPI:

  1. engine = create_engine('oracle://scott:tiger@127.0.0.1:1521/sidname')
  2.  
  3. engine = create_engine('oracle+cx_oracle://scott:tiger@tnsname')

More notes on connecting to Oracle at Oracle.

Microsoft SQL Server

The SQL Server dialect uses pyodbc as the default DBAPI. pymssql isalso available:

  1. # pyodbc
  2. engine = create_engine('mssql+pyodbc://scott:tiger@mydsn')
  3.  
  4. # pymssql
  5. engine = create_engine('mssql+pymssql://scott:tiger@hostname:port/dbname')

More notes on connecting to SQL Server at Microsoft SQL Server.

SQLite

SQLite connects to file-based databases, using the Python built-inmodule sqlite3 by default.

As SQLite connects to local files, the URL format is slightly different.The “file” portion of the URL is the filename of the database.For a relative file path, this requires three slashes:

  1. # sqlite://<nohostname>/<path>
  2. # where <path> is relative:
  3. engine = create_engine('sqlite:///foo.db')

And for an absolute file path, the three slashes are followed by the absolute path:

  1. # Unix/Mac - 4 initial slashes in total
  2. engine = create_engine('sqlite:////absolute/path/to/foo.db')
  3.  
  4. # Windows
  5. engine = create_engine('sqlite:///C:\\path\\to\\foo.db')
  6.  
  7. # Windows alternative using raw string
  8. engine = create_engine(r'sqlite:///C:\path\to\foo.db')

To use a SQLite :memory: database, specify an empty URL:

  1. engine = create_engine('sqlite://')

More notes on connecting to SQLite at SQLite.

Others

See Dialects, the top-level page for all additional dialectdocumentation.

Engine Creation API

  • sqlalchemy.createengine(args, *kwargs_)
  • Create a new Engine instance.

The standard calling form is to send the URL as thefirst positional argument, usually a stringthat indicates database dialect and connection arguments:

  1. engine = create_engine("postgresql://scott:tiger@localhost/test")

Additional keyword arguments may then follow it whichestablish various options on the resulting Engineand its underlying Dialect and Poolconstructs:

  1. engine = create_engine("mysql://scott:tiger@hostname/dbname",
  2. encoding='latin1', echo=True)

The string form of the URL isdialect[+driver]://user:password@host/dbname[?key=value..], wheredialect is a database name such as mysql, oracle,postgresql, etc., and driver the name of a DBAPI, such aspsycopg2, pyodbc, cx_oracle, etc. Alternatively,the URL can be an instance of URL.

**kwargs takes a wide variety of options which are routedtowards their appropriate components. Arguments may be specific tothe Engine, the underlying Dialect, as well as thePool. Specific dialects also accept keyword arguments thatare unique to that dialect. Here, we describe the parametersthat are common to most create_engine() usage.

Once established, the newly resulting Engine willrequest a connection from the underlying Pool onceEngine.connect() is called, or a method which depends on itsuch as Engine.execute() is invoked. The Pool in turnwill establish the first actual DBAPI connection when this requestis received. The create_engine() call itself does notestablish any actual DBAPI connections directly.

See also

Engine Configuration

Dialects

Working with Engines and Connections

  • Parameters
    • case_sensitive=True – if False, result column nameswill match in a case-insensitive fashion, that is,row['SomeColumn'].

    • connect_args – a dictionary of options which will bepassed directly to the DBAPI’s connect() method asadditional keyword arguments. See the exampleat Custom DBAPI connect() arguments.

    • convert_unicode=False

if set to True, causesall String datatypes to act as though theString.convert_unicode flag has been set to True,regardless of a setting of False on an individual Stringtype. This has the effect of causing all String -basedcolumns to accommodate Python Unicode objects directly as though thedatatype were the Unicode type.

Deprecated since version 1.3: The create_engine.convert_unicode parameteris deprecated and will be removed in a future release.All modern DBAPIs now support Python Unicode directly and thisparameter is unnecessary.

  1. -

creator – a callable which returns a DBAPI connection.This creation function will be passed to the underlyingconnection pool and will be used to create all new databaseconnections. Usage of this function causes connectionparameters specified in the URL argument to be bypassed.

  1. -

echo=False

if True, the Engine will log all statementsas well as a repr() of their parameter lists to the default loghandler, which defaults to sys.stdout for output. If set to thestring "debug", result rows will be printed to the standard outputas well. The echo attribute of Engine can be modified at anytime to turn logging on and off; direct control of logging is alsoavailable using the standard Python logging module.

See also

Configuring Logging - further detail on how to configurelogging.

  1. -

echo_pool=False

if True, the connection pool will loginformational output such as when connections are invalidatedas well as when connections are recycled to the default log handler,which defaults to sys.stdout for output. If set to the string"debug", the logging will include pool checkouts and checkins.Direct control of logging is also available using the standard Pythonlogging module.

See also

Configuring Logging - further detail on how to configurelogging.

  1. -

empty_in_strategy

The SQL compilation strategy to use whenrendering an IN or NOT IN expression for ColumnOperators.in_()where the right-hand sideis an empty set. This is a string value that may be one ofstatic, dynamic, or dynamic_warn. The staticstrategy is the default, and an IN comparison to an empty setwill generate a simple false expression “1 != 1”. The dynamicstrategy behaves like that of SQLAlchemy 1.1 and earlier, emittinga false expression of the form “expr != expr”, which has the effectof evaluting to NULL in the case of a null expression.dynamic_warn is the same as dynamic, however also emits awarning when an empty set is encountered; this because the “dynamic”comparison is typically poorly performing on most databases.

New in version 1.2: Added the empty_in_strategy setting andadditionally defaulted the behavior for empty-set IN comparisonsto a static boolean expression.

  1. -

encoding

Defaults to utf-8. This is the stringencoding used by SQLAlchemy for string encode/decodeoperations which occur within SQLAlchemy, outside ofthe DBAPI. Most modern DBAPIs feature some degree ofdirect support for Python unicode objects,what you see in Python 2 as a string of the formu'some string'. For those scenarios where theDBAPI is detected as not supporting a Python unicodeobject, this encoding is used to determine thesource/destination encoding. It is not usedfor those cases where the DBAPI handles unicodedirectly.

To properly configure a system to accommodate Pythonunicode objects, the DBAPI should beconfigured to handle unicode to the greatestdegree as is appropriate - seethe notes on unicode pertaining to the specifictarget database in use at Dialects.

Areas where string encoding may need to be accommodatedoutside of the DBAPI include zero or more of:

  1. -

the values passed to bound parameters, corresponding tothe Unicode type or the String typewhen convert_unicode is True;

  1. -

the values returned in result set columns correspondingto the Unicode type or the Stringtype when convert_unicode is True;

  1. -

the string SQL statement passed to the DBAPI’scursor.execute() method;

  1. -

the string names of the keys in the bound parameterdictionary passed to the DBAPI’s cursor.execute()as well as cursor.setinputsizes() methods;

  1. -

the string column names retrieved from the DBAPI’scursor.description attribute.

When using Python 3, the DBAPI is required to supportall of the above values as Python unicode objects,which in Python 3 are just known as str. In Python 2,the DBAPI does not specify unicode behavior at all,so SQLAlchemy must make decisions for each of the abovevalues on a per-DBAPI basis - implementations arecompletely inconsistent in their behavior.

  1. -

execution_options – Dictionary execution options which willbe applied to all connections. Seeexecution_options()

  1. -

hide_parameters

Boolean, when set to True, SQL statement parameterswill not be displayed in INFO logging nor will they be formatted intothe string representation of StatementError objects.

New in version 1.3.8.

  1. -

implicit_returning=True – When True, a RETURNING-compatible construct, if available, will be used tofetch newly generated primary key values when a single rowINSERT statement is emitted with no existing returning()clause. This applies to those backends which support RETURNINGor a compatible construct, including PostgreSQL, Firebird, Oracle,Microsoft SQL Server. Set this to False to disablethe automatic usage of RETURNING.

  1. -

isolation_level

this string parameter is interpreted by variousdialects in order to affect the transaction isolation level of thedatabase connection. The parameter essentially accepts some subset ofthese string arguments: "SERIALIZABLE", "REPEATABLE_READ","READ_COMMITTED", "READ_UNCOMMITTED" and "AUTOCOMMIT".Behavior here varies per backend, andindividual dialects should be consulted directly.

Note that the isolation level can also be set on aper-Connection basis as well, using theConnection.execution_options.isolation_levelfeature.

See also

Connection.default_isolation_level - view default level

Connection.execution_options.isolation_level- set per Connection isolation level

SQLite Transaction Isolation

PostgreSQL Transaction Isolation

MySQL Transaction Isolation

Setting Transaction Isolation Levels - for the ORM

  1. -

json_deserializer

for dialects that support the JSONdatatype, this is a Python callable that will convert a JSON stringto a Python object. By default, the Python json.loads function isused.

Changed in version 1.3.7: The SQLite dialect renamed this from_json_deserializer.

  1. -

json_serializer

for dialects that support the JSONdatatype, this is a Python callable that will render a given objectas JSON. By default, the Python json.dumps function is used.

Changed in version 1.3.7: The SQLite dialect renamed this from_json_serializer.

  1. -

label_length=None

optional integer value which limitsthe size of dynamically generated column labels to that manycharacters. If less than 6, labels are generated as“_(counter)”. If None, the value ofdialect.max_identifier_length, which may be affected via thecreate_engine.max_identifier_length parameter,is used instead. The value of create_engine.label_lengthmay not be larger than that ofcreate_engine.max_identfier_length.

See also

create_engine.max_identifier_length

  1. -

listeners – A list of one or morePoolListener objects which willreceive connection pool events.

  1. -

logging_name – String identifier which will be used withinthe “name” field of logging records generated within the“sqlalchemy.engine” logger. Defaults to a hexstring of theobject’s id.

  1. -

max_identifier_length

integer; override the max_identifier_lengthdetermined by the dialect. if None or zero, has no effect. Thisis the database’s configured maximum number of characters that may beused in a SQL identifier such as a table name, column name, or labelname. All dialects determine this value automatically, however in thecase of a new database version for which this value has changed butSQLAlchemy’s dialect has not been adjusted, the value may be passedhere.

New in version 1.3.9.

See also

create_engine.label_length

  1. -

max_overflow=10 – the number of connections to allow inconnection pool “overflow”, that is connections that can beopened above and beyond the pool_size setting, which defaultsto five. this is only used with QueuePool.

  1. -

module=None – reference to a Python module object (the moduleitself, not its string name). Specifies an alternate DBAPI module tobe used by the engine’s dialect. Each sub-dialect references aspecific DBAPI which will be imported before first connect. Thisparameter causes the import to be bypassed, and the given module tobe used instead. Can be used for testing of DBAPIs as well as toinject “mock” DBAPI implementations into the Engine.

  1. -

paramstyle=None – The paramstyleto use when rendering bound parameters. This style defaults to theone recommended by the DBAPI itself, which is retrieved from the.paramstyle attribute of the DBAPI. However, most DBAPIs acceptmore than one paramstyle, and in particular it may be desirableto change a “named” paramstyle into a “positional” one, or vice versa.When this attribute is passed, it should be one of the values"qmark", "numeric", "named", "format" or"pyformat", and should correspond to a parameter style knownto be supported by the DBAPI in use.

  1. -

pool=None – an already-constructed instance ofPool, such as aQueuePool instance. If non-None, thispool will be used directly as the underlying connection poolfor the engine, bypassing whatever connection parameters arepresent in the URL argument. For information on constructingconnection pools manually, see Connection Pooling.

  1. -

poolclass=None – a Poolsubclass, which will be used to create a connection poolinstance using the connection parameters given in the URL. Notethis differs from pool in that you don’t actuallyinstantiate the pool in this case, you just indicate what typeof pool to be used.

  1. -

pool_logging_name – String identifier which will be used withinthe “name” field of logging records generated within the“sqlalchemy.pool” logger. Defaults to a hexstring of the object’sid.

  1. -

pool_pre_ping

boolean, if True will enable the connection pool“pre-ping” feature that tests connections for liveness uponeach checkout.

New in version 1.2.

See also

Disconnect Handling - Pessimistic

  1. -

pool_size=5 – the number of connections to keep openinside the connection pool. This used withQueuePool aswell as SingletonThreadPool. WithQueuePool, a pool_size settingof 0 indicates no limit; to disable pooling, set poolclass toNullPool instead.

  1. -

pool_recycle=-1

this setting causes the pool to recycleconnections after the given number of seconds has passed. Itdefaults to -1, or no timeout. For example, setting to 3600means connections will be recycled after one hour. Note thatMySQL in particular will disconnect automatically if noactivity is detected on a connection for eight hours (althoughthis is configurable with the MySQLDB connection itself and theserver configuration as well).

See also

Setting Pool Recycle

  1. -

pool_reset_on_return='rollback'

set thePool.reset_on_return parameter of the underlyingPool object, which can be set to the values"rollback", "commit", or None.

See also

Pool.reset_on_return

  1. -

pool_timeout=30 – number of seconds to wait before givingup on getting a connection from the pool. This is only usedwith QueuePool.

  1. -

pool_use_lifo=False

use LIFO (last-in-first-out) when retrievingconnections from QueuePool instead of FIFO(first-in-first-out). Using LIFO, a server-side timeout scheme canreduce the number of connections used during non- peak periods ofuse. When planning for server-side timeouts, ensure that a recycle orpre-ping strategy is in use to gracefully handle stale connections.

  1. -

plugins

string list of plugin names to load. SeeCreateEnginePlugin for background.

New in version 1.2.3.

  1. -

strategy='plain'

selects alternate engine implementations.Currently available are:

  1. -

the threadlocal strategy, which is described inUsing the Threadlocal Execution Strategy;

  1. -

the mock strategy, which dispatches all statementexecution to a function passed as the argument executor.See example in the FAQ.

  1. -

executor=None – a function taking arguments(sql, multiparams, *params), to which the mock strategy willdispatch all statement execution. Used only by strategy='mock'.

  • sqlalchemy.enginefrom_config(_configuration, prefix='sqlalchemy.', **kwargs)
  • Create a new Engine instance using a configuration dictionary.

The dictionary is typically produced from a config file.

The keys of interest to engine_from_config() should be prefixed, e.g.sqlalchemy.url, sqlalchemy.echo, etc. The ‘prefix’ argumentindicates the prefix to be searched for. Each matching key (after theprefix is stripped) is treated as though it were the corresponding keywordargument to a create_engine() call.

The only required key is (assuming the default prefix) sqlalchemy.url,which provides the database URL.

A select set of keyword arguments will be “coerced” to theirexpected type based on string values. The set of argumentsis extensible per-dialect using the engine_config_types accessor.

  • Parameters
    • configuration – A dictionary (typically produced from a config file,but this is not a requirement). Items whose keys start with the valueof ‘prefix’ will have that prefix stripped, and will then be passed tocreate_engine.

    • prefix – Prefix to match and then strip from keysin ‘configuration’.

    • kwargs – Each keyword argument to enginefrom_config() itselfoverrides the corresponding item taken from the ‘configuration’dictionary. Keyword arguments should _not be prefixed.

  • sqlalchemy.engine.url.makeurl(_name_or_url)
  • Given a string or unicode instance, produce a new URL instance.

The given string is parsed according to the RFC 1738 spec. If anexisting URL object is passed, just returns the object.

  • class sqlalchemy.engine.url.URL(drivername, username=None, password=None, host=None, port=None, database=None, query=None)
  • Represent the components of a URL used to connect to a database.

This object is suitable to be passed directly to acreate_engine() call. The fields of the URL are parsedfrom a string by the make_url() function. the stringformat of the URL is an RFC-1738-style string.

All initialization parameters are available as public attributes.

  • Parameters
    • drivername – the name of the database backend.This name will correspond to a module in sqlalchemy/databasesor a third party plug-in.

    • username – The user name.

    • password – database password.

    • host – The name of the host.

    • port – The port number.

    • database – The database name.

    • query – A dictionary of options to be passed to thedialect and/or the DBAPI upon connect.

  • get_dialect()

  • Return the SQLAlchemy database dialect class correspondingto this URL’s driver name.

  • translateconnect_args(_names=[], **kw)

  • Translate url attributes into a dictionary of connection arguments.

Returns attributes of this url (host, database, username,password, port) as a plain dictionary. The attribute names areused as the keys by default. Unset or false attributes are omittedfrom the final dictionary.

  1. - Parameters
  2. -
  3. -

**kw – Optional, alternate key names for url attributes.

  1. -

names – Deprecated. Same purpose as the keyword-based alternatenames, but correlates the name to the original positionally.

Pooling

The Engine will ask the connection pool for aconnection when the connect() or execute() methods are called. Thedefault connection pool, QueuePool, will open connections to thedatabase on an as-needed basis. As concurrent statements are executed,QueuePool will grow its pool of connections to adefault size of five, and will allow a default “overflow” of ten. Since theEngine is essentially “home base” for theconnection pool, it follows that you should keep a singleEngine per database established within anapplication, rather than creating a new one for each connection.

Note

QueuePool is not used by default for SQLite engines. SeeSQLite for details on SQLite connection pool usage.

For more information on connection pooling, see Connection Pooling.

Custom DBAPI connect() arguments

Custom arguments used when issuing the connect() call to the underlyingDBAPI may be issued in three distinct ways. String-based arguments can bepassed directly from the URL string as query arguments:

  1. db = create_engine('postgresql://scott:tiger@localhost/test?argument1=foo&argument2=bar')

If SQLAlchemy’s database connector is aware of a particular query argument, itmay convert its type from string to its proper type.

create_engine() also takes an argument connect_args which is an additional dictionary that will be passed to connect(). This can be used when arguments of a type other than string are required, and SQLAlchemy’s database connector has no type conversion logic present for that parameter:

  1. db = create_engine('postgresql://scott:tiger@localhost/test', connect_args = {'argument1':17, 'argument2':'bar'})

The most customizable connection method of all is to pass a creatorargument, which specifies a callable that returns a DBAPI connection:

  1. def connect():
  2. return psycopg.connect(user='scott', host='localhost')
  3.  
  4. db = create_engine('postgresql://', creator=connect)

Configuring Logging

Python’s standard logging module is used toimplement informational and debug log output with SQLAlchemy. This allowsSQLAlchemy’s logging to integrate in a standard way with other applicationsand libraries. There are also two parameterscreate_engine.echo and create_engine.echo_poolpresent on create_engine() which allow immediate logging to sys.stdoutfor the purposes of local development; these parameters ultimately interactwith the regular Python loggers described below.

This section assumes familiarity with the above linked logging module. Alllogging performed by SQLAlchemy exists underneath the sqlalchemynamespace, as used by logging.getLogger('sqlalchemy'). When logging hasbeen configured (i.e. such as via logging.basicConfig()), the generalnamespace of SA loggers that can be turned on is as follows:

  • sqlalchemy.engine - controls SQL echoing. set to logging.INFO forSQL query output, logging.DEBUG for query + result set output. Thesesettings are equivalent to echo=True and echo="debug" oncreate_engine.echo, respectively.

  • sqlalchemy.pool - controls connection pool logging. set tologging.INFO to log connection invalidation and recycle events; set tologging.DEBUG to additionally log all pool checkins and checkouts.These settings are equivalent to pool_echo=True and pool_echo="debug"on create_engine.echo_pool, respectively.

  • sqlalchemy.dialects - controls custom logging for SQL dialects, to theextend that logging is used within specific dialects, which is generallyminimal.

  • sqlalchemy.orm - controls logging of various ORM functions to the extentthat logging is used within the ORM, which is generally minimal. Set tologging.INFO to log some top-level information on mapper configurations.

For example, to log SQL queries using Python logging instead of theecho=True flag:

  1. import logging
  2.  
  3. logging.basicConfig()
  4. logging.getLogger('sqlalchemy.engine').setLevel(logging.INFO)

By default, the log level is set to logging.WARN within the entiresqlalchemy namespace so that no log operations occur, even within anapplication that has logging enabled otherwise.

The echo flags present as keyword arguments tocreate_engine() and others as well as the echo propertyon Engine, when set to True, will firstattempt to ensure that logging is enabled. Unfortunately, the loggingmodule provides no way of determining if output has already been configured(note we are referring to if a logging configuration has been set up, not justthat the logging level is set). For this reason, any echo=True flags willresult in a call to logging.basicConfig() using sys.stdout as thedestination. It also sets up a default format using the level name, timestamp,and logger name. Note that this configuration has the affect of beingconfigured in addition to any existing logger configurations. Therefore,when using Python logging, ensure all echo flags are set to False at alltimes, to avoid getting duplicate log lines.

The logger name of instance such as an Engineor Pool defaults to using a truncated hex identifierstring. To set this to a specific name, use the “logging_name” and“pool_logging_name” keyword arguments with sqlalchemy.create_engine().

Note

The SQLAlchemy Engine conserves Python function call overheadby only emitting log statements when the current logging level is detectedas logging.INFO or logging.DEBUG. It only checks this level whena new connection is procured from the connection pool. Therefore whenchanging the logging configuration for an already-running application, anyConnection that’s currently active, or more commonly aSession object that’s active in a transaction, won’t log anySQL according to the new configuration until a new Connectionis procured (in the case of Session, this isafter the current transaction ends and a new one begins).