Indices and tables¶
Subpackages¶
pydal.adapters package¶
Submodules¶
pydal.adapters.base module¶
- class pydal.adapters.base.AdapterMeta[source]¶
Bases: type
Metaclass to support manipulation of adapter classes.
At the moment is used to intercept entity_quoting argument passed to DAL.
- class pydal.adapters.base.BaseAdapter(db, uri, pool_size=0, folder=None, db_codec='UTF-8', credential_decoder=<function IDENTITY at 0x7fef1043a9b0>, driver_args={}, adapter_args={}, do_connect=True, after_connection=None)[source]¶
Bases: pydal.connection.ConnectionPool
- FALSE = 'F'¶
- FALSE_exp = '0'¶
- QUOTE_TEMPLATE = '"%s"'¶
- TRUE = 'T'¶
- TRUE_exp = '1'¶
- T_SEP = ' '¶
- can_select_for_update = True¶
- commit_on_alter_table = False¶
- connection = None¶
- connector(*args, **kwargs)¶
- dbpath = None¶
- driver = None¶
- driver_auto_json = []¶
- driver_name = None¶
- drivers = ()¶
- folder = None¶
- iterparse(sql, fields, colnames, blob_decode=True, cacheable=False)[source]¶
Iterator to parse one row at a time. It doen’t support the old style virtual fields
- log(message, table=None)[source]¶
Logs migrations
It will not log changes if logfile is not specified. Defaults to sql.log
- migrate_table(table, sql_fields, sql_fields_old, sql_fields_aux, logfile, fake_migrate=False)[source]¶
- rowslice(rows, minimum=0, maximum=None)[source]¶
By default this function does nothing; overload when db does not do slicing.
- support_distributed_transaction = False¶
- test_query = 'SELECT 1;'¶
- types = {'reference': 'INTEGER REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s %(null)s %(unique)s', 'text': 'TEXT', 'float': 'DOUBLE', 'datetime': 'TIMESTAMP', 'bigint': 'INTEGER', 'id': 'INTEGER PRIMARY KEY AUTOINCREMENT', 'reference FK': ', CONSTRAINT "FK_%(constraint_name)s" FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'json': 'TEXT', 'big-id': 'INTEGER PRIMARY KEY AUTOINCREMENT', 'blob': 'BLOB', 'big-reference': 'INTEGER REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s %(null)s %(unique)s', 'string': 'CHAR(%(length)s)', 'list:string': 'TEXT', 'date': 'DATE', 'integer': 'INTEGER', 'password': 'CHAR(%(length)s)', 'list:integer': 'TEXT', 'double': 'DOUBLE', 'decimal': 'DOUBLE', 'upload': 'CHAR(%(length)s)', 'list:reference': 'TEXT', 'boolean': 'CHAR(1)', 'time': 'TIME'}¶
- uploads_in_blob = False¶
- class pydal.adapters.base.NoSQLAdapter(db, uri, pool_size=0, folder=None, db_codec='UTF-8', credential_decoder=<function IDENTITY at 0x7fef1043a9b0>, driver_args={}, adapter_args={}, do_connect=True, after_connection=None)[source]¶
Bases: pydal.adapters.base.BaseAdapter
- QUOTE_TEMPLATE = '%s'¶
- can_select_for_update = False¶
pydal.adapters.couchdb module¶
- class pydal.adapters.couchdb.CouchDBAdapter(db, uri='couchdb://127.0.0.1:5984', pool_size=0, folder=None, db_codec='UTF-8', credential_decoder=<function IDENTITY at 0x7fef1043a9b0>, driver_args={}, adapter_args={}, do_connect=True, after_connection=None)[source]¶
Bases: pydal.adapters.base.NoSQLAdapter
- drivers = ('couchdb',)¶
- types = {'string': <type 'str'>, 'reference': <type 'long'>, 'text': <type 'str'>, 'id': <type 'long'>, 'float': <type 'float'>, 'bigint': <type 'long'>, 'upload': <type 'str'>, 'datetime': <type 'datetime.datetime'>, 'json': <type 'str'>, 'boolean': <type 'bool'>, 'blob': <type 'str'>, 'list:string': <type 'list'>, 'double': <type 'float'>, 'date': <type 'datetime.date'>, 'integer': <type 'long'>, 'password': <type 'str'>, 'list:integer': <type 'list'>, 'time': <type 'datetime.time'>, 'list:reference': <type 'list'>}¶
- uploads_in_blob = True¶
pydal.adapters.cubrid module¶
- class pydal.adapters.cubrid.CubridAdapter(db, uri, pool_size=0, folder=None, db_codec='UTF-8', credential_decoder=<function IDENTITY at 0x7fef1043a9b0>, driver_args={}, adapter_args={}, do_connect=True, after_connection=None)[source]¶
Bases: pydal.adapters.mysql.MySQLAdapter
- REGEX_URI = <_sre.SRE_Pattern object at 0x1c69890>¶
- drivers = ('cubriddb',)¶
pydal.adapters.db2 module¶
- class pydal.adapters.db2.DB2Adapter(db, uri, pool_size=0, folder=None, db_codec='UTF-8', credential_decoder=<function IDENTITY at 0x7fef1043a9b0>, driver_args={}, adapter_args={}, do_connect=True, after_connection=None)[source]¶
Bases: pydal.adapters.base.BaseAdapter
- drivers = ('ibm_db_dbi', 'pyodbc')¶
- types = {'reference': 'INT, FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'text': 'CLOB', 'float': 'REAL', 'datetime': 'TIMESTAMP', 'bigint': 'BIGINT', 'reference TFK': ' CONSTRAINT FK_%(foreign_table)s_PK FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_table)s (%(foreign_key)s) ON DELETE %(on_delete_action)s', 'id': 'INT GENERATED ALWAYS AS IDENTITY PRIMARY KEY NOT NULL', 'reference FK': ', CONSTRAINT FK_%(constraint_name)s FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'json': 'CLOB', 'big-id': 'BIGINT GENERATED ALWAYS AS IDENTITY PRIMARY KEY NOT NULL', 'blob': 'BLOB', 'big-reference': 'BIGINT, FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'string': 'VARCHAR(%(length)s)', 'list:string': 'CLOB', 'date': 'DATE', 'integer': 'INT', 'password': 'VARCHAR(%(length)s)', 'list:integer': 'CLOB', 'double': 'DOUBLE', 'decimal': 'NUMERIC(%(precision)s,%(scale)s)', 'upload': 'VARCHAR(%(length)s)', 'list:reference': 'CLOB', 'boolean': 'CHAR(1)', 'time': 'TIME'}¶
pydal.adapters.firebird module¶
- class pydal.adapters.firebird.FireBirdAdapter(db, uri, pool_size=0, folder=None, db_codec='UTF-8', credential_decoder=<function IDENTITY at 0x7fef1043a9b0>, driver_args={}, adapter_args={}, do_connect=True, after_connection=None)[source]¶
Bases: pydal.adapters.base.BaseAdapter
- REGEX_URI = <_sre.SRE_Pattern object at 0x1c7c900>¶
- commit_on_alter_table = True¶
- drivers = ('kinterbasdb', 'firebirdsql', 'fdb', 'pyodbc')¶
- support_distributed_transaction = True¶
- types = {'reference': 'INTEGER REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'text': 'BLOB SUB_TYPE 1', 'float': 'FLOAT', 'datetime': 'TIMESTAMP', 'bigint': 'BIGINT', 'id': 'INTEGER PRIMARY KEY', 'json': 'BLOB SUB_TYPE 1', 'big-id': 'BIGINT PRIMARY KEY', 'blob': 'BLOB SUB_TYPE 0', 'big-reference': 'BIGINT REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'string': 'VARCHAR(%(length)s)', 'list:string': 'BLOB SUB_TYPE 1', 'date': 'DATE', 'integer': 'INTEGER', 'password': 'VARCHAR(%(length)s)', 'list:integer': 'BLOB SUB_TYPE 1', 'double': 'DOUBLE PRECISION', 'decimal': 'DECIMAL(%(precision)s,%(scale)s)', 'upload': 'VARCHAR(%(length)s)', 'list:reference': 'BLOB SUB_TYPE 1', 'boolean': 'CHAR(1)', 'time': 'TIME'}¶
- class pydal.adapters.firebird.FireBirdEmbeddedAdapter(db, uri, pool_size=0, folder=None, db_codec='UTF-8', credential_decoder=<function IDENTITY at 0x7fef1043a9b0>, driver_args={}, adapter_args={}, do_connect=True, after_connection=None)[source]¶
Bases: pydal.adapters.firebird.FireBirdAdapter
- REGEX_URI = <_sre.SRE_Pattern object at 0x1c33350>¶
- drivers = ('kinterbasdb', 'firebirdsql', 'fdb', 'pyodbc')¶
pydal.adapters.google_adapters module¶
Adapter for GAE
pydal.adapters.imap module¶
- class pydal.adapters.imap.IMAPAdapter(db, uri, pool_size=0, folder=None, db_codec='UTF-8', credential_decoder=<function IDENTITY at 0x7fef1043a9b0>, driver_args={}, adapter_args={}, do_connect=True, after_connection=None)[source]¶
Bases: pydal.adapters.base.NoSQLAdapter
IMAP server adapter
This class is intended as an interface with email IMAP servers to perform simple queries in the web2py DAL query syntax, so email read, search and other related IMAP mail services (as those implemented by brands like Google(r), and Yahoo!(r) can be managed from web2py applications.
The code uses examples by Yuji Tomita on this post: http://yuji.wordpress.com/2011/06/22/python-imaplib-imap-example-with-gmail/#comment-1137 and is based in docs for Python imaplib, python email and email IETF’s (i.e. RFC2060 and RFC3501)
This adapter was tested with a small set of operations with Gmail(r). Other services requests could raise command syntax and response data issues.
It creates its table and field names “statically”, meaning that the developer should leave the table and field definitions to the DAL instance by calling the adapter’s .define_tables() method. The tables are defined with the IMAP server mailbox list information.
.define_tables() returns a dictionary mapping dal tablenames to the server mailbox names with the following structure:
{<tablename>: str <server mailbox name>}
Here is a list of supported fields:
Field Type Description uid string answered boolean Flag created date content list:string A list of dict text or html parts to string cc string bcc string size integer the amount of octets of the message* deleted boolean Flag draft boolean Flag flagged boolean Flag sender string recent boolean Flag seen boolean Flag subject string mime string The mime header declaration email string The complete RFC822 message (*) attachments list Each non text part as dict encoding string The main detected encoding (*) At the application side it is measured as the length of the RFC822 message string
WARNING: As row id’s are mapped to email sequence numbers, make sure your imap client web2py app does not delete messages during select or update actions, to prevent updating or deleting different messages. Sequence numbers change whenever the mailbox is updated. To avoid this sequence numbers issues, it is recommended the use of uid fields in query references (although the update and delete in separate actions rule still applies).
# This is the code recommended to start imap support # at the app's model: imapdb = DAL("imap://user:password@server:port", pool_size=1) # port 993 for ssl imapdb.define_tables()
Here is an (incomplete) list of possible imap commands:
# Count today's unseen messages # smaller than 6000 octets from the # inbox mailbox q = imapdb.INBOX.seen == False q &= imapdb.INBOX.created == datetime.date.today() q &= imapdb.INBOX.size < 6000 unread = imapdb(q).count() # Fetch last query messages rows = imapdb(q).select() # it is also possible to filter query select results with limitby and # sequences of mailbox fields set.select(<fields sequence>, limitby=(<int>, <int>)) # Mark last query messages as seen messages = [row.uid for row in rows] seen = imapdb(imapdb.INBOX.uid.belongs(messages)).update(seen=True) # Delete messages in the imap database that have mails from mr. Gumby deleted = 0 for mailbox in imapdb.tables deleted += imapdb(imapdb[mailbox].sender.contains("gumby")).delete() # It is possible also to mark messages for deletion instead of ereasing them # directly with set.update(deleted=True) # This object give access # to the adapter auto mailbox # mapped names (which native # mailbox has what table name) imapdb.mailboxes <dict> # tablename, server native name pairs # To retrieve a table native mailbox name use: imapdb.<table>.mailbox ### New features v2.4.1: # Declare mailboxes statically with tablename, name pairs # This avoids the extra server names retrieval imapdb.define_tables({"inbox": "INBOX"}) # Selects without content/attachments/email columns will only # fetch header and flags imapdb(q).select(imapdb.INBOX.sender, imapdb.INBOX.subject)
- REGEX_URI = <_sre.SRE_Pattern object at 0x7fef10367c38>¶
- dbengine = 'imap'¶
- define_tables(mailbox_names=None)[source]¶
Auto create common IMAP fileds
This function creates fields definitions “statically” meaning that custom fields as in other adapters should not be supported and definitions handled on a service/mode basis (local syntax for Gmail(r), Ymail(r)
Returns a dictionary with tablename, server native mailbox name pairs.
- drivers = ('imaplib',)¶
- reconnect(f=None)[source]¶
IMAP4 Pool connection method
imap connection lacks of self cursor command. A custom command should be provided as a replacement for connection pooling to prevent uncaught remote session closing
- types = {'boolean': <type 'bool'>, 'string': <type 'str'>, 'list:string': <type 'str'>, 'integer': <type 'int'>, 'date': <type 'datetime.date'>, 'text': <type 'str'>, 'blob': <type 'str'>, 'bigint': <type 'long'>, 'id': <type 'long'>, 'datetime': <type 'datetime.datetime'>}¶
- uri = None¶
MESSAGE is an identifier for sequence number
pydal.adapters.informix module¶
- class pydal.adapters.informix.InformixAdapter(db, uri, pool_size=0, folder=None, db_codec='UTF-8', credential_decoder=<function IDENTITY at 0x7fef1043a9b0>, driver_args={}, adapter_args={}, do_connect=True, after_connection=None)[source]¶
Bases: pydal.adapters.base.BaseAdapter
- REGEX_URI = <_sre.SRE_Pattern object at 0x7fef1036a430>¶
- drivers = ('informixdb',)¶
- types = {'reference': 'INTEGER REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'text': 'BLOB SUB_TYPE 1', 'float': 'FLOAT', 'datetime': 'DATETIME', 'bigint': 'BIGINT', 'reference TFK': 'FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_table)s (%(foreign_key)s) ON DELETE %(on_delete_action)s CONSTRAINT TFK_%(table_name)s_%(field_name)s', 'id': 'SERIAL', 'reference FK': 'REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s CONSTRAINT FK_%(table_name)s_%(field_name)s', 'json': 'BLOB SUB_TYPE 1', 'big-id': 'BIGSERIAL', 'blob': 'BLOB SUB_TYPE 0', 'big-reference': 'BIGINT REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'string': 'VARCHAR(%(length)s)', 'list:string': 'BLOB SUB_TYPE 1', 'date': 'DATE', 'integer': 'INTEGER', 'password': 'VARCHAR(%(length)s)', 'list:integer': 'BLOB SUB_TYPE 1', 'double': 'DOUBLE PRECISION', 'decimal': 'NUMERIC(%(precision)s,%(scale)s)', 'upload': 'VARCHAR(%(length)s)', 'list:reference': 'BLOB SUB_TYPE 1', 'boolean': 'CHAR(1)', 'time': 'CHAR(8)'}¶
- class pydal.adapters.informix.InformixSEAdapter(db, uri, pool_size=0, folder=None, db_codec='UTF-8', credential_decoder=<function IDENTITY at 0x7fef1043a9b0>, driver_args={}, adapter_args={}, do_connect=True, after_connection=None)[source]¶
Bases: pydal.adapters.informix.InformixAdapter
work in progress
pydal.adapters.ingres module¶
- class pydal.adapters.ingres.IngresAdapter(db, uri, pool_size=0, folder=None, db_codec='UTF-8', credential_decoder=<function IDENTITY at 0x7fef1043a9b0>, driver_args={}, adapter_args={}, do_connect=True, after_connection=None)[source]¶
Bases: pydal.adapters.base.BaseAdapter
- drivers = ('pyodbc',)¶
- types = {'reference': 'INT, FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'text': 'CLOB', 'float': 'FLOAT', 'datetime': 'TIMESTAMP WITHOUT TIME ZONE', 'bigint': 'BIGINT', 'reference TFK': ' CONSTRAINT FK_%(foreign_table)s_PK FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_table)s (%(foreign_key)s) ON DELETE %(on_delete_action)s', 'id': 'int not null unique with default next value for ii***lineitemsequence', 'reference FK': ', CONSTRAINT FK_%(constraint_name)s FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'json': 'CLOB', 'big-id': 'bigint not null unique with default next value for ii***lineitemsequence', 'blob': 'BLOB', 'big-reference': 'BIGINT, FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'string': 'VARCHAR(%(length)s)', 'list:string': 'CLOB', 'date': 'ANSIDATE', 'integer': 'INTEGER4', 'password': 'VARCHAR(%(length)s)', 'list:integer': 'CLOB', 'double': 'FLOAT8', 'decimal': 'NUMERIC(%(precision)s,%(scale)s)', 'upload': 'VARCHAR(%(length)s)', 'list:reference': 'CLOB', 'boolean': 'CHAR(1)', 'time': 'TIME WITHOUT TIME ZONE'}¶
- class pydal.adapters.ingres.IngresUnicodeAdapter(db, uri, pool_size=0, folder=None, db_codec='UTF-8', credential_decoder=<function IDENTITY at 0x7fef1043a9b0>, driver_args={}, adapter_args={}, do_connect=True, after_connection=None)[source]¶
Bases: pydal.adapters.ingres.IngresAdapter
- drivers = ('pyodbc',)¶
- types = {'reference': 'INTEGER4, FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'text': 'NCLOB', 'float': 'FLOAT', 'datetime': 'TIMESTAMP WITHOUT TIME ZONE', 'bigint': 'BIGINT', 'reference TFK': ' CONSTRAINT FK_%(foreign_table)s_PK FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_table)s (%(foreign_key)s) ON DELETE %(on_delete_action)s', 'id': 'INTEGER4 not null unique with default next value for ii***lineitemsequence', 'reference FK': ', CONSTRAINT FK_%(constraint_name)s FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'json': 'NCLOB', 'big-id': 'BIGINT not null unique with default next value for ii***lineitemsequence', 'blob': 'BLOB', 'big-reference': 'BIGINT, FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'string': 'NVARCHAR(%(length)s)', 'list:string': 'NCLOB', 'date': 'ANSIDATE', 'integer': 'INTEGER4', 'password': 'NVARCHAR(%(length)s)', 'list:integer': 'NCLOB', 'double': 'FLOAT8', 'decimal': 'NUMERIC(%(precision)s,%(scale)s)', 'upload': 'VARCHAR(%(length)s)', 'list:reference': 'NCLOB', 'boolean': 'CHAR(1)', 'time': 'TIME WITHOUT TIME ZONE'}¶
pydal.adapters.mongo module¶
- class pydal.adapters.mongo.MongoBlob[source]¶
Bases: pydal.adapters.mongo.Binary
- MONGO_BLOB_BYTES = 0¶
- MONGO_BLOB_NON_UTF8_STR = 1¶
- class pydal.adapters.mongo.MongoDBAdapter(db, uri='mongodb://127.0.0.1:5984/db', pool_size=0, folder=None, db_codec='UTF-8', credential_decoder=<function IDENTITY at 0x7fef1043a9b0>, driver_args={}, adapter_args={}, do_connect=True, after_connection=None)[source]¶
Bases: pydal.adapters.base.NoSQLAdapter
- AS_MARK = '__#AS#__'¶
- class Expanded(adapter, crud, query, fields=(), tablename=None, groupby=None)[source]¶
Bases: object
Class to encapsulate a pydal expression and track the parse expansion and its results.
- MongoDBAdapter.GROUP_MARK = '__#GROUP#__'¶
- MongoDBAdapter.LENGTH(first)[source]¶
https://jira.mongodb.org/browse/SERVER-5319 https://github.com/afchin/mongo/commit/f52105977e4d0ccb53bdddfb9c4528a3f3c40bdf
- exception MongoDBAdapter.NotOnNoSqlError(message=None)[source]¶
Bases: exceptions.NotImplementedError
- MongoDBAdapter.RANDOM()[source]¶
ORDER BY RANDOM()
https://github.com/mongodb/cookbook/blob/master/content/patterns/random-attribute.txt https://jira.mongodb.org/browse/SERVER-533 http://stackoverflow.com/questions/19412/how-to-request-a-random-row-in-sql
- MongoDBAdapter.REGEXP(first, second, case_sensitive=True)[source]¶
MongoDB provides regular expression capabilities for pattern matching strings in queries. MongoDB uses Perl compatible regular expressions (i.e. ‘PCRE’) version 8.36 with UTF-8 support.
- MongoDBAdapter.REGEXP_MARK1 = '__#REGEXP_1#__'¶
- MongoDBAdapter.REGEXP_MARK2 = '__#REGEXP_2#__'¶
- MongoDBAdapter.REGEX_SELECT_AS_PARSER = <_sre.SRE_Pattern object at 0x7fef110a8df8>¶
- MongoDBAdapter.driver_auto_json = ['loads', 'dumps']¶
- MongoDBAdapter.drivers = ('pymongo',)¶
- MongoDBAdapter.insert(table, fields, safe=None)[source]¶
Safe determines whether a asynchronous request is done or a synchronous action is done For safety, we use by default synchronous requests
- MongoDBAdapter.object_id(arg=None)[source]¶
Convert input to a valid Mongodb ObjectId instance
self.object_id(“<random>”) -> ObjectId (not unique) instance
- MongoDBAdapter.types = {'string': <type 'str'>, 'reference': <type 'long'>, 'text': <type 'str'>, 'id': <type 'long'>, 'float': <type 'float'>, 'bigint': <type 'long'>, 'upload': <type 'str'>, 'datetime': <type 'datetime.datetime'>, 'json': <type 'str'>, 'boolean': <type 'bool'>, 'blob': <type 'str'>, 'list:string': <type 'list'>, 'double': <type 'float'>, 'date': <type 'datetime.date'>, 'integer': <type 'long'>, 'password': <type 'str'>, 'list:integer': <type 'list'>, 'time': <type 'datetime.time'>, 'list:reference': <type 'list'>}¶
- MongoDBAdapter.uploads_in_blob = False¶
pydal.adapters.mssql module¶
- class pydal.adapters.mssql.MSSQL2Adapter(db, uri, pool_size=0, folder=None, db_codec='UTF-8', credential_decoder=<function IDENTITY at 0x7fef1043a9b0>, driver_args={}, adapter_args={}, do_connect=True, srid=4326, after_connection=None)[source]¶
Bases: pydal.adapters.mssql.MSSQLAdapter
- drivers = ('pyodbc',)¶
- types = {'reference': 'INT %(null)s %(unique)s, CONSTRAINT %(constraint_name)s FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'text': 'NTEXT', 'float': 'FLOAT', 'datetime': 'DATETIME', 'bigint': 'BIGINT', 'reference TFK': ' CONSTRAINT FK_%(foreign_table)s_PK FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_table)s (%(foreign_key)s) ON DELETE %(on_delete_action)s', 'id': 'INT IDENTITY PRIMARY KEY', 'geography': 'geography', 'reference FK': ', CONSTRAINT FK_%(constraint_name)s FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'big-id': 'BIGINT IDENTITY PRIMARY KEY', 'json': 'NTEXT', 'blob': 'IMAGE', 'big-reference': 'BIGINT %(null)s %(unique)s, CONSTRAINT %(constraint_name)s FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'string': 'NVARCHAR(%(length)s)', 'list:string': 'NTEXT', 'date': 'DATETIME', 'integer': 'INT', 'password': 'NVARCHAR(%(length)s)', 'list:integer': 'NTEXT', 'geometry': 'geometry', 'double': 'FLOAT', 'decimal': 'NUMERIC(%(precision)s,%(scale)s)', 'upload': 'NVARCHAR(%(length)s)', 'list:reference': 'NTEXT', 'boolean': 'BIT', 'time': 'CHAR(8)'}¶
- class pydal.adapters.mssql.MSSQL3Adapter(db, uri, pool_size=0, folder=None, db_codec='UTF-8', credential_decoder=<function IDENTITY at 0x7fef1043a9b0>, driver_args={}, adapter_args={}, do_connect=True, srid=4326, after_connection=None)[source]¶
Bases: pydal.adapters.mssql.MSSQLAdapter
Experimental support for pagination in MSSQL
Requires MSSQL >= 2005, uses ROW_NUMBER()
- types = {'reference': 'INT %(null)s %(unique)s, CONSTRAINT %(constraint_name)s FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'text': 'VARCHAR(MAX)', 'float': 'FLOAT', 'datetime': 'DATETIME', 'bigint': 'BIGINT', 'reference TFK': ' CONSTRAINT FK_%(foreign_table)s_PK FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_table)s (%(foreign_key)s) ON DELETE %(on_delete_action)s', 'id': 'INT IDENTITY PRIMARY KEY', 'geography': 'geography', 'reference FK': ', CONSTRAINT FK_%(constraint_name)s FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'big-id': 'BIGINT IDENTITY PRIMARY KEY', 'json': 'VARCHAR(MAX)', 'blob': 'IMAGE', 'big-reference': 'BIGINT %(null)s %(unique)s, CONSTRAINT %(constraint_name)s FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'string': 'VARCHAR(%(length)s)', 'list:string': 'VARCHAR(MAX)', 'date': 'DATETIME', 'integer': 'INT', 'password': 'VARCHAR(%(length)s)', 'list:integer': 'VARCHAR(MAX)', 'geometry': 'geometry', 'double': 'FLOAT', 'decimal': 'NUMERIC(%(precision)s,%(scale)s)', 'upload': 'VARCHAR(%(length)s)', 'list:reference': 'VARCHAR(MAX)', 'boolean': 'BIT', 'time': 'TIME(7)'}¶
- class pydal.adapters.mssql.MSSQL3NAdapter(db, uri, pool_size=0, folder=None, db_codec='UTF-8', credential_decoder=<function IDENTITY at 0x7fef1043a9b0>, driver_args={}, adapter_args={}, do_connect=True, srid=4326, after_connection=None)[source]¶
Bases: pydal.adapters.mssql.MSSQLNAdapter
- drivers = ('pyodbc',)¶
Experimental support for pagination in MSSQL Experimental: see MSSQLNAdapter docstring for warnings
Requires MSSQL >= 2005, uses ROW_NUMBER()
- types = {'reference': 'INT %(null)s %(unique)s, CONSTRAINT %(constraint_name)s FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'text': 'NVARCHAR(MAX)', 'float': 'FLOAT', 'datetime': 'DATETIME', 'bigint': 'BIGINT', 'reference TFK': ' CONSTRAINT FK_%(foreign_table)s_PK FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_table)s (%(foreign_key)s) ON DELETE %(on_delete_action)s', 'id': 'INT IDENTITY PRIMARY KEY', 'geography': 'geography', 'reference FK': ', CONSTRAINT FK_%(constraint_name)s FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'big-id': 'BIGINT IDENTITY PRIMARY KEY', 'json': 'NVARCHAR(MAX)', 'blob': 'IMAGE', 'big-reference': 'BIGINT %(null)s %(unique)s, CONSTRAINT %(constraint_name)s FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'string': 'NVARCHAR(%(length)s)', 'list:string': 'NVARCHAR(MAX)', 'date': 'DATETIME', 'integer': 'INT', 'password': 'NVARCHAR(%(length)s)', 'list:integer': 'NVARCHAR(MAX)', 'geometry': 'geometry', 'double': 'FLOAT', 'decimal': 'NUMERIC(%(precision)s,%(scale)s)', 'upload': 'NVARCHAR(%(length)s)', 'list:reference': 'NVARCHAR(MAX)', 'boolean': 'BIT', 'time': 'TIME(7)'}¶
- class pydal.adapters.mssql.MSSQL4Adapter(db, uri, pool_size=0, folder=None, db_codec='UTF-8', credential_decoder=<function IDENTITY at 0x7fef1043a9b0>, driver_args={}, adapter_args={}, do_connect=True, srid=4326, after_connection=None)[source]¶
Bases: pydal.adapters.mssql.MSSQLAdapter
Support for “native” pagination
Requires MSSQL >= 2012, uses OFFSET ... ROWS ... FETCH NEXT ... ROWS ONLY
- types = {'reference': 'INT %(null)s %(unique)s, CONSTRAINT %(constraint_name)s FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'text': 'VARCHAR(MAX)', 'float': 'FLOAT', 'datetime': 'DATETIME', 'bigint': 'BIGINT', 'reference TFK': ' CONSTRAINT FK_%(foreign_table)s_PK FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_table)s (%(foreign_key)s) ON DELETE %(on_delete_action)s', 'id': 'INT IDENTITY PRIMARY KEY', 'geography': 'geography', 'reference FK': ', CONSTRAINT FK_%(constraint_name)s FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'big-id': 'BIGINT IDENTITY PRIMARY KEY', 'json': 'VARCHAR(MAX)', 'blob': 'IMAGE', 'big-reference': 'BIGINT %(null)s %(unique)s, CONSTRAINT %(constraint_name)s FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'string': 'VARCHAR(%(length)s)', 'list:string': 'VARCHAR(MAX)', 'date': 'DATETIME', 'integer': 'INT', 'password': 'VARCHAR(%(length)s)', 'list:integer': 'VARCHAR(MAX)', 'geometry': 'geometry', 'double': 'FLOAT', 'decimal': 'NUMERIC(%(precision)s,%(scale)s)', 'upload': 'VARCHAR(%(length)s)', 'list:reference': 'VARCHAR(MAX)', 'boolean': 'BIT', 'time': 'TIME(7)'}¶
- class pydal.adapters.mssql.MSSQL4NAdapter(db, uri, pool_size=0, folder=None, db_codec='UTF-8', credential_decoder=<function IDENTITY at 0x7fef1043a9b0>, driver_args={}, adapter_args={}, do_connect=True, srid=4326, after_connection=None)[source]¶
Bases: pydal.adapters.mssql.MSSQLNAdapter
Experimental: see MSSQLNAdapter docstring for warnings Support for “native” pagination
Unicode-compatible version Requires MSSQL >= 2012, uses OFFSET ... ROWS ... FETCH NEXT ... ROWS ONLY After careful testing, this should be the de-facto adapter for recent MSSQL backends
- types = {'reference': 'INT %(null)s %(unique)s, CONSTRAINT %(constraint_name)s FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'text': 'NVARCHAR(MAX)', 'float': 'FLOAT', 'datetime': 'DATETIME', 'bigint': 'BIGINT', 'reference TFK': ' CONSTRAINT FK_%(foreign_table)s_PK FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_table)s (%(foreign_key)s) ON DELETE %(on_delete_action)s', 'id': 'INT IDENTITY PRIMARY KEY', 'geography': 'geography', 'reference FK': ', CONSTRAINT FK_%(constraint_name)s FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'big-id': 'BIGINT IDENTITY PRIMARY KEY', 'json': 'NVARCHAR(MAX)', 'blob': 'IMAGE', 'big-reference': 'BIGINT %(null)s %(unique)s, CONSTRAINT %(constraint_name)s FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'string': 'NVARCHAR(%(length)s)', 'list:string': 'NVARCHAR(MAX)', 'date': 'DATE', 'integer': 'INT', 'password': 'NVARCHAR(%(length)s)', 'list:integer': 'NVARCHAR(MAX)', 'geometry': 'geometry', 'double': 'FLOAT', 'decimal': 'NUMERIC(%(precision)s,%(scale)s)', 'upload': 'NVARCHAR(%(length)s)', 'list:reference': 'NVARCHAR(MAX)', 'boolean': 'BIT', 'time': 'TIME(7)'}¶
- class pydal.adapters.mssql.MSSQLAdapter(db, uri, pool_size=0, folder=None, db_codec='UTF-8', credential_decoder=<function IDENTITY at 0x7fef1043a9b0>, driver_args={}, adapter_args={}, do_connect=True, srid=4326, after_connection=None)[source]¶
Bases: pydal.adapters.base.BaseAdapter
- FALSE = 0¶
- QUOTE_TEMPLATE = '"%s"'¶
- REGEX_ARGPATTERN = <_sre.SRE_Pattern object at 0x7fef0fc01100>¶
- REGEX_DSN = <_sre.SRE_Pattern object at 0x7fef0fc064c8>¶
- REGEX_URI = <_sre.SRE_Pattern object at 0x1c68cb0>¶
- TRUE = 1¶
- T_SEP = 'T'¶
- drivers = ('pyodbc',)¶
- types = {'reference': 'INT %(null)s %(unique)s, CONSTRAINT %(constraint_name)s FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'text': 'TEXT', 'float': 'FLOAT', 'datetime': 'DATETIME', 'bigint': 'BIGINT', 'reference TFK': ' CONSTRAINT FK_%(foreign_table)s_PK FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_table)s (%(foreign_key)s) ON DELETE %(on_delete_action)s', 'id': 'INT IDENTITY PRIMARY KEY', 'geography': 'geography', 'reference FK': ', CONSTRAINT FK_%(constraint_name)s FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'big-id': 'BIGINT IDENTITY PRIMARY KEY', 'json': 'TEXT', 'blob': 'IMAGE', 'big-reference': 'BIGINT %(null)s %(unique)s, CONSTRAINT %(constraint_name)s FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'string': 'VARCHAR(%(length)s)', 'list:string': 'TEXT', 'date': 'DATETIME', 'integer': 'INT', 'password': 'VARCHAR(%(length)s)', 'list:integer': 'TEXT', 'geometry': 'geometry', 'double': 'FLOAT', 'decimal': 'NUMERIC(%(precision)s,%(scale)s)', 'upload': 'VARCHAR(%(length)s)', 'list:reference': 'TEXT', 'boolean': 'BIT', 'time': 'CHAR(8)'}¶
- class pydal.adapters.mssql.MSSQLNAdapter(db, uri, pool_size=0, folder=None, db_codec='UTF-8', credential_decoder=<function IDENTITY at 0x7fef1043a9b0>, driver_args={}, adapter_args={}, do_connect=True, srid=4326, after_connection=None)[source]¶
Bases: pydal.adapters.mssql.MSSQLAdapter
- drivers = ('pyodbc',)¶
Experimental – base class for handling unicode in MSSQL by default. Needs lots of testing. Try this on a fresh (or on a legacy) database. Using this in a database handled previously with non-unicode aware adapter is NOT supported
- types = {'reference': 'INT %(null)s %(unique)s, CONSTRAINT %(constraint_name)s FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'text': 'NTEXT', 'float': 'FLOAT', 'datetime': 'DATETIME', 'bigint': 'BIGINT', 'reference TFK': ' CONSTRAINT FK_%(foreign_table)s_PK FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_table)s (%(foreign_key)s) ON DELETE %(on_delete_action)s', 'id': 'INT IDENTITY PRIMARY KEY', 'geography': 'geography', 'reference FK': ', CONSTRAINT FK_%(constraint_name)s FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'big-id': 'BIGINT IDENTITY PRIMARY KEY', 'json': 'NTEXT', 'blob': 'IMAGE', 'big-reference': 'BIGINT %(null)s %(unique)s, CONSTRAINT %(constraint_name)s FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'string': 'NVARCHAR(%(length)s)', 'list:string': 'NTEXT', 'date': 'DATETIME', 'integer': 'INT', 'password': 'NVARCHAR(%(length)s)', 'list:integer': 'NTEXT', 'geometry': 'geometry', 'double': 'FLOAT', 'decimal': 'NUMERIC(%(precision)s,%(scale)s)', 'upload': 'NVARCHAR(%(length)s)', 'list:reference': 'NTEXT', 'boolean': 'BIT', 'time': 'CHAR(8)'}¶
- class pydal.adapters.mssql.SybaseAdapter(db, uri, pool_size=0, folder=None, db_codec='UTF-8', credential_decoder=<function IDENTITY at 0x7fef1043a9b0>, driver_args={}, adapter_args={}, do_connect=True, srid=4326, after_connection=None)[source]¶
Bases: pydal.adapters.mssql.MSSQLAdapter
- drivers = 'Sybase'¶
- types = {'reference': 'INT NULL, CONSTRAINT %(constraint_name)s FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'text': 'TEXT', 'float': 'FLOAT', 'datetime': 'DATETIME', 'bigint': 'BIGINT', 'reference TFK': ' CONSTRAINT FK_%(foreign_table)s_PK FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_table)s (%(foreign_key)s) ON DELETE %(on_delete_action)s', 'id': 'INT IDENTITY PRIMARY KEY', 'geography': 'geography', 'reference FK': ', CONSTRAINT FK_%(constraint_name)s FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'big-id': 'BIGINT IDENTITY PRIMARY KEY', 'json': 'TEXT', 'blob': 'IMAGE', 'big-reference': 'BIGINT NULL, CONSTRAINT %(constraint_name)s FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'string': 'CHAR VARYING(%(length)s)', 'list:string': 'TEXT', 'date': 'DATETIME', 'integer': 'INT', 'password': 'CHAR VARYING(%(length)s)', 'list:integer': 'TEXT', 'geometry': 'geometry', 'double': 'FLOAT', 'decimal': 'NUMERIC(%(precision)s,%(scale)s)', 'upload': 'CHAR VARYING(%(length)s)', 'list:reference': 'TEXT', 'boolean': 'BIT', 'time': 'CHAR(8)'}¶
- class pydal.adapters.mssql.VerticaAdapter(db, uri, pool_size=0, folder=None, db_codec='UTF-8', credential_decoder=<function IDENTITY at 0x7fef1043a9b0>, driver_args={}, adapter_args={}, do_connect=True, srid=4326, after_connection=None)[source]¶
Bases: pydal.adapters.mssql.MSSQLAdapter
- T_SEP = ' '¶
- drivers = ('pyodbc',)¶
- types = {'big-reference': 'BIGINT REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'string': 'VARCHAR(%(length)s)', 'reference': 'INT REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'text': 'BYTEA', 'decimal': 'DECIMAL(%(precision)s,%(scale)s)', 'float': 'FLOAT', 'bigint': 'BIGINT', 'upload': 'VARCHAR(%(length)s)', 'datetime': 'DATETIME', 'json': 'VARCHAR(%(length)s)', 'boolean': 'BOOLEAN', 'id': 'IDENTITY', 'blob': 'BYTEA', 'list:string': 'BYTEA', 'double': 'DOUBLE PRECISION', 'date': 'DATE', 'integer': 'INT', 'password': 'VARCHAR(%(length)s)', 'list:integer': 'BYTEA', 'time': 'TIME', 'list:reference': 'BYTEA'}¶
pydal.adapters.mysql module¶
- class pydal.adapters.mysql.MySQLAdapter(db, uri, pool_size=0, folder=None, db_codec='UTF-8', credential_decoder=<function IDENTITY at 0x7fef1043a9b0>, driver_args={}, adapter_args={}, do_connect=True, after_connection=None)[source]¶
Bases: pydal.adapters.base.BaseAdapter
- QUOTE_TEMPLATE = '`%s`'¶
- REGEX_URI = <_sre.SRE_Pattern object at 0x1c69890>¶
- commit_on_alter_table = True¶
- drivers = ('MySQLdb', 'pymysql', 'mysqlconnector')¶
- support_distributed_transaction = True¶
- types = {'reference': 'INT %(null)s %(unique)s, INDEX %(index_name)s (%(field_name)s), FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'text': 'LONGTEXT', 'float': 'FLOAT', 'datetime': 'DATETIME', 'bigint': 'BIGINT', 'id': 'INT AUTO_INCREMENT NOT NULL', 'reference FK': ', CONSTRAINT `FK_%(constraint_name)s` FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'json': 'LONGTEXT', 'big-id': 'BIGINT AUTO_INCREMENT NOT NULL', 'blob': 'LONGBLOB', 'big-reference': 'BIGINT %(null)s %(unique)s, INDEX %(index_name)s (%(field_name)s), FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'string': 'VARCHAR(%(length)s)', 'list:string': 'LONGTEXT', 'date': 'DATE', 'integer': 'INT', 'password': 'VARCHAR(%(length)s)', 'list:integer': 'LONGTEXT', 'double': 'DOUBLE', 'decimal': 'NUMERIC(%(precision)s,%(scale)s)', 'upload': 'VARCHAR(%(length)s)', 'list:reference': 'LONGTEXT', 'boolean': 'CHAR(1)', 'time': 'TIME'}¶
pydal.adapters.oracle module¶
- class pydal.adapters.oracle.OracleAdapter(db, uri, pool_size=0, folder=None, db_codec='UTF-8', credential_decoder=<function IDENTITY at 0x7fef1043a9b0>, driver_args={}, adapter_args={}, do_connect=True, after_connection=None)[source]¶
Bases: pydal.adapters.base.BaseAdapter
- commit_on_alter_table = False¶
- drivers = ('cx_Oracle',)¶
- oracle_fix = <_sre.SRE_Pattern object at 0x7fef10429e30>¶
- types = {'reference': 'NUMBER, CONSTRAINT %(constraint_name)s FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'text': 'CLOB', 'float': 'FLOAT', 'datetime': 'DATE', 'bigint': 'NUMBER', 'reference TFK': ' CONSTRAINT FK_%(foreign_table)s_PK FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_table)s (%(foreign_key)s) ON DELETE %(on_delete_action)s', 'id': 'NUMBER PRIMARY KEY', 'reference FK': ', CONSTRAINT FK_%(constraint_name)s FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'json': 'CLOB', 'big-id': 'NUMBER PRIMARY KEY', 'blob': 'CLOB', 'big-reference': 'NUMBER, CONSTRAINT %(constraint_name)s FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'string': 'VARCHAR2(%(length)s)', 'list:string': 'CLOB', 'date': 'DATE', 'integer': 'INT', 'password': 'VARCHAR2(%(length)s)', 'list:integer': 'CLOB', 'double': 'BINARY_DOUBLE', 'decimal': 'NUMERIC(%(precision)s,%(scale)s)', 'upload': 'VARCHAR2(%(length)s)', 'list:reference': 'CLOB', 'boolean': 'CHAR(1)', 'time': 'CHAR(8)'}¶
pydal.adapters.postgres module¶
- class pydal.adapters.postgres.JDBCPostgreSQLAdapter(db, uri, pool_size=0, folder=None, db_codec='UTF-8', credential_decoder=<function IDENTITY at 0x7fef1043a9b0>, driver_args={}, adapter_args={}, do_connect=True, after_connection=None)[source]¶
Bases: pydal.adapters.postgres.PostgreSQLAdapter
- REGEX_URI = <_sre.SRE_Pattern object at 0x7fef1036a430>¶
- drivers = ('zxJDBC',)¶
- class pydal.adapters.postgres.NewPostgreSQLAdapter(db, uri, pool_size=0, folder=None, db_codec='UTF-8', credential_decoder=<function IDENTITY at 0x7fef1043a9b0>, driver_args={}, adapter_args={}, do_connect=True, srid=4326, after_connection=None)[source]¶
Bases: pydal.adapters.postgres.PostgreSQLAdapter
- drivers = ('psycopg2', 'pg8000')¶
- types = {'reference': 'INTEGER REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s %(null)s %(unique)s', 'text': 'TEXT', 'float': 'FLOAT', 'datetime': 'TIMESTAMP', 'bigint': 'BIGINT', 'reference TFK': ' CONSTRAINT "FK_%(foreign_table)s_PK" FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_table)s (%(foreign_key)s) ON DELETE %(on_delete_action)s', 'id': 'SERIAL PRIMARY KEY', 'geography': 'GEOGRAPHY', 'reference FK': ', CONSTRAINT "FK_%(constraint_name)s" FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'big-id': 'BIGSERIAL PRIMARY KEY', 'json': 'TEXT', 'blob': 'BYTEA', 'big-reference': 'BIGINT REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s %(null)s %(unique)s', 'string': 'VARCHAR(%(length)s)', 'list:string': 'TEXT[]', 'date': 'DATE', 'integer': 'INTEGER', 'password': 'VARCHAR(%(length)s)', 'list:integer': 'BIGINT[]', 'geometry': 'GEOMETRY', 'double': 'FLOAT8', 'decimal': 'NUMERIC(%(precision)s,%(scale)s)', 'upload': 'VARCHAR(%(length)s)', 'list:reference': 'BIGINT[]', 'boolean': 'CHAR(1)', 'time': 'TIME'}¶
- class pydal.adapters.postgres.PostgreSQLAdapter(db, uri, pool_size=0, folder=None, db_codec='UTF-8', credential_decoder=<function IDENTITY at 0x7fef1043a9b0>, driver_args={}, adapter_args={}, do_connect=True, srid=4326, after_connection=None)[source]¶
Bases: pydal.adapters.base.BaseAdapter
- QUOTE_TEMPLATE = '"%s"'¶
- REGEX_URI = <_sre.SRE_Pattern object at 0x1c2c200>¶
- drivers = ('psycopg2', 'pg8000')¶
- support_distributed_transaction = True¶
- types = {'reference': 'INTEGER REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s %(null)s %(unique)s', 'text': 'TEXT', 'float': 'FLOAT', 'datetime': 'TIMESTAMP', 'bigint': 'BIGINT', 'reference TFK': ' CONSTRAINT "FK_%(foreign_table)s_PK" FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_table)s (%(foreign_key)s) ON DELETE %(on_delete_action)s', 'id': 'SERIAL PRIMARY KEY', 'geography': 'GEOGRAPHY', 'reference FK': ', CONSTRAINT "FK_%(constraint_name)s" FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'big-id': 'BIGSERIAL PRIMARY KEY', 'json': 'TEXT', 'blob': 'BYTEA', 'big-reference': 'BIGINT REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s %(null)s %(unique)s', 'string': 'VARCHAR(%(length)s)', 'list:string': 'TEXT', 'date': 'DATE', 'integer': 'INTEGER', 'password': 'VARCHAR(%(length)s)', 'list:integer': 'TEXT', 'geometry': 'GEOMETRY', 'double': 'FLOAT8', 'decimal': 'NUMERIC(%(precision)s,%(scale)s)', 'upload': 'VARCHAR(%(length)s)', 'list:reference': 'TEXT', 'boolean': 'CHAR(1)', 'time': 'TIME'}¶
pydal.adapters.sapdb module¶
- class pydal.adapters.sapdb.SAPDBAdapter(db, uri, pool_size=0, folder=None, db_codec='UTF-8', credential_decoder=<function IDENTITY at 0x7fef1043a9b0>, driver_args={}, adapter_args={}, do_connect=True, after_connection=None)[source]¶
Bases: pydal.adapters.base.BaseAdapter
- REGEX_URI = <_sre.SRE_Pattern object at 0x1c2c200>¶
- drivers = ('sapdb',)¶
- support_distributed_transaction = False¶
- types = {'reference': 'INT, FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'text': 'LONG', 'float': 'FLOAT', 'datetime': 'TIMESTAMP', 'bigint': 'BIGINT', 'id': 'INT PRIMARY KEY', 'json': 'LONG', 'big-id': 'BIGINT PRIMARY KEY', 'blob': 'LONG', 'big-reference': 'BIGINT, FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'string': 'VARCHAR(%(length)s)', 'list:string': 'LONG', 'date': 'DATE', 'integer': 'INT', 'password': 'VARCHAR(%(length)s)', 'list:integer': 'LONG', 'double': 'DOUBLE PRECISION', 'decimal': 'FIXED(%(precision)s,%(scale)s)', 'upload': 'VARCHAR(%(length)s)', 'list:reference': 'LONG', 'boolean': 'CHAR(1)', 'time': 'TIME'}¶
pydal.adapters.sqlite module¶
- class pydal.adapters.sqlite.JDBCSQLiteAdapter(db, uri, pool_size=0, folder=None, db_codec='UTF-8', credential_decoder=<function IDENTITY at 0x7fef1043a9b0>, driver_args={}, adapter_args={}, do_connect=True, after_connection=None)[source]¶
Bases: pydal.adapters.sqlite.SQLiteAdapter
- drivers = ('zxJDBC_sqlite',)¶
- class pydal.adapters.sqlite.SQLiteAdapter(db, uri, pool_size=0, folder=None, db_codec='UTF-8', credential_decoder=<function IDENTITY at 0x7fef1043a9b0>, driver_args={}, adapter_args={}, do_connect=True, after_connection=None)[source]¶
Bases: pydal.adapters.base.BaseAdapter
- can_select_for_update = None¶
- drivers = ('sqlite2', 'sqlite3')¶
- class pydal.adapters.sqlite.SpatiaLiteAdapter(db, uri, pool_size=0, folder=None, db_codec='UTF-8', credential_decoder=<function IDENTITY at 0x7fef1043a9b0>, driver_args={}, adapter_args={}, do_connect=True, srid=4326, after_connection=None)[source]¶
Bases: pydal.adapters.sqlite.SQLiteAdapter
- drivers = ('sqlite3', 'sqlite2')¶
- types = {'string': 'CHAR(%(length)s)', 'reference': 'INTEGER REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s %(null)s %(unique)s', 'text': 'TEXT', 'float': 'DOUBLE', 'datetime': 'TIMESTAMP', 'bigint': 'INTEGER', 'list:string': 'TEXT', 'date': 'DATE', 'integer': 'INTEGER', 'password': 'CHAR(%(length)s)', 'list:integer': 'TEXT', 'id': 'INTEGER PRIMARY KEY AUTOINCREMENT', 'reference FK': ', CONSTRAINT "FK_%(constraint_name)s" FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s', 'geometry': 'GEOMETRY', 'double': 'DOUBLE', 'decimal': 'DOUBLE', 'big-id': 'INTEGER PRIMARY KEY AUTOINCREMENT', 'list:reference': 'TEXT', 'json': 'TEXT', 'boolean': 'CHAR(1)', 'upload': 'CHAR(%(length)s)', 'blob': 'BLOB', 'time': 'TIME', 'big-reference': 'INTEGER REFERENCES %(foreign_key)s ON DELETE %(on_delete_action)s %(null)s %(unique)s'}¶
pydal.adapters.teradata module¶
- class pydal.adapters.teradata.TeradataAdapter(db, uri, pool_size=0, folder=None, db_codec='UTF-8', credential_decoder=<function IDENTITY at 0x7fef1043a9b0>, driver_args={}, adapter_args={}, do_connect=True, after_connection=None)[source]¶
Bases: pydal.adapters.base.BaseAdapter
- drivers = ('pyodbc',)¶
- types = {'reference': 'INT', 'text': 'VARCHAR(2000)', 'float': 'REAL', 'datetime': 'TIMESTAMP', 'bigint': 'BIGINT', 'reference TFK': ' FOREIGN KEY (%(field_name)s) REFERENCES %(foreign_table)s (%(foreign_key)s)', 'id': 'INT GENERATED ALWAYS AS IDENTITY', 'reference FK': ' REFERENCES %(foreign_key)s', 'json': 'VARCHAR(4000)', 'big-id': 'BIGINT GENERATED ALWAYS AS IDENTITY', 'blob': 'BLOB', 'big-reference': 'BIGINT', 'string': 'VARCHAR(%(length)s)', 'list:string': 'VARCHAR(4000)', 'date': 'DATE', 'integer': 'INT', 'password': 'VARCHAR(%(length)s)', 'list:integer': 'VARCHAR(4000)', 'geometry': 'ST_GEOMETRY', 'double': 'DOUBLE', 'decimal': 'NUMERIC(%(precision)s,%(scale)s)', 'upload': 'VARCHAR(%(length)s)', 'list:reference': 'VARCHAR(4000)', 'boolean': 'CHAR(1)', 'time': 'TIME'}¶
pydal.helpers package¶
Submodules¶
pydal.helpers.classes module¶
- class pydal.helpers.classes.BasicStorage(*args, **kwargs)[source]¶
Bases: object
- clear(*args, **kwargs)¶
- copy(*args, **kwargs)¶
- has_key(item)¶
- pop(*args, **kwargs)¶
- class pydal.helpers.classes.DatabaseStoredFile(db, filename, mode)[source]¶
-
- web2py_filesystems = set([])¶
- class pydal.helpers.classes.FakeCursor[source]¶
Bases: object
The Python Database API Specification has a cursor() method, which NoSql drivers generally don’t support. If the exception in this function is taken then it likely means that some piece of functionality has not yet been implemented in the driver. And something is using the cursor.
- class pydal.helpers.classes.NullCursor[source]¶
Bases: pydal.helpers.classes.FakeCursor
- lastrowid = 1¶
- class pydal.helpers.classes.SQLALL(table)[source]¶
Bases: object
Helper class providing a comma-separated string having all the field names (prefixed by table name and ‘.’)
normally only called from within gluon.dal
- class pydal.helpers.classes.SQLCustomType(type='string', native=None, encoder=None, decoder=None, validator=None, _class=None, widget=None, represent=None)[source]¶
Bases: object
Allows defining of custom SQL types
Parameters: - type – the web2py type (default = ‘string’)
- native – the backend type
- encoder – how to encode the value to store it in the backend
- decoder – how to decode the value retrieved from the backend
- validator – what validators to use ( default = None, will use the default validator for type)
- Example::
Define as:
- decimal = SQLCustomType(
- type =’double’, native =’integer’, encoder =(lambda x: int(float(x) * 100)), decoder = (lambda x: Decimal(“0.00”) + Decimal(str(float(x)/100)) ) )
- db.define_table(
- ‘example’, Field(‘value’, type=decimal) )
pydal.helpers.methods module¶
- pydal.helpers.methods.cleanup(text)[source]¶
Validates that the given text is clean: only contains [0-9a-zA-Z_]
- pydal.helpers.methods.pluralize(singular, rules=[(<_sre.SRE_Pattern object at 0x7fef135b5b90>, <_sre.SRE_Pattern object at 0x7fef135b5b90>, 'children'), (<_sre.SRE_Pattern object at 0x7fef13570b30>, <_sre.SRE_Pattern object at 0x7fef13570b30>, 'eet'), (<_sre.SRE_Pattern object at 0x7fef13628b70>, <_sre.SRE_Pattern object at 0x7fef13628b70>, 'eeth'), (<_sre.SRE_Pattern object at 0x7fef13628c30>, <_sre.SRE_Pattern object at 0x7fef135b5c60>, 'l\\1aves'), (<_sre.SRE_Pattern object at 0x7fef13570be0>, <_sre.SRE_Pattern object at 0x7fef13570be0>, 'ses'), (<_sre.SRE_Pattern object at 0x7fef13570c90>, <_sre.SRE_Pattern object at 0x7fef13570c90>, 'men'), (<_sre.SRE_Pattern object at 0x7fef13570d40>, <_sre.SRE_Pattern object at 0x7fef13570d40>, 'ives'), (<_sre.SRE_Pattern object at 0x7fef13570df0>, <_sre.SRE_Pattern object at 0x7fef13570df0>, 'eaux'), (<_sre.SRE_Pattern object at 0x7fef1357ff30>, <_sre.SRE_Pattern object at 0x7fef1357ff30>, 'lves'), (<_sre.SRE_Pattern object at 0x7fef135b5d30>, <_sre.SRE_Pattern object at 0x7fef10a58a58>, 'es'), (<_sre.SRE_Pattern object at 0x7fef135c79d0>, <_sre.SRE_Pattern object at 0x7fef10a58a58>, 'es'), (<_sre.SRE_Pattern object at 0x7fef1356f8e8>, <_sre.SRE_Pattern object at 0x7fef10384b70>, 'ies'), (<_sre.SRE_Pattern object at 0x7fef10a58a58>, <_sre.SRE_Pattern object at 0x7fef10a58a58>, 's')])[source]¶
pydal.helpers.regex module¶
Module contents¶
Submodules¶
pydal.base module¶
This file contains the DAL support for many relational databases, including:
- SQLite & SpatiaLite
- MySQL
- Postgres
- Firebird
- Oracle
- MS SQL
- DB2
- Interbase
- Ingres
- Informix (9+ and SE)
- SapDB (experimental)
- Cubrid (experimental)
- CouchDB (experimental)
- MongoDB (in progress)
- Google:nosql
- Google:sql
- Teradata
- IMAP (experimental)
Example of usage:
>>> # from dal import DAL, Field
### create DAL connection (and create DB if it doesn't exist)
>>> db = DAL(('sqlite://storage.sqlite','mysql://a:b@localhost/x'),
... folder=None)
### define a table 'person' (create/alter as necessary)
>>> person = db.define_table('person',Field('name','string'))
### insert a record
>>> id = person.insert(name='James')
### retrieve it by id
>>> james = person(id)
### retrieve it by name
>>> james = person(name='James')
### retrieve it by arbitrary query
>>> query = (person.name=='James') & (person.name.startswith('J'))
>>> james = db(query).select(person.ALL)[0]
### update one record
>>> james.update_record(name='Jim')
<Row {'id': 1, 'name': 'Jim'}>
### update multiple records by query
>>> db(person.name.like('J%')).update(name='James')
1
### delete records by query
>>> db(person.name.lower() == 'jim').delete()
0
### retrieve multiple records (rows)
>>> people = db(person).select(orderby=person.name,
... groupby=person.name, limitby=(0,100))
### further filter them
>>> james = people.find(lambda row: row.name == 'James').first()
>>> print james.id, james.name
1 James
### check aggregates
>>> counter = person.id.count()
>>> print db(person).select(counter).first()(counter)
1
### delete one record
>>> james.delete_record()
1
### delete (drop) entire database table
>>> person.drop()
Supported DAL URI strings:
'sqlite://test.db'
'spatialite://test.db'
'sqlite:memory'
'spatialite:memory'
'jdbc:sqlite://test.db'
'mysql://root:none@localhost/test'
'postgres://mdipierro:password@localhost/test'
'postgres:psycopg2://mdipierro:password@localhost/test'
'postgres:pg8000://mdipierro:password@localhost/test'
'jdbc:postgres://mdipierro:none@localhost/test'
'mssql://web2py:none@A64X2/web2py_test'
'mssql2://web2py:none@A64X2/web2py_test' # alternate mappings
'mssql3://web2py:none@A64X2/web2py_test' # better pagination (requires >= 2005)
'mssql4://web2py:none@A64X2/web2py_test' # best pagination (requires >= 2012)
'oracle://username:password@database'
'firebird://user:password@server:3050/database'
'db2:ibm_db_dbi://DSN=dsn;UID=user;PWD=pass'
'db2:pyodbc://driver=DB2;hostname=host;database=database;uid=user;pwd=password;port=port'
'firebird://username:password@hostname/database'
'firebird_embedded://username:password@c://path'
'informix://user:password@server:3050/database'
'informixu://user:password@server:3050/database' # unicode informix
'ingres://database' # or use an ODBC connection string, e.g. 'ingres://dsn=dsn_name'
'google:datastore' # for google app engine datastore (uses ndb by default)
'google:sql' # for google app engine with sql (mysql compatible)
'teradata://DSN=dsn;UID=user;PWD=pass; DATABASE=database' # experimental
'imap://user:password@server:port' # experimental
'mongodb://user:password@server:port/database' # experimental
For more info:
help(DAL)
help(Field)
- class pydal.base.DAL(uri='sqlite://dummy.db', pool_size=0, folder=None, db_codec='UTF-8', check_reserved=None, migrate=True, fake_migrate=False, migrate_enabled=True, fake_migrate_all=False, decode_credentials=False, driver_args=None, adapter_args=None, attempts=5, auto_import=False, bigint_id=False, debug=False, lazy_tables=False, db_uid=None, do_connect=True, after_connection=None, tables=None, ignore_field_case=True, entity_quoting=False, table_hash=None)[source]¶
Bases: pydal.helpers.classes.Serializable, pydal.helpers.classes.BasicStorage
An instance of this class represents a database connection
Parameters: - uri (str) –
contains information for connecting to a database. Defaults to ‘sqlite://dummy.db’
Note
experimental: you can specify a dictionary as uri parameter i.e. with:
db = DAL({"uri": "sqlite://storage.sqlite", "tables": {...}, ...})
for an example of dict input you can check the output of the scaffolding db model with
db.as_dict()Note that for compatibility with Python older than version 2.6.5 you should cast your dict input keys to str due to a syntax limitation on kwarg names. for proper DAL dictionary input you can use one of:
obj = serializers.cast_keys(dict, [encoding="utf-8"]) #or else (for parsing json input) obj = serializers.loads_json(data, unicode_keys=False)
- pool_size – How many open connections to make to the database object.
- folder – where .table files will be created. Automatically set within web2py. Use an explicit path when using DAL outside web2py
- db_codec – string encoding of the database (default: ‘UTF-8’)
- table_hash – database identifier with .tables. If your connection hash change you can still using old .tables if they have db_hash as prefix
- check_reserved –
list of adapters to check tablenames and column names against sql/nosql reserved keywords. Defaults to None
- ‘common’ List of sql keywords that are common to all database types such as “SELECT, INSERT”. (recommended)
- ‘all’ Checks against all known SQL keywords
- ‘<adaptername>’’ Checks against the specific adapters list of keywords
- ‘<adaptername>_nonreserved’ Checks against the specific adapters list of nonreserved keywords. (if available)
- migrate – sets default migrate behavior for all tables
- fake_migrate – sets default fake_migrate behavior for all tables
- migrate_enabled – If set to False disables ALL migrations
- fake_migrate_all – If set to True fake migrates ALL tables
- attempts – Number of times to attempt connecting
- auto_import – If set to True, tries import automatically table definitions from the databases folder (works only for simple models)
- bigint_id – If set, turn on bigint instead of int for id and reference fields
- lazy_tables – delaya table definition until table access
- after_connection – can a callable that will be executed after the connection
Example
Use as:
db = DAL('sqlite://test.db')
or:
db = DAL(**{"uri": ..., "tables": [...]...}) # experimental db.define_table('tablename', Field('fieldname1'), Field('fieldname2'))
- class Table(db, tablename, *fields, **args)¶
Bases: pydal.helpers.classes.Serializable, pydal.helpers.classes.BasicStorage
Represents a database table
- Example::
- You can create a table as::
- db = DAL(...) db.define_table(‘users’, Field(‘name’))
And then:
db.users.insert(name='me') # print db.users._insert(...) to see SQL db.users.drop()
- as_dict(flat=False, sanitize=True)¶
- bulk_insert(items)¶
here items is a list of dictionaries
- drop(mode='')¶
- fields¶
- import_from_csv_file(csvfile, id_map=None, null='<NULL>', unique='uuid', id_offset=None, *args, **kwargs)¶
Import records from csv file. Column headers must have same names as table fields. Field ‘id’ is ignored. If column names read ‘table.file’ the ‘table.’ prefix is ignored.
- ‘unique’ argument is a field which must be unique (typically a uuid field)
- ‘restore’ argument is default False; if set True will remove old values in table first.
- ‘id_map’ if set to None will not map ids
The import will keep the id numbers in the restored table. This assumes that there is an field of type id that is integer and in incrementing order. Will keep the id numbers in restored table.
- insert(**fields)¶
- on(query)¶
- sqlsafe¶
- sqlsafe_alias¶
- truncate(mode=None)¶
- update(*args, **kwargs)¶
- update_or_insert(_key=<function <lambda> at 0x7fef1043a938>, **values)¶
- validate_and_insert(**fields)¶
- validate_and_update(_key=<function <lambda> at 0x7fef1043a938>, **fields)¶
- validate_and_update_or_insert(_key=<function <lambda> at 0x7fef1043a938>, **fields)¶
- with_alias(alias)¶
- DAL.check_reserved_keyword(name)[source]¶
Validates name against SQL keywords Uses self.check_reserve which is a list of operators to use.
- DAL.executesql(query, placeholders=None, as_dict=False, fields=None, colnames=None, as_ordered_dict=False)[source]¶
Executes an arbitrary query
Parameters: - query (str) – the query to submit to the backend
- placeholders – is optional and will always be None. If using raw SQL with placeholders, placeholders may be a sequence of values to be substituted in or, (if supported by the DB driver), a dictionary with keys matching named placeholders in your SQL.
- as_dict – will always be None when using DAL. If using raw SQL can be set to True and the results cursor returned by the DB driver will be converted to a sequence of dictionaries keyed with the db field names. Results returned with as_dict=True are the same as those returned when applying .to_list() to a DAL query. If “as_ordered_dict”=True the behaviour is the same as when “as_dict”=True with the keys (field names) guaranteed to be in the same order as returned by the select name executed on the database.
- fields –
list of DAL Fields that match the fields returned from the DB. The Field objects should be part of one or more Table objects defined on the DAL object. The “fields” list can include one or more DAL Table objects in addition to or instead of including Field objects, or it can be just a single table (not in a list). In that case, the Field objects will be extracted from the table(s).
Note
if either fields or colnames is provided, the results will be converted to a DAL Rows object using the db._adapter.parse() method
- colnames – list of field names in tablename.fieldname format
Note
It is also possible to specify both “fields” and the associated “colnames”. In that case, “fields” can also include DAL Expression objects in addition to Field objects. For Field objects in “fields”, the associated “colnames” must still be in tablename.fieldname format. For Expression objects in “fields”, the associated “colnames” can be any arbitrary labels.
DAL Table objects referred to by “fields” or “colnames” can be dummy tables and do not have to represent any real tables in the database. Also, note that the “fields” and “colnames” must be in the same order as the fields in the results cursor returned from the DB.
- static DAL.get_instances()[source]¶
Returns a dictionary with uri as key with timings and defined tables:
{'sqlite://storage.sqlite': { 'dbstats': [(select auth_user.email from auth_user, 0.02009)], 'dbtables': { 'defined': ['auth_cas', 'auth_event', 'auth_group', 'auth_membership', 'auth_permission', 'auth_user'], 'lazy': '[]' } } }
- DAL.import_from_csv_file(ifile, id_map=None, null='<NULL>', unique='uuid', map_tablenames=None, ignore_missing_tables=False, *args, **kwargs)[source]¶
- DAL.logger = <logging.Logger object at 0x7fef1035edd0>¶
- DAL.parse_as_rest(patterns, args, vars, queries=None, nested_select=True)[source]¶
Example
Use as:
db.define_table('person',Field('name'),Field('info')) db.define_table('pet', Field('ownedby',db.person), Field('name'),Field('info') ) @request.restful() def index(): def GET(*args,**vars): patterns = [ "/friends[person]", "/{person.name}/:field", "/{person.name}/pets[pet.ownedby]", "/{person.name}/pets[pet.ownedby]/{pet.name}", "/{person.name}/pets[pet.ownedby]/{pet.name}/:field", ("/dogs[pet]", db.pet.info=='dog'), ("/dogs[pet]/{pet.name.startswith}", db.pet.info=='dog'), ] parser = db.parse_as_rest(patterns,args,vars) if parser.status == 200: return dict(content=parser.response) else: raise HTTP(parser.status,parser.error) def POST(table_name,**vars): if table_name == 'person': return db.person.validate_and_insert(**vars) elif table_name == 'pet': return db.pet.validate_and_insert(**vars) else: raise HTTP(400) return locals()
- DAL.representers = {}¶
- DAL.serializers = None¶
- DAL.uuid(x)¶
- DAL.validators = None¶
- DAL.validators_method = None¶
- uri (str) –
pydal.connection module¶
- class pydal.connection.ConnectionPool[source]¶
Bases: object
- POOLS = {}¶
- check_active_connection = True¶
- static close_all_instances(action)[source]¶
to close cleanly databases in a multithreaded environment
pydal.objects module¶
- class pydal.objects.BasicRows[source]¶
Bases: object
Abstract class for Rows and IterRows
- as_csv()¶
Serializes the table into a csv file
- as_dict(key='id', compact=True, storage_to_dict=True, datetime_to_str=False, custom_types=None)[source]¶
Returns the data as a dictionary of dictionaries (storage_to_dict=True) or records (False)
Parameters: - key – the name of the field to be used as dict key, normally the id
- compact – ? (default True)
- storage_to_dict – when True returns a dict, otherwise a list(default True)
- datetime_to_str – convert datetime fields as strings (default False)
- as_json(mode='object', default=None)[source]¶
Serializes the rows to a JSON list or object with objects mode=’object’ is not implemented (should return a nested object structure)
- as_list(compact=True, storage_to_dict=True, datetime_to_str=False, custom_types=None)[source]¶
Returns the data as a list or dictionary.
Parameters: - storage_to_dict – when True returns a dict, otherwise a list
- datetime_to_str – convert datetime fields as strings
- as_trees(parent_name='parent_id', children_name='children', render=False)[source]¶
returns the data as list of trees.
Parameters: - parent_name – the name of the field holding the reference to the parent (default parent_id).
- children_name – the name where the children of each row will be stored as a list (default children).
- render – whether we will render the fields using their represent (default False) can be a list of fields to render or True to render all.
- export_to_csv_file(ofile, null='<NULL>', *args, **kwargs)[source]¶
Exports data to csv, the first line contains the column names
Parameters: - ofile – where the csv must be exported to
- null – how null values must be represented (default ‘<NULL>’)
- delimiter – delimiter to separate values (default ‘,’)
- quotechar – character to use to quote string values (default ‘”’)
- quoting – quote system, use csv.QUOTE_*** (default csv.QUOTE_MINIMAL)
- represent – use the fields .represent value (default False)
- colnames – list of column names to use (default self.colnames)
This will only work when exporting rows objects!!!! DO NOT use this with db.export_to_csv()
- json(mode='object', default=None)¶
Serializes the rows to a JSON list or object with objects mode=’object’ is not implemented (should return a nested object structure)
- class pydal.objects.Expression(db, op, first=None, second=None, type=None, **optional_args)[source]¶
Bases: object
- belongs(*value, **kwattr)[source]¶
Accepts the following inputs:
field.belongs(1,2) field.belongs((1,2)) field.belongs(query)
Does NOT accept:
field.belongs(1)If the set you want back includes None values, you can do:
field.belongs((1,None), null=True)
- class pydal.objects.Field(fieldname, type='string', length=None, default=<function <lambda> at 0x7fef1043a938>, required=False, requires=<function <lambda> at 0x7fef1043a938>, ondelete='CASCADE', notnull=False, unique=False, uploadfield=True, widget=None, label=None, comment=None, writable=True, readable=True, update=None, authorize=None, autodelete=False, represent=None, uploadfolder=None, uploadseparate=False, uploadfs=None, compute=None, custom_store=None, custom_retrieve=None, custom_retrieve_file_properties=None, custom_delete=None, filter_in=None, filter_out=None, custom_qualifier=None, map_none=None, rname=None)[source]¶
Bases: pydal.objects.Expression, pydal.helpers.classes.Serializable
- Lazy¶
Represents a database field
Example
Usage:
a = Field(name, 'string', length=32, default=None, required=False, requires=IS_NOT_EMPTY(), ondelete='CASCADE', notnull=False, unique=False, uploadfield=True, widget=None, label=None, comment=None, uploadfield=True, # True means store on disk, # 'a_field_name' means store in this field in db # False means file content will be discarded. writable=True, readable=True, update=None, authorize=None, autodelete=False, represent=None, uploadfolder=None, uploadseparate=False # upload to separate directories by uuid_keys # first 2 character and tablename.fieldname # False - old behavior # True - put uploaded file in # <uploaddir>/<tablename>.<fieldname>/uuid_key[:2] # directory) uploadfs=None # a pyfilesystem where to store upload )
to be used as argument of DAL.define_table
alias of FieldMethod
- Method¶
alias of FieldMethod
- Virtual¶
alias of FieldVirtual
- class pydal.objects.FieldVirtual(name, f=None, ftype='string', label=None, table_name=None)[source]¶
Bases: object
- class pydal.objects.IterRows(db, sql, fields, colnames, blob_decode, cacheable)[source]¶
Bases: pydal.objects.BasicRows
- next()¶
- class pydal.objects.Query(db, op, first=None, second=None, ignore_common_filters=False, **optional_args)[source]¶
Bases: pydal.helpers.classes.Serializable
Necessary to define a set. It can be stored or can be passed to DAL.__call__() to obtain a Set
Example
Use as:
query = db.users.name=='Max' set = db(query) records = set.select()
- as_dict(flat=False, sanitize=True)[source]¶
Experimental stuff
This allows to return a plain dictionary with the basic query representation. Can be used with json/xml services for client-side db I/O
Example
Usage:
q = db.auth_user.id != 0 q.as_dict(flat=True) { "op": "NE", "first":{ "tablename": "auth_user", "fieldname": "id" }, "second":0 }
- class pydal.objects.Row(*args, **kwargs)[source]¶
Bases: pydal.helpers.classes.BasicStorage
A dictionary that lets you do d[‘a’] as well as d.a this is only used to store a Row
- as_json(mode='object', default=None, colnames=None, serialize=True, **kwargs)[source]¶
serializes the row to a JSON object kwargs are passed to .as_dict method only “object” mode supported
serialize = False used by Rows.as_json
TODO: return array mode with query column order
mode and colnames are not implemented
- class pydal.objects.Rows(db=None, records=[], colnames=[], compact=True, rawrows=None)[source]¶
Bases: pydal.objects.BasicRows
A wrapper for the return value of a select. It basically represents a table. It has an iterator and each row is represented as a Row dictionary.
- exclude(f)[source]¶
Removes elements from the calling Rows object, filtered by the function f, and returns a new Rows object containing the removed elements
- find(f, limitby=None)[source]¶
Returns a new Rows object, a subset of the original object, filtered by the function f
- render(i=None, fields=None)[source]¶
Takes an index and returns a copy of the indexed row with values transformed via the “represent” attributes of the associated fields.
Parameters: - i – index. If not specified, a generator is returned for iteration over all the rows.
- fields – a list of fields to transform (if None, all fields with “represent” attributes will be transformed)
- setvirtualfields(**keyed_virtualfields)[source]¶
For reference:
db.define_table('x',Field('number','integer')) if db(db.x).isempty(): [db.x.insert(number=i) for i in range(10)] from gluon.dal import lazy_virtualfield class MyVirtualFields(object): # normal virtual field (backward compatible, discouraged) def normal_shift(self): return self.x.number+1 # lazy virtual field (because of @staticmethod) @lazy_virtualfield def lazy_shift(instance,row,delta=4): return row.x.number+delta db.x.virtualfields.append(MyVirtualFields()) for row in db(db.x).select(): print row.number, row.normal_shift, row.lazy_shift(delta=7)
- class pydal.objects.Set(db, query, ignore_common_filters=None)[source]¶
Bases: pydal.helpers.classes.Serializable
Represents a set of records in the database. Records are identified by the query=Query(...) object. Normally the Set is generated by DAL.__call__(Query(...))
Given a set, for example:
myset = db(db.users.name=='Max')
you can:
myset.update(db.users.name='Massimo') myset.delete() # all elements in the set myset.select(orderby=db.users.id, groupby=db.users.name, limitby=(0,10))
and take subsets:
subset = myset(db.users.id<5)
- class pydal.objects.Table(db, tablename, *fields, **args)[source]¶
Bases: pydal.helpers.classes.Serializable, pydal.helpers.classes.BasicStorage
Represents a database table
- Example::
- You can create a table as::
- db = DAL(...) db.define_table(‘users’, Field(‘name’))
And then:
db.users.insert(name='me') # print db.users._insert(...) to see SQL db.users.drop()
- import_from_csv_file(csvfile, id_map=None, null='<NULL>', unique='uuid', id_offset=None, *args, **kwargs)[source]¶
Import records from csv file. Column headers must have same names as table fields. Field ‘id’ is ignored. If column names read ‘table.file’ the ‘table.’ prefix is ignored.
- ‘unique’ argument is a field which must be unique (typically a uuid field)
- ‘restore’ argument is default False; if set True will remove old values in table first.
- ‘id_map’ if set to None will not map ids
The import will keep the id numbers in the restored table. This assumes that there is an field of type id that is integer and in incrementing order. Will keep the id numbers in restored table.