Sawmill

An alternative Python logging library.

Introduction

Sawmill is an alternative to the standard Python logging library

Its goals are to be simple and consistent in approach as well as making it easier to customise behaviour even at a low level. However, it does draw several concepts from the standard library so hopefully it will all appear vaguely familiar.

Here are some examples of how Sawmill differs:

  • No explicit hierarchy of loggers. All handlers have the opportunity to handle every message by default. However, using the Distribute handler you can construct your own handler hierarchy if you want.
  • Log levels are not treated specially. Instead you can filter based on log level by adding a Level filterer to your handlers.
  • There is a clearer contract between formatters and handlers. This simplifies having a formatter that produces an email with a handler that actually sends the mail.
  • No passing of format arguments to log calls. Format messages using either standard Python string formatting or a dedicated Formatter that acts on the whole log.
  • Filterers and formatters are only defined for handlers so it is clearer where to use them (though potentially more restrictive).
  • The system handles batches of logs by default making it simple to buffer logs and generate summary messages.
  • Easy to change the root handler (sawmill.root).
  • Requires Python 2.6 or higher.
  • Follows PEP8.

Installing

Note

Using Virtualenv is recommended when evaluating or running locally.

Installation is simple with pip:

pip install sawmill

Building from source

You can also build manually from the source for more control. First obtain a copy of the source by either downloading or cloning the public repository:

$ git clone git@gitlab.com:4degrees/sawmill.git

Then you can build and install the package into your current Python site-packages folder:

pip install .

When actively developing, you can install an editable version along with additional dependencies for building and running tests:

pip install -e .[dev]

Alternatively, just build locally and manage yourself:

python setup.py build

Building documentation from source

To build the documentation from source:

python setup.py build_sphinx

Then view in your browser:

file:///path/to/sawmill/build/doc/html/index.html

Running tests against the source

With a copy of the source it is also possible to run the unit tests:

python setup.py -q test

With a coverage report:

python setup.py -q test --addopts --cov=sawmill

Tutorial

Using Sawmill is straightforward:

>>> import sawmill
>>> from sawmill.log import Log
>>> sawmill.configure()
>>> sawmill.root.handle(Log(message='Hello World!'))
Hello World!

Voila! You asked Sawmill to configure() itself in a classic fashion and then called the main handle method with a Log instance. The resulting message appeared on sys.stderr.

Note

sawmill.configure() is a helper function that attempts to configure Sawmill in a way that is useful to most applications. However, there is no requirement to use this helper or the default configurators (see Configuration).

Loggers

If you find that you are regularly inputting the same information for each log instance then you can create a Logger() to hold common information. Any values you set on the logger will automatically propagate into the log messages you generate using the log() method:

>>> logger = Logger(name='my.logger')
>>> logger.log(message='Hi there')
my.logger:Hi there
>>> logger['level'] = 'info'
>>> logger.log(message='Some information')
my.logger:info:Hi there

Sawmill comes with some different logger implementations to handle common scenarios, but you can also define your own. Here is the Classic logger in action that mimics the standard Python logger behaviour:

>>> from sawmill.logger.classic import Classic as Logger
>>> logger = Logger('my.logger')
>>> logger.info('An informational message')
my.logger:info:An informational message
>>> logger.error('An error message')
my.logger:error:An error message

Handlers

So, how are those messages ending up on sys.stderr? This is because the configure function adds a Stream handler configured to output all messages to standard error. It does this by registering the handler with the root handler which, by default, is a Distribute handler. The distribute handler simply relays all the logs it receives to other handlers registered with it.

Let’s add another stream handler to the root handler, but this time outputting to a StringIO instance:

>>> from StringIO import StringIO
>>> from sawmill.handler.stream import Stream
>>> my_stream = StringIO()
>>> my_handler = Stream(stream=my_stream)

All that you have to do to register a handler with a distribute handler is set it with a unique key on the handlers dictionary of the distribute handler:

>>> sawmill.root.handlers['my_handler'] = my_handler

Now we can log as normal using our logger from before:

>>> logger.info('Some more information.')
my.logger:info:Some more information.

Same as before, but take a look at my_stream:

>>> print my_stream.getvalue()
{'name': 'my.logger', 'level': 'info', 'message': 'Some more information.'}

The reason it contains just a string representation of the log (dictionary) is because no formatter has been set on our custom handler.

Formatters

A formatter takes a list of Log instances and returns a corresponding list of formatted data that a handler can output. Typically the returned data will be a string, but it is important to note that it does not have to be. The only condition is that the returned data works with the handler’s output method.

Note

Due to the tight contract between a formatter and handler you cannot use every formatter with every handler. Instead check the documentation for which ones work well together.

Add a Template formatter to the handler created above:

>>> from sawmill.formatter.template import Template
>>> my_formatter = Template('{level}:{message}\n')
>>> my_handler.formatter = my_formatter

Now logging a message will result in the formatter being called for the handler my_handler:

>>> my_stream.truncate(0)
>>> logger.info('Yet more information.')
>>> print my_stream.getvalue()
info:Yet more information.

Filterers

A filterer controls whether a log should be handled by a particular handler. A typical usage of a filterer is to restrict a particular handler to only handle serious errors. Add a Level filterer to my_handler so that it only handles error messages (or greater):

>>> from sawmill.filterer.level import Level
>>> my_handler.filterer = Level(min='error', max=None)

Note

The level values available and their respective order is set, by default, according to the sawmill.levels array.

Now try logging an info level message:

>>> my_stream.truncate(0)
>>> logger.info('I will not appear in the stringio instance.')
my.logger:info:I will not appear in the stringio instance.

Whilst the log was still handled by the default stream handler (that does not filter info level messages) it was not handled by my_handler:

>>> print my_stream.getvalue()

If you wanted a group of handlers to have the same filterer you could set them up under a distribute handler and then set the filterer on that handler. For example, here is how to limit all the handlers using a filterer on the root handler:

>>> sawmill.root.filterer = Level(min='error', max=None)
>>> logger.info('I will not appear anywhere.')

You can also quickly combine different filterers for more complex effects:

>>> from sawmill.filterer.pattern import Pattern
>>> sawmill.root.filterer &= Pattern('my\..*', mode=Pattern.EXCLUDE)

The above would filter any log that had too low a level or had a name value that started with ‘my.’.

Configuration

Sawmill comes with some default configurators that are simple helper functions for configuring the system. Each one is registered in the sawmill.configurators dictionary and can be called through the sawmill.configure() function.

The default configurator is the classic one which you would have used in the Tutorial.

Note

You can find reference for each configurator that comes with Sawmill in the sawmill.configurator section.

You don’t have to use a configurator though (or you might want to create your own) so let’s take a look at manually configuring the system.

First, decide what you want to handle, where it should go and how it should be formatted. For our purposes, we will configure Sawmill to:

  • Send all logs to sys.stderr that have a level greater than ‘warning’ or have been tagged with a ‘user’ key. The format will be just the level and message.
  • Send all logs to a file displaying all available fields verbosely starting with date, level, name, message.
  • Send a formatted email for any log with a level of ‘error’ or higher to our support team including the traceback and a buffer of recent logs.
  • Redirect standard library logging to Sawmill.

Before you start make sure you import all the necessary modules:

import sys
import tempfile
from datetime import datetime

import sawmill
import sawmill.handler.stream
import sawmill.handler.email
import sawmill.handler.buffer
import sawmill.formatter.template
import sawmill.formatter.field
import sawmill.filterer.level
import sawmill.filterer.item
import sawmill.compatibility

User Visible Or High Level To Standard Error

First up we need a Stream handler to direct output to sys.stderr:

    stderr_handler = sawmill.handler.stream.Stream(sys.stderr)

We want to use a simple Template formatter to display each log as a string of level and message:

    stderr_formatter = sawmill.formatter.template.Template(
        '{level}:{message}\n'
    )

And attach that as the formatter for the stderr_handler:

    stderr_handler.formatter = stderr_formatter

Now we need to filter all logs that don’t meet the level requirement unless they are tagged with a ‘user’ key:

    stderr_filterer = sawmill.filterer.level.Level(min=level, max=None)
    stderr_filterer |= sawmill.filterer.item.Item('user', True)

And attach as the filterer for the stderr_handler:

    stderr_handler.filterer = stderr_filterer

Next we just need to register this handler under a sensible name like stderr:

    sawmill.root.handlers['stderr'] = stderr_handler

All To File

Logging everything to a file means we need another Stream handler, but pointing at a file this time:

    # Output to log file
    if filepath is None:
        prefix = datetime.now().strftime('sawmill-%Y_%m_%d-%H_%M_%S-')
        _, filepath = tempfile.mkstemp(prefix=prefix, suffix='.log')

    file_stream = open(filepath, 'a')
    file_handler = sawmill.handler.stream.Stream(file_stream)

We don’t need any filterer as all logs should go to the file, but we do want a specific formatter to try and capture as much information as the logs can provide:

    file_formatter = sawmill.formatter.field.Field([
        'timestamp', '*', 'name', 'level', 'message', 'traceback'
    ])
    file_handler.formatter = file_formatter

Note

The ‘*’ is special and means capture all other fields present ordered alphabetically by key.

And register that one as well:

    sawmill.root.handlers['file'] = file_handler

Errors To Email

Next, create an email handler that will send any errors to a predefined email address, including a buffer of recent messages. First setup the email handler with the default template:

    # Send email on errors.
    email_handler = sawmill.handler.email.Email(
        'smtp.example.com', 587,
        secure=(),
        formatter=sawmill.formatter.email.Email(
            'Error Report',
            email_sender,
            email_recipients
        )
    )

Next, we need a buffer so that errors will have context. This buffer will wrap the email handler and only pass on messages when the set trigger is activated:

    def check_for_error(logs, buffer):
        '''Return True if any of the recent logs was an error message.'''
        for log in logs:
            if log.get('level') == 'error':
                return True

        return False

    email_buffer_handler = sawmill.handler.buffer.Buffer(
        email_handler,
        check_for_error,
        limit=30
    )

And register the buffer handler as the email handler:

    sawmill.root.handlers['email'] = email_buffer_handler

Integrate standard library logging

Finally, redirect all standard library logging to Sawmill (and this configuration) using the supplied helper function:

    if redirect_standard_logging:
        sawmill.compatibility.redirect_standard_library_logging()

See also

The alpha configurator used for this example.

API Reference

sawmill

sawmill.root = <sawmill.handler.distribute.Distribute object>

Top level handler responsible for relaying all logs to other handlers.

sawmill.levels = ['debug', 'info', 'warning', 'error']

Log levels ordered by severity. Do not rely on the index of the level name

sawmill.configurators = {'classic': <function configure at 0x7f712d241668>}

Configurators registered for use with the sawmill.configure() function.

sawmill.configure(configurator='classic', *args, **kw)[source]

Configure Mill using configurator.

Will call registered configuration function matching the configurator name with args, and kw.

sawmill.configurator

sawmill.configurator.alpha
sawmill.configurator.alpha.configure(email_sender, email_recipients, level='info', filepath=None, redirect_standard_logging=True, *args, **kw)[source]

Alpha configurator.

level will determine the minimum level to display on stderr. filepath can be used to set where the log file should be stored. It defaults to a temporary file named after the current date and time.

If redirect_standard_logging is True then also redirect all standard library logging to Sawmill using sawmill.compatibility.redirect_standard_library_logging().

sawmill.configurator.classic
sawmill.configurator.classic.configure(level='info', filepath=None, redirect_standard_logging=True, *args, **kw)[source]

Configure the logging system in a classic manner.

level will determine the minimum level to display on stderr. filepath can be used to set where the log file should be stored. It defaults to a temporary file named after the current date and time.

If redirect_standard_logging is True then also redirect all standard library logging to Sawmill using sawmill.compatibility.redirect_standard_library_logging().

sawmill.filterer

sawmill.filterer.base
class sawmill.filterer.base.Filterer[source]

Bases: object

Determine if logs should be filtered.

filter(logs)[source]

Return logs that pass filter.

class sawmill.filterer.base.All(filterers=None)[source]

Bases: sawmill.filterer.base.Filterer

Combine filterers and filter logs that don’t pass all filterers.

__init__(filterers=None)[source]

Initialise filterer with initial filterers to combine.

filter(logs)[source]

Return logs that pass all filterers.

Note

If no filterers have been set then all logs are returned.

class sawmill.filterer.base.Any(filterers=None)[source]

Bases: sawmill.filterer.base.Filterer

Combine filterers and filter logs that don’t pass any filterers.

__init__(filterers=None)[source]

Initialise filterer with initial filterers to combine.

filter(logs)[source]

Return logs that pass any of the filterers.

Note

If no filterers have been set then all logs are returned.

sawmill.filterer.item
class sawmill.filterer.item.Item(key, value, mode='include')[source]

Bases: sawmill.filterer.base.Filterer

Filter logs based on key, value item matching.

__init__(key, value, mode='include')[source]

Initialise filterer with key and value to test against.

mode can be either EXCLUDE or INCLUDE. If set to EXCLUDE then any log that has the specific key, value pair will be filtered. Conversely, if set to INCLUDE then any log not matching the key, value pair exactly will be filtered.

filter(logs)[source]

Filter logs based on key, value item matching.

If a log does not have the key to test against and mode is set to INCLUDE it will be filtered. Conversely, if mode is set to EXCLUDE it will not be filtered.

EXCLUDE = 'exclude'
INCLUDE = 'include'
sawmill.filterer.level
class sawmill.filterer.level.Level(min='info', max=None, levels=['debug', 'info', 'warning', 'error'])[source]

Bases: sawmill.filterer.base.Filterer

Filter logs according to defined log level limits.

__init__(min='info', max=None, levels=['debug', 'info', 'warning', 'error'])[source]

Initialise filterer with min and max levels.

The values must be taken from passed levels array. If the value given is None then it will be as if there is not limit.

A Log level value must fall between (inclusive) the min and max values.

filter(logs)[source]

Return logs whose level is between the defined range.

Note

If a log has no level information (or a level not present in the current levels array) it will pass the filter successfully.

sawmill.filterer.pattern
class sawmill.filterer.pattern.Pattern(pattern, key='name', mode='include')[source]

Bases: sawmill.filterer.base.Filterer

Filter logs using pattern matching.

__init__(pattern, key='name', mode='include')[source]

Initialise filterer with pattern and key to test.

If pattern is a string it will be converted to a compiled regular expression instance.

mode can be either ‘exclude’ or ‘include’. If set to ‘exclude’ then any log matching the pattern will be filtered. Conversely, if set to ‘include’ then any log not matching the pattern will be filtered.

EXCLUDE = 'exclude'
INCLUDE = 'include'
filter(logs)[source]

Filter logs based on pattern matching.

If a log does not have the key to test against it will pass the filter successfully. If the key is present, but not a string then the log will be filtered.

sawmill.formatter

sawmill.formatter.base
class sawmill.formatter.base.Formatter[source]

Bases: object

Format logs into data for output.

The format of data returned should conform to a contract with supported handlers. In this way a formatter is tightly bound to a Handler and is really only separated out to increase possibility of reuse.

format(logs)[source]

Return formatted data representing logs.

The data should be returned as a list, typically one entry per log. Some formatters may choose to combine the passed logs into one formatted datum and should return a list of a single item.

Warning

logs may be shared and should not be altered.

sawmill.formatter.email
sawmill.formatter.email.DEFAULT_TEMPLATE = "\n<html>\n <body>\n <h1>Logs</h1>\n {{#logs}}\n <span class='{{level}}'>\n {{timestamp}}:{{name}}:{{message}}{{#traceback}}<br/>{{.}}{{/traceback}}\n </span><br/>\n {{/logs}}\n </body>\n</html>\n"

Default html email template.

class sawmill.formatter.email.Email(subject, sender, recipients, template="n<html>n <body>n <h1>Logs</h1>n {{#logs}}n <span class='{{level}}'>n {{timestamp}}:{{name}}:{{message}}{{#traceback}}<br/>{{.}}{{/traceback}}n </span><br/>n {{/logs}}n </body>n</html>n", batch=True, **kw)[source]

Bases: sawmill.formatter.mustache.Mustache

Format logs to email messages.

__init__(subject, sender, recipients, template="\n<html>\n <body>\n <h1>Logs</h1>\n {{#logs}}\n <span class='{{level}}'>\n {{timestamp}}:{{name}}:{{message}}{{#traceback}}<br/>{{.}}{{/traceback}}\n </span><br/>\n {{/logs}}\n </body>\n</html>\n", batch=True, **kw)[source]

Initialise handler with subject, sender and recipients.

Each of subject, sender and recipients can be either a static string or a callable which will be passed the log records being handled and should return an appropriate value.

recipients should be (or return) a string of comma separated email addresses.

format(logs)[source]

Return email messages representing logs.

Each message will be an instance of Message. It will be setup as a multipart message containing both html and plain text versions of the logs formatted according to the set template.

sawmill.formatter.field
class sawmill.formatter.field.Field(keys, missing_key='__SKIP__', template='{key}={value}', item_separator=':')[source]

Bases: sawmill.formatter.base.Formatter

Format logs according to item list.

REMAINING = '*'
__init__(keys, missing_key='__SKIP__', template='{key}={value}', item_separator=':')[source]

Initialise formatter with keys to look for in specific order.

keys may contain REMAINING at any point to include all remaining unspecified keys in alphabetical order.

missing_key determines how to handle a missing key when formatting a log. The default SKIP will skip the key and not include it in the resulting output. ERROR will cause a KeyError to be raised. Any other value will be used as the substitute string for the missing value.

template is used to format the key and value of each field and item_separator will separate each item.

format(logs)[source]

Return formatted data representing logs.

ERROR = '__ERROR__'
SKIP = '__SKIP__'
sawmill.formatter.mustache
class sawmill.formatter.mustache.Mustache(template, batch=False)[source]

Bases: sawmill.formatter.base.Formatter

Format logs using Mustache template.

__init__(template, batch=False)[source]

Initialise formatter with template.

If batch is False then the template will be applied for each log received. Otherwise all received logs will be passed to template at once under the key ‘logs’.

format(logs)[source]

Return formatted data representing logs.

If self.batch is False then each log will be passed to the template separately as the context. Otherwise, all the logs will be passed as the context.

sawmill.formatter.template
class sawmill.formatter.template.Template(template, missing_key='')[source]

Bases: sawmill.formatter.base.Formatter, string.Formatter

Format logs according to a template.

ERROR = '__ERROR__'
__init__(template, missing_key='')[source]

Initialise formatter with template.

missing_key determines how to handle a missing key when formatting a log. If set to ERROR then an error will be raised for any key referenced in the template that is missing from the log. Any other value will be used as the substitute for the missing value. The default is an empty string.

Note

The template is applied once per processed log.

format(logs)[source]

Return formatted data representing log.

get_field(field_name, args, kwargs)[source]

Convert and return field_name to an object to be formatted.

Note

Based on Python’s string.Formatter.get_field source.

sawmill.handler

sawmill.handler.base
class sawmill.handler.base.Handler(filterer=None, formatter=None)[source]

Bases: object

Handle a Log, outputting it in a relevant way.

__init__(filterer=None, formatter=None)[source]

Initialise handler with filterer and formatter.

If specified, filterer should be an instance of Filterer and will be used to determine whether to handle a log or not.

formatter should be an instance of Formatter and will be called to format the log record into a format suitable for output by this handler. As such a contractual relationship exists between a handler and its formatter.

setup()[source]

Setup handler.

teardown()[source]

Teardown handler.

handle(*logs)[source]

Output formatted logs that pass defined filters.

output(data)[source]

Output formatted data.

data should be a list of entries (typically one per log) to output.

Warning

data may be shared and should not be altered.

sawmill.handler.buffer
class sawmill.handler.buffer.Buffer(handler, trigger, limit=50, *args, **kw)[source]

Bases: sawmill.handler.base.Handler

Buffer log records and distribute to another handler on trigger.

Any log record handled by this handler will first be filtered and formatted by this handler before being added to the buffer.

The wrapped handler will still have opportunity to further filter and format the log records as required.

Note

The data returned by any registered formatter on this handler will be passed directly to the wrapped handler’s handle method. Therefore, it must be a valid Log record.

__init__(handler, trigger, limit=50, *args, **kw)[source]

Initialise handler with wrapped handler and trigger.

trigger should be a callable that accepts the new data being handled and the buffer contents (including the new data). It should return True if the wrapped handler should be called with the buffer else False.

limit will set the maximimum size of the buffer. When the buffer size exceeds the limit and new logs are added, logs will be removed from the opposite end

Note

By default the buffer is cleared after each successful trigger.

output(data)[source]

Output formatted data.

As data is eventually passed directly to other handlers’ handle methods, it should be a list of valid Log records.

reset()[source]

Reset buffer.

sawmill.handler.distribute
class sawmill.handler.distribute.Distribute(handlers=None, *args, **kw)[source]

Bases: sawmill.handler.base.Handler

Distribute log records to other named handlers.

Any log record handled by this handler will first be filtered and formatted by this handler before being passed to the other registered handlers.

The other handlers will still have opportunity to further filter and format the log record as required.

The registered handlers are called in no particular order.

Note

The data returned by any registered formatter on this handler will be passed directly to the registered handlers’ handle methods. Therefore, it must be a valid Log record.

__init__(handlers=None, *args, **kw)[source]

Initialise handler with mapping of handlers to distribute to.

output(data)[source]

Output formatted data.

As data is passed directly to other handlers’ handle methods, it should be a list of valid Log records.

sawmill.handler.email
class sawmill.handler.email.Email(host='localhost', port=25, credentials=None, *args, **kw)[source]

Bases: sawmill.handler.base.Handler

Output log records to email.

__init__(host='localhost', port=25, credentials=None, *args, **kw)[source]

Initialise handler with host and port to connect to SMTP server.

credentials may be supplied as a tuple of (username, password) if the server requires a login.

Note

Messages are expected to be constructed by the formatter and is where attributes such as sender, recipients, subject and content should be set.

output(data)[source]

Output formatted data.

data should be a list of prepared Message instances that will be sent using the configured SMTP server. In addition each message must specify the keys ‘To’ and ‘From’ which will be used when sending the email.

sawmill.handler.stream
class sawmill.handler.stream.Stream(stream, *args, **kw)[source]

Bases: sawmill.handler.base.Handler

Output log records to stream.

__init__(stream, *args, **kw)[source]

Initialise handler with target stream.

Note

The given stream is not managed by this handler. It is not opened or closed as it may also be being used elsewhere (such as sys.stderr).

teardown()[source]

Teardown handler.

flush()[source]

Explicitly flush the stream if supported.

output(data)[source]

Output formatted data.

data should be a list of objects able to be written to a stream.

sawmill.logger

sawmill.logger.audit
class sawmill.logger.audit.Audit(_handler=<sawmill.handler.distribute.Distribute object>, **kw)[source]

Bases: sawmill.logger.base.Logger

Add timestamp and username information automatically.

prepare(*args, **kw)[source]

Prepare and return a log for emission.

sawmill.logger.base
class sawmill.logger.base.Logger(_handler=<sawmill.handler.distribute.Distribute object>, **kw)[source]

Bases: sawmill.log.Log

Helper for emitting logs.

A logger can be used to preset common information (such as a name) and then emit Log records with that information already present.

__init__(_handler=<sawmill.handler.distribute.Distribute object>, **kw)[source]

Initialise logger.

If you need to override the default handler then pass in a custom _handler

prepare(*args, **kw)[source]

Prepare and return a log for emission.

kw arguments are automatically mixed in to a Log record made by copying this current logger.

log(*args, **kw)[source]

Emit a Log record.

sawmill.logger.classic
class sawmill.logger.classic.Classic(name, **kw)[source]

Bases: sawmill.logger.dynamic.Dynamic, sawmill.logger.traceback.Traceback, sawmill.logger.audit.Audit

Classic logger compatible with standard Python logger.

__init__(name, **kw)[source]

Initialise logger with identifying name.

prepare(message, **kw)[source]

Prepare and return a log for emission.

log(message, **kw)[source]

Log a message with additional kw arguments.

debug(message, **kw)[source]

Log a debug level message.

info(message, **kw)[source]

Log an info level message.

warning(message, **kw)[source]

Log a warning level message.

error(message, **kw)[source]

Log an error level message.

sawmill.logger.dynamic
class sawmill.logger.dynamic.Dynamic(_handler=<sawmill.handler.distribute.Distribute object>, **kw)[source]

Bases: sawmill.logger.base.Logger

Dynamic logger allowing delayed computation of values.

sawmill.logger.traceback
class sawmill.logger.traceback.Traceback(_handler=<sawmill.handler.distribute.Distribute object>, **kw)[source]

Bases: sawmill.logger.base.Logger

Support extracting traceback information automatically.

prepare(*args, **kw)[source]

Prepare and return a log for emission.

If ‘traceback’ is present in kw arguments and is set to the value True then attempt to extract current traceback information and set as real value.

Warning

If no traceback could be extracted the traceback value will be the string ‘None’.

sawmill.compatibility

sawmill.compatibility.redirect_standard_library_logging()[source]

Redirect standard library logging to Sawmill.

Replace any existing handlers on the standard library root logger with a single RedirectToSawmillHandler. Other handlers should be left untouched.

In addition, unset any level filter on the standard library root logger to ensure logs are passed through.

class sawmill.compatibility.RedirectToSawmillHandler(target=<sawmill.handler.distribute.Distribute object>, *args, **kwargs)[source]

Bases: logging.Handler

Redirect to Sawmill.

__init__(target=<sawmill.handler.distribute.Distribute object>, *args, **kwargs)[source]

Initialise handler.

target should be the Sawmill handler to redirect to.

emit(record)[source]

Emit record to Sawmill.

Note

It is the responsibility of the attached formatter to convert the record to a Sawmill Log instance.

class sawmill.compatibility.SawmillFormatter[source]

Bases: logging.Formatter

Format records for use by Sawmill.

__init__()[source]

Initialise formatter.

format(record)[source]

Return Sawmill Log instance from record.

sawmill.log

class sawmill.log.Log(*args, **kw)[source]

Bases: _abcoll.MutableMapping, dict

Hold individual log data.

__init__(*args, **kw)[source]

Initialise log.

clone()[source]

Return a clone of this log.

This is a mixture of shallow and deep copies where the log instance and its attributes are shallow copied, but the actual mapping (items) are deepcopied.

Release and migration notes

Find out what has changed between versions and see important migration notes to be aware of when switching to a new version.

Release Notes

0.2.1

8 November 2016

0.2.0

11 July 2016
  • new

    Added compatibility helpers for redirecting standard logging to Sawmill.

  • changed

    Included configurators now redirect standard library logging to Sawmill. This can be turned off by passing redirect_standard_logging=False to the configurator.

0.1.1

8 June 2016
  • fixed

    Exceptions raised on teardown() of Stream handler if underlying stream was already closed as part of cleanup process.

  • changed

    documentationSimplified documentation structure.

  • fixed

    documentationAdded missing Installing section.

  • fixed

    documentationFixed broken documentation references.

  • changed

    Log.__repr__ updated to return useful and accurate representation of Log instances.

  • fixed

    Refactored test_stream:test_auto_flush_on_exit that caused incorrect code coverage results to be reported.

0.1.0

25 May 2016
  • new

    Initial release for evaluation.

Migration notes

This section will show more detailed information when relevant for switching to a new version, such as when upgrading involves backwards incompatibilities.

Glossary

Virtualenv

A tool to create isolated Python environments.

Indices and tables