"""
log2bq

    Storing logs in BigQuery
    ========================

    log2bq can be used to store your logs in BigQuery so that you an easily analyze
    them. To achieve this, an hourly cron handler is set up in your project to hit
    a log2bq handler. log2bq then pushes the past hour's worth of logs into
    Cloud Storage and from there into BigQuery, finally cleaning up the Cloud
    Storage file. log2bq creates a table named log2bq_20130323 where the logs
    in that table are for that day. Note that you can query multiple tables
    in BigQuery by specifying a comma-separated list of tables in your FROM
    clause.

    To add the cron, add something like the following to your cron.yaml:

        cron:
            ...
            - description: Upload logs to BigQuery
              url: /log2bq/cron/hourly/
              schedule: every 1 hours synchronized

    You may use any route you wish, because you also have to set up your own route,
    probably in your ROUTES (e.g., route.py, or url.py):

        ROUTES = [
            ...
            Route('/log2bq/cron/hourly/', handler='log2bq.bq.HourlyCron'),
            ...
        ]

    log2bq will drive the logs into files on the Cloud Storage bucket specified
    by the GS_BUCKET_NAME parameter. These files will be all prefixed by the
    GS_FILE_PREFIX parameter. When the files are ingested into BigQuery the
    resulting tables will start with the BQ_TABLE_PREFIX parameter and be
    placed in the BQ_DATASET dataset. The entire process uses the
    BQ_PROJECT_ID parameter to identify the Google API Project to use (i.e.,
    this is what controls access to Cloud Storage and BigQuery) and runs
    in the queue identified by the QUEUE parameter.

    For this process to work, you must have the correct permissions
    configured. First, in Google API Project, you must have Cloud Storage,
    and BigQuery enabled. Your application's service account (see Application
    Settings tab) must have "Can edit" access on the Team tab. The
    Google API console may require an @your-domain.com account (which the service
    account is not), so you can set up an @your-domain.com group in Google
    Apps and add the service account to that group - yes, this works!
    Finally, your application's service account needs read-write access
    to the configured Cloud Storage bucket; see gutil for details on how to
    set this up:

        https://developers.google.com/storage/docs/gsutil

    To set configurations, you simply need to add the configuration parameter,
    prefixed by "log2bq_" to your appengine_config.py file. For example,
    to set an alternate queue and an alternate BigQuery table prefix,
    you would add the following lines to your appengine_config.py file:

        log2bq_QUEUE = 'my-queue'
        log2bq_BQ_TABLE_PREFIX = 'my_prefix_' # underscores for BigQuery names

    log2bq will also drop old tables from BigQuery so that you don't have to worry
    about large numbers of tables hanging around. It does this by comparing all
    tables in your dataset that start with "log2bq_". The comparison is a simple
    lexographic string comparison, so be careful to never manually name tables
    starting with "log2bq_" (this default value can be changed using the
    BQ_TABLE_PREFIX parameter). The number of days worth of logs to keep in
    BigQuery is set by the SCRUB_AGE parameter.

    You can also add a list of tags to your log message. To do so, provide a
    function that takes a google.appengine.api.logservice.RequestLog as a parameter.
    This function must return a list of strings (can be an empty list). For
    example, you might have the following in your appengine_config.py:

        from google.appengine.api import logservice

        def generate_tags(request_log):
            tags = set([])
            for app_log in request_log.app_logs:
                if app_log.level = logservice.LOG_LEVEL_CRITICAL:
                    tags.add('CRITICAL')
            return list(tags)

        log2bq_generate_tags = generate_tags

        # NOTE: if you want to look at request_log.app_logs, make sure you set
        # INCLUDE_APP_LOGS to True in your configuration:

        log2bq_INCLUDE_APP_LOGS = True


Release Notes
=============
v1.0.0
- initial release
"""

__version__ = '1.0.0'
