Sending logs from AWS S3

About the script is a Python script used to send any kind of log files to Devo from AWS Lambda. 

The script comes bundled with a config.ini file for its configuration. The key variables in the config.ini file are: 

  • There are three certificate variables used to authenticate the destination account.
    • client_key - The Private Key.
    • client_crt - The X.509 Certificate.
    • chain - The Chain CA.
  • relay_add - The name of the Event Load Balancer to which the log files will be sent.
  • relay_port - The port of the Event Load Balancer to which the log files will be sent.
  • tag - The tag to be applied to the events sent to Devo.

Obtain certificates

In Devo, go to Administration → Credentials → X.509 Certificates and download the X.509 Certificate, Private Key, and Chain CA.

Define the tags

In order to classify the data sent to Devo, unique tags are added to the syslog events. Tags have a hierarchical structure consisting of a sequence of named elements separated by periods. See  Supported technologies for more information.
The following table describes the structure recommended for AWS web technologies.

Technology Tag
Cloud Front<region>.<instance>
Elastic Load Balancing<region>.<instance>

For example, in the Elastic Load Balancing tag identifies the region where the server is located and pro-frontend-elb identifies the server.

Deploy the script on AWS Lambda

  1. Click here to download the lt_aws_s3_lambda package.

  2. Create a new AWS Lambda function in the same zone in which the S3 bucket resides.

  3. Double-click to select the s3-get-object-python blueprint. The Configure triggers page opens.

  4. Select the bucket to which you will save the log information and select Object Created (all) as the Event type. Select Enable trigger to start sending logs immediately once the Lambda function has been created. Click the Next button. The Configure function page opens.

  5. Enter a Name and a Description for your new function. Select Python 2.7 for Runtime.  Select Upload a .ZIP file for Code entry type and upload the ZIP file you created earlier containing the six files.

  6. Edit the script and create a zip file:
    • Copy config.ini.example to config.iniand edit the content accordingly.
    • Create a .ZIP file containing 6 files:, the config file config.ini, lt_engine.pyand the three certificates downloaded (certificate, key, and chain).   

      Make sure all the files are in the root of the ZIP archive, not in a folder.
  7. Lambda function handler and role:
    • Enter lt_s3_lambda.lambda_handler as the Handler.
    • Enter a Role name for a new S3 execution role. Note that your IAM user must have permissions to create and assign new roles. Select default, S3 object read-only permissions, for Policy templates.

  8. In the Advanced settings section, select the maximum value (1536 MB) for Memory. Set the Timeout to a value that is close but less than log file creation frequency. For example, if the log file creation frequency is 5 minutes, set the Timeout to 4 minutes and 30 seconds. Select No VPC for the VPC value. 

  9. Review the Lambda function configuration. If it is correct, click the Create function button.

The logs become visible in your Devo account as new files created in the S3 bucket.

  • Note that due to the nature of services logging to S3, these will not be in real time. Logs will only be ingested when the file is written to the bucket.
  • Take into consideration the timestamp difference when searching logs and setting write frequency.

Have we answered your question?

If not, please contact our technical support team via email by clicking the button below.