In a previous article we went through the steps on how to setup your AWS environment to Export EC2 logs to CloudWatch. In this article we will build on that and go through the steps to automate exporting CloudWatch Logs to S3. To accomplish this, we will be using the following AWS services:
- S3 (Bucket & Bucket Policy)
- IAM (Lambda Role)
- CloudWatch (Events Rules)
- Lambda (Functions)
S3 (Bucket & Bucket Policy)
First, you will need to create the S3 Bucket to export the logs to. This AWS Documentation will walk you through on how to create an S3 Bucket. Once the bucket is created, you will need to update its Bucket Policy with the following:
This policy will allow CloudWatch to PUT (WRITE access) objects to the bucket.
IAM Role
The AWS Lambda service will require permissions to log events and write to the S3 bucket we created. We will need to create the IAM Role ‘Export-EC2-CloudWatch-Logs-To-S3-Lambda-Role’ with the following policies attached to it:
Note: It is best practice to provide the least privilege required for Lambda to perform it’s routine tasks. Since this is just a walk through on how to automate the task, I chose the AWS managed Policies and granted the Lambda IAM Role the full access to S3, CloudWatch Logs, and CloudWatch Events.
Configure the Lambda Function
With Lambda you don’t have to provision or manage servers. You can simply paste your code and it will run when triggered. In this section of the article, we will set up the Lambda Function.
Navigate to Lambda >> Functions >> Create Function from there you will need to perform the following:
- Choose ‘Author from Scratch’.
- Function Name: Export-EC2-CloudWatch-Logs-To-S3
- Runtime: Python 3.x
- Under Permissions, you can either create a new IAM role or use an existing one. Select ‘Use an existing role’, and choose the IAM we created earlier. Now go ahead and click on Create Function.
Under the Function Code section, you will need to paste the following code:
You will need to paste the script under the Function Code section as shown in the screenshot below:
Under the Environment Variables section, you will need to add the Keys (case-sensitive) and their values.
- DESTINATION_BUCKET
- GROUP_NAME
- PREFIX
- NDAYS (review the documentation I left in the python script)
You may need to adjust the ‘Timeout’ duration under the ‘Basic Settings Section’ to more than 3 seconds. Keep in mind that changing the timeout may impact your function cost. [Source]
CloudWatch Events Rules
In order to schedule the Lambda Function to run or trigger at a specific time, you will need to create and configure a CloudWatch Events Rule. To create a new rule, you will need to navigate to CloudWatch >> Rules >> Create Rule.
To create an event rule, you will need to customize the event pattern or set a schedule and specify the target(s). If the Lambda function is already created, you can select the function as the target while creating the Event Rule. This will automatically add the trigger to the function.
Keep in mind that all scheduled events use the UTC time zone. In this example, I will trigger the Lambda Function at 00:00 CST everyday, which translates to 5AM UTC. You can customize an event pattern or set a schedule to invoke a target. [Source]
In order to add the trigger manually (if you have not done so while creating the Event Rule), you need to navigate to the Lambda Function you created and click on Add trigger.
You can now configure the trigger as can be seen in the screenshot below:
This is it, you have now setup your environment to Export EC2 Logs to CloudWatch and then to S3.
If this article has helped you solve a problem, please consider sharing it and following me on Medium as I will be posting more articles in the future. Also, feel free to connect with me on LinkedIn.
Stay safe and thanks for coming by.