Day 7: AWS Cloud Automation with Python & Boto3

Day 7: AWS Cloud Automation with Python & Boto3

ยท

7 min read

Table of contents

Learning points:

๐Ÿ”น Cloud Automation with Boto3 โ€“ Managing AWS services (EC2, S3, IAM, Lambda, RDS) using Python scripts.
๐Ÿ”น Automating Cloud Monitoring & Logging โ€“ Writing Python scripts for log parsing, system monitoring, and alerting.
๐Ÿ”น Security & Compliance โ€“ Automating security audits, IAM role management, and compliance checks.

LEARN:

PYTHON WITH BOTO3 TUTORIAL BY SANDIP DAS - Click Here

Initial Tasks:

โœ… Task 1: Install boto3 (pip install boto3) and configure AWS credentials (aws configure).


Challenge 1: Write a Python script that provisions an EC2 instance, a new security group, and a key pair. The same script should connect to ec2 after creation to check that everything is working fine. (The key pair should be generated via the Python script and used for the EC2 SSH connection.)

Goal: Automate EC2 instance provisioning with security group and key pair creation. Verify connectivity via SSH.

Prerequisites:

Step 1: Install Required Dependencies

pip install boto3 paramiko
  • boto3: AWS SDK for provisioning resources.

  • paramiko: SSH library for connecting to EC2.

Step 2: Create the Python Script (provision_ec2.py)

import boto3
import paramiko
import time
import os

# Initialize Boto3 client
ec2 = boto3.client("ec2")

# Create key pair
key_name = "ec2-key-pair"
key_pair = ec2.create_key_pair(KeyName=key_name)
private_key = key_pair['KeyMaterial']

# Save the key pair
with open(f"{key_name}.pem", "w") as key_file:
    key_file.write(private_key)
os.chmod(f"{key_name}.pem", 0o400)

# Create security group
security_group = ec2.create_security_group(
    GroupName="ec2-security-group", Description="Allow SSH access"
)
security_group_id = security_group['GroupId']

ec2.authorize_security_group_ingress(
    GroupId=security_group_id,
    IpPermissions=[
        {
            'IpProtocol': 'tcp',
            'FromPort': 22,
            'ToPort': 22,
            'IpRanges': [{'CidrIp': '0.0.0.0/0'}]
        }
    ]
)

# Launch EC2 instance
instance = ec2.run_instances(
    ImageId="ami-0c55b159cbfafe1f0",  # Update with a valid AMI
    InstanceType="t2.micro",
    KeyName=key_name,
    SecurityGroupIds=[security_group_id],
    MinCount=1,
    MaxCount=1
)
instance_id = instance["Instances"][0]["InstanceId"]

# Wait for instance to be running
print("Waiting for instance to launch...")
boto3.resource("ec2").Instance(instance_id).wait_until_running()
instance_info = ec2.describe_instances(InstanceIds=[instance_id])
public_ip = instance_info["Reservations"][0]["Instances"][0]["PublicIpAddress"]

print(f"EC2 Instance launched successfully with Public IP: {public_ip}")

# Connect via SSH using Paramiko
time.sleep(60)  # Wait for SSH service to be available
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(public_ip, username="ec2-user", key_filename=f"{key_name}.pem")

stdin, stdout, stderr = ssh.exec_command("echo Connection Successful")
print(stdout.read().decode())

ssh.close()

Explanation:

  • Key Pair Creation: Securely creates and saves an SSH key pair.

  • Security Group Configuration: Allows SSH access to the instance.

  • EC2 Instance Launch: Uses an AMI and instance type to provision an EC2 instance.

  • SSH Connection Verification: Uses Paramiko to ensure successful connectivity.

Best Practices:

  • Use IAM roles instead of hardcoded AWS credentials.

  • Restrict SSH access to specific IPs instead of 0.0.0.0/0.

  • Store key pairs securely (e.g., AWS Secrets Manager).


Challenge 2: Automate S3 lifecycle policies using boto3 (e.g., move files to Glacier after 30 days).

Goal: Create an S3 lifecycle rule that moves objects to Glacier after 30 days.

Prerequisites:

pip install boto3

Python Script (s3_lifecycle.py)

import boto3

s3 = boto3.client("s3")
bucket_name = "my-bucket"

# Define lifecycle rule
lifecycle_configuration = {
    "Rules": [
        {
            "ID": "MoveToGlacier",
            "Prefix": "",
            "Status": "Enabled",
            "Transitions": [
                {
                    "Days": 30,
                    "StorageClass": "GLACIER"
                }
            ]
        }
    ]
}

# Apply policy
s3.put_bucket_lifecycle_configuration(
    Bucket=bucket_name,
    LifecycleConfiguration=lifecycle_configuration
)
print("Lifecycle policy applied successfully.")

Explanation:

  • Configures a lifecycle policy to transition objects after 30 days.

  • Uses put_bucket_lifecycle_configuration to apply the rule.

Best Practices:

  • Set up expiration policies for objects in Glacier to avoid long-term costs.

  • Use versioning in case objects need to be restored.


Challenge 3: Create a script that starts or stops all EC2 instances in a specific AWS region.

Goal: Automate starting or stopping all EC2 instances in a specific region.

Python Script (ec2_start_stop.py)

import boto3

region = "us-east-1"
ec2 = boto3.client("ec2", region_name=region)

# Get all instances
instances = ec2.describe_instances()
instance_ids = [i["InstanceId"] for r in instances["Reservations"] for i in r["Instances"]]

def control_instances(action):
    if action == "start":
        ec2.start_instances(InstanceIds=instance_ids)
        print("Starting instances...")
    elif action == "stop":
        ec2.stop_instances(InstanceIds=instance_ids)
        print("Stopping instances...")

# Call function
control_instances("stop")  # Change to "start" to start instances

Explanation:

  • describe_instances() fetches all EC2 instances in the region.

  • start_instances() / stop_instances() controls instances in bulk.

Best Practices:

  • Implement scheduled automation using AWS Lambda and CloudWatch Events.

  • Use tags to filter and manage instances efficiently.


Challenge 4: Write a Python program that checks for unused IAM users and disables them.

Goal: Identify IAM users who haven't logged in for a specified period and disable them.

Prerequisites:

pip install boto3

Python Script (disable_unused_iam_users.py):

import boto3
from datetime import datetime, timedelta

# Initialize IAM client
iam = boto3.client("iam")

# Define inactivity period (e.g., 90 days)
inactive_days = 90
time_threshold = datetime.utcnow() - timedelta(days=inactive_days)

# Get all IAM users
users = iam.list_users()["Users"]

for user in users:
    username = user["UserName"]
    last_used = iam.get_login_profile(UserName=username)

    if "CreateDate" in last_used:
        last_activity = last_used["CreateDate"]
        if last_activity < time_threshold:
            print(f"Disabling user: {username}")
            iam.delete_login_profile(UserName=username)
            iam.update_user(UserName=username, NewPath="/disabled/")

Explanation:

  • Fetches IAM users.

  • Checks last activity and disables unused accounts.

  • Moves inactive users to a /disabled/ path.

Best Practices:

  • Set up monitoring for IAM activity.

  • Use AWS IAM policies to enforce least privilege.


Challenge 5: Implement a log monitoring system that scans EC2 instances' /var/log for error messages and sends alerts via email (AWS SES) and Slack.

Goal: Monitor /var/log for errors and send alerts via AWS SES and Slack.

Prerequisites:

pip install boto3 slack_sdk

Python Script (ec2_log_monitor.py):

import boto3
import os
import smtplib
from slack_sdk import WebClient

# AWS and Slack configuration
region = "us-east-1"
ec2 = boto3.client("ec2", region_name=region)

# Define email and Slack details
slack_token = "xoxb-your-slack-token"
slack_channel = "#alerts"
ses_email = "alert@example.com"
smtp_server = "smtp.example.com"

# Function to scan logs and send alerts
def scan_logs():
    for file in os.listdir("/var/log"):
        if file.endswith(".log"):
            with open(f"/var/log/{file}") as log_file:
                for line in log_file:
                    if "ERROR" in line:
                        send_alert(line)

# Send alerts via Slack and SES
def send_alert(message):
    client = WebClient(token=slack_token)
    client.chat_postMessage(channel=slack_channel, text=message)

    with smtplib.SMTP(smtp_server) as server:
        server.sendmail(ses_email, ses_email, message)

scan_logs()

Explanation:

  • Scans logs for errors.

  • Sends alerts to Slack and email.

Best Practices:

  • Use AWS CloudWatch for log aggregation.

  • Implement IAM roles for secure API access.


Challenge 6: Automate DNS record updates in AWS Route 53 using Python.

Goal: Automatically update DNS records in Route 53.

Prerequisites:

pip install boto3

Python Script (route53_update.py):

import boto3

domain = "example.com"
hosted_zone_id = "ZXXXXXXXXXXXX"
ip_address = "192.168.1.1"

route53 = boto3.client("route53")

# Update DNS record
response = route53.change_resource_record_sets(
    HostedZoneId=hosted_zone_id,
    ChangeBatch={
        "Changes": [
            {
                "Action": "UPSERT",
                "ResourceRecordSet": {
                    "Name": f"subdomain.{domain}",
                    "Type": "A",
                    "TTL": 300,
                    "ResourceRecords": [{"Value": ip_address}]
                }
            }
        ]
    }
)

print("DNS record updated successfully.")

Explanation:

  • Uses Route 53 API to update DNS records.

Best Practices:

  • Use IAM roles for security.

  • Implement TTL values based on traffic needs.


Challenge 7: Write a script that triggers an AWS Lambda function using boto3.

Goal: Invoke an AWS Lambda function programmatically.

Prerequisites:

pip install boto3

Python Script (lambda_trigger.py):

import boto3
import json

# AWS Configuration
AWS_REGION = "us-east-1"  # Change as needed
LAMBDA_FUNCTION_NAME = "my_lambda_function"  # Replace with your Lambda function name

# Initialize the AWS Lambda client
lambda_client = boto3.client("lambda", region_name=AWS_REGION)

def invoke_lambda(payload={}):
    """Invokes the AWS Lambda function with an optional JSON payload."""
    print(f"๐Ÿš€ Triggering Lambda function: {LAMBDA_FUNCTION_NAME}")

    try:
        response = lambda_client.invoke(
            FunctionName=LAMBDA_FUNCTION_NAME,
            InvocationType="RequestResponse",  # Use "Event" for async invocation
            Payload=json.dumps(payload)
        )

        # Read response
        response_payload = json.loads(response["Payload"].read().decode())
        print(f"โœ… Lambda response: {response_payload}")

    except Exception as e:
        print(f"โŒ Failed to invoke Lambda: {e}")

if __name__ == "__main__":
    # Example payload
    payload = {"message": "Hello from Python!"}

    invoke_lambda(payload)

Explanation:

  • Calls a Lambda function asynchronously.

Best Practices:

  • Use AWS EventBridge for scheduled triggers.

  • Implement error handling.


Challenge 8: Fetch AWS Billing Data and Generate Cost Report

Goal: Fetch AWS cost data and generate a PDF report.

Prerequisites:

pip install boto3 reportlab

Python Script (billing_report.py):

import boto3
from reportlab.pdfgen import canvas

cost_client = boto3.client("ce")
response = cost_client.get_cost_and_usage(
    TimePeriod={"Start": "2024-03-01", "End": "2024-03-31"},
    Granularity="MONTHLY",
    Metrics=["BlendedCost"]
)

total_cost = response["ResultsByTime"][0]["Total"]["BlendedCost"]["Amount"]

def generate_pdf(cost):
    pdf = canvas.Canvas("billing_report.pdf")
    pdf.drawString(100, 750, f"AWS Billing Report: ${cost}")
    pdf.save()

generate_pdf(total_cost)
print("Billing report generated successfully.")

Explanation:

  • Uses AWS Cost Explorer API to get billing data.

  • Generates a PDF report with ReportLab.

Best Practices:

  • Automate report generation with AWS Lambda.

  • Use cost allocation tags for granular reporting.


ย