Friday, December 30, 2016

AWS Inventory - Audit the Cloud Infrastructure

Update 20180103: I just created a new PR that adds support for IAM, namely it lists users and the policies assigned to them. In order to use this version you need to add a new policy to the role used to run the Lambda:
    "Version": "2012-10-17",
    "Statement": [
            "Effect": "Allow",
            "Action": "iam:ListUsers",
            "Resource": "*"
            "Effect": "Allow",
            "Action": "iam:ListUserPolicies",
            "Resource": "*"
            "Effect": "Allow",
            "Action": "iam:ListAttachedUserPolicies",
            "Resource": "*"
Update 20171217: I just created a new PR with a couple of fixes:
* Lambda functions must finalize within a short period of time and therefore the amount of snapshots should be an external environment variable.
* Reserved IP addresses might not be in use in which case we should show the instance id as empty, otherwise we get an exception.
How difficult is to audit your AWS Cloud Infrastructure?

Instances, volumes, snapshots, RDS, security groups, elastic IPs and beyond. A single report to get all the invaluable information that will keep you informed to make critical and quick decisions.

The guys from powerupcloud shared an initial script in their blog which they put in github. I forked it and after some tweaks found it so useful that I decided to ask the author for a pull request. The new Lambda Function:
* Adds support for environment variables
* Adds security groups listing
* Removes hardcoded and non generic names
* Corrects some comments
* Retrieves the ownerId instead of hardcoding it
* Adds the description for volumes for clearer identification
* Lists the Elastic IPs with the instanceId the are assigned to for clearer identification
* Has a TODO ;-: Naming conventions should be established
Here is a quick start:
  1. Create IAM role | Name:Inventory; Role Type: AWSLambda; Policy: EC2ReadOnly, AmazonS3FullAccess, AmazonRDSReadOnlyAccess
  2. Create S3 bucket | Name: YourInventoryS3Name
  3. Create Lambda | Name: YourInventoryLambda; Description: Extracts the AWS Inventory; Runtime: Python; Role: Inventory; Timeout: 3 min; environment variables: SES_SMTP_USER, SES_SMTP_PASSWORD, S3_INVENTORY_BUCKET, MAIL_FROM, MAIL_TO
  4. Schedule the Lambda: Select Lambda | Trigger | Add Trigger | CloudWatch Events - Schedule | Rule Name: InventorySchedulerRule; Rule Description: Inventory Scheduler Rule; Schedule Expression: cron(0 14 ? * FRI *) if you want it to run every Friday at 9AM ET| Enable Trigger | Submit
To be able to send emails from AWS you need to allow Amazon Simple Email Service (SES) to send emails for some emails in your domain:
  1. Verify the domain: SES | Manage Entities | Verify a New Domain ;
  2. Follow the instructions. To complete the verification you must add several TXT/CNAME records to your domain's DNS settings
  3. Verify email addresses: SES | hit the domain | Verify a New Email Address | Confirm from email
  4. After getting confirmation go to SES | SMTP Settings | Create my SMTP Credentials | Provide a username like ‘anyusername’ for example and Amazon will show you the credentials (SMTP Username and SMTP Password). Keep them in safe place. You can also download them. This is the last time AWS will share them with you.
You can clone/fetch/checkout my latest version from my fork: Really useful stuff. Thanks PowerUpCloud for the quick start!


Prabhu Gopalakrishnan said...

Hi, the script works perfectly, I am looking to get RDS DB Snapshot list within the script, but I could not able to make it... do you have any to list the RDS DB Snapshot to CSV format?


Nestor Urquiza said...

@Prabhu, I recommend you fork the project in github and add support for RDS DB snapshots. The reason is not there is because nobody using the script is using such resource.

Sabareesh said...

I tried executing the query and got the

"Calling the Invoke API failed with message: Network Error"

Nestor Urquiza said...

@Sabareesh, I recommend you post your issue in the github project. A network issue though sounds like some local issues to me though.