Monday, July 24, 2017

S3 buckets audit: check bucket's public access level, etc .. updated with authorised audit support

I previous post: S3 buckets audit: check bucket existence, public access level, etc - without having access to target AWS account I described and released tool to audit s3 buckets even without access to the AWS account these buckets belong to.

But what about if I have access to the bucket's account or I would like to audit all buckets in my AWS account?

These features have been addressed in the new release of the s3 audit tool:

$python --profile prod-read --bucket bucket2test

$python --profile prod-read --file aws

$python --profile prod-read --file buckets.list

                        Please specify AWS CLI profile
  -B BUCKET, --bucket=BUCKET
                        Please provide bucket name
  -F FILE, --file=FILE  Optional: file with buckets list to check or aws to check all buckets in your account

--profile=AWS_PROFILE - yours AWS access profile (from aws cli). This profile  might or might not have access to the audited bucket (we need this just to become Authenticated User from AWS point of view ).

If  AWS_PROFILE allows authorised access to the bucket being audited - tool will fetch bucket's ACLs, Policies and S3 Static Web setting and perform authorised audit.

If AWS_PROFILE does not allow authorised access - tool will work in pentester mode

You can specify:
  •  one bucket to check using --bucket option
  •  file with list of buckets(one bucket name per line) using --file option
  •  all buckets in your AWS account (accessible using AWS_PROFILE) using --file=aws option

Based on the your AWS profile limitations tool will provide you:
  • indirect scan results (AWS_profile have no API access to the bucket being audited)
  • validated scan results based on you s3 buckets settings like ACL, bucket policy and s3 website config. (AWS_profile have API access to the bucket being audited )
Enjoy and stay secured.

PS. Currently tool does not support bucket check for Frankfurt region (AWS Signature Version 4). Working on it.

Wednesday, July 19, 2017

S3 buckets audit: check bucket existence, public access level, etc - without having access to target AWS account

      Currently, publicly accessible buckets become a big deal and root cause of many recent data leaks.
All of these events even drive Amazon AWS to proactively send out emails to the customers who has such s3 configurations. Let's become a bit more proactive as well and audit s3 buckets

        First, let's take look why bucket might become publicly available:
- Configured for public access intentionally (S3 static web hosting or just public resource) or by mistake
- Configured for the access of the Authenticated Users  (option, misinterpreted by many as users from your account, which is wrong, it's any AWS authenticated user from any account)
         Auditing AWS account you have full access to is quite easy - just list the buckets and check theirs ACL, users and bucket policies via aws cli or web gui.

         What about cases when you:
- have many accounts and buckets (will take forever to audit manually)
- do not have enough permissions in the target AWS account to check bucket access
- you do not have permissions at all in this account (pentester mode)

To address everything above I've created small tool to do all dirty job for you (updated to v2):

$python --profile prod-read --bucket test.bcuket

                        Please specify AWS CLI profile
  -B BUCKET, --bucket=BUCKET
                        Please provide bucket name
  -F FILE, --file=FILE  Optional: file with buckets list to check

Note: --profile=AWS_PROFILE - any your AWS access profile (from aws cli). This profile HAS to NOT have access to the audited bucket (we need this just to become Authenticated User from AWS point of view )

You can specify one bucket to check using --bucket option or file with list of buckets(one bucket name per line) using --file option

Based on the bucket access status tool will provide you following responses:

Bucket: test.bucktet - The specified bucket does not exist
Bucket: test.bucktet -  Bucket exists, but Access Denied
Bucket: test.bucktet -  Found index.html, most probably S3 static web hosting is enabled
Bucket: test.bucktet - Bucket exists, publicly available and no S3 static web hosting, most probably misconfigured! 


PS. More over, you can create list of the buckets(even using some DNS/name alterations and permutations) to test in the file and loop through it checking each.

Stay secure.

Thursday, March 30, 2017

Trailing dot in DNS name, incorrect S3 website endpoint work and possible back-end information leak

I discovered that AWS S3 website endpoint incorrectly interpret trailing dot (which is actually essential part of FQDN according to RFC1034 ) in the website FQDN. 
Instead of referring to the correct bucket endpoint gives "No such bucket error" revealing information about web site back-end. 
I have not considered this initially as a security issue more as a misconfiguration or even expected undocumented behaviour , but found one case that could lead to others:

If web site use 3rd party DDOS and WAF protection service like CloudFlare this technic(adding trailing dot ) could reveal and expose web-site origin. 

Example of the possible information disclose below:

Dns name resolution pointing to the CloudFlare:

 dot error pointing to S3 bucket back-end with rest of information pointing to CloudFlare:

PS. One of the possible usage of the s3 back-end information leak could be  s3 backet name squatting to block possible sub-domain usage due to the uniqueness of the s3 bucket names.

Wednesday, February 1, 2017

MediaWiki as a static website and content sharing

Using wiki for knowledge management in a teams or individually is easy and often is an obvious choice.
       Challenges appear when you need to share information stored in the wiki. 
Challenges are: hardening MediaWiki installation for public access and partially sharing wiki content.

If your main goal is to just to publish content, you can extract wiki pages as a static html pages using relatively simple wget one-liner. After extracting, you can publish your wiki using AWS  S3 static web hosting.

To share only part of the information available in the wiki you can leverage Categoies and restrict user access to specified categories using special extension. Afterwards, you can use this user restricted access to grab wiki content.
Another simple way is to use Category special wiki page as a starting point for crawler to grab pages related to the specific category, let's say Public category.
The code is way shorter than all description above:

# get the wiki content
wget --recursive --level=1 --page-requisites --html-extension --no-directories --convert-links --no-parent -R "*Special*" -R "*action=*" -R "*printable=*"  -R "*oldid=*" -R "*title=Talk:*" -R "*limit=*" "http://mywikiprivate:80/wiki/index.php/Category:Public"
# replace sensitive by the link to the stub page
sed -i -E 's/http:\/\/mywikiprivate[^"]*/http:\/\/\/404.html/g' *.html 
# remove sensitive file
rm Category\:Public.1.html
# rename Public category pages to be an a list of published pages
mv Category:Public.html Public.html 
# sync content to AWS
aws s3 sync ./ s3://you_bucket/

Result of such script running along with some public notes from my wiki could be found here:

Disclaimer: current wiki publication contains only small part of the information available and will be updated on almost daily basis to add more content cleared for publishing. Main purpose of this wiki is to keep technical notes and references in the structured way. Some of them are obvious, outdated or incomplete.

Goal of the establishing public publishing process is to keep wiki information up-do-date  and have ability to publish small useful notes which does not fit blog format and style.