Cloud Pentesting

Manual Methods for Identifying S3 Buckets

Using Browser URL Inspection:

One of the simplest ways to check if a website is hosted on AWS is by entering the following in the browser URL bar:

Copy%c0

If you encounter an XML error the website is likely hosted on Amazon AWS. To verify further use browser extensions like Wappalyzer to check for AWS-related technologies.

Checking the Source Code

Inspect the website source code and search for "s3" to find any hidden S3 bucket URLs. if you found any just open and check if there bucket listing is enabled.

NoneNone

Google Dorking for AWS S3 Buckets

Google dorking helps uncover exposed S3 buckets. you can use the following dork to find open s3 buckets:

If bucket listing is enabled you'll be able to view the entire directory and its files. If you see an "Access Denied" message it means the bucket is private.

Automating Google Dorking with DorkEye

DorkEye automates Google dorking making reconnaissance faster by quickly extracting multiple AWS URLs for analysis.

GitHub - BullsEye0/dorks-eye: Dorks Eye Google Hacking Dork Scraping and Searching Script. Dorks…Dorks Eye Google Hacking Dork Scraping and Searching Script. Dorks Eye is a script I made in python 3. With this tool…github.comarrow-up-right

Using S3Misconfig for Fast Bucket Enumeration

S3Misconfig scans a list of URLs for open S3 buckets with listing enabled and saves the results in a user friendly HTML format for easy review.

GitHub - Atharv834/S3BucketMisconfContribute to Atharv834/S3BucketMisconf development by creating an account on GitHub.github.comarrow-up-right

Finding S3 Buckets with HTTPX and Nuclei

You can use the HTTPX command along with the Nuclei tool to quickly identify all S3 buckets across subdomains saving you significant time in recon.

Extracting S3 URLs from JavaScript Files

Next we'll use the Katana tool to download JavaScript files from target subdomains and extract S3 URLs using the following grep command:

GitHub - projectdiscovery/katana: A next-generation crawling and spidering framework.A next-generation crawling and spidering framework. - projectdiscovery/katanagithub.comarrow-up-right

Using java2s3 tool to find s3 urls in js files

Alternatively you can use this powerful approach to extract all S3 URLs from JavaScript files of subdomains. First combine Subfinder and HTTPX to generate the final list of subdomains then run the java2s3 tool for extraction.

after this you can use use the S3Misconfig tool to identify publicly accessible S3 buckets with listing enabled by sending all these s3 urls to the tool

GitHub - mexploit30/java2s3Contribute to mexploit30/java2s3 development by creating an account on GitHub.github.comarrow-up-right

Brute-Forcing S3 Bucket with LazyS3

You can also use this LazyS3 tool — it's basically a brute force tool for AWS S3 buckets using different permutations. you can run the following command by specifying the target domain

GitHub - nahamsec/lazys3Contribute to nahamsec/lazys3 development by creating an account on GitHub.github.comarrow-up-right

Using Cewl + S3Scanner to find open buckets

Next use the Cewl tool to generate a custom wordlist from the target domain. Then run S3Scanner with the list to identify valid and invalid S3 buckets. Finally use a grep command to filter valid buckets based on permissions and size and inspect their contents using the AWS CLI.

GitHub - sa7mon/S3Scanner: Scan for misconfigured S3 buckets across S3-compatible APIs!Scan for misconfigured S3 buckets across S3-compatible APIs! - sa7mon/S3Scannergithub.comarrow-up-right

Extracting S3 Buckets from GitHub Repositories

Use GitHub dorks to find AmazonAWS results in public repositories. Check S3 URLs for bucket listings and verify access with AWS CLI. If you discover sensitive data report it to bug bounty programs.

Websites for Public S3 Bucket Discovery

Use these websites to search for files in public AWS buckets by keyword. Download and inspect the contents and if you find any sensitive files report them responsibly:

grayhatwarfare:

Public Buckets by GrayhatWarfareEdit descriptiongrayhatwarfare.comarrow-up-right

osint.sh

https://osint.sh/buckets/arrow-up-right

Finding Hidden S3 URLs with Extensions

The S3BucketList Chrome extension scans web pages for exposed S3 URLs helping researchers quickly identify misconfigured buckets without manually inspecting the source code

S3BucketList - Chrome Web StoreSearch, lists, and checks S3 Buckets found in network requestsgoogle.comarrow-up-right

AWS S3 Bucket Listing & File Management

Easily manage AWS S3 buckets with these AWS CLI commands. These commands help security researchers, penetration testers and cloud administrators list, copy, delete and download files for efficient storage management and security assessments.

Reading Files:

Copying Files:

Deleting Files:

Downloading All Files:

Buckets with "Full Control" permission allow file uploads and deletions which could lead to security risks. Always follow responsible disclosure policies when reporting vulnerabilities.

Last updated