S3Scanner

S3Scanner

Verified 2779 Stars

Scan for misconfigured S3 buckets across S3-compatible APIs!

sa7mon
May 26, 2025
2779 stars
Category
Bug-bounty
GitHub Stars
2779
Project Added On
May 26, 2025
Contributors
9

S3Scanner

Features - Usage - Quick Start - Installation - Discuss



A tool to find open S3 buckets in AWS or other cloud providers:

  • AWS
  • DigitalOcean
  • DreamHost
  • GCP
  • Linode
  • Scaleway
  • Custom

demo

Features

  • ⚡️ Multi-threaded scanning
  • 🔭 Supports many built-in S3 storage providers or custom
  • 🕵️‍♀️ Scans all bucket permissions to find misconfigurations
  • 💾 Save results to Postgres database
  • 🐇 Connect to RabbitMQ for automated scanning at scale
  • 🐳 Docker support

Used By

banner for six2dez/reconftw banner for yogeshojha/rengine banner for pry0cc/axiom - reads 'the dynamic infrastructure framework for everybody'

Usage

INPUT: (1 required)
  -bucket        string  Name of bucket to check.
  -bucket-file   string  File of bucket names to check.
  -mq                    Connect to RabbitMQ to get buckets. Requires config file key "mq". Default: "false"

OUTPUT:
  -db       Save results to a Postgres database. Requires config file key "db.uri". Default: "false"
  -json     Print logs to stdout in JSON format instead of human-readable. Default: "false"

OPTIONS:
  -enumerate           Enumerate bucket objects (can be time-consuming). Default: "false"
  -provider    string  Object storage provider: aws, custom, digitalocean, dreamhost, gcp, linode, scaleway - custom requires config file. Default: "aws"
  -threads     int     Number of threads to scan with. Default: "4"

DEBUG:
  -verbose     Enable verbose logging. Default: "false"
  -version     Print version Default: "false"

If config file is required these locations will be searched for config.yml: "." "/etc/s3scanner/" "$HOME/.s3scanner/"

🚀 Support

If you’ve found this tool useful, please consider donating to support its development. You can find sponsor options on the side of this repo page or in FUNDING.yml

Huge thank you to tines for being an ongoing sponsor of this project.

Quick Start

Scan AWS for bucket names listed in a file, enumerate all objects
shell $ s3scanner -bucket-file names.txt -enumerate

Scan a bucket in GCP, enumerate all objects, and save results to database
shell $ s3scanner -provider gcp -db -bucket my-bucket -enumerate

Installation

Platform Version Steps
BlackArch BlackArch package pacman -S s3scanner
Docker Docker release docker run ghcr.io/sa7mon/s3scanner
Go Golang go install -v github.com/sa7mon/s3scanner@latest
Kali Linux Kali package apt install s3scanner
MacOS homebrew version brew install s3scanner
Parrot OS Parrot package apt install s3scanner
Windows - winget winget install s3scanner
NixOS stable nixpkgs unstable package nix-shell -p s3scanner
NixOS unstable nixpkgs unstable package nix-shell -p s3scanner
Other - Build from source GitHub release git clone [email protected]:sa7mon/S3Scanner.git && cd S3Scanner && go build -o s3scanner .

Using

Input

s3scanner requires exactly one type of input: -bucket, -bucket-file, or -mq.

INPUT: (1 required)
  -bucket        string  Name of bucket to check.
  -bucket-file   string  File of bucket names to check.
  -mq                    Connect to RabbitMQ to get buckets. Requires config file key "mq". Default: "false"

-bucket


Scan a single bucket

s3scanner -bucket secret_uploads

-bucket-file


Scans every bucket name listed in file

s3scanner -bucket-file names.txt

where names.txt contains one bucket name per line

$ cat names.txt
bucket123
assets
image-uploads

Bucket names listed multiple times will only be scanned once.

-mq


Connects to a RabbitMQ server and consumes messages containing bucket names to scan.

s3scanner -mq

Messages should be JSON-encoded Bucket objects - refer to mqingest for a Golang publishing example.

-mq requires the mq.uri and mq.queue_name config file keys. See Config File section for example.

Output

OUTPUT:
  -db       Save results to a Postgres database. Requires config file key "db.uri". Default: "false"
  -json     Print logs to stdout in JSON format instead of human-readable. Default: "false"

-db


Saves all scan results to a PostgreSQL database

s3scanner -bucket images -db
  • Requires the db.uri config file key. See Config File section for example.
  • If using -db, results will also be printed to the console if using -json or the default human-readable output mode.
  • s3scanner runs Gorm’s Auto Migration feature each time it connects two the database. If
    the schema already has tables with names Gorm expects, it may change these tables’ structure. It is recommended to create a Postgres schema dedicated to s3scanner results.

-json


Instead of outputting scan results to console in human-readable format, output machine-readable JSON.

s3scanner -bucket images -json

This will print one JSON object per line to the console, which can then be piped to jq or other tools that accept JSON input.

Example: Print bucket name and region for all buckets that exist

```shell
$ s3scanner -bucket-file names.txt -json | jq -r ‘. | select(.bucket.exists==1) | [.bucket.name, .bucket.region] | join(” - “)’
10000 - eu-west-1
10000.pizza - ap-southeast-1
images_staging -

... Content truncated. Click "See More" to view the full README.

Tool Information

Author

sa7mon

Project Added On

May 26, 2025

License

Open Source

Tags

aws bugbounty gcp infosec s3 s3scanner