This dork pulls results that match any of these keywords. This way you won't need to search for each keyword separately. You'll get all relevant results at once.
Filtering by Path, Language and File Type
During reconnaissance, filtering by path, language and file type helps narrow down valuable targets. Below are some common filters to use:
Copyfilename: Search by specific file names (e.g. filename:.env)
extension: Filter by file type (e.g. extension:json)
path: Search within specific directories (e.g. path:/config)
org: Limit results to an organization (e.g. org:my-company)
repo: Focus on a specific repository (e.g. repo:my-project)
1. Filename: — Search by Specific File Name
Copyfilename:.env"DB_PASSWORD"
Finds all .env files containing the keyword DB_PASSWORD. .env files often include credentials, secrets and API keys.
2. Extension: — Filter by File Type
Copyextension:json "access_token"
Searches all .json files across GitHub that contain the string access_token. Great for finding exposed tokens in config files.
3. path: — Search Within Specific Directories
Use the path: filter to find files located in specific folders or subdirectories. This is useful for locating sensitive files commonly stored in predictable paths.
4. repo: — Focus on a Specific Repository
Limits search to the vercel/next.js repo and looks for config.js files. Great when you're auditing a specific open-source project.
Bonus: Combine Filters for Maximum Precision
Find files that contain both "password" and "domain" keywords anywhere within a specific language, such as .php, .jsp or .asp.
This dork searches for PHP files that contain both the word "password" and "domain" anywhere in the content. It's useful when you're looking for potential credentials or sensitive data related toyour domain.
Note: Many of these credentials are committed by random developers. It's crucial to confirm if they belong to your target's assets before reporting.
Keyword Variations
Don't just search for "password." Try variations like:
To verify whether exposed API keys are working use the Keyhacks repository. It includes all commands and testing methods for over 50+ types of API keys.
Manual recon is great, but for mass scale use GitGraber tool, Install GitGraber and run:
It'll scrape and sort all keyword matches in seconds. Even better, you get direct URLs, timestamps and raw JSON preview.
Using TruffleHog for Deep Secret Scanning
TruffleHog is another powerful tool for hunting secrets in code repositories. Here's how to use it:
These trufflehog commands help detect exposed secrets (like API keys, credentials, tokens) in Git repositories. You can scan local repos, GitHub repositories or entire organizations. Additional flags allow filtering results, parsing JSON and scanning comments in issues and PRs for deeper coverage.
Mass Hunting .git Directory Exposure
. git directories on public websites are another goldmine. Why? Because they store the entire source code history, including deleted but restorable files.
Using Nuclei private template
Find Exposed .git Repositories with httpx-toolkit
This httpx-toolkit command scans a list of domains for exposed .git/ directories. It:
Probes the /.git/ path
Filters responses with status code 200
Matches content with "Index of"
Shows response status code, server header, content length and redirect location
Instantly Detect Git Leaks with This Extension
Install the .git browser extension. it automatically alerts you if any site exposes its Git repository, helping you quickly spot misconfigurations and potential attack surfaces during recon
💡 Tip: Even if a site returns a 403 Forbidden for /.git/ , don't give up—some Git files might still be accessible. Use tools like GitDumper to attempt extraction and reconstruction of the repository.
Dumping Git Repositories
Next Step: Once you've identified a valid .git/ folder using the methods above, it's time to dump the repository contents. Use tools like GitTools, git-dumper or git-extractor to recover exposed files and inspect the source code.
After dumping the .git folder the next step is to rebuild the full file structure. This helps uncover deleted files, sensitive data and historical changes that may still exist in the Git history.
Run the following commands to restore and inspect the repository:
Copy# Search for sensitive data related to the entire organization
python3 gitGraber.py -k wordlists/keywords.txt -q nasa.gov -s
# Search for sensitive data related strictly to the domain
python3 gitGraber.py -k wordlists/keywords.txt -q "nasa.gov" -s
Copy# Scan a local Git repository
trufflehog git file:///home/user/my-repo
# Scan a public GitHub repository
trufflehog git https://github.com/username/repo.git
# Scan with filtering results to show only verified and unknown findings
trufflehog git https://github.com/trufflesecurity/test_keys --results=verified,unknown
# Scan and format output as JSON using jq for readability
trufflehog git https://github.com/trufflesecurity/test_keys --results=verified,unknown --json | jq
# Scan a GitHub repository and include issue and PR comments in the scan
trufflehog github --repo=https://github.com/trufflesecurity/test_keys --issue-comments --pr-comments
# Scan all repositories in a GitHub organization using a personal access token
trufflehog github --org=nasa --token=yourgithubtoken
# Scan a specific GitHub repository (basic usage)
trufflehog github --repo=https://github.com/username/repo