Skip to content

six2dez/OneListForAll

Repository files navigation

OneListForAll

Rockyou for web fuzzing

This is a project to generate huge wordlists for web fuzzing, if you just want to fuzz with a good wordlist use the file onelistforallmicro.txt.

Due to GitHub's size file limitations I had to split all the files bigger than 50M in different files with the following taxonomy technology[1-99]_long.txt If you want to recreate the original file just run, for example the apache long dict cat dict/apache* > dict/apache_long.txt

The wordlists mentioned at the bottom of this pages are merged by technology/type and differenced by _short and _long suffixes. So you can search by any technology or software and fuzz the target site with a small list or the long one. Also, this projects provides three of all-in-one wordlists:

  • onelistforall.txt (everything merged, both _short.txt and _long.txt files, cleaned and deduplicated, zipped 7z multi)
  • onelistforallshort.txt (merged only _short.txt files, cleaned and deduplicated)
  • onelistforallmicro.txt (my favorite, manually crafted and constantly updated, with interesting files and low-hanging fruits findings)

Usage

Method 1

  1. Go to releases and download the latest

  2. Fuzz with the best tool ffuf :)

ffuf -c -w onelistforall.txt -u [target.com]/FUZZ

Method 2

Build your own wordlists!

Requirement: install duplicut by yourself :)

  1. Add your wordlists to dict/ folder with suffix _short.txt for short wordlist and _long.txt for the full wordlist.

  2. Run ./olfa.sh (olfa -> One List For All) and you will have onelistforall.txt file and onelistforallshort.txt.

  3. Fuzz with the best tool ffuf :)

ffuf -c -w onelistforall.txt -u [target.com]/FUZZ

Wordlists summary

  • onelistforallmicro.txt manally crafted wordlist for low hanging fruits: 18109 lines, 298K
  • onelistforallshort.txt a shortened version, it also contains a lot of things, but in a more affordable way: 892361 lines, 15M
  • onelistforall.txt basically everything, launch it and go to sleep. 59076819 lines, 1.2G

Sources

This is a wordlists project for fuzzing purposes made from the best word lists currently available,merged and deduplicated later with duplicut, adding cleaner from BonJarber. The lists used have been selected from these repositories:

Feel free to contribute, PR are welcomed.