You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There doesn't seem to be any clear documentation explaining how the -se flag functions. So, I am assuming it stops on more than 10 errors.
I'm using a large wordlist for Directory Enumeration, which may lead to numerous errors, potentially up to 500 or 1000. I prefer not to use these flags -sa-se as they restrict the scanning capacity. Even if there are some errors, in my experience, FFUF continues to function properly.
The real issues begin when I reach a certain error threshold, for me, it's around 1500. At this point, FFUF's performance drops from 300 req/s to 2 req/s. I'm currently using the -maxtime flag to manage this, but it would be more efficient if there was a flag to specify a maximum error limit at which FFUF would stop scanning.
The text was updated successfully, but these errors were encountered:
-sf (Stop on 403 Forbidden): Halts the job when more than 95% of the HTTP responses returned are 403 Forbidden errors. -se (Stop on Spurious Errors): Stops the job when the number of spurious errors (unclassified or unexpected errors) exceeds twice the number of threads being used by the job.
-sa (Stop on All): Will include 429 HTTP Codes to the combo, halting the job if 20% of the requests are 429.
Usually, the errors you see in some servers are directly related to the amount requests/sec you set (via -threads or via -rate), making the server fail, in those cases the best thing is to find a stable (lower) number of requests and fuzz with it.
Changing the % that fuff uses to halt the job would make you believe it works better, but indeed the server is failing and you need to avoid that to find proper hits.
Does that make sense to you? Something I'm missing?
There doesn't seem to be any clear documentation explaining how the
-se
flag functions. So, I am assuming it stops on more than 10 errors.I'm using a large wordlist for Directory Enumeration, which may lead to numerous errors, potentially up to 500 or 1000. I prefer not to use these flags
-sa
-se
as they restrict the scanning capacity. Even if there are some errors, in my experience, FFUF continues to function properly.The real issues begin when I reach a certain error threshold, for me, it's around 1500. At this point, FFUF's performance drops from 300 req/s to 2 req/s. I'm currently using the
-maxtime
flag to manage this, but it would be more efficient if there was a flag to specify a maximum error limit at which FFUF would stop scanning.The text was updated successfully, but these errors were encountered: