You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Duplicate JWTs are a problem. When you enable the web spider on a JWT-enabled website, you are likely to get two JWT events from every URL (one from excavate and one from badsecrets). That means if you spider a single website that has 1000 URLS, you will get 2000 JWTs that are effectively all the same.
I think the best way to solve this is to have a dedicated JWT event that intelligently dedupes itself by its contents, disregarding any one-time information like nonces/timestamps. This will ensure that equivalent JWTs won't be duplicated across the scan.
The text was updated successfully, but these errors were encountered:
Moving JWT parsing into the event validation seems pretty small compared to most of the other reworks that are happening. Unless I'm missing something.
Duplicate JWTs are a problem. When you enable the web spider on a JWT-enabled website, you are likely to get two JWT events from every URL (one from excavate and one from badsecrets). That means if you spider a single website that has 1000 URLS, you will get 2000 JWTs that are effectively all the same.
I think the best way to solve this is to have a dedicated JWT event that intelligently dedupes itself by its contents, disregarding any one-time information like nonces/timestamps. This will ensure that equivalent JWTs won't be duplicated across the scan.
The text was updated successfully, but these errors were encountered: