Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Curl robots.tx not working #190

Open
45im opened this issue Feb 17, 2023 · 4 comments
Open

Curl robots.tx not working #190

45im opened this issue Feb 17, 2023 · 4 comments

Comments

@45im
Copy link
Contributor

45im commented Feb 17, 2023

Curl robots.txt always reports There did not appear to be a robots.txt file in the webroot dsepite having a non empty robots.txt in webroot.
Is there something amiss here

async def run(self, service):
		if service.protocol == 'tcp':
			process, stdout, _ = await service.execute('curl -sSikf {http_scheme}://{addressv6}:{port}/robots.txt', future_outfile='{protocol}_{port}_{http_scheme}_curl-robots.txt')

			lines = await stdout.readlines()

			if process.returncode == 0 and lines:
				filename = fformat('{scandir}/{protocol}_{port}_{http_scheme}_curl-robots.txt')
				with open(filename, mode='wt', encoding='utf8') as robots:
					robots.write('\n'.join(lines))
			else:
				service.info('{bblue}[' + fformat('{tag}') + ']{rst} There did not appear to be a robots.txt file in the webroot (/).')
@Tib3rius
Copy link
Owner

I'm unable to reproduce this, however I could probably move this check over to use requests rather than curl which might make it more reliable.

Could you run the following manually though and provide me with a screenshot showing the result, as well as the exit code (e.g. echo $?)

curl -sSikf http://<target>/robots.txt

Screenshot 2023-02-17 101149

@45im
Copy link
Contributor Author

45im commented Feb 17, 2023

image

image

Yes manually it works 100%. exit code is 0 everytime.
I have rerun the the same host a few times and it does work about 20% of the times.
the if process.returncode == 0 and lines: is failing most of the times for some reason

On localhost works more reliably (about 80% success) but does sometimes miss it.

and when it does not work even the tcp_80_http_curl.html file is empty, so the issue seems to be related to the use of curl somehow. async / await not playing nicely with service.execute?

@45im
Copy link
Contributor Author

45im commented Apr 27, 2023

any movement on this Tib3rius?

@gstb989
Copy link

gstb989 commented Jan 8, 2024

I've ran into the same thing, fresh install from github using pipx. It will run for a bit identifying ports then hang on this last line.

└─$ sudo env "PATH=$PATH" autorecon 192.168.240.246
[*] Scanning target 192.168.240.246
[*] [192.168.240.246/all-tcp-ports] Discovered open port tcp/443 on 192.168.240.246
[*] [192.168.240.246/all-tcp-ports] Discovered open port tcp/80 on 192.168.240.246
[*] [192.168.240.246/all-tcp-ports] Discovered open port tcp/2222 on 192.168.240.246
[*] [192.168.240.246/tcp/80/http/vhost-enum] The target was not a hostname, nor was a hostname provided as an option. Skipping virtual host enumeration.
[*] [192.168.240.246/tcp/443/http/vhost-enum] The target was not a hostname, nor was a hostname provided as an option. Skipping virtual host enumeration.
[*] [192.168.240.246/tcp/80/http/known-security] [tcp/80/http/known-security] There did not appear to be a .well-known/security.txt file in the webroot (/).
[*] [192.168.240.246/tcp/80/http/curl-robots] [tcp/80/http/curl-robots] There did not appear to be a robots.txt file in the webroot (/).
[*] [192.168.240.246/tcp/443/http/known-security] [tcp/443/http/known-security] There did not appear to be a .well-known/security.txt file in the webroot (/).
[*] [192.168.240.246/tcp/443/http/curl-robots] [tcp/443/http/curl-robots] There did not appear to be a robots.txt file in the webroot (/).
[*] [192.168.240.246/tcp/443/https/vhost-enum] The target was not a hostname, nor was a hostname provided as an option. Skipping virtual host enumeration.
[*] [192.168.240.246/tcp/443/https/known-security] [tcp/443/https/known-security] There did not appear to be a .well-known/security.txt file in the webroot (/).
[*] [192.168.240.246/tcp/443/https/curl-robots] [tcp/443/https/curl-robots] There did not appear to be a robots.txt file in the webroot (/).

Breaking out of the command will give a TypeError: can only concatenate str (not "list") to str

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants