You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
So, basically, I run my dockerized crawlers on Kubernetes cluster with help of ArgoCD. Rest of my 60 crawlers work just fine, but for this two new crawlers I decided to try MechanicalSoup instead of Selenium WebDriver.
The issue I have, is that those two crawlers are not starting at all. No errors, nothing, it just hangs and doesn't start. I tried pulling the docker image and running it with docker compose, crawler works perfectly fine. What could be the issue with it being ran on Pod in Kubernetes cluster?
Thanks
The text was updated successfully, but these errors were encountered:
Hi, and thanks for your interest in MechanicalSoup! Unfortunately, I have no experience with Kubernetes, and I can't even think of where to start to debug an issue that has no errors or output. The most important thing to do would be to figure out where it is hanging. Can you perhaps run the command with strace?
If it's not starting at all, I'd suggest adding a trivial debug statement (print("starting") or so) at the very beginning of the main function of your code to check whether you enter the main function or not. Also try to run a trivial program importing the mechanicalsoup package (just the import statement and a print statement), to see if the problem is the import statement.
Check the sheebang of your program (#!/usr/bin/env python3 ?) is it pointing to a version of Python available on your system?
So, basically, I run my dockerized crawlers on Kubernetes cluster with help of ArgoCD. Rest of my 60 crawlers work just fine, but for this two new crawlers I decided to try MechanicalSoup instead of Selenium WebDriver.
The issue I have, is that those two crawlers are not starting at all. No errors, nothing, it just hangs and doesn't start. I tried pulling the docker image and running it with docker compose, crawler works perfectly fine. What could be the issue with it being ran on Pod in Kubernetes cluster?
Thanks
The text was updated successfully, but these errors were encountered: