You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm running nodeJS v0.10.35 on Ubuntu 14.04.1 LTS.
I read your documentation and calls
scraperjs.DynamicScraper.startFactory()
before I request multiple pages in parallel to scrape.
On completion of all the scraping, (using Promise). I would call
scraperjs.DynamicScraper.closeFactory()
However, looking at the memory usage, after a few hundred rounds of function calls. I see there are a bunch of processes belonging to phantomjs sitting there using up memory, eventually the system runs out of memory (4G) and nodeJS crashes.
I looked through the source code for this PhantomPoll class, I don't see any where it close the "Page", is this close() for each page needed to release the memory? could this be the reason for the memory "leak" that I see? Could you please spend a little bit of your time to help check? much appreciated.
The text was updated successfully, but these errors were encountered:
I think the leak is not directly from the scraper/node process, but rather a mismanagement of the phantom pages. I had some issues with it in the past.
I'll take a look into it in the next couple of days. Thanks for your input.
I'm running nodeJS v0.10.35 on Ubuntu 14.04.1 LTS.
I read your documentation and calls
before I request multiple pages in parallel to scrape.
On completion of all the scraping, (using Promise). I would call
However, looking at the memory usage, after a few hundred rounds of function calls. I see there are a bunch of processes belonging to phantomjs sitting there using up memory, eventually the system runs out of memory (4G) and nodeJS crashes.
I looked through the source code for this PhantomPoll class, I don't see any where it close the "Page", is this close() for each page needed to release the memory? could this be the reason for the memory "leak" that I see? Could you please spend a little bit of your time to help check? much appreciated.
The text was updated successfully, but these errors were encountered: