Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to reduce the file size of the Scrapy log? #54

Closed
xunqirui opened this issue Jun 20, 2019 · 8 comments
Closed

How to reduce the file size of the Scrapy log? #54

xunqirui opened this issue Jun 20, 2019 · 8 comments
Labels
question Further information is requested

Comments

@xunqirui
Copy link

xunqirui commented Jun 20, 2019

I runed my spider by using scrapydweb.
After a few days latter, I found my spider's logfile was so big, and it took me a lot of time to open it.
Can you give me some advice?
Thank you : )

@my8100 my8100 added the question Further information is requested label Jun 22, 2019
@my8100
Copy link
Owner

my8100 commented Jun 22, 2019

  1. Visit the corresponding Stats page in scrapydweb, then switch to the Log categorization tab,
    and find out the amount of different level of logs.
  2. You may also need to set up LOG_LEVEL of your spider to control the logging level, see docs.

BTW, there's no need to open the original log of the spider as scrapydweb provides the Stats page.

@my8100 my8100 closed this as completed Jun 22, 2019
@my8100 my8100 changed the title The log is so big How to reduce the file size of the Scrapy log? Jun 22, 2019
@Digenis
Copy link

Digenis commented Jun 22, 2019

Also see logrotation through the built-in logging module:
scrapy/scrapy#3235

logrotate(8) is also an option scrapy/scrapy#3628
but make sure you test it thoroughly with scrapy.
It gave me issues with corruption of the first bytes of every new scrapyd log file
(which is written by twisted.python.log, not the built-in logging module).

@xunqirui
Copy link
Author

xunqirui commented Jun 22, 2019

@my8100
I changed my LOG_LEVEL, and the log has been decreased a lot.
Then I will try to use LogParser, I find it is very useful.
Thank you!!

@xunqirui
Copy link
Author

xunqirui commented Jun 22, 2019

@Digenis
when I found the log file was so big, I also wanted to use logrotation, but I didn't find a better way.
your suggestions also help me a lot.
thank you!

@my8100
Copy link
Owner

my8100 commented Jun 22, 2019

@xunqirui
But logrotation would break LogParser.

@xunqirui
Copy link
Author

xunqirui commented Jun 22, 2019

@my8100
The LogParser is very useful.
If the logrotation would break LogParser, I will not use the logrotation.
I have already changed the LOG_LEVEL, and the log has been decreased a lot, it has met my needs.
thank you for reminding me!!!!

@my8100
Copy link
Owner

my8100 commented Jun 22, 2019

@xunqirui
Don‘t forget to add a space after the punctuation mark.
Add a new line if needed.
: )

@xunqirui
Copy link
Author

@my8100
Haha
I haven't chatted with others in English for a long time.
I will re-edit it.
: )

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants