Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

logging: refactor to use logging with client entrypoint #31

Closed
yarikoptic opened this issue May 29, 2020 · 5 comments · Fixed by #35
Closed

logging: refactor to use logging with client entrypoint #31

yarikoptic opened this issue May 29, 2020 · 5 comments · Fixed by #35

Comments

@yarikoptic
Copy link
Contributor

probably it is possible to configure logging via ~/.qme/config.ini file, but I find it super useful to be able to configure logging for a specific invocation. For that e.g. in majority of the tools I touch we have -l|--loglevel option at the top level of the interface which is typically also configurable via environment variable. E.g.:

$> datalad --help | grep -2 log

*Global options*
  -l LEVEL, --log-level LEVEL
                        set logging verbosity level. Choose among critical,
                        error, warning, info, debug. Also you can specify an
                        integer <10 to provide even more debugging information

Command line option overrides env variable, and that one overrides possible config file setting. This way I could easily start/restart dashboard with different logging level while running qme run with default or another.

ATM qme show floods terminal with some logs, so those should be demoted to DEBUG level but I could make them visible by quickly restarting with -l debug if interested to see them.

@vsoch
Copy link
Owner

vsoch commented May 29, 2020

Related #17 what do you think about logging on the level of different executors, also to file?

@yarikoptic
Copy link
Contributor Author

"to file" -- makes total sense by default for the dashboard! (eventually might even want to handle HUP signal to close/open that logging backend so logrotate could do its job ;))

But for cmdline tools, even though we support such functionality everywhere, I barely ever use it. I guess because I do want to see logging lines interleaved with actual output.

FWIW, I do "love logging"... or to say I rely on logging a lot for a variety of use cases, largely for troubleshooting, debugging, and even optimization. That is why in DataLad we actually have quite a number of features bolted on to the logging (here is the implementation which you might be interested to poke at or just adopt entirely). All of those are controlled via our config (or where forgotten - purely through env variables), see the _LOG ones in https://github.com/datalad/datalad/blob/master/CONTRIBUTING.md#useful-environment-variables

I do often use adding timestamps, including the traceback and PID into the log lines. Also there is a little dtime (difference in time) helper which I can feed with a log output

and avoid doing algebra on each line to estimate how long it took to the next one:
$> DATALAD_LOG_TIMESTAMP=1 DATALAD_LOG_LEVEL=1 datalad install --help 2>&1 | dtime | head
    9 2020-05-29 15:29:29,901 [Level 5] Instantiating ssh manager 
    1 2020-05-29 15:29:29,910 [Level 5] Done importing main __init__ 
    1 2020-05-29 15:29:29,911 [Level 5] Importing cmdline.main 
    0 2020-05-29 15:29:29,912 [Level 5] Done importing cmdline.main 
    0 2020-05-29 15:29:29,912 [Level 5] Starting main(None) 
   11 2020-05-29 15:29:29,912 [Level 5] Starting to setup_parser 
   10 2020-05-29 15:29:29,923 [Level 5] Importing support.network 
    7 2020-05-29 15:29:29,933 [Level 5] Done importing support.network 
    1 2020-05-29 15:29:29,940 [Level 5] Requested to provide version for cmd:git 
    - |   if True, don't return annex special remotes", we split into [('exclude_special_remotes', "exclude_special_remotes: bool, optional\n  if True, don't return annex special remotes")] 
or right away see the most taxing "steps"
$> DATALAD_LOG_TIMESTAMP=1 DATALAD_LOG_LEVEL=1 datalad install --help 2>&1 | dtime | sort -n | tail
    2 2020-05-29 15:29:48,566 [DEBUG] Building doc for <class 'datalad.core.distributed.clone.Clone'> 
    2 2020-05-29 15:29:48,568 [DEBUG] Building doc for <class 'datalad.distribution.get.Get'> 
    2 2020-05-29 15:29:48,574 [Level 5] Finished setup_parser 
    2 2020-05-29 15:29:48,576 [DEBUG] Parsing known args among ['/home/yoh/proj/datalad/datalad-master/venvs/dev3/bin/datalad', 'install', '--help'] 
    4 2020-05-29 15:29:48,570 [DEBUG] Building doc for <class 'datalad.distribution.install.Install'> 
    8 2020-05-29 15:29:48,511 [Level 5] Instantiating ssh manager 
    8 2020-05-29 15:29:48,544 [Level 5] Done importing support.network 
    9 2020-05-29 15:29:48,557 [DEBUG] Building doc for <class 'datalad.local.subdatasets.Subdatasets'> 
   11 2020-05-29 15:29:48,533 [Level 5] Importing support.network 
   12 2020-05-29 15:29:48,521 [Level 5] Starting to setup_parser 
without resorting to the full scale profiling etc.
Or look at the traceback without debugger
$> DATALAD_LOG_TRACEBACK=4 datalad -l debug install --help 2>&1 | head
[DEBUG] datalad:8>main:450,216  Command line args 1st pass. Parsed: Namespace() Unparsed: ['install', '--help'] 
[DEBUG] ...>main:450,222>interface.base:102>plugin.__init__:30  Discovering plugins 
[DEBUG] ...><frozen importlib._bootstrap_external>:728><frozen importlib._bootstrap>:219>subdatasets:94>interface.base:499  Building doc for <class 'datalad.local.subdatasets.Subdatasets'> 
[DEBUG] ...><frozen importlib._bootstrap_external>:728><frozen importlib._bootstrap>:219>clone:88>interface.base:499  Building doc for <class 'datalad.core.distributed.clone.Clone'> 
[DEBUG] ...><frozen importlib._bootstrap_external>:728><frozen importlib._bootstrap>:219>get:572>interface.base:499  Building doc for <class 'datalad.distribution.get.Get'> 
[DEBUG] ...><frozen importlib._bootstrap_external>:728><frozen importlib._bootstrap>:219>install:60>interface.base:499  Building doc for <class 'datalad.distribution.install.Install'> 
[DEBUG] datalad:8>main:450,332,379  Generating detailed description for the parser 
[DEBUG] datalad:8>main:458  Parsing known args among ['/home/yoh/proj/datalad/datalad-master/venvs/dev3/bin/datalad', '-l', 'debug', 'install', '--help'] 

(BTW - that is how you could quickly figure out where we log any ERROR line we might want to helpme with ;))

BTW, at least for .debug (and "lower") levels of logging I would recommend to rely on ability to pass string interpolation arguments directly into the call, e.g. instead of

app.logger.debug("Received deletion request for %s" % json.get("taskid"))

use

app.logger.debug("Received deletion request for %s", json.get("taskid"))

then interprolation would be performed only if that line is to be logged. This way you could be more generous with .debug log lines while minimizing their negative impact

PS now thinking about it -- I could have had dtime as just yet another _LOG option, d'oh! ;)

@vsoch
Copy link
Owner

vsoch commented May 29, 2020

There is an environment variable you can set for qme - MESSAGELEVEL akin to helpme (and I'll update to be QME_MESSAGELEVEL)

@vsoch
Copy link
Owner

vsoch commented May 30, 2020

From #34 we also want:

Consider using common prefix for all env var related to logging, eg shorter QME_LOG_. Then you could uniformly add more settings as needed. I will also appreciate it personally since that is the convention I used in a good number of projects

@vsoch vsoch changed the title logging: make level configurable at invocation time (not only in config) refactor logging to use standard logging with client entrypoint May 30, 2020
@vsoch vsoch changed the title refactor logging to use standard logging with client entrypoint logging: refactor to use logging with client entrypoint May 30, 2020
@vsoch
Copy link
Owner

vsoch commented May 31, 2020

@yarikoptic take a look! #35 I added a snippet of documentation for how it works so be sure to look at that file. I was going for a simple, clean implementation that would make it easy to customize via environment or client.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants