You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hey man! I hear you, and thanks for taking the time to write in with feedback.
Currently we have this stuff from bazel shushed on the theory that it created a lot of noise, and that, probably, if it was taking a long time we should be coaching people to run after builds w/ the same flags so it can hit cache for the slow header search part--to run it on faster-running subsets of their code. Does that seem reasonable/applicable, or do you definitely want more logging?
I sadly don't have bandwidth for this one at the moment. Would you be interested in exploring? If so, there are two internal phases that are (sometimes) slow: (1) aquerying bazel (if v large) and (2) running header search (if can't hit cache from prior builds/runs of this tool). For (1) you could play with removing the silencing of bazel (go into the generated python and comment '--ui_event_filters=-info', and '--noshow_progress',. For (2) we could put TQDM or similar around the big threadpool.map, though we'd have to figure out how to get it to install nicely w/ Bazel. pip_parse and all that.
During
>>> Analyzing commands used in @//...
It can take quite a while before anything prints, or between things printing.It might be helpful to have a status printout at the bottom of the screen which indicates that it's still working, and how many targets/total targets.
The text was updated successfully, but these errors were encountered: