Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[caffe2][1/n] migrate global Static Initializer #126688

Closed
wants to merge 1 commit into from

Conversation

klein-meta
Copy link
Contributor

Summary:
Caffe2 lib has 200+ global static initializer usage, which are papar-cut reference to startup perf. Detail in this post https://fb.workplace.com/groups/arglassesperf/permalink/623909116287154.
Kick off a stack to migirate all usage of global static initializer in caffe2.

Test Plan: TODO: Please advise how can i test this change?

Differential Revision: D57531083

Copy link

pytorch-bot bot commented May 20, 2024

This appears to be a diff that was exported from phabricator, but the PR author does not have sufficient permissions to run CI. @klein-meta, please do step 2 of internal wiki to get write access so you do not need to get CI approvals in the future. If you think this is a mistake, please contact the Pytorch Dev Infra team.

Copy link

linux-foundation-easycla bot commented May 20, 2024

CLA Signed

The committers listed above are authorized under a signed CLA.

Copy link

pytorch-bot bot commented May 20, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/126688

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit 43178a7 with merge base 14dc8d4 (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D57531083

klein-meta added a commit to klein-meta/pytorch that referenced this pull request May 20, 2024
Summary:

Caffe2 lib has 200+ global static initializer usage, which are papar-cut reference to startup perf. Detail in this post https://fb.workplace.com/groups/arglassesperf/permalink/623909116287154.
Kick off a stack to migirate all usage of global static initializer in caffe2.

Test Plan: TODO: Please advise how can i test this change?

Differential Revision: D57531083
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D57531083

Summary:

Caffe2 lib has 200+ global static initializer usage, which are papar-cut reference to startup perf. Detail in this post https://fb.workplace.com/groups/arglassesperf/permalink/623909116287154.
Kick off a stack to migirate all usage of global static initializer in caffe2.

Test Plan: TODO: Please advise how can i test this change?

Differential Revision: D57531083
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D57531083

Copy link

pytorch-bot bot commented May 21, 2024

This appears to be a diff that was exported from phabricator, but the PR author does not have sufficient permissions to run CI. @klein-meta, please do step 2 of internal wiki to get write access so you do not need to get CI approvals in the future. If you think this is a mistake, please contact the Pytorch Dev Infra team.

@facebook-github-bot
Copy link
Contributor

@pytorchbot merge -f 'Landed internally'

(Initiating merge automatically since Phabricator Diff has merged, using force because this PR might not pass merge_rules.json but landed internally)

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Please use -f as last resort and instead consider -i/--ignore-current to continue the merge ignoring current failures. This will allow currently pending tests to finish and report signal before the merge.

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants