Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about the implement of Meta-Joint Optimzation #162

Open
zero0kiriyu opened this issue Mar 15, 2024 · 0 comments
Open

Question about the implement of Meta-Joint Optimzation #162

zero0kiriyu opened this issue Mar 15, 2024 · 0 comments

Comments

@zero0kiriyu
Copy link

zero0kiriyu commented Mar 15, 2024

I try to reimplement the Meta-joint optimization part, but it always output inf loss after several iter. @cleardusk

model_vdc = mobilenet()
model_wpdc = copy.deepcopy(model_vdc)

optimizer_vdc = torch.optim.SGD(params=model_vdc.parameters(),lr=lr)
optimizer_wpdc = torch.optim.SGD(params=model_wpdc.parameters(),lr=lr)

for epoch in range(N):
    for batch_idx,batch in enumerate(trainloader):
        if batch_idx == 0 or (batch_idx != 0 and batch_idx % meta_joint_k != 0):
            # update by vdc loss
            loss_vdc.backward()
            optimizer_vdc.step()
            optimizer_vdc.zero_grad()
           
            # update by wpdc loss
            loss_wpdc.backward()
            optimizer_wpdc.step()
            optimizer_wpdc.zero_grad()
        elif batch_idx != 0 and batch_idx % meta_joint_k == 0:
            model_vdc.eval();model_wpdc.eval()
            # calculate the vdc loss for two model
            ......

            if loss_vdc_vdc > loss_vdc_wpdc:
                model_vdc.load_state_dict(copy.deepcopy(model_wpdc))
                optimizer_vdc.load_state_dict(copy.deepcopy(optimizer_wpdc))
            else:
                model_wpdc.load_state_dict(copy.deepcopy(model_vdc))
                optimizer_wpdc.load_state_dict(copy.deepcopy(optimizer_vdc))
           model_vdc.training();model_wpdc.training()
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant