Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to understand the word "parallel". #129

Open
isCopyman opened this issue Nov 29, 2022 · 0 comments
Open

How to understand the word "parallel". #129

isCopyman opened this issue Nov 29, 2022 · 0 comments

Comments

@isCopyman
Copy link

Some papers on heterogeneous domain adaptation mention this word, but no good explanation is given. Does anyone understand the meaning of this word.

Paper: Unsupervised Heterogeneous Domain Adaptation with Sparse Feature Transformation

context:

  1. Some semi-supervised HDA methods even utilize parallel unlabeled instances to learn cross-domain representations
  2. A few unsupervised HDA approaches overcome this dependence limitation on labeled target data by learning a common latent correlation subspace based only on parallel instances
  3. he method uses a linear function to transform the source domain features into the target domain features to match the parallel instances, while minimizing the cross domain distribution divergence by aligning the transformed source domain covariance matrix with the target domain covariance matrix.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant