Skip to content
This repository has been archived by the owner on Oct 31, 2023. It is now read-only.

Commit

Permalink
merge with readme changes
Browse files Browse the repository at this point in the history
  • Loading branch information
vlad-karpukhin committed Feb 11, 2022
2 parents 10ca720 + 644d4fe commit d9f3e41
Show file tree
Hide file tree
Showing 3 changed files with 5 additions and 4 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ Their toolkit also reports higher BM25 and hybrid scores.
4. Dense retriever component for inference time logic is based on FAISS index.

## New (March 2021) release
DPR codeabse is upgraded with a number of enhancements and new models.
DPR codebase is upgraded with a number of enhancements and new models.
Major changes:
1. [Hydra](https://hydra.cc/)-based configuration for all the command line tools exept the data loader (to be converted soon)
2. Pluggable data processing layer to support custom datasets
Expand Down
5 changes: 3 additions & 2 deletions dpr/data/download_data.py
Original file line number Diff line number Diff line change
Expand Up @@ -460,6 +460,7 @@ def download(resource_key: str, out_dir: str = None):
if resource_key not in RESOURCES_MAP:
# match by prefix
resources = [k for k in RESOURCES_MAP.keys() if k.startswith(resource_key)]
logger.info("matched by prefix resources: %s", resources)
if resources:
for key in resources:
download(key, out_dir)
Expand Down Expand Up @@ -517,9 +518,9 @@ def main():
if args.resource:
download(args.resource, args.output_dir)
else:
print("Please specify resource value. Possible options are:")
logger.warning("Please specify resource value. Possible options are:")
for k, v in RESOURCES_MAP.items():
print("Resource key=%s : %s", k, v["desc"])
logger.warning("Resource key=%s : %s", k, v["desc"])


if __name__ == "__main__":
Expand Down
2 changes: 1 addition & 1 deletion train_dense_encoder.py
Original file line number Diff line number Diff line change
Expand Up @@ -740,7 +740,7 @@ def _do_biencoder_fwd_pass(
if cfg.n_gpu > 1:
loss = loss.mean()
if cfg.train.gradient_accumulation_steps > 1:
loss = loss / cfg.gradient_accumulation_steps
loss = loss / cfg.train.gradient_accumulation_steps
return loss, is_correct


Expand Down

0 comments on commit d9f3e41

Please sign in to comment.