Skip to content
This repository has been archived by the owner on Mar 19, 2024. It is now read-only.

Incorrect accuracy when running benchmark on linear_image_classification on imagenet1K #545

Open
VicaYang opened this issue May 11, 2022 · 5 comments
Assignees

Comments

@VicaYang
Copy link

VicaYang commented May 11, 2022

Instructions To Reproduce the Issue:

I used the prebuilt vissl, created a new folder contains tools/run_distributed_engines.py and all files under folder configs, and created a new config to run the benchmark using vit-b16
Check https://stackoverflow.com/help/minimal-reproducible-example for how to ask good questions.
Simplify the steps to reproduce the issue using suggestions from the above link, and provide them below:

  1. the config I used (git diff)
# @package _global_
config:
  VERBOSE: False
  LOG_FREQUENCY: 10
  TEST_ONLY: False
  TEST_MODEL: True
  SEED_VALUE: 0
  MULTI_PROCESSING_METHOD: forkserver
  HOOKS:
    PERF_STATS:
      MONITOR_PERF_STATS: True
      ROLLING_BTIME_FREQ: 313
      PERF_STAT_FREQUENCY: 10
    TENSORBOARD_SETUP:
      USE_TENSORBOARD: True
      EXPERIMENT_LOG_DIR:
      FLUSH_EVERY_N_MIN: 20
  CHECKPOINT:
    DIR: "."
    AUTO_RESUME: True
    CHECKPOINT_FREQUENCY: 10
  DATA:
    NUM_DATALOADER_WORKERS: 5
    TRAIN:
      DATA_SOURCES: [disk_folder]
      LABEL_SOURCES: [disk_folder]
      DATASET_NAMES: [imagenet1k_folder]
      BATCHSIZE_PER_REPLICA: 1024
      TRANSFORMS:
        - name: RandomResizedCrop
          size: 224
        - name: RandomHorizontalFlip
        - name: ToTensor
        - name: Normalize
          mean: [0.485, 0.456, 0.406]
          std: [0.229, 0.224, 0.225]
    TEST:
      DATA_SOURCES: [disk_folder]
      LABEL_SOURCES: [disk_folder]
      DATASET_NAMES: [imagenet1k_folder]
      BATCHSIZE_PER_REPLICA: 1024
      TRANSFORMS:
        - name: Resize
          size: 256
        - name: CenterCrop
          size: 224
        - name: ToTensor
        - name: Normalize
          mean: [0.485, 0.456, 0.406]
          std: [0.229, 0.224, 0.225]
  MODEL:
    GRAD_CLIP:
      USE_GRAD_CLIP: False
    FEATURE_EVAL_SETTINGS:
      EVAL_MODE_ON: True
      FREEZE_TRUNK_ONLY: True
    TRUNK:
      NAME: vision_transformer
      VISION_TRANSFORMERS:
        IMAGE_SIZE: 224
        PATCH_SIZE: 16
        NUM_LAYERS: 12
        NUM_HEADS: 12
        HIDDEN_DIM: 768
        MLP_DIM: 3072
        DROPOUT_RATE: 0.1
        ATTENTION_DROPOUT_RATE: 0
        CLASSIFIER: token
    HEAD:
      PARAMS: [
        ["mlp", {"dims": [768, 1000]}],
      ]
    WEIGHTS_INIT:
      PARAMS_FILE: "specify the model weights"
      STATE_DICT_KEY_NAME: classy_state_dict
    SYNC_BN_CONFIG:
      CONVERT_BN_TO_SYNC_BN: False
      SYNC_BN_TYPE: apex
      GROUP_SIZE: 8
    AMP_PARAMS:
      USE_AMP: True
      AMP_ARGS: {"opt_level": "O1"}
  LOSS:
    name: cross_entropy_multiple_output_single_target
    cross_entropy_multiple_output_single_target:
      ignore_index: -1
  OPTIMIZER:
    name: sgd
    # In the OSS Caffe2 benchmark, RN50 models use 1e-4 and AlexNet models 5e-4
    weight_decay: 0
    momentum: 0.9
    num_epochs: 100
    nesterov: False
    regularize_bias: True
    param_schedulers:
      lr:
        auto_lr_scaling:
          auto_scale: true
          base_value: 0.1
          base_lr_batch_size: 256
        name: composite
        schedulers:
          - name: linear
            start_value: 0.0
            end_value: 0.1
          - name: cosine
            start_value: 0.1
            end_value: 0
        interval_scaling: [rescaled, rescaled]
        update_interval: step
        lengths: [0.1, 0.9]
  METERS:
    name: accuracy_list_meter
    accuracy_list_meter:
      num_meters: 1
      topk_values: [1, 5]
  TRAINER:
    TRAIN_STEP_NAME: standard_train_step
  DISTRIBUTED:
    BACKEND: nccl
    NUM_NODES: 1
    NUM_PROC_PER_NODE: 1
    RUN_ID: auto
  MACHINE:
    DEVICE: gpu

and my dataset_catalog.json (I am not sure should I use test split or val split for "val"; the val folder is created using this script

{"imagenet1k_folder": {"train": ["/data/vica/dataset/ILSVRC2012/train", "/data/vica/dataset/ILSVRC2012/train"], "val": ["/data/vica/dataset/ILSVRC2012/val", "/data/vica/dataset/ILSVRC2012/val"]}}
  1. what exact command you run:
python tools/run_distributed_engines.py \
  config=benchmark/linear_image_classification/imagenet1k/2 \
  config.MODEL.WEIGHTS_INIT.PARAMS_FILE="../weight/vit_b16_p16_in22k_ep90_supervised.torch" \
  config.CHECKPOINT.DIR="checkpoint/2"


python tools/run_distributed_engines.py \
  config=benchmark/linear_image_classification/imagenet1k/2 \
  config.MODEL.WEIGHTS_INIT.PARAMS_FILE="../weight/mocov3-vit-b-300ep.pth.tar" \
  config.MODEL.WEIGHTS_INIT.STATE_DICT_KEY_NAME="" \
  config.MODEL.WEIGHTS_INIT.APPEND_PREFIX="trunk._feature_blocks." \
  config.CHECKPOINT.DIR="checkpoint/moco2"
  1. full logs you observed:
    2/metrics.json
{"iteration": 1252, "phase_idx": 0, "train_accuracy_list_meter": {"top_1": {"0": 48.6703}, "top_5": {"0": 70.0547}}, "train_phase_idx": 0}
{"iteration": 1252, "phase_idx": 1, "test_accuracy_list_meter": {"top_1": {"0": 78.2}, "top_5": {"0": 95.078}}, "train_phase_idx": 0}
{"iteration": 2504, "phase_idx": 2, "train_accuracy_list_meter": {"top_1": {"0": 51.3593}, "top_5": {"0": 72.7701}}, "train_phase_idx": 1}
{"iteration": 2504, "phase_idx": 3, "test_accuracy_list_meter": {"top_1": {"0": 78.338}, "top_5": {"0": 94.95400000000001}}, "train_phase_idx": 1}
{"iteration": 3756, "phase_idx": 4, "train_accuracy_list_meter": {"top_1": {"0": 51.6701}, "top_5": {"0": 73.0482}}, "train_phase_idx": 2}
{"iteration": 3756, "phase_idx": 5, "test_accuracy_list_meter": {"top_1": {"0": 78.172}, "top_5": {"0": 94.75}}, "train_phase_idx": 2}
{"iteration": 5008, "phase_idx": 6, "train_accuracy_list_meter": {"top_1": {"0": 51.7722}, "top_5": {"0": 73.0847}}, "train_phase_idx": 3}
{"iteration": 5008, "phase_idx": 7, "test_accuracy_list_meter": {"top_1": {"0": 78.2}, "top_5": {"0": 94.716}}, "train_phase_idx": 3}
{"iteration": 6260, "phase_idx": 8, "train_accuracy_list_meter": {"top_1": {"0": 51.843799999999995}, "top_5": {"0": 73.1542}}, "train_phase_idx": 4}
{"iteration": 6260, "phase_idx": 9, "test_accuracy_list_meter": {"top_1": {"0": 77.97}, "top_5": {"0": 94.678}}, "train_phase_idx": 4}
{"iteration": 7512, "phase_idx": 10, "train_accuracy_list_meter": {"top_1": {"0": 51.8061}, "top_5": {"0": 73.1753}}, "train_phase_idx": 5}
{"iteration": 7512, "phase_idx": 11, "test_accuracy_list_meter": {"top_1": {"0": 78.034}, "top_5": {"0": 94.726}}, "train_phase_idx": 5}
{"iteration": 8764, "phase_idx": 12, "train_accuracy_list_meter": {"top_1": {"0": 51.9195}, "top_5": {"0": 73.2584}}, "train_phase_idx": 6}
{"iteration": 8764, "phase_idx": 13, "test_accuracy_list_meter": {"top_1": {"0": 77.932}, "top_5": {"0": 94.738}}, "train_phase_idx": 6}
{"iteration": 10016, "phase_idx": 14, "train_accuracy_list_meter": {"top_1": {"0": 51.9531}, "top_5": {"0": 73.237}}, "train_phase_idx": 7}
{"iteration": 10016, "phase_idx": 15, "test_accuracy_list_meter": {"top_1": {"0": 78.038}, "top_5": {"0": 94.624}}, "train_phase_idx": 7}
{"iteration": 11268, "phase_idx": 16, "train_accuracy_list_meter": {"top_1": {"0": 51.96340000000001}, "top_5": {"0": 73.311}}, "train_phase_idx": 8}
{"iteration": 11268, "phase_idx": 17, "test_accuracy_list_meter": {"top_1": {"0": 78.02799999999999}, "top_5": {"0": 94.666}}, "train_phase_idx": 8}
{"iteration": 12520, "phase_idx": 18, "train_accuracy_list_meter": {"top_1": {"0": 51.9783}, "top_5": {"0": 73.3279}}, "train_phase_idx": 9}
{"iteration": 12520, "phase_idx": 19, "test_accuracy_list_meter": {"top_1": {"0": 78.022}, "top_5": {"0": 94.638}}, "train_phase_idx": 9}
{"iteration": 13772, "phase_idx": 20, "train_accuracy_list_meter": {"top_1": {"0": 51.9991}, "top_5": {"0": 73.3466}}, "train_phase_idx": 10}
{"iteration": 13772, "phase_idx": 21, "test_accuracy_list_meter": {"top_1": {"0": 78.172}, "top_5": {"0": 94.616}}, "train_phase_idx": 10}
{"iteration": 15024, "phase_idx": 22, "train_accuracy_list_meter": {"top_1": {"0": 52.0532}, "top_5": {"0": 73.4346}}, "train_phase_idx": 11}
{"iteration": 15024, "phase_idx": 23, "test_accuracy_list_meter": {"top_1": {"0": 78.13}, "top_5": {"0": 94.634}}, "train_phase_idx": 11}
{"iteration": 16276, "phase_idx": 24, "train_accuracy_list_meter": {"top_1": {"0": 52.01090000000001}, "top_5": {"0": 73.3309}}, "train_phase_idx": 12}
{"iteration": 16276, "phase_idx": 25, "test_accuracy_list_meter": {"top_1": {"0": 77.776}, "top_5": {"0": 94.574}}, "train_phase_idx": 12}
{"iteration": 17528, "phase_idx": 26, "train_accuracy_list_meter": {"top_1": {"0": 52.0753}, "top_5": {"0": 73.3931}}, "train_phase_idx": 13}
{"iteration": 17528, "phase_idx": 27, "test_accuracy_list_meter": {"top_1": {"0": 77.912}, "top_5": {"0": 94.594}}, "train_phase_idx": 13}
{"iteration": 18780, "phase_idx": 28, "train_accuracy_list_meter": {"top_1": {"0": 52.162200000000006}, "top_5": {"0": 73.43990000000001}}, "train_phase_idx": 14}
{"iteration": 18780, "phase_idx": 29, "test_accuracy_list_meter": {"top_1": {"0": 78.078}, "top_5": {"0": 94.648}}, "train_phase_idx": 14}
{"iteration": 20032, "phase_idx": 30, "train_accuracy_list_meter": {"top_1": {"0": 52.1456}, "top_5": {"0": 73.4644}}, "train_phase_idx": 15}
{"iteration": 20032, "phase_idx": 31, "test_accuracy_list_meter": {"top_1": {"0": 77.864}, "top_5": {"0": 94.64}}, "train_phase_idx": 15}
{"iteration": 21284, "phase_idx": 32, "train_accuracy_list_meter": {"top_1": {"0": 52.1913}, "top_5": {"0": 73.53110000000001}}, "train_phase_idx": 16}
{"iteration": 21284, "phase_idx": 33, "test_accuracy_list_meter": {"top_1": {"0": 78.208}, "top_5": {"0": 94.768}}, "train_phase_idx": 16}
{"iteration": 22536, "phase_idx": 34, "train_accuracy_list_meter": {"top_1": {"0": 52.186600000000006}, "top_5": {"0": 73.53020000000001}}, "train_phase_idx": 17}
{"iteration": 22536, "phase_idx": 35, "test_accuracy_list_meter": {"top_1": {"0": 78.254}, "top_5": {"0": 94.634}}, "train_phase_idx": 17}
{"iteration": 23788, "phase_idx": 36, "train_accuracy_list_meter": {"top_1": {"0": 52.219300000000004}, "top_5": {"0": 73.5076}}, "train_phase_idx": 18}
{"iteration": 23788, "phase_idx": 37, "test_accuracy_list_meter": {"top_1": {"0": 78.194}, "top_5": {"0": 94.76}}, "train_phase_idx": 18}
{"iteration": 25040, "phase_idx": 38, "train_accuracy_list_meter": {"top_1": {"0": 52.2216}, "top_5": {"0": 73.51520000000001}}, "train_phase_idx": 19}
{"iteration": 25040, "phase_idx": 39, "test_accuracy_list_meter": {"top_1": {"0": 78.102}, "top_5": {"0": 94.728}}, "train_phase_idx": 19}
{"iteration": 26292, "phase_idx": 40, "train_accuracy_list_meter": {"top_1": {"0": 52.3286}, "top_5": {"0": 73.6481}}, "train_phase_idx": 20}
{"iteration": 26292, "phase_idx": 41, "test_accuracy_list_meter": {"top_1": {"0": 78.244}, "top_5": {"0": 94.722}}, "train_phase_idx": 20}
{"iteration": 27544, "phase_idx": 42, "train_accuracy_list_meter": {"top_1": {"0": 52.3783}, "top_5": {"0": 73.66629999999999}}, "train_phase_idx": 21}
{"iteration": 27544, "phase_idx": 43, "test_accuracy_list_meter": {"top_1": {"0": 78.44200000000001}, "top_5": {"0": 94.69999999999999}}, "train_phase_idx": 21}
{"iteration": 28796, "phase_idx": 44, "train_accuracy_list_meter": {"top_1": {"0": 52.3906}, "top_5": {"0": 73.714}}, "train_phase_idx": 22}
{"iteration": 28796, "phase_idx": 45, "test_accuracy_list_meter": {"top_1": {"0": 78.314}, "top_5": {"0": 94.688}}, "train_phase_idx": 22}
{"iteration": 30048, "phase_idx": 46, "train_accuracy_list_meter": {"top_1": {"0": 52.4285}, "top_5": {"0": 73.8044}}, "train_phase_idx": 23}
{"iteration": 30048, "phase_idx": 47, "test_accuracy_list_meter": {"top_1": {"0": 78.292}, "top_5": {"0": 94.76400000000001}}, "train_phase_idx": 23}
{"iteration": 31300, "phase_idx": 48, "train_accuracy_list_meter": {"top_1": {"0": 52.4684}, "top_5": {"0": 73.8225}}, "train_phase_idx": 24}
{"iteration": 31300, "phase_idx": 49, "test_accuracy_list_meter": {"top_1": {"0": 78.236}, "top_5": {"0": 94.704}}, "train_phase_idx": 24}
{"iteration": 32552, "phase_idx": 50, "train_accuracy_list_meter": {"top_1": {"0": 52.481100000000005}, "top_5": {"0": 73.7923}}, "train_phase_idx": 25}
{"iteration": 32552, "phase_idx": 51, "test_accuracy_list_meter": {"top_1": {"0": 78.33200000000001}, "top_5": {"0": 94.78999999999999}}, "train_phase_idx": 25}
{"iteration": 33804, "phase_idx": 52, "train_accuracy_list_meter": {"top_1": {"0": 52.648700000000005}, "top_5": {"0": 73.9105}}, "train_phase_idx": 26}
{"iteration": 33804, "phase_idx": 53, "test_accuracy_list_meter": {"top_1": {"0": 78.268}, "top_5": {"0": 94.686}}, "train_phase_idx": 26}
{"iteration": 35056, "phase_idx": 54, "train_accuracy_list_meter": {"top_1": {"0": 52.635200000000005}, "top_5": {"0": 73.976}}, "train_phase_idx": 27}
{"iteration": 35056, "phase_idx": 55, "test_accuracy_list_meter": {"top_1": {"0": 78.312}, "top_5": {"0": 94.67999999999999}}, "train_phase_idx": 27}
{"iteration": 36308, "phase_idx": 56, "train_accuracy_list_meter": {"top_1": {"0": 52.674699999999994}, "top_5": {"0": 73.9935}}, "train_phase_idx": 28}
{"iteration": 36308, "phase_idx": 57, "test_accuracy_list_meter": {"top_1": {"0": 78.194}, "top_5": {"0": 94.65599999999999}}, "train_phase_idx": 28}
{"iteration": 37560, "phase_idx": 58, "train_accuracy_list_meter": {"top_1": {"0": 52.6903}, "top_5": {"0": 74.01950000000001}}, "train_phase_idx": 29}
{"iteration": 37560, "phase_idx": 59, "test_accuracy_list_meter": {"top_1": {"0": 78.508}, "top_5": {"0": 94.792}}, "train_phase_idx": 29}
{"iteration": 38812, "phase_idx": 60, "train_accuracy_list_meter": {"top_1": {"0": 52.7625}, "top_5": {"0": 74.0935}}, "train_phase_idx": 30}
{"iteration": 38812, "phase_idx": 61, "test_accuracy_list_meter": {"top_1": {"0": 78.43}, "top_5": {"0": 94.652}}, "train_phase_idx": 30}
{"iteration": 40064, "phase_idx": 62, "train_accuracy_list_meter": {"top_1": {"0": 52.8271}, "top_5": {"0": 74.12910000000001}}, "train_phase_idx": 31}
{"iteration": 40064, "phase_idx": 63, "test_accuracy_list_meter": {"top_1": {"0": 78.422}, "top_5": {"0": 94.73}}, "train_phase_idx": 31}
{"iteration": 41316, "phase_idx": 64, "train_accuracy_list_meter": {"top_1": {"0": 52.92210000000001}, "top_5": {"0": 74.17049999999999}}, "train_phase_idx": 32}
{"iteration": 41316, "phase_idx": 65, "test_accuracy_list_meter": {"top_1": {"0": 78.47}, "top_5": {"0": 94.782}}, "train_phase_idx": 32}
{"iteration": 42568, "phase_idx": 66, "train_accuracy_list_meter": {"top_1": {"0": 52.9482}, "top_5": {"0": 74.2158}}, "train_phase_idx": 33}
{"iteration": 42568, "phase_idx": 67, "test_accuracy_list_meter": {"top_1": {"0": 78.50200000000001}, "top_5": {"0": 94.812}}, "train_phase_idx": 33}
{"iteration": 43820, "phase_idx": 68, "train_accuracy_list_meter": {"top_1": {"0": 53.0861}, "top_5": {"0": 74.3485}}, "train_phase_idx": 34}
{"iteration": 43820, "phase_idx": 69, "test_accuracy_list_meter": {"top_1": {"0": 78.604}, "top_5": {"0": 94.83}}, "train_phase_idx": 34}
{"iteration": 45072, "phase_idx": 70, "train_accuracy_list_meter": {"top_1": {"0": 53.040299999999995}, "top_5": {"0": 74.3432}}, "train_phase_idx": 35}
{"iteration": 45072, "phase_idx": 71, "test_accuracy_list_meter": {"top_1": {"0": 78.494}, "top_5": {"0": 94.914}}, "train_phase_idx": 35}
{"iteration": 46324, "phase_idx": 72, "train_accuracy_list_meter": {"top_1": {"0": 53.1142}, "top_5": {"0": 74.3947}}, "train_phase_idx": 36}
{"iteration": 46324, "phase_idx": 73, "test_accuracy_list_meter": {"top_1": {"0": 78.66}, "top_5": {"0": 94.868}}, "train_phase_idx": 36}
{"iteration": 47576, "phase_idx": 74, "train_accuracy_list_meter": {"top_1": {"0": 53.2202}, "top_5": {"0": 74.47409999999999}}, "train_phase_idx": 37}
{"iteration": 47576, "phase_idx": 75, "test_accuracy_list_meter": {"top_1": {"0": 78.556}, "top_5": {"0": 94.808}}, "train_phase_idx": 37}
{"iteration": 48828, "phase_idx": 76, "train_accuracy_list_meter": {"top_1": {"0": 53.3245}, "top_5": {"0": 74.5382}}, "train_phase_idx": 38}
{"iteration": 48828, "phase_idx": 77, "test_accuracy_list_meter": {"top_1": {"0": 78.61}, "top_5": {"0": 94.878}}, "train_phase_idx": 38}
{"iteration": 50080, "phase_idx": 78, "train_accuracy_list_meter": {"top_1": {"0": 53.376599999999996}, "top_5": {"0": 74.6219}}, "train_phase_idx": 39}
{"iteration": 50080, "phase_idx": 79, "test_accuracy_list_meter": {"top_1": {"0": 78.73400000000001}, "top_5": {"0": 94.86}}, "train_phase_idx": 39}
{"iteration": 51332, "phase_idx": 80, "train_accuracy_list_meter": {"top_1": {"0": 53.5072}, "top_5": {"0": 74.66380000000001}}, "train_phase_idx": 40}
{"iteration": 51332, "phase_idx": 81, "test_accuracy_list_meter": {"top_1": {"0": 78.728}, "top_5": {"0": 94.772}}, "train_phase_idx": 40}
{"iteration": 52584, "phase_idx": 82, "train_accuracy_list_meter": {"top_1": {"0": 53.455600000000004}, "top_5": {"0": 74.7445}}, "train_phase_idx": 41}
{"iteration": 52584, "phase_idx": 83, "test_accuracy_list_meter": {"top_1": {"0": 78.678}, "top_5": {"0": 94.882}}, "train_phase_idx": 41}
{"iteration": 53836, "phase_idx": 84, "train_accuracy_list_meter": {"top_1": {"0": 53.6231}, "top_5": {"0": 74.7963}}, "train_phase_idx": 42}
{"iteration": 53836, "phase_idx": 85, "test_accuracy_list_meter": {"top_1": {"0": 78.61399999999999}, "top_5": {"0": 94.87400000000001}}, "train_phase_idx": 42}
{"iteration": 55088, "phase_idx": 86, "train_accuracy_list_meter": {"top_1": {"0": 53.71510000000001}, "top_5": {"0": 74.8763}}, "train_phase_idx": 43}
{"iteration": 55088, "phase_idx": 87, "test_accuracy_list_meter": {"top_1": {"0": 78.786}, "top_5": {"0": 94.948}}, "train_phase_idx": 43}
{"iteration": 56340, "phase_idx": 88, "train_accuracy_list_meter": {"top_1": {"0": 53.782799999999995}, "top_5": {"0": 74.9507}}, "train_phase_idx": 44}
{"iteration": 56340, "phase_idx": 89, "test_accuracy_list_meter": {"top_1": {"0": 78.78}, "top_5": {"0": 94.932}}, "train_phase_idx": 44}
{"iteration": 57592, "phase_idx": 90, "train_accuracy_list_meter": {"top_1": {"0": 53.7792}, "top_5": {"0": 75.0033}}, "train_phase_idx": 45}
{"iteration": 57592, "phase_idx": 91, "test_accuracy_list_meter": {"top_1": {"0": 78.74600000000001}, "top_5": {"0": 95.012}}, "train_phase_idx": 45}
{"iteration": 58844, "phase_idx": 92, "train_accuracy_list_meter": {"top_1": {"0": 53.9329}, "top_5": {"0": 75.1015}}, "train_phase_idx": 46}
{"iteration": 58844, "phase_idx": 93, "test_accuracy_list_meter": {"top_1": {"0": 78.716}, "top_5": {"0": 94.928}}, "train_phase_idx": 46}
{"iteration": 60096, "phase_idx": 94, "train_accuracy_list_meter": {"top_1": {"0": 54.000499999999995}, "top_5": {"0": 75.1875}}, "train_phase_idx": 47}
{"iteration": 60096, "phase_idx": 95, "test_accuracy_list_meter": {"top_1": {"0": 79.07600000000001}, "top_5": {"0": 94.95599999999999}}, "train_phase_idx": 47}
{"iteration": 61348, "phase_idx": 96, "train_accuracy_list_meter": {"top_1": {"0": 54.0794}, "top_5": {"0": 75.2159}}, "train_phase_idx": 48}
{"iteration": 61348, "phase_idx": 97, "test_accuracy_list_meter": {"top_1": {"0": 79.066}, "top_5": {"0": 95.028}}, "train_phase_idx": 48}
{"iteration": 62600, "phase_idx": 98, "train_accuracy_list_meter": {"top_1": {"0": 54.1452}, "top_5": {"0": 75.3116}}, "train_phase_idx": 49}
{"iteration": 62600, "phase_idx": 99, "test_accuracy_list_meter": {"top_1": {"0": 78.99000000000001}, "top_5": {"0": 95.02000000000001}}, "train_phase_idx": 49}
{"iteration": 63852, "phase_idx": 100, "train_accuracy_list_meter": {"top_1": {"0": 54.2848}, "top_5": {"0": 75.4063}}, "train_phase_idx": 50}
{"iteration": 63852, "phase_idx": 101, "test_accuracy_list_meter": {"top_1": {"0": 78.934}, "top_5": {"0": 95.06599999999999}}, "train_phase_idx": 50}
{"iteration": 65104, "phase_idx": 102, "train_accuracy_list_meter": {"top_1": {"0": 54.315000000000005}, "top_5": {"0": 75.4286}}, "train_phase_idx": 51}
{"iteration": 65104, "phase_idx": 103, "test_accuracy_list_meter": {"top_1": {"0": 78.938}, "top_5": {"0": 95.06599999999999}}, "train_phase_idx": 51}
{"iteration": 66356, "phase_idx": 104, "train_accuracy_list_meter": {"top_1": {"0": 54.443200000000004}, "top_5": {"0": 75.4973}}, "train_phase_idx": 52}
{"iteration": 66356, "phase_idx": 105, "test_accuracy_list_meter": {"top_1": {"0": 79.012}, "top_5": {"0": 95.06400000000001}}, "train_phase_idx": 52}
{"iteration": 67608, "phase_idx": 106, "train_accuracy_list_meter": {"top_1": {"0": 54.4863}, "top_5": {"0": 75.60510000000001}}, "train_phase_idx": 53}
{"iteration": 67608, "phase_idx": 107, "test_accuracy_list_meter": {"top_1": {"0": 79.026}, "top_5": {"0": 95.002}}, "train_phase_idx": 53}
{"iteration": 68860, "phase_idx": 108, "train_accuracy_list_meter": {"top_1": {"0": 54.500400000000006}, "top_5": {"0": 75.61630000000001}}, "train_phase_idx": 54}
{"iteration": 68860, "phase_idx": 109, "test_accuracy_list_meter": {"top_1": {"0": 79.042}, "top_5": {"0": 95.038}}, "train_phase_idx": 54}
{"iteration": 70112, "phase_idx": 110, "train_accuracy_list_meter": {"top_1": {"0": 54.6803}, "top_5": {"0": 75.7338}}, "train_phase_idx": 55}
{"iteration": 70112, "phase_idx": 111, "test_accuracy_list_meter": {"top_1": {"0": 79.096}, "top_5": {"0": 95.164}}, "train_phase_idx": 55}
{"iteration": 71364, "phase_idx": 112, "train_accuracy_list_meter": {"top_1": {"0": 54.761}, "top_5": {"0": 75.8537}}, "train_phase_idx": 56}
{"iteration": 71364, "phase_idx": 113, "test_accuracy_list_meter": {"top_1": {"0": 79.066}, "top_5": {"0": 95.122}}, "train_phase_idx": 56}
{"iteration": 72616, "phase_idx": 114, "train_accuracy_list_meter": {"top_1": {"0": 54.8493}, "top_5": {"0": 75.8888}}, "train_phase_idx": 57}
{"iteration": 72616, "phase_idx": 115, "test_accuracy_list_meter": {"top_1": {"0": 79.026}, "top_5": {"0": 95.11}}, "train_phase_idx": 57}
{"iteration": 73868, "phase_idx": 116, "train_accuracy_list_meter": {"top_1": {"0": 54.9263}, "top_5": {"0": 75.9718}}, "train_phase_idx": 58}
{"iteration": 73868, "phase_idx": 117, "test_accuracy_list_meter": {"top_1": {"0": 79.28399999999999}, "top_5": {"0": 95.126}}, "train_phase_idx": 58}
{"iteration": 75120, "phase_idx": 118, "train_accuracy_list_meter": {"top_1": {"0": 54.97410000000001}, "top_5": {"0": 76.0272}}, "train_phase_idx": 59}
{"iteration": 75120, "phase_idx": 119, "test_accuracy_list_meter": {"top_1": {"0": 79.186}, "top_5": {"0": 95.178}}, "train_phase_idx": 59}

moco2/metrics.json

{"iteration": 1252, "phase_idx": 0, "train_accuracy_list_meter": {"top_1": {"0": 0.7965}, "top_5": {"0": 2.8588999999999998}}, "train_phase_idx": 0}
{"iteration": 1252, "phase_idx": 1, "test_accuracy_list_meter": {"top_1": {"0": 1.138}, "top_5": {"0": 4.0120000000000005}}, "train_phase_idx": 0}
{"iteration": 2504, "phase_idx": 2, "train_accuracy_list_meter": {"top_1": {"0": 0.8748}, "top_5": {"0": 3.1144000000000003}}, "train_phase_idx": 1}
{"iteration": 2504, "phase_idx": 3, "test_accuracy_list_meter": {"top_1": {"0": 1.25}, "top_5": {"0": 4.2459999999999996}}, "train_phase_idx": 1}
{"iteration": 3756, "phase_idx": 4, "train_accuracy_list_meter": {"top_1": {"0": 0.8821000000000001}, "top_5": {"0": 3.1538999999999997}}, "train_phase_idx": 2}
{"iteration": 3756, "phase_idx": 5, "test_accuracy_list_meter": {"top_1": {"0": 1.398}, "top_5": {"0": 4.446}}, "train_phase_idx": 2}
{"iteration": 5008, "phase_idx": 6, "train_accuracy_list_meter": {"top_1": {"0": 0.897}, "top_5": {"0": 3.1723}}, "train_phase_idx": 3}
{"iteration": 5008, "phase_idx": 7, "test_accuracy_list_meter": {"top_1": {"0": 1.138}, "top_5": {"0": 4.102}}, "train_phase_idx": 3}
{"iteration": 6260, "phase_idx": 8, "train_accuracy_list_meter": {"top_1": {"0": 0.8961}, "top_5": {"0": 3.1822999999999997}}, "train_phase_idx": 4}
{"iteration": 6260, "phase_idx": 9, "test_accuracy_list_meter": {"top_1": {"0": 1.188}, "top_5": {"0": 4.108}}, "train_phase_idx": 4}
{"iteration": 7512, "phase_idx": 10, "train_accuracy_list_meter": {"top_1": {"0": 0.8949}, "top_5": {"0": 3.1635000000000004}}, "train_phase_idx": 5}
{"iteration": 7512, "phase_idx": 11, "test_accuracy_list_meter": {"top_1": {"0": 1.274}, "top_5": {"0": 4.5}}, "train_phase_idx": 5}
{"iteration": 8764, "phase_idx": 12, "train_accuracy_list_meter": {"top_1": {"0": 0.8926999999999999}, "top_5": {"0": 3.1884999999999994}}, "train_phase_idx": 6}
{"iteration": 8764, "phase_idx": 13, "test_accuracy_list_meter": {"top_1": {"0": 1.1520000000000001}, "top_5": {"0": 4.176}}, "train_phase_idx": 6}
{"iteration": 10016, "phase_idx": 14, "train_accuracy_list_meter": {"top_1": {"0": 0.8864000000000001}, "top_5": {"0": 3.1722}}, "train_phase_idx": 7}
{"iteration": 10016, "phase_idx": 15, "test_accuracy_list_meter": {"top_1": {"0": 1.4200000000000002}, "top_5": {"0": 4.702}}, "train_phase_idx": 7}
{"iteration": 11268, "phase_idx": 16, "train_accuracy_list_meter": {"top_1": {"0": 0.9181}, "top_5": {"0": 3.2216}}, "train_phase_idx": 8}
{"iteration": 11268, "phase_idx": 17, "test_accuracy_list_meter": {"top_1": {"0": 1.206}, "top_5": {"0": 4.424}}, "train_phase_idx": 8}
{"iteration": 12520, "phase_idx": 18, "train_accuracy_list_meter": {"top_1": {"0": 0.9095000000000001}, "top_5": {"0": 3.1975000000000002}}, "train_phase_idx": 9}
{"iteration": 12520, "phase_idx": 19, "test_accuracy_list_meter": {"top_1": {"0": 1.51}, "top_5": {"0": 4.748}}, "train_phase_idx": 9}
{"iteration": 13772, "phase_idx": 20, "train_accuracy_list_meter": {"top_1": {"0": 0.8848999999999999}, "top_5": {"0": 3.1672}}, "train_phase_idx": 10}
{"iteration": 13772, "phase_idx": 21, "test_accuracy_list_meter": {"top_1": {"0": 1.3820000000000001}, "top_5": {"0": 4.424}}, "train_phase_idx": 10}
{"iteration": 15024, "phase_idx": 22, "train_accuracy_list_meter": {"top_1": {"0": 0.9026}, "top_5": {"0": 3.2098}}, "train_phase_idx": 11}
{"iteration": 15024, "phase_idx": 23, "test_accuracy_list_meter": {"top_1": {"0": 1.3639999999999999}, "top_5": {"0": 4.73}}, "train_phase_idx": 11}
{"iteration": 16276, "phase_idx": 24, "train_accuracy_list_meter": {"top_1": {"0": 0.9041}, "top_5": {"0": 3.2253}}, "train_phase_idx": 12}
{"iteration": 16276, "phase_idx": 25, "test_accuracy_list_meter": {"top_1": {"0": 1.384}, "top_5": {"0": 4.328}}, "train_phase_idx": 12}
{"iteration": 17528, "phase_idx": 26, "train_accuracy_list_meter": {"top_1": {"0": 0.9029}, "top_5": {"0": 3.2321000000000004}}, "train_phase_idx": 13}
{"iteration": 17528, "phase_idx": 27, "test_accuracy_list_meter": {"top_1": {"0": 1.4200000000000002}, "top_5": {"0": 4.552}}, "train_phase_idx": 13}
{"iteration": 18780, "phase_idx": 28, "train_accuracy_list_meter": {"top_1": {"0": 0.8968999999999999}, "top_5": {"0": 3.2190999999999996}}, "train_phase_idx": 14}
{"iteration": 18780, "phase_idx": 29, "test_accuracy_list_meter": {"top_1": {"0": 1.24}, "top_5": {"0": 4.2139999999999995}}, "train_phase_idx": 14}
{"iteration": 20032, "phase_idx": 30, "train_accuracy_list_meter": {"top_1": {"0": 0.9075}, "top_5": {"0": 3.2416}}, "train_phase_idx": 15}
{"iteration": 20032, "phase_idx": 31, "test_accuracy_list_meter": {"top_1": {"0": 1.5}, "top_5": {"0": 4.918}}, "train_phase_idx": 15}
{"iteration": 21284, "phase_idx": 32, "train_accuracy_list_meter": {"top_1": {"0": 0.9267}, "top_5": {"0": 3.2439999999999998}}, "train_phase_idx": 16}
{"iteration": 21284, "phase_idx": 33, "test_accuracy_list_meter": {"top_1": {"0": 1.392}, "top_5": {"0": 4.754}}, "train_phase_idx": 16}
{"iteration": 22536, "phase_idx": 34, "train_accuracy_list_meter": {"top_1": {"0": 0.8971}, "top_5": {"0": 3.2405999999999997}}, "train_phase_idx": 17}
{"iteration": 22536, "phase_idx": 35, "test_accuracy_list_meter": {"top_1": {"0": 1.358}, "top_5": {"0": 4.638}}, "train_phase_idx": 17}
{"iteration": 23788, "phase_idx": 36, "train_accuracy_list_meter": {"top_1": {"0": 0.9233999999999999}, "top_5": {"0": 3.2748}}, "train_phase_idx": 18}
{"iteration": 23788, "phase_idx": 37, "test_accuracy_list_meter": {"top_1": {"0": 1.344}, "top_5": {"0": 4.558}}, "train_phase_idx": 18}
{"iteration": 25040, "phase_idx": 38, "train_accuracy_list_meter": {"top_1": {"0": 0.9213000000000001}, "top_5": {"0": 3.2568}}, "train_phase_idx": 19}
{"iteration": 25040, "phase_idx": 39, "test_accuracy_list_meter": {"top_1": {"0": 1.47}, "top_5": {"0": 4.997999999999999}}, "train_phase_idx": 19}
{"iteration": 26292, "phase_idx": 40, "train_accuracy_list_meter": {"top_1": {"0": 0.9216}, "top_5": {"0": 3.2800999999999996}}, "train_phase_idx": 20}
{"iteration": 26292, "phase_idx": 41, "test_accuracy_list_meter": {"top_1": {"0": 1.3679999999999999}, "top_5": {"0": 4.304}}, "train_phase_idx": 20}
{"iteration": 27544, "phase_idx": 42, "train_accuracy_list_meter": {"top_1": {"0": 0.9295}, "top_5": {"0": 3.3072999999999997}}, "train_phase_idx": 21}
{"iteration": 27544, "phase_idx": 43, "test_accuracy_list_meter": {"top_1": {"0": 1.452}, "top_5": {"0": 4.654}}, "train_phase_idx": 21}
{"iteration": 28796, "phase_idx": 44, "train_accuracy_list_meter": {"top_1": {"0": 0.9434}, "top_5": {"0": 3.3158}}, "train_phase_idx": 22}
{"iteration": 28796, "phase_idx": 45, "test_accuracy_list_meter": {"top_1": {"0": 1.18}, "top_5": {"0": 4.19}}, "train_phase_idx": 22}
{"iteration": 30048, "phase_idx": 46, "train_accuracy_list_meter": {"top_1": {"0": 0.9400000000000001}, "top_5": {"0": 3.3093999999999997}}, "train_phase_idx": 23}
{"iteration": 30048, "phase_idx": 47, "test_accuracy_list_meter": {"top_1": {"0": 1.5699999999999998}, "top_5": {"0": 4.936}}, "train_phase_idx": 23}
{"iteration": 31300, "phase_idx": 48, "train_accuracy_list_meter": {"top_1": {"0": 0.9486}, "top_5": {"0": 3.3371999999999997}}, "train_phase_idx": 24}
{"iteration": 31300, "phase_idx": 49, "test_accuracy_list_meter": {"top_1": {"0": 1.3639999999999999}, "top_5": {"0": 4.756}}, "train_phase_idx": 24}
{"iteration": 32552, "phase_idx": 50, "train_accuracy_list_meter": {"top_1": {"0": 0.9544}, "top_5": {"0": 3.3884999999999996}}, "train_phase_idx": 25}
{"iteration": 32552, "phase_idx": 51, "test_accuracy_list_meter": {"top_1": {"0": 1.488}, "top_5": {"0": 4.806}}, "train_phase_idx": 25}
{"iteration": 33804, "phase_idx": 52, "train_accuracy_list_meter": {"top_1": {"0": 0.9636}, "top_5": {"0": 3.3931999999999998}}, "train_phase_idx": 26}
{"iteration": 33804, "phase_idx": 53, "test_accuracy_list_meter": {"top_1": {"0": 1.254}, "top_5": {"0": 4.51}}, "train_phase_idx": 26}
{"iteration": 35056, "phase_idx": 54, "train_accuracy_list_meter": {"top_1": {"0": 0.9585}, "top_5": {"0": 3.3888000000000003}}, "train_phase_idx": 27}
{"iteration": 35056, "phase_idx": 55, "test_accuracy_list_meter": {"top_1": {"0": 1.414}, "top_5": {"0": 4.802}}, "train_phase_idx": 27}
{"iteration": 36308, "phase_idx": 56, "train_accuracy_list_meter": {"top_1": {"0": 0.9471999999999999}, "top_5": {"0": 3.3966000000000003}}, "train_phase_idx": 28}
{"iteration": 36308, "phase_idx": 57, "test_accuracy_list_meter": {"top_1": {"0": 1.428}, "top_5": {"0": 4.72}}, "train_phase_idx": 28}
{"iteration": 37560, "phase_idx": 58, "train_accuracy_list_meter": {"top_1": {"0": 0.9575}, "top_5": {"0": 3.4067}}, "train_phase_idx": 29}
{"iteration": 37560, "phase_idx": 59, "test_accuracy_list_meter": {"top_1": {"0": 1.54}, "top_5": {"0": 5.140000000000001}}, "train_phase_idx": 29}
{"iteration": 38812, "phase_idx": 60, "train_accuracy_list_meter": {"top_1": {"0": 0.9693999999999999}, "top_5": {"0": 3.4554}}, "train_phase_idx": 30}
{"iteration": 38812, "phase_idx": 61, "test_accuracy_list_meter": {"top_1": {"0": 1.3559999999999999}, "top_5": {"0": 4.590000000000001}}, "train_phase_idx": 30}
{"iteration": 40064, "phase_idx": 62, "train_accuracy_list_meter": {"top_1": {"0": 0.9934999999999999}, "top_5": {"0": 3.4757000000000002}}, "train_phase_idx": 31}
{"iteration": 40064, "phase_idx": 63, "test_accuracy_list_meter": {"top_1": {"0": 1.418}, "top_5": {"0": 4.684}}, "train_phase_idx": 31}
{"iteration": 41316, "phase_idx": 64, "train_accuracy_list_meter": {"top_1": {"0": 1.0022}, "top_5": {"0": 3.5020999999999995}}, "train_phase_idx": 32}
{"iteration": 41316, "phase_idx": 65, "test_accuracy_list_meter": {"top_1": {"0": 1.444}, "top_5": {"0": 4.766}}, "train_phase_idx": 32}
{"iteration": 42568, "phase_idx": 66, "train_accuracy_list_meter": {"top_1": {"0": 1.0064}, "top_5": {"0": 3.4833999999999996}}, "train_phase_idx": 33}
{"iteration": 42568, "phase_idx": 67, "test_accuracy_list_meter": {"top_1": {"0": 1.534}, "top_5": {"0": 4.988}}, "train_phase_idx": 33}
{"iteration": 43820, "phase_idx": 68, "train_accuracy_list_meter": {"top_1": {"0": 0.9908}, "top_5": {"0": 3.5222}}, "train_phase_idx": 34}
{"iteration": 43820, "phase_idx": 69, "test_accuracy_list_meter": {"top_1": {"0": 1.4460000000000002}, "top_5": {"0": 4.976}}, "train_phase_idx": 34}
{"iteration": 45072, "phase_idx": 70, "train_accuracy_list_meter": {"top_1": {"0": 1.0078}, "top_5": {"0": 3.5419}}, "train_phase_idx": 35}
{"iteration": 45072, "phase_idx": 71, "test_accuracy_list_meter": {"top_1": {"0": 1.6260000000000001}, "top_5": {"0": 5.382}}, "train_phase_idx": 35}
{"iteration": 46324, "phase_idx": 72, "train_accuracy_list_meter": {"top_1": {"0": 1.0211}, "top_5": {"0": 3.5804}}, "train_phase_idx": 36}
{"iteration": 46324, "phase_idx": 73, "test_accuracy_list_meter": {"top_1": {"0": 1.418}, "top_5": {"0": 4.992}}, "train_phase_idx": 36}
{"iteration": 47576, "phase_idx": 74, "train_accuracy_list_meter": {"top_1": {"0": 1.0205}, "top_5": {"0": 3.6020000000000003}}, "train_phase_idx": 37}
{"iteration": 47576, "phase_idx": 75, "test_accuracy_list_meter": {"top_1": {"0": 1.67}, "top_5": {"0": 5.318}}, "train_phase_idx": 37}
{"iteration": 48828, "phase_idx": 76, "train_accuracy_list_meter": {"top_1": {"0": 1.0396}, "top_5": {"0": 3.608}}, "train_phase_idx": 38}
{"iteration": 48828, "phase_idx": 77, "test_accuracy_list_meter": {"top_1": {"0": 1.494}, "top_5": {"0": 5.138}}, "train_phase_idx": 38}
{"iteration": 50080, "phase_idx": 78, "train_accuracy_list_meter": {"top_1": {"0": 1.0361}, "top_5": {"0": 3.6658000000000004}}, "train_phase_idx": 39}
{"iteration": 50080, "phase_idx": 79, "test_accuracy_list_meter": {"top_1": {"0": 1.472}, "top_5": {"0": 5.1499999999999995}}, "train_phase_idx": 39}
{"iteration": 51332, "phase_idx": 80, "train_accuracy_list_meter": {"top_1": {"0": 1.0626}, "top_5": {"0": 3.6706000000000003}}, "train_phase_idx": 40}
{"iteration": 51332, "phase_idx": 81, "test_accuracy_list_meter": {"top_1": {"0": 1.5959999999999999}, "top_5": {"0": 5.162}}, "train_phase_idx": 40}
{"iteration": 52584, "phase_idx": 82, "train_accuracy_list_meter": {"top_1": {"0": 1.0695}, "top_5": {"0": 3.7204}}, "train_phase_idx": 41}
{"iteration": 52584, "phase_idx": 83, "test_accuracy_list_meter": {"top_1": {"0": 1.744}, "top_5": {"0": 5.34}}, "train_phase_idx": 41}
{"iteration": 53836, "phase_idx": 84, "train_accuracy_list_meter": {"top_1": {"0": 1.0718}, "top_5": {"0": 3.7359999999999998}}, "train_phase_idx": 42}
{"iteration": 53836, "phase_idx": 85, "test_accuracy_list_meter": {"top_1": {"0": 1.6820000000000002}, "top_5": {"0": 5.428}}, "train_phase_idx": 42}
{"iteration": 55088, "phase_idx": 86, "train_accuracy_list_meter": {"top_1": {"0": 1.0744}, "top_5": {"0": 3.7638}}, "train_phase_idx": 43}
{"iteration": 55088, "phase_idx": 87, "test_accuracy_list_meter": {"top_1": {"0": 1.6580000000000001}, "top_5": {"0": 5.48}}, "train_phase_idx": 43}
{"iteration": 56340, "phase_idx": 88, "train_accuracy_list_meter": {"top_1": {"0": 1.0904}, "top_5": {"0": 3.7876}}, "train_phase_idx": 44}
{"iteration": 56340, "phase_idx": 89, "test_accuracy_list_meter": {"top_1": {"0": 1.6400000000000001}, "top_5": {"0": 5.218}}, "train_phase_idx": 44}
{"iteration": 57592, "phase_idx": 90, "train_accuracy_list_meter": {"top_1": {"0": 1.0906}, "top_5": {"0": 3.8153}}, "train_phase_idx": 45}
{"iteration": 57592, "phase_idx": 91, "test_accuracy_list_meter": {"top_1": {"0": 1.78}, "top_5": {"0": 5.568}}, "train_phase_idx": 45}
{"iteration": 58844, "phase_idx": 92, "train_accuracy_list_meter": {"top_1": {"0": 1.1066}, "top_5": {"0": 3.8497000000000003}}, "train_phase_idx": 46}
{"iteration": 58844, "phase_idx": 93, "test_accuracy_list_meter": {"top_1": {"0": 1.712}, "top_5": {"0": 5.4}}, "train_phase_idx": 46}
{"iteration": 60096, "phase_idx": 94, "train_accuracy_list_meter": {"top_1": {"0": 1.1085}, "top_5": {"0": 3.8850000000000002}}, "train_phase_idx": 47}
{"iteration": 60096, "phase_idx": 95, "test_accuracy_list_meter": {"top_1": {"0": 1.6199999999999999}, "top_5": {"0": 5.442}}, "train_phase_idx": 47}
{"iteration": 61348, "phase_idx": 96, "train_accuracy_list_meter": {"top_1": {"0": 1.1219}, "top_5": {"0": 3.9195}}, "train_phase_idx": 48}
{"iteration": 61348, "phase_idx": 97, "test_accuracy_list_meter": {"top_1": {"0": 1.5879999999999999}, "top_5": {"0": 5.372}}, "train_phase_idx": 48}
{"iteration": 62600, "phase_idx": 98, "train_accuracy_list_meter": {"top_1": {"0": 1.1339}, "top_5": {"0": 3.9228}}, "train_phase_idx": 49}
{"iteration": 62600, "phase_idx": 99, "test_accuracy_list_meter": {"top_1": {"0": 1.624}, "top_5": {"0": 5.418}}, "train_phase_idx": 49}
{"iteration": 63852, "phase_idx": 100, "train_accuracy_list_meter": {"top_1": {"0": 1.149}, "top_5": {"0": 4.0063}}, "train_phase_idx": 50}
{"iteration": 63852, "phase_idx": 101, "test_accuracy_list_meter": {"top_1": {"0": 1.55}, "top_5": {"0": 5.489999999999999}}, "train_phase_idx": 50}
{"iteration": 65104, "phase_idx": 102, "train_accuracy_list_meter": {"top_1": {"0": 1.1602}, "top_5": {"0": 3.9985}}, "train_phase_idx": 51}
{"iteration": 65104, "phase_idx": 103, "test_accuracy_list_meter": {"top_1": {"0": 1.68}, "top_5": {"0": 5.548}}, "train_phase_idx": 51}
{"iteration": 66356, "phase_idx": 104, "train_accuracy_list_meter": {"top_1": {"0": 1.179}, "top_5": {"0": 4.0446}}, "train_phase_idx": 52}
{"iteration": 66356, "phase_idx": 105, "test_accuracy_list_meter": {"top_1": {"0": 1.6820000000000002}, "top_5": {"0": 5.71}}, "train_phase_idx": 52}
{"iteration": 67608, "phase_idx": 106, "train_accuracy_list_meter": {"top_1": {"0": 1.1849}, "top_5": {"0": 4.1065}}, "train_phase_idx": 53}
{"iteration": 67608, "phase_idx": 107, "test_accuracy_list_meter": {"top_1": {"0": 1.754}, "top_5": {"0": 5.696}}, "train_phase_idx": 53}
{"iteration": 68860, "phase_idx": 108, "train_accuracy_list_meter": {"top_1": {"0": 1.2097}, "top_5": {"0": 4.181}}, "train_phase_idx": 54}
{"iteration": 68860, "phase_idx": 109, "test_accuracy_list_meter": {"top_1": {"0": 1.7680000000000002}, "top_5": {"0": 5.836}}, "train_phase_idx": 54}
{"iteration": 70112, "phase_idx": 110, "train_accuracy_list_meter": {"top_1": {"0": 1.2144}, "top_5": {"0": 4.1934}}, "train_phase_idx": 55}
{"iteration": 70112, "phase_idx": 111, "test_accuracy_list_meter": {"top_1": {"0": 1.7340000000000002}, "top_5": {"0": 5.742}}, "train_phase_idx": 55}
{"iteration": 71364, "phase_idx": 112, "train_accuracy_list_meter": {"top_1": {"0": 1.2312999999999998}, "top_5": {"0": 4.2592}}, "train_phase_idx": 56}
{"iteration": 71364, "phase_idx": 113, "test_accuracy_list_meter": {"top_1": {"0": 1.678}, "top_5": {"0": 5.876}}, "train_phase_idx": 56}
{"iteration": 72616, "phase_idx": 114, "train_accuracy_list_meter": {"top_1": {"0": 1.2368000000000001}, "top_5": {"0": 4.2545}}, "train_phase_idx": 57}
{"iteration": 72616, "phase_idx": 115, "test_accuracy_list_meter": {"top_1": {"0": 1.8239999999999998}, "top_5": {"0": 6.042}}, "train_phase_idx": 57}
{"iteration": 73868, "phase_idx": 116, "train_accuracy_list_meter": {"top_1": {"0": 1.2469}, "top_5": {"0": 4.293}}, "train_phase_idx": 58}
{"iteration": 73868, "phase_idx": 117, "test_accuracy_list_meter": {"top_1": {"0": 1.7399999999999998}, "top_5": {"0": 6.052}}, "train_phase_idx": 58}
{"iteration": 75120, "phase_idx": 118, "train_accuracy_list_meter": {"top_1": {"0": 1.2686}, "top_5": {"0": 4.3446}}, "train_phase_idx": 59}
{"iteration": 75120, "phase_idx": 119, "test_accuracy_list_meter": {"top_1": {"0": 1.6420000000000001}, "top_5": {"0": 5.736}}, "train_phase_idx": 59}
{"iteration": 76372, "phase_idx": 120, "train_accuracy_list_meter": {"top_1": {"0": 1.3023}, "top_5": {"0": 4.3985}}, "train_phase_idx": 60}
{"iteration": 76372, "phase_idx": 121, "test_accuracy_list_meter": {"top_1": {"0": 1.82}, "top_5": {"0": 6.17}}, "train_phase_idx": 60}

Expected behavior:

  1. For 2/metrics.json, the top-1 accuracy on train and val should not differ so much (0.55 and 0.79 now)
  2. For moco2/metrics.json, the top-1 accuracy should not be greater than 1(1.3 on train and 1.8 on test now)

I also check the difference between the generated train_config.yaml between my two folder, the only difference is the checkpoint.dir and weight_init

diff --git a/exp/checkpoint/2/train_config.yaml b/exp/checkpoint/moco2/train_config.yaml
index 6a98f31..3eb04fa 100644
--- a/exp/checkpoint/2/train_config.yaml
+++ b/exp/checkpoint/moco2/train_config.yaml
@@ -4,7 +4,7 @@ CHECKPOINT:
   BACKEND: disk
   CHECKPOINT_FREQUENCY: 10
   CHECKPOINT_ITER_FREQUENCY: -1
-  DIR: checkpoint/2
+  DIR: checkpoint/moco2
   LATEST_CHECKPOINT_RESUME_FILE_NUM: 1
   OVERWRITE_EXISTING: false
   USE_SYMLINK_CHECKPOINT_FOR_RESUME: false
@@ -441,12 +441,12 @@ MODEL:
       TOKENS_NORM: true
       name: null
   WEIGHTS_INIT:
-    APPEND_PREFIX: ''
-    PARAMS_FILE: ../weight/vit_b16_p16_in22k_ep90_supervised.torch
+    APPEND_PREFIX: trunk._feature_blocks.
+    PARAMS_FILE: ../weight/mocov3-vit-b-300ep.pth.tar
     REMOVE_PREFIX: ''
     SKIP_LAYERS:
     - num_batches_tracked
-    STATE_DICT_KEY_NAME: classy_state_dict
+    STATE_DICT_KEY_NAME: ''
   _MODEL_INIT_SEED: 0
 MONITORING:
   MONITOR_ACTIVATION_STATISTICS: 0

Environment:

Provide your environment information using the following command:

-------------------  ------------------------------------------------------------------------------------
sys.platform         linux
Python               3.8.13 (default, Mar 28 2022, 11:38:47) [GCC 7.5.0]
numpy                1.19.2
Pillow               7.1.2
vissl                0.1.6 @/home/vica/anaconda3/envs/vissl/lib/python3.8/site-packages/vissl
GPU available        True
GPU 0,1,2,3,4,5,6    NVIDIA GeForce RTX 3090
CUDA_HOME            /usr/local/cuda
torchvision          0.10.1 @/home/vica/anaconda3/envs/vissl/lib/python3.8/site-packages/torchvision
hydra                1.0.7 @/home/vica/anaconda3/envs/vissl/lib/python3.8/site-packages/hydra
classy_vision        0.7.0.dev @/home/vica/anaconda3/envs/vissl/lib/python3.8/site-packages/classy_vision
apex                 0.1 @/home/vica/anaconda3/envs/vissl/lib/python3.8/site-packages/apex
PyTorch              1.9.1 @/home/vica/anaconda3/envs/vissl/lib/python3.8/site-packages/torch
PyTorch debug build  False
-------------------  ------------------------------------------------------------------------------------
PyTorch built with:
  - GCC 7.3
  - C++ Version: 201402
  - Intel(R) Math Kernel Library Version 2020.0.2 Product Build 20200624 for Intel(R) 64 architecture applications
  - Intel(R) MKL-DNN v2.1.2 (Git Hash 98be7e8afa711dc9b66c8ff3504129cb82013cdb)
  - OpenMP 201511 (a.k.a. OpenMP 4.5)
  - NNPACK is enabled
  - CPU capability usage: AVX2
  - CUDA Runtime 11.1
  - NVCC architecture flags: -gencode;arch=compute_37,code=sm_37;-gencode;arch=compute_50,code=sm_50;-gencode;arch=compute_60,code=sm_60;-gencode;arch=compute_61,code=sm_61;-gencode;arch=compute_70,code=sm_70;-gencode;arch=compute_75,code=sm_75;-gencode;arch=compute_80,code=sm_80;-gencode;arch=compute_86,code=sm_86;-gencode;arch=compute_37,code=compute_37
  - CuDNN 8.0.5
  - Magma 2.5.2
  - Build settings: BLAS_INFO=mkl, BUILD_TYPE=Release, CUDA_VERSION=11.1, CUDNN_VERSION=8.0.5, CXX_COMPILER=/opt/rh/devtoolset-7/root/usr/bin/c++, CXX_FLAGS= -Wno-deprecated -fvisibility-inlines-hidden -DUSE_PTHREADPOOL -fopenmp -DNDEBUG -DUSE_KINETO -DUSE_FBGEMM -DUSE_QNNPACK -DUSE_PYTORCH_QNNPACK -DUSE_XNNPACK -DSYMBOLICATE_MOBILE_DEBUG_HANDLE -O2 -fPIC -Wno-narrowing -Wall -Wextra -Werror=return-type -Wno-missing-field-initializers -Wno-type-limits -Wno-array-bounds -Wno-unknown-pragmas -Wno-sign-compare -Wno-unused-parameter -Wno-unused-variable -Wno-unused-function -Wno-unused-result -Wno-unused-local-typedefs -Wno-strict-overflow -Wno-strict-aliasing -Wno-error=deprecated-declarations -Wno-stringop-overflow -Wno-psabi -Wno-error=pedantic -Wno-error=redundant-decls -Wno-error=old-style-cast -fdiagnostics-color=always -faligned-new -Wno-unused-but-set-variable -Wno-maybe-uninitialized -fno-math-errno -fno-trapping-math -Werror=format -Wno-stringop-overflow, LAPACK_INFO=mkl, PERF_WITH_AVX=1, PERF_WITH_AVX2=1, PERF_WITH_AVX512=1, TORCH_VERSION=1.9.1, USE_CUDA=ON, USE_CUDNN=ON, USE_EXCEPTION_PTR=1, USE_GFLAGS=OFF, USE_GLOG=OFF, USE_MKL=ON, USE_MKLDNN=ON, USE_MPI=OFF, USE_NCCL=ON, USE_NNPACK=ON, USE_OPENMP=ON, 

CPU info:
-------------------  -----------------------------------------
Architecture         x86_64
CPU op-mode(s)       32-bit, 64-bit
Byte Order           Little Endian
CPU(s)               80
On-line CPU(s) list  0-79
Thread(s) per core   2
Core(s) per socket   20
Socket(s)            2
NUMA node(s)         2
Vendor ID            GenuineIntel
CPU family           6
Model                85
Model name           Intel(R) Xeon(R) Gold 5218R CPU @ 2.10GHz
Stepping             7
CPU MHz              3247.779
CPU max MHz          4000.0000
CPU min MHz          800.0000
BogoMIPS             4200.00
Virtualization       VT-x
L1d cache            32K
L1i cache            32K
L2 cache             1024K
L3 cache             28160K
NUMA node0 CPU(s)    0-19,40-59
NUMA node1 CPU(s)    20-39,60-79
-------------------  -----------------------------------------
@VicaYang
Copy link
Author

One other minor issue, what is the correct way to combine the configs from models/vit_b16.yaml and eval_resnet_8gpu_transfer_in1k_linear.yaml? Currently I manually merge these two files and create a new yaml file, but I think it is a bad practice. I found How to override a training component with config files in this page, but I failed to use it

@QuentinDuval
Copy link
Contributor

Hi @VicaYang,

First of all, thank you for using VISSL :)

So let me tackle first the question of how to combine training components. The way to do it is to use the override syntax of Hydra:

python /path/to/vissl/tools/run_distributed_engines.py
   config=benchmark/linear_image_classification/imagenet1k/eval_resnet_8gpu_transfer_in1k_linear 
   +config/benchmark/linear_image_classification/imagenet1k/models=vit_b16 
   config.MODEL.WEIGHTS_INIT.PARAMS_FILE=/path/to/checkpoint.torch

The "+" syntax will deal with the merging.

Now the classification accuracy issues:

On the supervised ViT, I am not sure there is any issue. The train accuracy can sometimes be different from the test accuracy because the augmentations (especially the augmentations for ViT, because ViT lack a lot of biases that CNN have) can be pretty agressive to enforce regularisation.

On the MoCo side there is clearly a big issue. I think the model is probably not correctly loaded, and so the accuracy you get is on linear evaluation on top of a random representation (hence the 1% top-1 accuracy). So we need to debug why.

Could you please send me the logs that you got (the log.txt created by VISSL)?

Inside of it, you can grep for Extra layer and this will indicate us whether or not the weights were actually loaded. Once I have this information, I can help you correct this.

Thank you,
Quentin

@VicaYang
Copy link
Author

VicaYang commented May 14, 2022

Updated: well, I checked the moco2/log.txt and found that the weight seems not correctly loaded. I am trying a different prefix to see if it works.

===================

Thank @QuentinDuval for your help! I will try your config to see how it works.
For the log data, I found that the Extra layers are not loaded in both 2/log.txt and `moco2/log.txt'. I pasted the grepped results below, the full log can be downloaded from here

cat 2/log.txt | grep Extra

INFO 2022-05-08 18:17:47,578 checkpoint.py: 901: Extra layers not loaded from checkpoint: ['trunk.blocks.0.attn.qkv.bias', 'trunk.blocks.1.attn.qkv.bias', 'trunk.blocks.2.attn.qkv.bias', 'trunk.blocks.3.attn.qkv.bias', 'trunk.blocks.4.attn.qkv.bias', 'trunk.blocks.5.attn.qkv.bias', 'trunk.blocks.6.attn.qkv.bias', 'trunk.blocks.7.attn.qkv.bias', 'trunk.blocks.8.attn.qkv.bias', 'trunk.blocks.9.attn.qkv.bias', 'trunk.blocks.10.attn.qkv.bias', 'trunk.blocks.11.attn.qkv.bias']
INFO 2022-05-08 19:07:01,192 checkpoint.py: 901: Extra layers not loaded from checkpoint: ['trunk.blocks.0.attn.qkv.bias', 'trunk.blocks.1.attn.qkv.bias', 'trunk.blocks.2.attn.qkv.bias', 'trunk.blocks.3.attn.qkv.bias', 'trunk.blocks.4.attn.qkv.bias', 'trunk.blocks.5.attn.qkv.bias', 'trunk.blocks.6.attn.qkv.bias', 'trunk.blocks.7.attn.qkv.bias', 'trunk.blocks.8.attn.qkv.bias', 'trunk.blocks.9.attn.qkv.bias', 'trunk.blocks.10.attn.qkv.bias', 'trunk.blocks.11.attn.qkv.bias']
cat moco2/log.txt | grep Extra

INFO 2022-05-08 18:17:27,381 checkpoint.py: 901: Extra layers not loaded from checkpoint: ['trunk._feature_blocks.epoch', 'trunk._feature_blocks.arch', 'trunk._feature_blocks.state_dict', 'trunk._feature_blocks.optimizer', 'trunk._feature_blocks.type']
INFO 2022-05-08 19:06:36,351 checkpoint.py: 901: Extra layers not loaded from checkpoint: ['trunk._feature_blocks.epoch', 'trunk._feature_blocks.arch', 'trunk._feature_blocks.state_dict', 'trunk._feature_blocks.optimizer', 'trunk._feature_blocks.type']

@VicaYang
Copy link
Author

VicaYang commented May 15, 2022

I spent 2 hours and still failed to load the weight from mocov3 correctly :(
The weight file is structured as follow

In [3]: list(a.keys())
Out[3]: ['epoch', 'arch', 'state_dict', 'optimizer']
In [4]: list(a['state_dict'].keys())
Out[4]: 
['module.base_encoder.cls_token',
 'module.base_encoder.pos_embed',
 'module.base_encoder.patch_embed.proj.weight',
 'module.base_encoder.patch_embed.proj.bias',
 'module.base_encoder.blocks.0.norm1.weight',
 'module.base_encoder.blocks.0.norm1.bias',
 'module.base_encoder.blocks.0.attn.qkv.weight',
 'module.base_encoder.blocks.0.attn.qkv.bias',
 'module.base_encoder.blocks.0.attn.proj.weight',
 'module.base_encoder.blocks.0.attn.proj.bias',
 'module.base_encoder.blocks.0.norm2.weight',
 'module.base_encoder.blocks.0.norm2.bias',
 'module.base_encoder.blocks.0.mlp.fc1.weight',
 'module.base_encoder.blocks.0.mlp.fc1.bias',
 'module.base_encoder.blocks.0.mlp.fc2.weight',
 'module.base_encoder.blocks.0.mlp.fc2.bias',
 'module.base_encoder.blocks.1.norm1.weight',
 'module.base_encoder.blocks.1.norm1.bias',
 'module.base_encoder.blocks.1.attn.qkv.weight',
 'module.base_encoder.blocks.1.attn.qkv.bias',
 'module.base_encoder.blocks.1.attn.proj.weight',
 'module.base_encoder.blocks.1.attn.proj.bias',
 'module.base_encoder.blocks.1.norm2.weight',
 'module.base_encoder.blocks.1.norm2.bias',
 'module.base_encoder.blocks.1.mlp.fc1.weight',
 'module.base_encoder.blocks.1.mlp.fc1.bias',
 'module.base_encoder.blocks.1.mlp.fc2.weight',
 'module.base_encoder.blocks.1.mlp.fc2.bias',
 'module.base_encoder.blocks.2.norm1.weight',
 'module.base_encoder.blocks.2.norm1.bias',
 'module.base_encoder.blocks.2.attn.qkv.weight',
 'module.base_encoder.blocks.2.attn.qkv.bias',
 'module.base_encoder.blocks.2.attn.proj.weight',
 'module.base_encoder.blocks.2.attn.proj.bias',
 'module.base_encoder.blocks.2.norm2.weight',
 'module.base_encoder.blocks.2.norm2.bias',
 'module.base_encoder.blocks.2.mlp.fc1.weight',
 'module.base_encoder.blocks.2.mlp.fc1.bias',
 'module.base_encoder.blocks.2.mlp.fc2.weight',
 'module.base_encoder.blocks.2.mlp.fc2.bias',
 'module.base_encoder.blocks.3.norm1.weight',
 'module.base_encoder.blocks.3.norm1.bias',
 'module.base_encoder.blocks.3.attn.qkv.weight',
 'module.base_encoder.blocks.3.attn.qkv.bias',
 'module.base_encoder.blocks.3.attn.proj.weight',
 'module.base_encoder.blocks.3.attn.proj.bias',
 'module.base_encoder.blocks.3.norm2.weight',
 'module.base_encoder.blocks.3.norm2.bias',
 'module.base_encoder.blocks.3.mlp.fc1.weight',
 'module.base_encoder.blocks.3.mlp.fc1.bias',
 'module.base_encoder.blocks.3.mlp.fc2.weight',
 'module.base_encoder.blocks.3.mlp.fc2.bias',
 'module.base_encoder.blocks.4.norm1.weight',
 'module.base_encoder.blocks.4.norm1.bias',
 'module.base_encoder.blocks.4.attn.qkv.weight',
 'module.base_encoder.blocks.4.attn.qkv.bias',
 'module.base_encoder.blocks.4.attn.proj.weight',
 'module.base_encoder.blocks.4.attn.proj.bias',
 'module.base_encoder.blocks.4.norm2.weight',
 'module.base_encoder.blocks.4.norm2.bias',
 'module.base_encoder.blocks.4.mlp.fc1.weight',
 'module.base_encoder.blocks.4.mlp.fc1.bias',
 'module.base_encoder.blocks.4.mlp.fc2.weight',
 'module.base_encoder.blocks.4.mlp.fc2.bias',
 'module.base_encoder.blocks.5.norm1.weight',
 'module.base_encoder.blocks.5.norm1.bias',
 'module.base_encoder.blocks.5.attn.qkv.weight',
 'module.base_encoder.blocks.5.attn.qkv.bias',
 'module.base_encoder.blocks.5.attn.proj.weight',
 'module.base_encoder.blocks.5.attn.proj.bias',
 'module.base_encoder.blocks.5.norm2.weight',
 'module.base_encoder.blocks.5.norm2.bias',
 'module.base_encoder.blocks.5.mlp.fc1.weight',
 'module.base_encoder.blocks.5.mlp.fc1.bias',
 'module.base_encoder.blocks.5.mlp.fc2.weight',
 'module.base_encoder.blocks.5.mlp.fc2.bias',
 'module.base_encoder.blocks.6.norm1.weight',
 'module.base_encoder.blocks.6.norm1.bias',
 'module.base_encoder.blocks.6.attn.qkv.weight',
 'module.base_encoder.blocks.6.attn.qkv.bias',
 'module.base_encoder.blocks.6.attn.proj.weight',
 'module.base_encoder.blocks.6.attn.proj.bias',
 'module.base_encoder.blocks.6.norm2.weight',
 'module.base_encoder.blocks.6.norm2.bias',
 'module.base_encoder.blocks.6.mlp.fc1.weight',
 'module.base_encoder.blocks.6.mlp.fc1.bias',
 'module.base_encoder.blocks.6.mlp.fc2.weight',
 'module.base_encoder.blocks.6.mlp.fc2.bias',
 'module.base_encoder.blocks.7.norm1.weight',
 'module.base_encoder.blocks.7.norm1.bias',
 'module.base_encoder.blocks.7.attn.qkv.weight',
 'module.base_encoder.blocks.7.attn.qkv.bias',
 'module.base_encoder.blocks.7.attn.proj.weight',
 'module.base_encoder.blocks.7.attn.proj.bias',
 'module.base_encoder.blocks.7.norm2.weight',
 'module.base_encoder.blocks.7.norm2.bias',
 'module.base_encoder.blocks.7.mlp.fc1.weight',
 'module.base_encoder.blocks.7.mlp.fc1.bias',
 'module.base_encoder.blocks.7.mlp.fc2.weight',
 'module.base_encoder.blocks.7.mlp.fc2.bias',
 'module.base_encoder.blocks.8.norm1.weight',
 'module.base_encoder.blocks.8.norm1.bias',
 'module.base_encoder.blocks.8.attn.qkv.weight',
 'module.base_encoder.blocks.8.attn.qkv.bias',
 'module.base_encoder.blocks.8.attn.proj.weight',
 'module.base_encoder.blocks.8.attn.proj.bias',
 'module.base_encoder.blocks.8.norm2.weight',
 'module.base_encoder.blocks.8.norm2.bias',
 'module.base_encoder.blocks.8.mlp.fc1.weight',
 'module.base_encoder.blocks.8.mlp.fc1.bias',
 'module.base_encoder.blocks.8.mlp.fc2.weight',
 'module.base_encoder.blocks.8.mlp.fc2.bias',
 'module.base_encoder.blocks.9.norm1.weight',
 'module.base_encoder.blocks.9.norm1.bias',
 'module.base_encoder.blocks.9.attn.qkv.weight',
 'module.base_encoder.blocks.9.attn.qkv.bias',
 'module.base_encoder.blocks.9.attn.proj.weight',
 'module.base_encoder.blocks.9.attn.proj.bias',
 'module.base_encoder.blocks.9.norm2.weight',
 'module.base_encoder.blocks.9.norm2.bias',
 'module.base_encoder.blocks.9.mlp.fc1.weight',
 'module.base_encoder.blocks.9.mlp.fc1.bias',
 'module.base_encoder.blocks.9.mlp.fc2.weight',
 'module.base_encoder.blocks.9.mlp.fc2.bias',
 'module.base_encoder.blocks.10.norm1.weight',
 'module.base_encoder.blocks.10.norm1.bias',
 'module.base_encoder.blocks.10.attn.qkv.weight',
 'module.base_encoder.blocks.10.attn.qkv.bias',
 'module.base_encoder.blocks.10.attn.proj.weight',
 'module.base_encoder.blocks.10.attn.proj.bias',
 'module.base_encoder.blocks.10.norm2.weight',
 'module.base_encoder.blocks.10.norm2.bias',
 'module.base_encoder.blocks.10.mlp.fc1.weight',
 'module.base_encoder.blocks.10.mlp.fc1.bias',
 'module.base_encoder.blocks.10.mlp.fc2.weight',
 'module.base_encoder.blocks.10.mlp.fc2.bias',
 'module.base_encoder.blocks.11.norm1.weight',
 'module.base_encoder.blocks.11.norm1.bias',
 'module.base_encoder.blocks.11.attn.qkv.weight',
 'module.base_encoder.blocks.11.attn.qkv.bias',
 'module.base_encoder.blocks.11.attn.proj.weight',
 'module.base_encoder.blocks.11.attn.proj.bias',
 'module.base_encoder.blocks.11.norm2.weight',
 'module.base_encoder.blocks.11.norm2.bias',
 'module.base_encoder.blocks.11.mlp.fc1.weight',
 'module.base_encoder.blocks.11.mlp.fc1.bias',
 'module.base_encoder.blocks.11.mlp.fc2.weight',
 'module.base_encoder.blocks.11.mlp.fc2.bias',
 'module.base_encoder.norm.weight',
 'module.base_encoder.norm.bias',
 'module.base_encoder.head.0.weight',
 'module.base_encoder.head.1.weight',
 'module.base_encoder.head.1.bias',
 'module.base_encoder.head.1.running_mean',
 'module.base_encoder.head.1.running_var',
 'module.base_encoder.head.1.num_batches_tracked',
 'module.base_encoder.head.3.weight',
 'module.base_encoder.head.4.weight',
 'module.base_encoder.head.4.bias',
 'module.base_encoder.head.4.running_mean',
 'module.base_encoder.head.4.running_var',
 'module.base_encoder.head.4.num_batches_tracked',
 'module.base_encoder.head.6.weight',
 'module.base_encoder.head.7.running_mean',
 'module.base_encoder.head.7.running_var',
 'module.base_encoder.head.7.num_batches_tracked',
 'module.momentum_encoder.cls_token',
 'module.momentum_encoder.pos_embed',
 'module.momentum_encoder.patch_embed.proj.weight',
 'module.momentum_encoder.patch_embed.proj.bias',
 'module.momentum_encoder.blocks.0.norm1.weight',
 'module.momentum_encoder.blocks.0.norm1.bias',
 'module.momentum_encoder.blocks.0.attn.qkv.weight',
 'module.momentum_encoder.blocks.0.attn.qkv.bias',
 'module.momentum_encoder.blocks.0.attn.proj.weight',
 'module.momentum_encoder.blocks.0.attn.proj.bias',
 'module.momentum_encoder.blocks.0.norm2.weight',
 'module.momentum_encoder.blocks.0.norm2.bias',
 'module.momentum_encoder.blocks.0.mlp.fc1.weight',
 'module.momentum_encoder.blocks.0.mlp.fc1.bias',
 'module.momentum_encoder.blocks.0.mlp.fc2.weight',
 'module.momentum_encoder.blocks.0.mlp.fc2.bias',
 'module.momentum_encoder.blocks.1.norm1.weight',
 'module.momentum_encoder.blocks.1.norm1.bias',
 'module.momentum_encoder.blocks.1.attn.qkv.weight',
 'module.momentum_encoder.blocks.1.attn.qkv.bias',
 'module.momentum_encoder.blocks.1.attn.proj.weight',
 'module.momentum_encoder.blocks.1.attn.proj.bias',
 'module.momentum_encoder.blocks.1.norm2.weight',
 'module.momentum_encoder.blocks.1.norm2.bias',
 'module.momentum_encoder.blocks.1.mlp.fc1.weight',
 'module.momentum_encoder.blocks.1.mlp.fc1.bias',
 'module.momentum_encoder.blocks.1.mlp.fc2.weight',
 'module.momentum_encoder.blocks.1.mlp.fc2.bias',
 'module.momentum_encoder.blocks.2.norm1.weight',
 'module.momentum_encoder.blocks.2.norm1.bias',
 'module.momentum_encoder.blocks.2.attn.qkv.weight',
 'module.momentum_encoder.blocks.2.attn.qkv.bias',
 'module.momentum_encoder.blocks.2.attn.proj.weight',
 'module.momentum_encoder.blocks.2.attn.proj.bias',
 'module.momentum_encoder.blocks.2.norm2.weight',
 'module.momentum_encoder.blocks.2.norm2.bias',
 'module.momentum_encoder.blocks.2.mlp.fc1.weight',
 'module.momentum_encoder.blocks.2.mlp.fc1.bias',
 'module.momentum_encoder.blocks.2.mlp.fc2.weight',
 'module.momentum_encoder.blocks.2.mlp.fc2.bias',
 'module.momentum_encoder.blocks.3.norm1.weight',
 'module.momentum_encoder.blocks.3.norm1.bias',
 'module.momentum_encoder.blocks.3.attn.qkv.weight',
 'module.momentum_encoder.blocks.3.attn.qkv.bias',
 'module.momentum_encoder.blocks.3.attn.proj.weight',
 'module.momentum_encoder.blocks.3.attn.proj.bias',
 'module.momentum_encoder.blocks.3.norm2.weight',
 'module.momentum_encoder.blocks.3.norm2.bias',
 'module.momentum_encoder.blocks.3.mlp.fc1.weight',
 'module.momentum_encoder.blocks.3.mlp.fc1.bias',
 'module.momentum_encoder.blocks.3.mlp.fc2.weight',
 'module.momentum_encoder.blocks.3.mlp.fc2.bias',
 'module.momentum_encoder.blocks.4.norm1.weight',
 'module.momentum_encoder.blocks.4.norm1.bias',
 'module.momentum_encoder.blocks.4.attn.qkv.weight',
 'module.momentum_encoder.blocks.4.attn.qkv.bias',
 'module.momentum_encoder.blocks.4.attn.proj.weight',
 'module.momentum_encoder.blocks.4.attn.proj.bias',
 'module.momentum_encoder.blocks.4.norm2.weight',
 'module.momentum_encoder.blocks.4.norm2.bias',
 'module.momentum_encoder.blocks.4.mlp.fc1.weight',
 'module.momentum_encoder.blocks.4.mlp.fc1.bias',
 'module.momentum_encoder.blocks.4.mlp.fc2.weight',
 'module.momentum_encoder.blocks.4.mlp.fc2.bias',
 'module.momentum_encoder.blocks.5.norm1.weight',
 'module.momentum_encoder.blocks.5.norm1.bias',
 'module.momentum_encoder.blocks.5.attn.qkv.weight',
 'module.momentum_encoder.blocks.5.attn.qkv.bias',
 'module.momentum_encoder.blocks.5.attn.proj.weight',
 'module.momentum_encoder.blocks.5.attn.proj.bias',
 'module.momentum_encoder.blocks.5.norm2.weight',
 'module.momentum_encoder.blocks.5.norm2.bias',
 'module.momentum_encoder.blocks.5.mlp.fc1.weight',
 'module.momentum_encoder.blocks.5.mlp.fc1.bias',
 'module.momentum_encoder.blocks.5.mlp.fc2.weight',
 'module.momentum_encoder.blocks.5.mlp.fc2.bias',
 'module.momentum_encoder.blocks.6.norm1.weight',
 'module.momentum_encoder.blocks.6.norm1.bias',
 'module.momentum_encoder.blocks.6.attn.qkv.weight',
 'module.momentum_encoder.blocks.6.attn.qkv.bias',
 'module.momentum_encoder.blocks.6.attn.proj.weight',
 'module.momentum_encoder.blocks.6.attn.proj.bias',
 'module.momentum_encoder.blocks.6.norm2.weight',
 'module.momentum_encoder.blocks.6.norm2.bias',
 'module.momentum_encoder.blocks.6.mlp.fc1.weight',
 'module.momentum_encoder.blocks.6.mlp.fc1.bias',
 'module.momentum_encoder.blocks.6.mlp.fc2.weight',
 'module.momentum_encoder.blocks.6.mlp.fc2.bias',
 'module.momentum_encoder.blocks.7.norm1.weight',
 'module.momentum_encoder.blocks.7.norm1.bias',
 'module.momentum_encoder.blocks.7.attn.qkv.weight',
 'module.momentum_encoder.blocks.7.attn.qkv.bias',
 'module.momentum_encoder.blocks.7.attn.proj.weight',
 'module.momentum_encoder.blocks.7.attn.proj.bias',
 'module.momentum_encoder.blocks.7.norm2.weight',
 'module.momentum_encoder.blocks.7.norm2.bias',
 'module.momentum_encoder.blocks.7.mlp.fc1.weight',
 'module.momentum_encoder.blocks.7.mlp.fc1.bias',
 'module.momentum_encoder.blocks.7.mlp.fc2.weight',
 'module.momentum_encoder.blocks.7.mlp.fc2.bias',
 'module.momentum_encoder.blocks.8.norm1.weight',
 'module.momentum_encoder.blocks.8.norm1.bias',
 'module.momentum_encoder.blocks.8.attn.qkv.weight',
 'module.momentum_encoder.blocks.8.attn.qkv.bias',
 'module.momentum_encoder.blocks.8.attn.proj.weight',
 'module.momentum_encoder.blocks.8.attn.proj.bias',
 'module.momentum_encoder.blocks.8.norm2.weight',
 'module.momentum_encoder.blocks.8.norm2.bias',
 'module.momentum_encoder.blocks.8.mlp.fc1.weight',
 'module.momentum_encoder.blocks.8.mlp.fc1.bias',
 'module.momentum_encoder.blocks.8.mlp.fc2.weight',
 'module.momentum_encoder.blocks.8.mlp.fc2.bias',
 'module.momentum_encoder.blocks.9.norm1.weight',
 'module.momentum_encoder.blocks.9.norm1.bias',
 'module.momentum_encoder.blocks.9.attn.qkv.weight',
 'module.momentum_encoder.blocks.9.attn.qkv.bias',
 'module.momentum_encoder.blocks.9.attn.proj.weight',
 'module.momentum_encoder.blocks.9.attn.proj.bias',
 'module.momentum_encoder.blocks.9.norm2.weight',
 'module.momentum_encoder.blocks.9.norm2.bias',
 'module.momentum_encoder.blocks.9.mlp.fc1.weight',
 'module.momentum_encoder.blocks.9.mlp.fc1.bias',
 'module.momentum_encoder.blocks.9.mlp.fc2.weight',
 'module.momentum_encoder.blocks.9.mlp.fc2.bias',
 'module.momentum_encoder.blocks.10.norm1.weight',
 'module.momentum_encoder.blocks.10.norm1.bias',
 'module.momentum_encoder.blocks.10.attn.qkv.weight',
 'module.momentum_encoder.blocks.10.attn.qkv.bias',
 'module.momentum_encoder.blocks.10.attn.proj.weight',
 'module.momentum_encoder.blocks.10.attn.proj.bias',
 'module.momentum_encoder.blocks.10.norm2.weight',
 'module.momentum_encoder.blocks.10.norm2.bias',
 'module.momentum_encoder.blocks.10.mlp.fc1.weight',
 'module.momentum_encoder.blocks.10.mlp.fc1.bias',
 'module.momentum_encoder.blocks.10.mlp.fc2.weight',
 'module.momentum_encoder.blocks.10.mlp.fc2.bias',
 'module.momentum_encoder.blocks.11.norm1.weight',
 'module.momentum_encoder.blocks.11.norm1.bias',
 'module.momentum_encoder.blocks.11.attn.qkv.weight',
 'module.momentum_encoder.blocks.11.attn.qkv.bias',
 'module.momentum_encoder.blocks.11.attn.proj.weight',
 'module.momentum_encoder.blocks.11.attn.proj.bias',
 'module.momentum_encoder.blocks.11.norm2.weight',
 'module.momentum_encoder.blocks.11.norm2.bias',
 'module.momentum_encoder.blocks.11.mlp.fc1.weight',
 'module.momentum_encoder.blocks.11.mlp.fc1.bias',
 'module.momentum_encoder.blocks.11.mlp.fc2.weight',
 'module.momentum_encoder.blocks.11.mlp.fc2.bias',
 'module.momentum_encoder.norm.weight',
 'module.momentum_encoder.norm.bias',
 'module.momentum_encoder.head.0.weight',
 'module.momentum_encoder.head.1.weight',
 'module.momentum_encoder.head.1.bias',
 'module.momentum_encoder.head.1.running_mean',
 'module.momentum_encoder.head.1.running_var',
 'module.momentum_encoder.head.1.num_batches_tracked',
 'module.momentum_encoder.head.3.weight',
 'module.momentum_encoder.head.4.weight',
 'module.momentum_encoder.head.4.bias',
 'module.momentum_encoder.head.4.running_mean',
 'module.momentum_encoder.head.4.running_var',
 'module.momentum_encoder.head.4.num_batches_tracked',
 'module.momentum_encoder.head.6.weight',
 'module.momentum_encoder.head.7.running_mean',
 'module.momentum_encoder.head.7.running_var',
 'module.momentum_encoder.head.7.num_batches_tracked',
 'module.predictor.0.weight',
 'module.predictor.1.weight',
 'module.predictor.1.bias',
 'module.predictor.1.running_mean',
 'module.predictor.1.running_var',
 'module.predictor.1.num_batches_tracked',
 'module.predictor.3.weight',
 'module.predictor.4.running_mean',
 'module.predictor.4.running_var',
 'module.predictor.4.num_batches_tracked']

So I think STATE_DICT_KEY_NAME should be state_dict and REMOVE_PREFIX should be module.base_encoder..
Then I tried several combinations for APPEND_PREFIX, including , trunk., trunk.base_model., trunk._feature_blocks., trunk.base_model._feature_blocks., trunk._feature_blocks.base_model., but none of them works.
Could you help me how to load the weight correctly (for both linear and fulltune, I read the documents and found that there are some differences for this two mode. I thought .base_model is needed for linear and should be removed for fulltune, but I am not quite sure now)?

python run_distributed_engines.py \
  config=benchmark/linear_image_classification/imagenet1k/eval_resnet_8gpu_transfer_in1k_linear \
  +config/benchmark/linear_image_classification/imagenet1k/models=vit_b16 \
  config.MODEL.WEIGHTS_INIT.PARAMS_FILE="../weight/mocov3-vit-b-300ep.pth.tar" \
  config.MODEL.WEIGHTS_INIT.STATE_DICT_KEY_NAME="state_dict" \
  config.MODEL.WEIGHTS_INIT.APPEND_PREFIX="trunk.base_model." \
  config.MODEL.WEIGHTS_INIT.REMOVE_PREFIX="module.base_encoder." \
  config.CHECKPOINT.DIR="linear/moco" \

python run_distributed_engines.py \
  hydra.verbose=true \
  config=benchmark/fulltune/imagenet1k/eval_vit_8gpu_transfer_in1k_finetune \
  +config/benchmark/fulltune/imagenet1k/models=vit_b16_vica \
  config.CHECKPOINT.DIR="fulltune/moco" \
  config.MODEL.WEIGHTS_INIT.PARAMS_FILE="../weight/mocov3-vit-b-300ep.pth.tar" \
  config.MODEL.WEIGHTS_INIT.STATE_DICT_KEY_NAME="state_dict" \
  config.MODEL.WEIGHTS_INIT.REMOVE_PREFIX="module.base_encoder." \
  config.MODEL.WEIGHTS_INIT.APPEND_PREFIX="trunk." \
  config.DISTRIBUTED.NUM_PROC_PER_NODE=4 > log.txt 2>&1

@VicaYang
Copy link
Author

The error above is related to #550

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants