Skip to content
View sseung0703's full-sized avatar
๐ŸŽฏ
Focusing
๐ŸŽฏ
Focusing
Block or Report

Block or report sseung0703

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Please don't include any personal information such as legal names or email addresses. Maximum 100 characters, markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this userโ€™s behavior. Learn more about reporting abuse.

Report abuse
sseung0703/README.md

Seunghyun Lee ๐Ÿ‘‹ sseung0703

  • Welcome to my Github page. I am a Ph.D at Inha University in South Korea.
    My research areas are machine learning, deep learning, and especially the light-weighting the convolutional neural networks such as knowledge distillation and filter pruning.
    You can find my curriculum vitae here.

ML libraries ๐Ÿงฑ

  • Tensorflow (1.x and 2.x): Professional
  • Pytorch: Upper intermediate
  • JAX: Upper intermediate

Anurag's github stats

Academic activity ๐Ÿ•น

  • Google developers experts from May 2022
  • Leader of deep learning paper study group: link
  • Major contributor of the implementation project for Putting NeRF on a Diet in ๐Ÿค—HuggingFace X GoogleAI Flax/JAX Community Week Event (won the 2nd price! ๐Ÿ˜†)
  • Have served as a reviewer for CVPR, ICCV, ECCV, and so on.

Publication ๐Ÿ“œ

First author of

  • "Fast Filter Pruning via Coarse-to-Fine Neural Architecture Search and Contrastive Knowledge Transfer" on IEEE TNNLS (2023) [paper]
  • "Ensemble Knowledge Guided Sub-network Search and Fine-tuning for Filter Pruning" on ECCV2022 [paper] [code]
  • "Interpretable Embedding Procedure Knowledge Transfer via Stacked Principal Component Analysis and Graph Neural Network" on AAAI2021 [paper] [code]
  • "Knowledge Transfer via Decomposing Essential Information in Convolutional Neural Networks" on IEEE TNNLS (2020) [paper] [TF1 code, TF2 code]
  • "Filter Pruning and Re-Initialization via Latent Space Clustering" on IEEE Access (2020) [paper]
  • "Transformation of Non-Euclidean Space to Euclidean Space for Efficient Learning of Singular Vectors" on IEEE Access (2020) [paper]
  • "Graph-based Knowledge Distillation by Multi-head Attention Network." on BMVC2019 oral [paper] [code]
  • "Self-supervised Knowledge Distillation Using Singular Value Decomposition" on ECCV2018 [paper] [TF1 code, TF2 code]

Co-author of

  • "CFA: Coupled-hypersphere-based Feature Adaptation for Target-Oriented Anomaly Localization" on IEEE Access (2022) [paper] [code]
  • "Balanced knowledge distillation for one-stage object detector" on Neurocomputing (2022) [paper]
  • "Vision Transformer for Small-Size Datasets" on arxiv preprint [paper] [code]
  • "Contextual Gradient Scaling for Few-Shot Learning" on WACV2022 [paper] [code]
  • "Zero-Shot Knowledge Distillation Using Label-Free Adversarial Perturbation With Taylor Approximation" on IEEE Access (2021) [paper] [code]
  • "Channel Pruning Via Gradient Of Mutual Information For Light-Weight Convolutional Neural Networks" on ICIP 2020 [paper]
  • "Real-time purchase behavior recognition system based on deep learning-based object detection and tracking for an unmanned product cabinet" on ESWA (2020) [paper]
  • "Metric-Based Regularization and Temporal Ensemble for Multi-Task Learning using Heterogeneous Unsupervised Tasks" on ICCVW2019 [paper]
  • "MUNet: macro unit-based convolutional neural network for mobile devices" on CVPRW2018 [paper]

and so on ๐ŸŽ“

Pinned

  1. KD_methods_with_TF KD_methods_with_TF Public

    Knowledge distillation methods implemented with Tensorflow (now there are 11 (+1) methods, and will be added more.)

    Python 265 59

  2. Knowledge_distillation_via_TF2.0 Knowledge_distillation_via_TF2.0 Public

    The codes for recent knowledge distillation algorithms and benchmark results via TF2.0 low-level API

    Python 105 30

  3. codestella/putting-nerf-on-a-diet codestella/putting-nerf-on-a-diet Public

    Putting NeRF on a Diet: Semantically Consistent Few-Shot View Synthesis Implementation

    Python 264 21

  4. Zero-shot_Knowledge_Distillation Zero-shot_Knowledge_Distillation Public

    Zero-Shot Knowledge Distillation in Deep Networks in ICML2019

    Python 49 9

  5. SSKD_SVD SSKD_SVD Public

    Python 49 10

  6. EKG EKG Public

    Ensemble Knowledge Guided Sub-network Search and Fine-tuning for Filter Pruning

    Python 17 1