/
useful_links.txt
135 lines (121 loc) · 7.67 KB
/
useful_links.txt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
This repository contains expamples of diffenrent machine learning algorithms using scikit-learn, tensorflow..etc
TODO
Deep Learning for Hackers:
https://github.com/curiousily/Deep-Learning-For-Hackers
https://github.com/sgrvinod?tab=overview&from=2019-12-01&to=2019-12-31
Auto Differentiation:
http://colah.github.io/posts/2015-08-Backprop/
https://blog.paperspace.com/pytorch-101-understanding-graphs-and-automatic-differentiation/
https://blog.paperspace.com/pytorch-hooks-gradient-clipping-debugging/
https://towardsdatascience.com/pytorch-autograd-understanding-the-heart-of-pytorchs-magic-2686cd94ec95#:~:text=Mathematically%2C%20the%20autograd%20class%20is,with%20respect%20to%20another%20vector.
https://www.youtube.com/watch?v=MswxJw-8PvE
https://ml-cheatsheet.readthedocs.io/en/latest/backpropagation.html
https://www.cs.toronto.edu/~rgrosse/courses/csc321_2018/slides/lec10.pdf
https://medium.com/@14prakash/back-propagation-is-very-simple-who-made-it-complicated-97b794c97e5c
Pytorch hooks:
https://www.youtube.com/watch?v=syLFCVYua6Q
Batch Normalization:
https://papers.nips.cc/paper/7996-understanding-batch-normalization.pdf
Convolution Neural Network for image recognization:
https://www.youtube.com/playlist?list=PL3FW7Lu3i5JvHM8ljYj-zLfQRF3EO8sYv
https://petewarden.com/2015/04/20/why-gemm-is-at-the-heart-of-deep-learning/
cuDNN: https://arxiv.org/pdf/1410.0759.pdf
caffe optimization: https://arxiv.org/pdf/1504.04343v1.pdf
Conv backpropagation:
https://towardsdatascience.com/backpropagation-in-a-convolutional-layer-24c8d64d8509
Gradient Descent:
https://ruder.io/optimizing-gradient-descent/
https://en.wikipedia.org/wiki/Stochastic_gradient_descent
https://en.wikipedia.org/wiki/Automatic_differentiation
Transposed Convolution:
https://medium.com/activating-robotic-minds/up-sampling-with-transposed-convolution-9ae4f2df52d0
https://people.eecs.berkeley.edu/~jonlong/long_shelhamer_fcn.pdf
https://arxiv.org/abs/1603.07285
gan: https://arxiv.org/pdf/1511.06434v2.pdf
https://distill.pub/2016/deconv-checkerboard/
https://missinglink.ai/guides/computer-vision/image-segmentation-deep-learning-methods-applications/
LSTM:
http://colah.github.io/posts/2015-08-Understanding-LSTMs/
https://towardsdatascience.com/illustrated-guide-to-lstms-and-gru-s-a-step-by-step-explanation-44e9eb85bf21
http://karpathy.github.io/2015/05/21/rnn-effectiveness/
https://github.com/sherjilozair/char-rnn-tensorflow
https://towardsdatascience.com/neural-network-embeddings-explained-4d028e6f0526
http://papers.nips.cc/paper/5021-distributed-representations-of-words-and-phrases-and-their-compositionality.pdf
Embeding:
https://stats.stackexchange.com/questions/270546/how-does-keras-embedding-layer-work
https://developers.google.com/machine-learning/crash-course/embeddings/video-lecture
https://stackoverflow.com/questions/41455101/what-is-the-meaning-of-the-word-logits-in-tensorflow
https://www.tensorflow.org/tutorials/text/word_embeddings
https://arxiv.org/pdf/1507.07998.pdf ( Document embdding with paragraphs vectors )
Jupyter:
https://jupyter-notebook.readthedocs.io/en/stable/examples/Notebook/Working%20With%20Markdown%20Cells.html
https://machinelearningmastery.com/use-word-embedding-layers-deep-learning-keras/
Convolutional Neural Network:
https://missinglink.ai/guides/neural-network-concepts/convolutional-neural-network-build-one-keras-pytorch/
Image segmentation:
https://missinglink.ai/guides/computer-vision/image-segmentation-deep-learning-methods-applications/
https://medium.com/@arthur_ouaknine/review-of-deep-learning-algorithms-for-image-semantic-segmentation-509a600f7b57
Distributed training:
https://github.com/tensorflow/examples/blob/master/community/en/docs/deploy/distributed.md
https://lambdalabs.com/blog/tensorflow-2-0-tutorial-05-distributed-training-multi-node/
strong scaling: https://arxiv.org/pdf/1807.09161.pdf
Data normalization:
https://towardsdatascience.com/understand-data-normalization-in-machine-learning-8ff3062101f0
BenchMarking:
https://github.com/baidu-research/DeepBench
Setting Random values while training:
https://medium.com/@ODSC/properly-setting-the-random-seed-in-ml-experiments-not-as-simple-as-you-might-imagine-219969c84752
Numpy strides:
https://stackoverflow.com/questions/53097952/how-to-understand-numpy-strides-for-layman
covid19India data display:
state plot:
https://towardsdatascience.com/tracking-corona-covid-19-spread-in-india-using-python-40ef8ffa7e31
face recognition:
https://www.hackster.io/mjrobot/real-time-face-recognition-an-end-to-end-project-a10826
Feature Engineering, Data cleaning, Data preparation:
https://machinelearningmastery.com/books-on-data-cleaning-data-preparation-and-feature-engineering/
Recomended System:
https://en.wikipedia.org/wiki/Recommender_system
DLRM: https://ai.facebook.com/blog/dlrm-an-advanced-open-source-deep-learning-recommendation-model/
Content filtering: https://developers.google.com/machine-learning/recommendation/content-based/basics
TensorRT:
https://developer.nvidia.com/gtc/2020/video/s21671
Dont decay learning rate & increase the batchsize:
https://openreview.net/pdf?id=B1Yy1BxCZ
Transformer:
https://nlp.seas.harvard.edu/2018/04/03/attention.html
http://jalammar.github.io/illustrated-transformer/
misc:
https://towardsdatascience.com/how-to-automatically-import-your-favorite-libraries-into-ipython-or-a-jupyter-notebook-9c69d89aa343
yolov5:
https://www.analyticsvidhya.com/blog/2021/08/train-your-own-yolov5-object-detection-model/
https://yanfengliux.medium.com/the-confusing-metrics-of-ap-and-map-for-object-detection-3113ba0386ef
https://github.com/saimj7/People-Counting-in-Real-Time
it clone https://github.com/saimj7/People-Counting-in-Real-Time.git
opencv working with onnx yolov5:https://github.com/ultralytics/yolov5/issues/239
python export.py --weights yolov5s.pt --include onnx --simplify
python detect.py --weights yolov5s.onnx # ONNX Runtime inference
python detect.py --weights yolov5s.onnx --dnn # OpenCV DNN inference
object detection:
https://github.com/sgrvinod/a-PyTorch-Tutorial-to-Object-Detection
https://www.kaggle.com/rahulkumarpatro/yolo-object-detection
https://becominghuman.ai/understanding-anchors-backbone-of-object-detection-using-yolo-54962f00fbbb
https://curiousily.com/posts/object-detection-on-custom-dataset-with-yolo-v5-using-pytorch-and-python/
https://pysource.com/object-detection-opencv-deep-learning-video-course/
ANPR:
https://github.com/theAIGuysCode/yolov4-custom-functions
https://www.youtube.com/watch?v=AAPZLK41rek
https://github.com/bharatsubedi/ALPR-Yolov5
https://www.kaggle.com/rkuo2000/yolov5-alpr/data
https://www.kaggle.com/rkuo2000/yolov5-alpr/notebook
mAP: https://towardsdatascience.com/map-mean-average-precision-might-confuse-you-5956f1bfa9e2
BERT:
http://nlp.seas.harvard.edu/2018/04/03/attention.html
https://arxiv.org/pdf/1810.04805.pdf
Statistics:
https://eng.libretexts.org/Bookshelves/Industrial_and_Systems_Engineering/Book%3A_Chemical_Process_Dynamics_and_Controls_(Woolf)/13%3A_Statistics_and_Probability_Background/13.01%3A_Basic_statistics-_mean%2C_median%2C_average%2C_standard_deviation%2C_z-scores%2C_and_p-value
distributed bert training:
https://docs.microsoft.com/en-us/azure/machine-learning/how-to-train-distributed-gpu
https://aws.amazon.com/blogs/machine-learning/multi-gpu-distributed-deep-learning-training-at-scale-on-aws-with-ubuntu18-dlami-efa-on-p3dn-instances-and-amazon-fsx-for-lustre/
https://docs.nvidia.com/ngc/multi-node-bert-user-guide/index.html
https://github.com/microsoft/AzureML-BERT/blob/master/pretrain/PyTorch/notebooks/BERT_Pretrain.ipynb