mirror of
https://github.com/aladdinpersson/Machine-Learning-Collection.git
synced 2026-02-21 11:18:01 +00:00
14 lines
1.3 KiB
Markdown
14 lines
1.3 KiB
Markdown
|
|
[Original Paper - Going Deeper with Convolutions (2014)](https://arxiv.org/abs/1409.4842)
|
|||
|
|
[Related Video](https://www.youtube.com/watch?v=uQc4Fs7yx5I)
|
|||
|
|
|
|||
|
|

|
|||
|
|
|
|||
|
|
|
|||
|
|
- [Review: GoogLeNet (Inception v1)](https://medium.com/coinmonks/paper-review-of-googlenet-inception-v1-winner-of-ilsvlc-2014-image-classification-c2b3565a64e7)
|
|||
|
|
- [Understanding GoogLeNet Model – CNN Architecture](https://www.geeksforgeeks.org/understanding-googlenet-model-cnn-architecture/)
|
|||
|
|
- [Ensemble Methods in Machine Learning: What are They and Why Use Them?](https://towardsdatascience.com/ensemble-methods-in-machine-learning-what-are-they-and-why-use-them-68ec3f9fef5f)
|
|||
|
|
- [Neural Networks Ensemble](https://towardsdatascience.com/neural-networks-ensemble-33f33bea7df3)
|
|||
|
|
- [Multiscale Methods and Machine Learning](https://www.kdnuggets.com/2018/03/multiscale-methods-machine-learning.html)
|
|||
|
|
- [What do the terms “dense” and “sparse” mean in the context of neural networks?](https://stats.stackexchange.com/questions/266996/what-do-the-terms-dense-and-sparse-mean-in-the-context-of-neural-networks)
|
|||
|
|
- [The Sparse Future of Deep Learning](https://towardsdatascience.com/the-sparse-future-of-deep-learning-bce05e8e094a)
|
|||
|
|
- [Understanding Auxiliary Loss](https://stats.stackexchange.com/a/436203)
|