Title | Why and when can deep-but not shallow-networks avoid the curse of dimensionality: A review |
Publication Type | Journal Article |
Year of Publication | 2017 |
Authors | Poggio T., Mhaskar H, Rosasco L., Miranda B., Liao Q. |
Journal | International Journal of Automation and Computing |
Pagination | 1-17 |
Date Published | 03/2017 |
Keywords | convolutional neural networks, deep and shallow networks, deep learning, function approximation, Machine Learning, Neural Networks |
Abstract | The paper reviews and extends an emerging body of theoretical results on deep learning including the conditions under which it can be exponentially better than shallow learning. A class of deep convolutional networks represent an important special case of these conditions, though weight sharing is not the main reason for their exponential advantage. Implications of a few key theorems are discussed, together with new results, open problems and conjectures. |
URL | http://link.springer.com/article/10.1007/s11633-017-1054-2?wt_mc=Internal.Event.1.SEM.ArticleAuthorOnlineFirst |
DOI | 10.1007/s11633-017-1054-2 |