Why and When Can Deep - but Not Shallow - Networks Avoid the Curse of Dimensionality: a Review

TitleWhy and When Can Deep - but Not Shallow - Networks Avoid the Curse of Dimensionality: a Review
Publication TypeCBMM Memo
Year of Publication2016
AuthorsPoggio T., Mhaskar H., Rosasco L., Miranda B., Liao Q.
Date Published11/2016
Abstract

The paper reviews and extends an emerging body of theoretical results on deep learning including the conditions under which it can be exponentially better than shallow learning. A class of deep convolutional networks represent an important special case of these conditions, though weight sharing is not the main reason for their exponential advantage. Implications of a few key theorems are discussed, together with new results, open problems and conjectures.

URLhttps://cbmm.mit.edu/sites/default/files/publications/CBMM-Memo-058_0.pdf
Citation Key315

CBMM Memo No.: 

58