What If...

TitleWhat If...
Publication TypeMiscellaneous
Year of Publication2015
AuthorsPoggio T.
Abstract

Over the last 3 years and increasingly so in the last few months, I have seen supervised DCLNs — feedforward and recurrent — do more and more of everything quite well. They seem to learn good representations for a growing number of speech and text problems (for a review by the pioneers in the field see LeCun, Bengio, Hinton, 2015). More interestingly, it is increasingly clear, as I will discuss later, that instead of being trained on millions of labeled examples they can be trained in implicitly supervised ways. This breakthrough in machine learning triggers a few dreams. What if we have now the basic answer to how to develop brain-like intelligence and its basic building blocks?...

URLhttps://cbmm.mit.edu/sites/default/files/publications/What%20if.pdf