Skip to content

About

Logo

Deep learning's recent history has been one of achievement: from triumphing over humans in the game of Go to world-leading performance in image recognition, voice recognition, translation, and other tasks. But this progress has come with a voracious appetite for computing power. Recently, researchers from MIT, UnB, IBM, and Yonsei University published a paper named "The Computational Limits of Learning" and warned society about this issue.

This article reports on the computational demands of Deep Learning applications in five prominent application areas and shows that progress in all five is strongly reliant on increases in computing power. Extrapolating forward this reliance reveals that progress along current lines is rapidly becoming economically, technically, and environmentally unsustainable.

Thus, continued progress in these applications will require dramatically more computationally-efficient methods, which will either have to come from changes to deep learning or from moving to other machine learning methods.

Our project aims to develop a web application for this paper where will be possible for people/community to have access to the data and the paper's analysis, and also allowing them to continuously contribute with it.

Paper link

You want to understand the beahavior of Deep Learning models in terms of scalabily or other Deep Learning issues? Or share your paper? The TCLDL can help you.

TLCL is a web site for update papers related to deep leraning, share it, and show it in some graphics.

Team

Document Versioning

Date Author(s) Description Version
09/01/20 Lorrany Azevedo Document creation 0.1
09/01/20 Lorrany Azevedo Document edit 0.2
09/01/20 Lorrany Azevedo and Mikhaele stylization of documentation 0.3
17/01/20 Lorrany Azevedo Bugfix in description and members photo 0.4