The Theory of Perfect Learning

Submitted by : Nonvikan Karl-Augustt Alahassa
University: University of Montreal
E-mail : alahassan@dms.umontreal.ca
Supervisor's Name : Alejandro Murua
Year of award :
Awards : 2021
Summary of Thesis

The perfect learning exists. We mean a learning model that can be generalized and moreover that can always fit perfectly the test data as well as the training data. We have performed in this thesis many experiments that validate this concept in many ways. The tools are given through the chapters that contain our developments. The classical Multilayer Feedforward model has been re-considered and a novel N_k-architecture is proposed to fit any multivariate regression task. This model can easily be augmented to thousands of possible layers without loss of predictive power and has the potential to overcome our difficulties simultaneously in building a model that has a good fit on the ...

Read More
THESIS CHAPTER SCHEME

1.    Statistical topics

2.    The Potts Model with Complete Shrinkage

3.    Deep learning and Classical Neural Networks

4.    Shallow Potts Neural Network Mixture Models

5.    Nearest Neighbor Multivariate Interpolation (NNMI)

6.    Generalization of similarity measure using Metric Learning

7.    Convolutional Neural Network Gibbs Model

8.    Concluding Remarks, Discussion notes & Applications 



Download PDF