2019 | Rodrigo Fernandes De Mello

Rodrigo Fernandes de Mello is an associate professor with the Department of Computer Science at the Universidade de São Paulo, with 15 years of carrier. He holds a M.Sc. in Computer Science from the Universidade Federal de São Carlos and a Ph.D. in Electrical Engineering from the Universidade de São Paulo, Brazil. He is also a CNPq researcher level 2B (CNPq is the Brazilian Research Agency responsible for most of the research funding in the country and, in the CS area, there are about 500 researchers who receive this award). In the last five years, he published over 22 papers in top CS journals. He has graduated 7 PhD students and 6 master students. He published a textbook with Springer International entitled ‘Machine Learning: A Practical Approach on the Statistical Learning Theory’ in conjunction with Prof. Moacir Antonelli Ponti in August 2018. His research interests are in the areas of Machine Learning, Statistical Learning Theory and Time Series Analysis.


Objectives:
Machine Learning supervised algorithms have been widely adopted as tools by different areas, however most people involved still require an additional background on how to prove an algorithm indeed learns and how it operates on data. That is the main contribution of the Statistical Learning Theory (SLT). It provides all theoretical foundation for Supervised Machine Learning, Data Science and Artificial Intelligence.
Content:
i) Essential concepts and assumptions for the development of the Statistical Learning Theory;
ii) Generalization and Consistency;
iii) Bias-Variance Dilemma;
iv) The Empirical Risk Minimization Principle: The Law of Large Numbers; Inconsistencies of the principle; Uniform convergence; Shattering coefficient; Vapnik-Chervonenkis dimension; Large-margin bounds;
v) Analyzing the most common supervised learning algorithms according to the SLT: Multilayer Perceptron, K-Nearest Neighbors, Support Vector Machines, and Deep Learning.
References:
Mello, R. F. and Ponti, M. A.; Machine Learning: A Practical Approach on the Statistical Learning Theory, August 2018, Springer International.
Vapnik, V. N.; Statistical Learning Theory, 1998, Wiley-Interscience.
Vapnik, V. N.; The Nature of Statistical Learning Theory, 1999, Springer.
Scholkopf, B.; Smola, A. J.; Learning With Kernels: Support Vector Machines, Regularization, Optimization, and Beyond, 2001, MIT Press Cambridge.
James, G.; Witten, D.; Hastie, T.; Tibshirani, R.; An Introduction to Statistical Learning with Applications in R, Springer, 2013.


His program :

Course: Introduction to Statistical Learning Theory
Public: Masters, and Doctorate candidates

  • February, 19 2019 – 14hs – 3hs
  • Place: LIX – Computer Science department, Institut Polytechnique de Paris – Route de Saclay – 91120 Palaiseau
  • Room : Grace Hopper
  • Title: An introduction to the Statistical Learning Theory.
  • Abstract: This seminar presents the main assumptions and results found by Vapnik. By taking advantage of such theoretical background, we will also discuss on how the assumptions may fail learning (such as in temporal-based scenarios) as well as on biases of different algorithms, including Deep Learning.PDF presentation
  • February, 20 2019 – 14hs – 3hs
  • Place: LIX – Computer Science department, Institut Polytechnique de Paris – Route de Saclay – 91120 Palaiseau
  • Room : Henri Poincaré
  • Title: Introducing Deep Learning from Multilayer Perceptron.
  • Abstract: We start formalizing learning with the Multilayer Perceptron approach and then discuss on how Convolutional Neural Networks can be designed.PDF presentation

  • March, 13 2019 – 11hs
  • Place: LIX – Computer Science department, Institut Polytechnique de Paris – Route de Saclay – 91120 Palaiseau
  • Room : Henri Poincaré
  • Title: Embedding of Time series and Data Streams.
  • Abstract: This seminar will discuss on Takens’ embedding theorem as a way of reconstructing phase spaces from which learning guarantees can be met using the framework provided by the Statistical Learning Theory.PDF presentation
  • March, 21 2019 – 11hs
  • Place: LIX – Computer Science department, Institut Polytechnique de Paris – Route de Saclay – 91120 Palaiseau
  • Room : Henri Poincaré
  • Title: Decomposition of Time series and Data Streams.
  • Abstract: This seminar introduces an approach based on EMD (Empirical Mode Decomposition) to separate stochastic from deterministic influences present in time series/data streams, something that supports the better modeling of each of those individual components.PDF presentation

  • April 4, 2019 – 10am
  • Place: Télécom ParisTech, 46 rue Baurrault – 75013 Paris, Room C229
  • Title: On the implementation of the Convolutional Neural Network: First part
  • Abstract: In this seminar, we will discuss how to implement the CNN. The audience is expected to have already implemented at least the Multilayer Perceptron and know its formulation.
  • April 11, 2019 – 10am
  • Place: Télécom ParisTech, 46 rue Baurrault – 75013 Paris, Room C221
  • Title: On the implementation of the Convolutional Neural Network: Second part
  • Abstract: In this seminar, we will discuss how to implement the CNN. The audience is expected to have already implemented at least the Multilayer Perceptron and know its formulation.

Seules les personnes avec un identifiant peuvent lire le rapport de visite.