Navigation

Springe zum Hauptinhalt

Neurocomputing

Neurocomputing

Vorlesung: Montag, 17:15 - 18:45, 1/201 (A10.201) (Dr. J. Vitay)
Übung: Dienstag, 13:45 - 15:15, 1/B201 (A11.201) (Dr. M. Teichmann)
Übung: Mittwoch, 13:45 - 15:15, 1/203 (A10.203) (T. Follak)
Übung: Donnerstag, 15:30 - 17:00, 1/203 (A10.203) (T. Follak)

General Information

This course replaces Machine Learning (573050).

Suggested prerequisites: Mathematics I to IV, basic knowledge in Python.

Exam: written examination (90 minutes), 5 ECTS.

Contact: julien dot vitay at informatik dot tu-chemnitz dot de.

Language: English. The exam can of course be done in German.

Exam WS 2019-2020 on 17.02.20

The exam starts at 13:00!

Room 316all Informatik students plus biomedizinische Technik.

Room 204the others: data science, mathematics, human factors, SEKO, Erasmus.

Some notes to prepare for the exam: (pdf).

Consultation on 10.2.20 at 15:30 (room 204).

Content

The course will introduce a variety of methods using neural networks to learn to solve useful problems. The first part of the course covers the deep learning area, starting from an introduction to machine learning and old-school neural networks up to current research trends. The second part will introduce other forms of neural network structures (attractor networks, reservoir computing, unsupervised Hebbian learning and spiking networks) which may be used to build complex cognitive architectures.

The different algorithms presented during the lectures will be studied in more details during the exercises, through implementations in Python (tensorflow, keras).

The preliminary plan of the course is:

  1. Deep learning
    1. Linear algorithms (regression, classification)
    2. Neural Networks (MLP, regularization)
    3. Deep Learning (CNN, autoencoders, GAN)
    4. Recurrent neural networks (LSTM, attention)
  2. Neurocomputing
    1. Attractor networks (Hopfield, neural fields)
    2. Reservoir computing
    3. Unsupervised Hebbian learning
    4. Deep spiking networks

Literature

  • Deep Learning, Ian Goodfellow, Yoshua Bengio & Aaron Courville, MIT press.

FAQ

  • How do I register for the exam?
    If you missed the registration on SBService, you need to fill one of these forms: (Bachelor), (Master) and send them to the ZPA per internal post or personally. Info:
    • Pruefungsnummer: 57318 / 57305
    • Pruefungsname: Neurocomputing (Maschinelles Lernen)
    • Modulname: Maschinelles Lernen (573050)
    • Pruefer: Prof. Hamker, Dr. Vitay
  • I do not have Neurocomputing in my study program, can I take it anyway?
    If you have Machine Learning (573050) in your program, you can replace it with Neurocomputing.
  • I need to take Machine Learning!
    You can only attend the Neurocomputing exam. The grade will be transferred to the Machine Learning course automatically. If the transfer is not automatic, you can use this form (which has to be signed by the head of the Pruefungsausschuss): https://www.tu-chemnitz.de/studentenservice/stusek/formulare/einstufung.pdf.
  • How do I register for the course?
    You can register on OPAL: https://bildungsportal.sachsen.de/opal/auth/RepositoryEntry/21637267460.
  • I cannot assist to the exercises. Can I take the exam anyway?
    Yes. The exercises are there to help you understand the concepts seen in the lectures and get practical experience with neural networks. But they are not obligatory for the exam.
  • Do I have to memorize all these equations?
    No, but to understand them, which is basically the same.

Slides

  1. Introduction
    1. Introduction to machine learning (html, pdf)
    2. Neurons (html, pdf)
  2. Linear models
    1. Optimization (html, pdf)
    2. Linear regression (html, pdf)
    3. Regularization (html, pdf)
    4. Linear classification (html, pdf)
    5. Maximum Likelihood Estimation and Logistic regression (html, pdf)
    6. Multi-class classification (html, pdf)
    7. Learning theory (html, pdf)
  3. Deep learning
    1. Artificial neural networks (html, pdf)
    2. Deep neural networks (html, pdf)
    3. Convolutional neural networks (html, pdf)
    4. Object detection (html, pdf)
    5. Autoencoders (html, pdf)
    6. Restricted Boltzmann Machines (skipped) (html, pdf)
    7. Segmentation networks (html, pdf)
    8. Generative adversarial networks (html, pdf)
    9. Recurrent neural networks (html, pdf)
  4. Neurocomputing
    1. Limits of deep learning (html, pdf)
    2. Associative memory (html, pdf)
    3. Reservoir computing (html, pdf)
    4. Spiking neural networks (html, pdf)

Exercises

Instructions to setup a virtual environment for Python: pdf

  1. Introduction to Python and Numpy: notebook (zip) , solution (zip)
  2. Linear regression: notebook (zip) solution (zip)
  3. Linear classification: notebook (zip) , solution (zip)
  4. Cross-validation: notebook (zip) , solution (zip)
  5. Backpropagation: notebook (zip) , solution (zip)
  6. MNIST: notebook (zip) , solution (zip)
  7. CNN: notebook (zip) , solution (zip)
  8. Transfer Learning: notebook (zip) , solution (zip)
  9. LSTM: notebook (zip) , solution (zip)

Presseartikel