Main
  Description
  Lecturers
  Program
  Proceedings
  Fees
  Submissions
  Registrations
  Organization
  General Info
  Hotels
  Mailing List
  Leaflet & Poster
  Photos
Sponsors:
 
 

Advanced School and Workshop on Soft Computing and Complex Systems

Program


Broad structure

The workshop will be organised around three main activities: lectures, given by well known international experts, teamwork by the attendees, to a solve particular problem proposed by the lecturers and, finally, short presentations by the students about their own work and interests.

 

Advanced School and Workshop on Soft Computing and Complex Systems
Coimbra, June 23-27, 2003
Schedule Monday Tuesday Wednesday Thursday Friday
8:30 9:00 Reception
9:00 10:30 Opening

G. Dorffner

Neural Computation

DM 2.4

R. Babuska

Intelligent Control

DM 2.4

J. Schmidhubber

Recurrent Neural

Networks

DM 2.4

Team Work

G. Dorffnner - DEI

Team Work

C. Fonseca - DEI

10:30 11:00 Coffee break Coffee break Coffee break Coffee break Coffee break
11:00 12:30 R. Babuska

Neuro-Fuzzy

Modelling

DM 2.4

J. Schmidhubber

Universal Learning

Algorithms

DM 2.4

Teams Formation Team Work

G. Dorffnner - DEI

Team Work

C. Fonseca - DEI

12:30 14:00 Lunch Lunch Lunch Lunch Lunch
14h00 15h30 C. Fonseca

Multi-Criteria Genetic

Optimisation

DM 2.4

Short Paper1

Short Paper2

Short Paper3

Short Paper4

Visit to the Team Work

R. Babuska - DEI

Team Work

J. Félix Costa - DEI

15:30 16:00 Coffee break Coffee break University Coffee break Coffee break
16:00 17:30 J. Félix Costa

Analog Computation

DM 2.4

Short Paper5

Short Paper6

Short Paper7

Short Paper8

Short Paper9

Team Work

R. Babuska - DEI

Team Work

J. Félix Costa - DEI

Closing

20:30 School Dinner
DM -- Departamento de Matemática (Department of Mathematics)
DEI -- Departamento de Eng. Informático (Departament of Informatics Engineering)
SP1-Piero Baraldi; SP2-Rafaelle Giordano; SP3-José Ramos; SP4-Luís Mujica; SP5-Peter Posík
SP6-Helder Pinho; SP7-Gonçalo Silva; SP8-André Ribeiro; SP9-Amândio Marques

 

Each Lecture will last for one and a half hour. There will be a small break between the lectures. Each team will have a lecturer assigned to it to provide some help if needed. The results of each group will be in the form of a written document, that will be the possible basis for a paper to be submitted to an international conference.

Lecture 1A: 
Neural Computation and Applications in Time Series and Signal Processing.
  • Speaker: Georg Dorffner
  • Synopsis: The main architectures for neural computation will be reviewed. The particular architectures for time series prediction and signal processing will be studied.
Lecture 2A: 
Analog Computation
  • Speaker: Félix Costa
  • Synopsis: There will be short introduction to the new promising area of analog computation. Some mathematical results, about the power of this approach, will be presented as well as the implications for the theory of computation.
Lecture 1B: 
Neuro-Fuzzy Modeling
  • Speaker: R. Babuska
  • Synopsis: Clustering in its several forms will be studied for structure learning from data in fuzzy systems. Parameter optimisation through neural networks, composing neuro-fuzzy systems will be reviewed and experimented.
Lecture 2B: 
Intelligent Control
  • Speaker: R. Babuska.
  • Synopsis: The main recently developed techniques for advanced Intelligent Control will be studied and experimented. On-line learning techniques, the problems of dimensionality, will be discussed.
Lecture 1C: 
Multi-criteria Genetic Optimisation
  • Speaker: Carlos Fonseca
  • Synopsis: Genetic optimisation will be reviewed in its general formulation. The particular case of multi-criteria will then be developed and discussed.
Lecture 2C: 
Speaker: Juergen Schimdhuber
Universal learning algorithms based on the theory of universal induction and Kolmogorov complexity, with applications:
 
Recurrent Neural Networks

RNNs are artificial neural networks with adaptive feedback connections. From training examples they can learn to map input sequences to output sequences. They can implement almost arbitrary sequential behavior. RNNs are biologically more plausible and computationally more powerful than other adaptive models such as Hidden Markov Models (no continuous internal states), feed-forward networks and Support Vector Machines (no internal states at all)


Advanced School and Workshop on Soft Computing and Complex Systems
Pedro Quaresma de Almeida
Departamento de Matemática
Faculdade de Ciências e Tecnologia
Universidade de Coimbra
3000 COIMBRA, PORTUGAL
softcomplex@hilbert.mat.uc.pt