×

Invited talk: Representation learning with trainable COSFIRE filters

News / Events / Past / Invited talk: Representation learning with trainable COSFIRE filters
19 October. 2021 | 10:00 | CYENS co-working space
Abstract:  

In order to be effective, traditional pattern recognition methods typically require a careful manual design of features, involving considerable domain knowledge and effort by experts. The effectiveness and popularity of convolutional neural networks (CNN) and deep learning is largely due to the automatic configuration of effective feature extractors in the intermediate CNN representations. The downside is that a CNN requires a huge number of training examples. 
 
For the detection of any specific pattern that does not belong to any of the classes learned by a CNN fine tuning needs to be done. Furthermore, a CNN can be applied to images of fixed format and larger images need to be analysed in a sliding window mode. 
 
Trainable COSFIRE filters are an alternative to CNNs in applications with a small number of training examples. COSFIRE stands for Combinations of Shifted Filter Responses. Their design was inspired by the function of certain shape selective neuron in area V4 of visual cortex. A COSFIE filter is configured by the automatic analysis of a single training pattern. The filter response is computed as a combination of the responses of simpler filters, such as Difference of Gaussians or Gabor filters, taken at different positions of the concerned pattern. Recently we extended this approach by defining contributing filters using intermediate representations in a pre-trained CNN. The identification of the parameters of the contributing filters that are needed for a specific training pattern and the positions at which their responses need to be taken is done automatically. Once configured, a COSFIRE filter can be applied to an image of any size to detect the pattern of interest. 
 
This approach is illustrated on various applications. For visual pattern classification it yields performance results that are comparable to the best results obtained with CNNs. The trainable COSFIRE filter approach offers advantages over CNNs mainly in applications where a specific pattern (that does not belong to any of the classes learned by a CNN) needs to be detected in an image of an arbitrary size. 
 
Bio sketch:
 
Nicolai Petkov is professor of computer science with a chair in Intelligent Systems and Parallel Computing at the University of Groningen since 1991.
 
He received his doctoral degree at the Dresden University of Technology in Germany. After graduation he worked at several universities and in 1991 he was appointed professor of Computer Science at the University of Groningen. He was the PhD thesis director of 37 scientists till now. He was scientific director of the Institute for Mathematics and Computer Science (now Bernoulli Institute) from 1998 to 2009.
 
Nicolai Petkov is associate editor of several scientific journals (e.g. J. Image and Vision Computing). He co-organised and co-chaired several conferences in the series CAIP (Computer Analysis of Images and Patterns: 2003, 2009, 2015, 2021), BrainComp (Brain-Inspired Computing: 2013, 2015, 2017, 2019) and APPIS (Applications of Intelligent Systems: 2018, 2019, 2020).
 
His research interests are in the field of development of pattern recognition and machine learning algorithms for various applications. Recently he focusses his research on financial data. 
More Events
"Disrupting Human-AI Interaction through Language Models: A...
"Disrupting Human-AI Interaction through Language Models: A tour about ChatGPT potentials"
20 July. 2023
15th Cyprus Workshop on Signal Processing and Informatics (CWSPI)
Following the successful one-day workshop we had in the last 14 years
05 July. 2023
Computer Analysis of Images and Patterns (CAIP2023)
CAIP 2023 is the 20th in the CAIP series of biennial international conferences devoted to all...
18 January. 2023
Computer Animation and Social Agents (CASA 2023)
The 36th International Conference onĀ Computer Animation and Social AgentsĀ (CASA 2023) will be...
28 November. 2022