Sort by
Refine Your Search
-
Listed
-
Category
-
Employer
- Radboud University
- Delft University of Technology (TU Delft); yesterday published
- Eindhoven University of Technology (TU/e)
- Eindhoven University of Technology (TU/e); Eindhoven
- Leiden University
- Maastricht University (UM)
- Maastricht University (UM); Maastricht
- AMOLF
- Amsterdam UMC
- CWI
- Leiden University; Leiden
- Max Planck Institute (MPI)
- University of Amsterdam (UvA)
- University of Amsterdam (UvA); Amsterdam
- University of Groningen
- 5 more »
- « less
-
Field
-
addresses this challenge in two ways: We investigate the fundamental neural mechanisms that control movement. We explore engineering-based solutions to restore function when these pathways are disrupted. As a
-
temporarily, as needed, when needed. The goal of this project is to advance the understanding of how working memory is implemented in the human brain. To this end, the main objective is to develop a neural
-
to work on the design and implementation of Oscillatory Neural Networks (ONNs) for physics-based computing applications. You as the candidate will be an integral part of the prestigious NWO AiNED AI-on-ONN
-
are currently seeking a highly motivated PhD student to join our team to work on the design and implementation of Oscillatory Neural Networks (ONNs) for physics-based computing applications. You as the successful
-
around us? At Maastricht University, you will investigate how individuals differ in predictive processing by combining behavioural and neural testing with computational modelling. Together with colleagues
-
transparent and intelligible. Although explainable AI methods can shed some light on the inner workings of black-box machine learning models such as deep neural networks, they have severe drawbacks and
-
of black-box machine learning models such as deep neural networks, they have severe drawbacks and limitations. The field of interpretable machine learning aims to fill this gap by developing interpretable
-
workings of black-box machine learning models such as deep neural networks, they have severe drawbacks and limitations. The field of interpretable machine learning aims to fill this gap by developing
-
role of neural rhuthms for inter-area brain network communicartion PHD2: The neural code for multi-item representation in working memory PHD 3: The dynamic interplay between brain and bodily rhythms in
-
inductive biases, we aim to identify key mechanisms that drive rapid learning in the visual system. The goal is to create a robust mechanistic neural network model of the visual system that not only mimics