Sort by
Refine Your Search
-
Listed
-
Category
-
Country
-
Program
-
Field
-
leadership to align Foundation activities to the College?s one college model and priorities. In addition to leading fundraising and outreach, the Chief Development Officer provides leadership for all
-
Engineer with strong experience in learning-based control and mobile manipulation. The successful candidate will work on algorithm development for contact-rich robotic manipulation, integrating perception
-
), preferably via the TUD SecureMail Portal https://securemail.tu-dresden.de by sending it as a single pdf file with the email subject header: Application JRGL-Human Sensory Perception and Action to ceti@tu
-
between HDR and SDR worlds, ensuring consistent colour and contrast perception across luminance levels. Building on cutting-edge psychophysical models, this research promises to reshape how we see, capture
-
environment with strong industrial links and a focus on innovation and sustainability. https://www.tum.de/ Chair of Perception for Intelligent Systems (PercInS) PercInS develops AI-driven perception systems
-
Czech Academy of Sciences, Institute of Information Theory and Automation | Czech | about 2 months ago
Skłodowska-Curie Doctoral Network Researcher position in project EXPLORA: Perception of materials, objects and spaces through active EXPLORAtion Research topic: Capture and model changes in material appearance
-
systems that combine vision-language-action (VLA) modeling, robotic perception and interaction, and autonomous task execution. The Research Engineer will assist in system design, software implementation
-
intelligence systems that combine vision-language-action (VLA) modeling, robotic perception and interaction, and autonomous task execution. The Senior Research Engineer will play a key role in bridging
-
or more of the following areas is required: Embodied AI and robot learning Vision-language-action (VLA) modeling Perception, control, and interaction for real-world robotic systems Deep learning, multimodal
-
intelligent robotic behavior across diverse environments. The position involves developing and testing embodied intelligence systems that combine vision-language-action (VLA) modeling, robotic perception and