Sort by
Refine Your Search
-
Listed
-
Category
-
Country
-
Program
-
Field
-
acceleration methods such as quantization, pruning, knowledge distillation, and cache tuning; demonstrable results with LLMs or large-scale neural networks. Demonstrable familiarity with AI/ML optimization tools
-
Elhoseiny, Code: https://github.com/yli1/CLCL Uncertainty-guided Continual Learning with Bayesian Neural Networks (ICLR’20), Sayna Ebrahimi, Mohamed Elhoseiny, Trevor Darrell, Marcus Rohrbach, Code: https
-
(computational linguistics, philosophy, and computer modeling of neural networks and brains). The successful candidate will be expected to teach courses at the graduate and undergraduate levels, mentor graduate
-
guarantees), knowledge representation and reasoning, and cognitive science (computational linguistics, philosophy, and computer modeling of neural networks and brains). The successful candidate will be
-
applications of neural networks to the analysis of multi-omic data, models for predicting phenotypes using genotype data, biological data integration, etc.. Participation in these projects will include
-
) within robotics, computer science, data science, or related fields. The candidates are expected to have a profound knowledge on the majority of the following topics: Machine learning (deep neural networks
-
preservation, and safety guarantees), knowledge representation and reasoning, and cognitive science (computational linguistics, philosophy, and computer modeling of neural networks and brains). The successful
-
solutions, which typically involve data-driven AI approaches, relying on neural networks, reinforcement learning, graph neural networks, etc. In this PhD position, you will build on our expertise in
-
Reconstruction Algorithms,” ICASSP 2015. (4) D.M. Pelt and J.A. Sethian, “A mixed-scale dense convolutional neural network for image analysis,” PNAS, January 8, 2019. If interested then, please, contact: Peter