-
: OBJECTIVES | FUNCTIONS The candidate will work on the development of novel approaches for clinical natural language processing specifically looking at: (a) the use of language models for coding multilingual
-
, as well as predictive models for context-aware forecasting to guide tactical and strategic planning. (ii) to develop a Node.js–based analytics tool for supporting the management of donations and
-
demand for elastic and scalable cloud resources keeps growing, as evidenced by the emergence of popular cloud computing models such as Serverless. Since this demand cannot be met with existing
-
approaches for the integration of cancer multi-omics data and the analysis of CRISPR-based screens. Responsibilities include designing bioinformatics workflows, processing molecular data, training models, and
-
the ability of existing vision and language models to reason using language and visual inputs. Particular attention will be given to studying approaches to reasoning in continuous latent spaces and to the issue
-
of the Project GROW-LC - I3AC003101, supported by INESC-ID, is now available under the following conditions: OBJECTIVES | FUNCTIONS The main objective is to develop a model capable of performing a statistical
-
implement, scenarios of human-robot interaction using Large Language Models to engage in conversation with victims and first-responders. BINDING LEGISLATION Law 40/2004 of 18th of August (Scientific Research
-
information to be extracted, the system should generate an XQuery program to perform the extraction. We will leverage Large Language Models and explore various prompting strategies to determine the most
-
the following conditions: OBJECTIVES | FUNCTIONS The UNIFY project consortium is developing an accelerated multi-core computing system, composed of processors and accelerators, based on the RISC-V instruction set
-
to mitigate the risk of triggering anti-scraping mechanisms. To achieve this, the student will first establish a base deployment system for running the generated scrapers and storing their collected data and