-
Machine Learning components of the CONVERGE project toolset.; - Assist in executing integration tests across different hardware and software modules.; - Contribute to the structured collection and
-
PowerFactory.; • Develop and implement the interface between the test network (simulation model or real-network replica) in DIgSILENT PowerFactory and the SCADA software for real-time measurement acquisition
-
software solutions for precise energy measurement and data communication. In this context, the purpose of this Research Initiation Grant (BII) is:; • Understand the operation and architecture
-
describe processing sequences, based on Web technologies and a scientific workflow system.; - Implement the software module and respective documentation; - Perform usability and validation tests
-
INESC TEC is accepting applications to award 4 Scientific Research Grant - NEXUS - CTM (AE2025-0564)
transmission mechanisms using Regenerative Artificial Intelligence; - Develop a prototype for image transmission based on regenerative AI semantics; - Integrate and test the solution with an emulated HF testbed
-
four years, in the cases of students enrolled in a PhD. Scientific advisor: Alípio Jorge Workplace: INESC TEC, Porto, Portugal Maintenance stipend: 1309.64, according to the table of monthly maintenance
-
industrial-grade communication protocols for automation in electric power systems.; Implement the interface between OPAL-RT and HMI/SCADA software: connecting the real-time model or digital twin with the SCADA
-
.; The selected candidates will:; - Contribute to the software development, implementation, and validation of methods and tools in the VeriFixer project. ; 3. BRIEF PRESENTATION OF THE WORK PROGRAMME AND TRAINING
-
visualisation (libraries such as Three.js, OpenGL, VTK, or similar); - Advanced knowledge of optimisation algorithms; - Previous experience with software development for logistics problems; - In-depth experience
-
: - prepare the requirements specification for a software module that allows the use of pre-trained large language models (Large Language Model); - containerization and availability of trained models