Sort by
Refine Your Search
-
Listed
-
Program
-
Field
-
expertise in plant sciences, optical technologies, and data processing. The existing imaging Mueller polarimeter, which is sensitive to the microstructural properties of biological tissues, will serve as the
-
• Exploration of dielectric-based resonant and non-resonant nanophotonic architectures compatible with large-area fabrication • Demonstration of prototype metasurface devices combining optical selectivity
-
systems and Python programming - Skills in data processing and image analysis - Oral and written communication skills, ability to facilitate collaboration at the interface between physics, mathematics, and
-
modality for single-molecule localization. The main responsibilities will include: -designing and developing the optical instrumentation, -adapting and extending existing image and data processing tools
-
effects on the performance of an integrated circuit (IC). 2. Identification of aging and/or performance signatures that can be extracted using non-invasive and non-destructive methods, such as imaging
-
fundamental mechanisms of secretion • Develop and implement advanced imaging and cell biology approaches • Analyse quantitative data and interpret results • Communicate findings within the team and at
-
be to quantitatively compare the predictions made by the digital twins of the patients' faces with the post-operative reality. - Collection and processing of post-operative scanner imaging data
-
characterization of C and SiC fibers - Training on bench use (in particular heating techniques by Joule effect, laser diffraction, infrared imaging, pyrometry, preparation of micrometric samples, ...) - Technical
-
origins (bacterial, biological fluids, hybrid nanovectors, etc.). He/she will develop protocols for fluorescent labeling (e.g. DNA paint...), immunolabeling, immunoimmobilization, acquisition, processing
-
in a highly competitive international context. The project aims to prototype energy efficient solutions that will enable the HL-LHC and SKA to reliably process and analyze the huge volumes of raw data