-
of computational scaling and efficiency for large-scale HPC environments. Strong background in materials science with an emphasis on phase transformations and mechanical behavior of structural materials. Familiarity
-
to contribute to development of alloys with desirable advances in mechanical properties, thermal/electrical properties, and processability. A background in solidification processing, high pressure die casting
-
Requisition Id 15603 Overview: The National Center for Computational Sciences (NCCS) at the Oak Ridge National Laboratory (ORNL) is seeking a postdoctoral research associate in the area of HPC
-
.) to enable real-time process monitoring of the Directed Energy Deposition (DED) printing process. A background in sensors, instrumentation, and data analytics is preferred. Strong communication and program
-
, demand-flexible, and affordable buildings for the DOE Building Technologies Office (BTO), the Federal Energy Management Program (FEMP), and Office of State and Community Energy Program (SCEP). Major Duties
-
, water, carbon, and materials productivity throughout the U.S. economy and to identify opportunities for improvement. Through the Industrial Energy Efficiency Program, the MEERA Group develops a diverse
-
materials that may serve as model systems displaying quantum behaviors. It will also provide opportunities for collaboration with quantum computing efforts within the Quantum Science Center, guiding and
-
companies participating in the DOE’s Better Plants program. Deliver ORNL’s mission by aligning behaviors, priorities, and interactions with our core values of Impact, Integrity, Teamwork, Safety, and Service
-
program Deliver ORNL’s mission by aligning behaviors, priorities, and interactions with our core values of Impact, Integrity, Teamwork, Safety, and Service. Promote equal opportunity by fostering a
-
such as quantization, model pruning, approximate attention (linear and sparse) and proposing new mechanisms for tackling speed, accuracy, as well as energy issues, for large language mode (LLM) inferencing