Back

Minisymposium Presentation

Challenges and Opportunities in Combining LLMs with Conventional Simulation Workflows

Tuesday, June 4, 2024
12:30
-
13:00
CEST
Climate, Weather and Earth Sciences
Climate, Weather and Earth Sciences
Climate, Weather and Earth Sciences
Chemistry and Materials
Chemistry and Materials
Chemistry and Materials
Computer Science and Applied Mathematics
Computer Science and Applied Mathematics
Computer Science and Applied Mathematics
Humanities and Social Sciences
Humanities and Social Sciences
Humanities and Social Sciences
Engineering
Engineering
Engineering
Life Sciences
Life Sciences
Life Sciences
Physics
Physics
Physics

Description

The effectiveness of AI for a variety of scientific tasks has rapidly improved over the last few years, changing the way that we can perform scientific workflows on HPC. Our recent work deploying a workflow around a Large-Language Model (LLM) for generating protein sequences is a good example of many of the opportunities and challenges. Embedding the LLM within a larger protein screening workflow enabled us to target simulations more effectively and find better sequences faster; but was not easily accomplished with conventional workflow tools. Interleaving AI predictions required expressing dynamic actions with the workflow application, the size of data being transferred around the workflow required adding a secondary data transfer fabric, and the evolving nature of the tasks being deployed on HPC required particular attention to caching elements of workflow tasks. We will discuss how we addressed these and other challenges in this presentation.

Authors