At Asphalion, we are strongly committed to driving digitalization and integrating innovative tools to enhance our work. We believe that the thoughtful and responsible use of digital solutions, such as Artificial Intelligence, can significantly help improve project outcomes. That’s why we continuously explore new automation strategies to optimize our processes.
Continuing our revision of the different key authorities shaping the future of AI in healthcare is the European Medicines Agency (EMA). Recently, the EMA shared important guidance on “Guiding principles on the use of large language models in regulatory science and for medicines regulatory activities”.
The document highlights both the potential and pitfalls of these tools. LLMs can assist with language generation, summarization, translation, education, coding support, and data-driven automation. At the same time, they carry risks such as inaccuracies, privacy concerns, and ethical implications like bias or misinformation.
The EMA emphasizes that users must understand the tools they’re working with, avoid sharing sensitive information, verify outputs carefully, and stay informed about evolving best practices. Organizations are encouraged to define clear governance policies, train their teams, and actively monitor how these technologies are being used.
Read the whole document here: https://bit.ly/4lcqQlo
If you require any help, you can contact us at: [email protected]