The US Food and Drug Administration’s (FDA) Center for Drug Evaluation and Research (CDER) is bringing together the agency’s artificial intelligence (AI)-based activities into one domain by establishing an AI council.
CDER Director Patrizia Cavazzoni announced that the AI council would consolidate the FDA’s three existing AI groups — the CDER AI Steering Committee, AI Policy Working Group and CDER AI Community of Practice.
In an email to FDA staff, Cavazzoni noted that the number of regulatory submissions to CDER with AI components has been rapidly rising, saying, “the scope and impact of AI use in drug development are expanding.”
The AI council is in line with the federal government’s aims to ensure the safety and security of machine learning (ML) software through its AI Community of Practice program.
The government announced the launch of the AI Safety Institute under the National Institute of Standards and Technology (NIST) last year, which was developed to create standards and technical guidance for biotech R&D and regulatory rulemaking.
XTALKS WEBINAR: Leveraging AI and Automation to Streamline Chart Reviews, Identify Treatment Options, and Prove Service Value
Live and On-Demand: Wednesday, October 2, 2024, at 1pm EDT (10am PDT)
Register for this free webinar to learn how AI can be applied to streamline clinical care, improve treatment outcomes and prove the value of healthcare services to stakeholders.
The AI Council will serve as an interdisciplinary body, bringing together experts from various fields, including regulatory science, data science, ethics and clinical practice. The council’s primary mission is to develop a framework for the oversight of AI and ML technologies in drug development and healthcare, ensuring they meet the rigorous standards required for therapies and medical devices.
Cavazzoni referred to the AI Council as a “decisional body” that would be responsible for ensuring the consistency of internal and external AI-related activities, such as the standards used to evaluate drug safety, effectiveness and quality.
Internally, the group will advise on the appropriate use of AI within the FDA, such as for talent hiring or to create algorithms and models for data-processing. It will also lead the creation of policy initiatives for incorporating AI into regulatory decision-making.
“While multiple CDER groups have previously been involved in AI initiatives spanning policy, regulatory and technology efforts, the changing external and federal environments for AI have brought new governance needs,” wrote Cavazzoni in the email.
“The advent of novel AI applications, such as generative AI and large language models, is likely to expand opportunities for use of AI within CDER, including by non-technical staff. This will require increased AI education and coordination,” noted Cavazzoni. “There has also been a rapid increase in CDER regulatory submissions incorporating AI, and the scope and impact of AI use in drug development are expanding.”
The council will be co-led by Tala Fakhouri, PhD, associate director for data science and AI policy; Qi Liu, PhD, associate director for innovation and partnership in the Office of Clinical Pharmacology; and Sri Mantha, director of the Office of Strategic Programs. It will include representatives from all CDER offices.
Related: FDA’s New Rare Disease Innovation Hub to Elevate Patient Care
One of the council’s first tasks will be to refine the FDA’s existing regulatory pathways for AI and ML-based products. This includes updating guidelines on premarket submissions, post-market surveillance and continuous learning systems where AI can evolve through real-world data.
Cavazzoni said the new council “will soon conduct an inventory of all CDER AI projects, expand tracking of AI efforts in CDER and establish a centralized process through which offices will be required to seek consultation and support for deployment of all future AI projects” while aiming to offer a consistent voice in the drug center’s communications to the public and the industry on AI issues.
She also said the council will carry out “a landscape assessment to better understand CDER staff’s knowledge of and application of AI in their daily work.”
One of the most significant challenges the FDA’s AI Council will face is balancing the need to foster innovation with the imperative to protect public health. The rapid pace of AI development means that regulatory frameworks must be flexible and adaptive, allowing for the safe and timely introduction of new technologies. At the same time, the council must ensure that these innovations do not bypass critical safety checks.
If you want your company to be featured on Xtalks.com, please email [email protected].
Join or login to leave a comment
JOIN LOGIN