Can academic medical centers compete in the AI arms race?

Must read

In the race to develop artificial intelligence in healthcare, academic medical centers face an uphill climb as they compete with the business community for talent.

Around 65% of people who graduated with a Ph.D. in AI in 2021 took a job in industry while 28% went into academia, according to an April report from researchers at Stanford University’s Institute for Human Centered Artificial Intelligence. The percentage of graduates choosing the two career paths was roughly even until 2011.

As the technology has gotten more popular, hiring for AI talent at companies has grown eightfold since 2006 while it’s stagnated in academia, according to a separate study from researchers at the Massachusetts Institute of Technology.

Tech leaders at academic medical centers say the trends favoring industry come at a critical time. The introduction of widespread generative AI large language models in late 2022 has led to a financial arms race in healthcare, with both investors and big tech firms spending big. Academic medical centers struggle to hire AI talent to develop unbiased research on medicine.

There were 22 significant machine learning models produced by industry in 2022 compared with just three produced by academia, according to the Stanford researchers.

“The movement from academia to industry is happening in parallel with the need for more AI computing power at academia,” said Dr. Lloyd Minor, dean of the Stanford School of Medicine in Palo Alto, California. “We don’t have the computing power at Stanford, or really any other single academic institution…that’s being used to develop [Google’s] Bard or [OpenAI’s] ChatGPT. “

In June, Stanford Medicine and the AI institute launched an initiative called Responsible AI for Safe and Equitable Health that aims to accelerate the development of safe, transparent and ethical research of AI in medicine. Stakeholders for the initiative will try to create a repository of AI-generated insights at Stanford and other academic medical centers and develop a structural framework for AI use, Minor said.

The initiative can also act a convener for academic medical center leaders who are trying to navigate the AI gold rush in healthcare, Minor said.

“I think we have to do a better job of communicating what’s rewarding and fulfilling” about working at an academic medical center, Minor said. “And we also have to constantly be aware that what motivated me when I made the decision many years ago to have a life in academic medical centers and research universities may not be the prime motivator for young people today.”

Minor said to attract that talent, academic medical centers need to be more understanding of the generational differences around flexible working environments and childcare.

Dennis Chornenky, a former White House advisor on AI under former President Donald Trump and President Joe Biden, joined Sacramento-based UC Davis Health in June as its first-ever chief AI advisor. His role includes establishing guardrails around AI adoption for safety and regulatory compliance. He also is looking at different ways to make an academic medical center like UC Davis an attractive landing spot for AI talent.

That includes expanding online educational resources to help people use AI more effectively, setting up an environment for developers to test and evaluate AI tools and forming partnerships with external networks.

“There’s an interesting and transformative capability that’s emerging in healthcare technology that I think we in the healthcare space need to work to make people, especially young people, become more aware of…it’s an opportunity to participate in this tremendous transformation,” Chornenky said.

Some academic medical centers have fewer resources to hire talent and are looking for help from larger institutions. State University of New York Downstate Medical Center in Brooklyn, New York, announced in June it is working with the University of Albany to use to develop, evaluate and use AI for mental health.

“They are the AI experts and they are the ones that are going to be developing the technology,” said David Christini, vice president for research at SUNY Downstate Medical Center. “And our subject matter experts will be applying the technology and guiding them in terms of what would be what would be useful.”

Christini isn’t surprised that most people with Ph.D.s in AI are joining companies as he’s seen a similar trend in biomedical sciences. But he noted that academic medical centers, which rely on government funding, offer a commitment to transparency that’s critical to AI development.

“We’re not bottom-line profit driven,” Christini said. “Whenever you have a black box tool such as AI in a commercial environment where people are protecting trade secrets, there can be the risk of things not being necessarily done in a way that is most advantageous to the patients and community.”

AI talent ideally should go to a mix of government and academic organizations and the private sector, said Elad Walach, CEO of AI imaging company Aidoc. The advancements OpenAI has made in generative AI came in large part because it has received billions of dollars in investments from Microsoft and others, he said.

“In my mind, there is definitely a room for academia to develop this,” Walach said. “That said, when it comes down to actually making into a real [product] and delivering value, I don’t think there’s any better model than industry.”

More articles

Latest article