y0news
← Feed
Back to feed
🧠 AI NeutralImportance 5/10

An Empirical Investigation of Pre-Trained Deep Learning Model Reuse in the Scientific Process

arXiv – CS AI|Nicholas M. Synovic, Karolina Ryzka, Alessandra V. Vellucci Solari, Kenny Lyons, James C. Davis, George K. Thiruvathukal|
🤖AI Summary

Researchers conducted the first empirical study analyzing how natural scientists reuse pre-trained deep learning models across 17,511 peer-reviewed papers from 2000-2025. The study found that biochemistry and molecular biology lead in model reuse, with adaptation being the most common reuse pattern, primarily impacting the testing phase of scientific research.

Key Takeaways
  • Biochemistry, genetics and molecular biology fields show the highest adoption of pre-trained deep learning model reuse among natural sciences.
  • Adaptation reuse is the most prevalent pattern across all natural science fields studied.
  • The testing stage of the scientific process has been most significantly impacted by pre-trained model integration.
  • Scientists are increasingly leveraging computational methods for high-throughput, data-driven research approaches.
  • The study establishes baseline metrics for understanding how AI models are being integrated into scientific workflows.
Read Original →via arXiv – CS AI
Act on this with AI
Stay ahead of the market.
Connect your wallet to an AI agent. It reads balances, proposes swaps and bridges across 15 chains — you keep full control of your keys.
Connect Wallet to AI →How it works
Related Articles