科學家開發出一種基於深度學習的生物醫學圖像分割方法
作者:
小柯機器人發布時間:2020/12/9 13:32:53
德國海德堡大學Klaus H. Maier-Hein課題組開發出一種基於深度學習的生物醫學圖像分割方法。2020年12月7日,《自然—方法學》雜誌在線發表了這項成果。
研究人員開發了nnU-Net,這是一種基於深度學習的細分方法,可以自動進行自我配置,包括針對任何新任務的預處理、網絡架構、訓練和後處理。此過程中的關鍵設計選擇被建模為一組固定參數、相互依賴的規則和經驗性決策。無需人工幹預,nnU-Net可以超越大多數現有方法,包括對國際生物醫學分割競賽中使用的23個公共數據集提供高度專業化的解決方案。研究人員將nnU-Net作為一個開箱即用的工具公開提供,通過不需要標準網絡培訓之外的專業知識或計算資源,即可讓廣大研究者使用。
據介紹,生物醫學成像是科學發現的驅動力,也是醫療保健的核心組成部分,並且受到深度學習領域的影響。儘管語義分割算法可以在許多應用程式中進行圖像分析和量化,但是各個專用解決方案的設計並非易事,並且高度依賴於數據集屬性和硬體條件。
附:英文原文
Title: nnU-Net: a self-configuring method for deep learning-based biomedical image segmentation
Author: Fabian Isensee, Paul F. Jaeger, Simon A. A. Kohl, Jens Petersen, Klaus H. Maier-Hein
Issue&Volume: 2020-12-07
Abstract: Biomedical imaging is a driver of scientific discovery and a core component of medical care and is being stimulated by the field of deep learning. While semantic segmentation algorithms enable image analysis and quantification in many applications, the design of respective specialized solutions is non-trivial and highly dependent on dataset properties and hardware conditions. We developed nnU-Net, a deep learning-based segmentation method that automatically configures itself, including preprocessing, network architecture, training and post-processing for any new task. The key design choices in this process are modeled as a set of fixed parameters, interdependent rules and empirical decisions. Without manual intervention, nnU-Net surpasses most existing approaches, including highly specialized solutions on 23 public datasets used in international biomedical segmentation competitions. We make nnU-Net publicly available as an out-of-the-box tool, rendering state-of-the-art segmentation accessible to a broad audience by requiring neither expert knowledge nor computing resources beyond standard network training. nnU-Net is a deep learning-based image segmentation method that automatically configures itself for diverse biological and medical image segmentation tasks. nnU-Net offers state-of-the-art performance as an out-of-the-box tool.
DOI: 10.1038/s41592-020-01008-z
Source: https://www.nature.com/articles/s41592-020-01008-z