Abstract
BackgroundChest computed tomography (CT) remains the imaging standard for demonstrating cystic fibrosis (CF) airway structural diseasein vivo. However, visual scoring systems as an outcome measure are time consuming, require training and lack high reproducibility. Our objective was to validate a fully automated artificial intelligence (AI)-driven scoring system of CF lung disease severity.
方法Data were retrospectively collected in three CF reference centres, between 2008 and 2020, in 184 patients aged 4–54 years. An algorithm using three 2D convolutional neural networks was trained with 78 patients’ CT scans (23 530 CT slices) for the semantic labelling of bronchiectasis, peribronchial thickening, bronchial mucus, bronchiolar mucus and collapse/consolidation. 36 patients’ CT scans (11 435 CT slices) were used for testingversusground-truth labels. The method's clinical validity was assessed in an independent group of 70 patients with or without lumacaftor/ivacaftor treatment (n=10 and n=60, respectively) with repeat examinations. Similarity and reproducibility were assessed using the Dice coefficient, correlations using the Spearman test, and paired comparisons using the Wilcoxon rank test.
ResultsThe overall pixelwise similarity of AI-drivenversus地面真相标签很好(骰子0.71)。所有AI驱动的体积量化都与视觉成像评分(P <0.001)具有中等至良好的相关性,并且在肺功能测试中预测的1 s%的强迫呼气量(P <0.001)中都具有公平相关性。在Lumacaftor/ivacaftor的患者中测量了环节增厚(P = 0.005),支气管粘液(P = 0.005)和支气管粘液(P = 0.007)体积的显着降低。相反,没有Lumacaftor/Ivacaftor的患者增加了支气管扩张(P = 0.002)和周环增厚(P = 0.008)。可重复性几乎是完美的(骰子> 0.99)。
ConclusionAI allows fully automated volumetric quantification of CF-related modifications over an entire lung. The novel scoring system could provide a robust disease outcome in the era of effective CF transmembrane conductance regulator modulator therapy.
Abstract
Artificial intelligence allows a fully automated volumetric scoring system of lung structural abnormalities in CF using computed tomography. It could be used as a robust quantitative outcome to assess disease changes in the era of CFTR modulators.https://bit.ly/3hlxmnc
Footnotes
This study is registered atClinicalTrials.gov具有标识符号NCT04760548.
Author contributions: J. Macey, S. Bui and A.S. Brody enrolled patients and assessed clinical data. C.S. Hall wrote the artificial intelligence pipeline. G. Dournes, C.S. Hall, M.M. Willmering, F. Laurent, P. Berger, B. Denis de Senneville, I. Benlala and J.W. realised the data analysis, statistical analyses and figures/tables conception. G. Dournes, C.S. Hall, M.M. Willmering, F. Laurent, P. Berger, B. Denis de Senneville, I. Benlala, A.S. Brody and J.C. Woods wrote the manuscript (with significant contributions from G. Dournes, A.S. Brody and J.C. Woods). All authors read and approved the final manuscript.
Conflict of interest: G. Dournes reports an academic grant to spend a research programme in the USA from the French Society of Radiology and IdEx Bordeaux, for the submitted work; lecture payments from Margaux Orange, outside the submitted work.
Conflict of interest: C.S. Hall reports grants from Boehringer Ingelheim; lecture payment or honoraria from Boehringer Ingelheim and VIDA Diagnostics, outside the submitted work.
Conflict of interest: M.M. Willmering has nothing to disclose.
Conflict of interest: A.S. Brody has nothing to disclose.
Conflict of interest: J. Macey has nothing to disclose.
Conflict of interest: S. Bui has nothing to disclose.
Conflict of interest: B. Denis de Senneville has nothing to disclose.
Conflict of interest: P. Berger has nothing to disclose.
Conflict of interest: F. Laurent reports technical support to conduct lung magnetic resonance imaging research in cystic fibrosis from Siemens Healthineers, outside the submitted work.
Conflict of interest: I. Benlala has nothing to disclose.
利益冲突:室内外J.C.森林报告tor-initiated support and consulting fees from Vertex Pharmaceuticals, outside the submitted work.
Support statement: Gael Dournes received academic funding from the IdEx (grant ANR-10-IDEX-03-02) and the French Society of Radiology (grant Alain Rahmouni 2019–2020). Funding information for this article has been deposited with theCrossref Funder Registry.
- ReceivedMarch 22, 2021.
- AcceptedJuly 2, 2021.
- Copyright ©The authors 2022. For reproduction rights and permissions contactpermissions{at}ersnet.org