当前位置: 首页 > 详情页

BrainMass: Advancing Brain Network Analysis for Diagnosis With Large-Scale Self-Supervised Learning

文献详情

资源类型:
WOS体系:

收录情况: ◇ SCIE

机构: [1]Harbin Inst Technol Shenzhen, Sch Elect & Informat Engn, Shenzhen 518000, Peoples R China [2]Peng Cheng Lab, Shenzhen 518066, Guangdong, Peoples R China [3]Harbin Inst Technol Shenzhen, Shenzhen 518057, Peoples R China [4]Tencent Data Platform, Shenzhen 518057, Peoples R China [5]Chinese Acad Sci, Shenzhen Inst Adv Technol, Paul C Lauterbur Res Ctr Biomed Imaging, Shenzhen 518000, Guangdong, Peoples R China [6]Capital Med Univ, Xuanwu Hosp, Beijing 100053, Peoples R China [7]Harbin Inst Technol Shenzhen, Int Res Inst Artificial Intelligence, Shenzhen 518000, Peoples R China
出处:
ISSN:

关键词: Brain modeling Task analysis Adaptation models Self-supervised learning Biological system modeling Data models Transformers brain network transformer large-scale pretrain

摘要:
Foundation models pretrained on large-scale datasets via self-supervised learning demonstrate exceptional versatility across various tasks. Due to the heterogeneity and hard-to-collect medical data, this approach is especially beneficial for medical image analysis and neuroscience research, as it streamlines broad downstream tasks without the need for numerous costly annotations. However, there has been limited investigation into brain network foundation models, limiting their adaptability and generalizability for broad neuroscience studies. In this study, we aim to bridge this gap. In particular, 1) we curated a comprehensive dataset by collating images from 30 datasets, which comprises 70,781 samples of 46,686 participants. Moreover, we introduce pseudo-functional connectivity (pFC) to further generates millions of augmented brain networks by randomly dropping certain timepoints of the BOLD signal; 2) we propose the BrainMass framework for brain network self-supervised learning via mask modeling and feature alignment. BrainMass employs Mask-ROI Modeling (MRM) to bolster intra-network dependencies and regional specificity. Furthermore, Latent Representation Alignment (LRA) module is utilized to regularize augmented brain networks of the same participant with similar topological properties to yield similar latent representations by aligning their latent embeddings. Extensive experiments on eight internal tasks and seven external brain disorder diagnosis tasks show BrainMass's superior performance, highlighting its significant generalizability and adaptability. Nonetheless, BrainMass demonstrates powerful few/zero-shot learning abilities and exhibits meaningful interpretation to various diseases, showcasing its potential use for clinical applications.

基金:
语种:
被引次数:
WOS:
PubmedID:
中科院(CAS)分区:
出版当年[2023]版:
大类 | 1 区 医学
小类 | 1 区 计算机:跨学科应用 1 区 工程:生物医学 1 区 工程:电子与电气 1 区 成像科学与照相技术 1 区 核医学
最新[2025]版:
大类 | 1 区 医学
小类 | 1 区 计算机:跨学科应用 1 区 工程:生物医学 1 区 工程:电子与电气 1 区 成像科学与照相技术 1 区 核医学
JCR分区:
出版当年[2022]版:
Q1 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS Q1 ENGINEERING, BIOMEDICAL Q1 ENGINEERING, ELECTRICAL & ELECTRONIC Q1 IMAGING SCIENCE & PHOTOGRAPHIC TECHNOLOGY Q1 RADIOLOGY, NUCLEAR MEDICINE & MEDICAL IMAGING
最新[2024]版:
Q1 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS Q1 ENGINEERING, BIOMEDICAL Q1 ENGINEERING, ELECTRICAL & ELECTRONIC Q1 IMAGING SCIENCE & PHOTOGRAPHIC TECHNOLOGY Q1 RADIOLOGY, NUCLEAR MEDICINE & MEDICAL IMAGING

影响因子: 最新[2024版] 最新五年平均 出版当年[2022版] 出版当年五年平均 出版前一年[2021版] 出版后一年[2023版]

第一作者:
第一作者机构: [1]Harbin Inst Technol Shenzhen, Sch Elect & Informat Engn, Shenzhen 518000, Peoples R China [2]Peng Cheng Lab, Shenzhen 518066, Guangdong, Peoples R China
共同第一作者:
通讯作者:
通讯机构: [1]Harbin Inst Technol Shenzhen, Sch Elect & Informat Engn, Shenzhen 518000, Peoples R China [2]Peng Cheng Lab, Shenzhen 518066, Guangdong, Peoples R China [7]Harbin Inst Technol Shenzhen, Int Res Inst Artificial Intelligence, Shenzhen 518000, Peoples R China
推荐引用方式(GB/T 7714):
APA:
MLA:

资源点击量:18035 今日访问量:0 总访问量:984 更新日期:2025-09-01 建议使用谷歌、火狐浏览器 常见问题

版权所有©2020 首都医科大学宣武医院 技术支持:重庆聚合科技有限公司 地址:北京市西城区长椿街45号宣武医院