当前位置: 首页 > 详情页

PMotion: An advanced markerless pose estimation approach based on novel deep learning framework used to reveal neurobehavior

文献详情

资源类型:
WOS体系:
Pubmed体系:

收录情况: ◇ SCIE

机构: [1]Department of Neurology, Xuanwu Hospital, Capital Medical University, Beijing 100053, China. [2]Beijing Advanced Innovation Center for Intelligent Robot and System, Beijing Institute of Technology, Beijing 100871, China. [3]Department of Neurosurgery, Xuanwu Hospital Capital Medical University, Beijing 100053, China [4]Clinical Research Center for Epilepsy Capital Medical University, Beijing 100053, China [5]Beijing Municipal Geriatric Medical Research Center, Beijing 100053, China
出处:
ISSN:

关键词: PMotion deep learning framework pose estimation limb motion motion trajectory maps

摘要:
The evaluation of animals' motion behavior has played a vital role in neuromuscular biomedical research and clinical diagnostics, which reflects the changes caused by neuromodulation or neurodamage. Currently, the existing animal pose estimation methods are unreliable, unpractical, and inaccurate.Data augmentation (random scaling, random standard deviation Gaussian blur, random contrast, and random uniform color quantization) is adopted to augment image dataset. For the key points recognition, we present a novel efficient convolutional deep learning framework (PMotion), which combines modified ConvNext using multi-kernel feature fusion and self-defined stacked Hourglass block with SiLU activation function.PMotion is useful to predict the key points of dynamics of unmarked animal body joints in real time with high spatial precision. Gait quantification (step length, step height, and joint angle) was performed for the study of lateral lower limb movements with rats on a treadmill.The performance accuracy of PMotion on rat joint dataset was improved by 1.98, 1.46, and 0.55 pixels compared with deepposekit, deeplabcut, and stacked hourglass, respectively. This approach also may be applied for neurobehavioral studies of freely moving animals' behavior in challenging environments (e.g., Drosophila melanogaster and openfield-Pranav) with a high accuracy.© 2023 IOP Publishing Ltd.

基金:
语种:
WOS:
PubmedID:
中科院(CAS)分区:
出版当年[2022]版:
大类 | 2 区 工程技术
小类 | 3 区 神经科学 3 区 工程:生物医学
最新[2023]版:
大类 | 3 区 医学
小类 | 3 区 工程:生物医学 3 区 神经科学
JCR分区:
出版当年[2021]版:
Q2 ENGINEERING, BIOMEDICAL Q2 NEUROSCIENCES
最新[2023]版:
Q2 ENGINEERING, BIOMEDICAL Q2 NEUROSCIENCES

影响因子: 最新[2023版] 最新五年平均 出版当年[2021版] 出版当年五年平均 出版前一年[2020版] 出版后一年[2022版]

第一作者:
第一作者机构: [1]Department of Neurology, Xuanwu Hospital, Capital Medical University, Beijing 100053, China. [2]Beijing Advanced Innovation Center for Intelligent Robot and System, Beijing Institute of Technology, Beijing 100871, China.
共同第一作者:
通讯作者:
通讯机构: [3]Department of Neurosurgery, Xuanwu Hospital Capital Medical University, Beijing 100053, China [4]Clinical Research Center for Epilepsy Capital Medical University, Beijing 100053, China [5]Beijing Municipal Geriatric Medical Research Center, Beijing 100053, China
推荐引用方式(GB/T 7714):
APA:
MLA:

资源点击量:16409 今日访问量:0 总访问量:869 更新日期:2025-01-01 建议使用谷歌、火狐浏览器 常见问题

版权所有©2020 首都医科大学宣武医院 技术支持:重庆聚合科技有限公司 地址:北京市西城区长椿街45号宣武医院