Deep learning-based workflow for hip joint morphometric parameter measurement from CT images

计算机科学 组内相关 分割 地标 人工智能 工作流程 接头(建筑物) 皮尔逊积矩相关系数 深度学习 再现性 数学 数据库 统计 工程类 建筑工程
作者
Haoyu Zhai,Jin Huang,Lei Li,Hairong Tao,Jinwu Wang,Kang Li,Mingqi Shao,Xingwang Cheng,Jing Wang,Xiang Wu,Chuan Wu,Xiao Zhang,Hongkai Wang,Yan Xiong
出处
期刊:Physics in Medicine and Biology [IOP Publishing]
卷期号:68 (22): 225003-225003
标识
DOI:10.1088/1361-6560/ad04aa
摘要

Objective.Precise hip joint morphometry measurement from CT images is crucial for successful preoperative arthroplasty planning and biomechanical simulations. Although deep learning approaches have been applied to clinical bone surgery planning, there is still a lack of relevant research on quantifying hip joint morphometric parameters from CT images.Approach.This paper proposes a deep learning workflow for CT-based hip morphometry measurement. For the first step, a coarse-to-fine deep learning model is designed for accurate reconstruction of the hip geometry (3D bone models and key landmark points). Based on the geometric models, a robust measurement method is developed to calculate a full set of morphometric parameters, including the acetabular anteversion and inclination, the femoral neck shaft angle and the inclination, etc. Our methods were validated on two datasets with different imaging protocol parameters and further compared with the conventional 2D x-ray-based measurement method.Main results. The proposed method yields high bone segmentation accuracies (Dice coefficients of 98.18% and 97.85%, respectively) and low landmark prediction errors (1.55 mm and 1.65 mm) on both datasets. The automated measurements agree well with the radiologists' manual measurements (Pearson correlation coefficients between 0.47 and 0.99 and intraclass correlation coefficients between 0.46 and 0.98). This method provides more accurate measurements than the conventional 2D x-ray-based measurement method, reducing the error of acetabular cup size from over 2 mm to less than 1 mm. Moreover, our morphometry measurement method is robust against the error of the previous bone segmentation step. As we tested different deep learning methods for the prerequisite bone segmentation, our method produced consistent final measurement results, with only a 0.37 mm maximum inter-method difference in the cup size.Significance. This study proposes a deep learning approach with improved robustness and accuracy for pelvis arthroplasty planning.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
纪间完成签到,获得积分10
1秒前
lyang完成签到,获得积分10
2秒前
Cathy发布了新的文献求助10
3秒前
suusu完成签到,获得积分10
4秒前
呐呐完成签到,获得积分10
4秒前
5秒前
慕青应助Bonnie采纳,获得10
5秒前
小马甲应助孟浮尘采纳,获得10
5秒前
8秒前
10秒前
10秒前
11秒前
轩辕沛柔发布了新的文献求助10
12秒前
suusu发布了新的文献求助10
12秒前
LBX应助科研通管家采纳,获得30
12秒前
圆锥香蕉应助科研通管家采纳,获得20
12秒前
12秒前
12秒前
打打应助科研通管家采纳,获得10
12秒前
慕青应助科研通管家采纳,获得20
12秒前
12秒前
JamesPei应助科研通管家采纳,获得10
12秒前
英姑应助科研通管家采纳,获得10
12秒前
lan应助科研通管家采纳,获得30
12秒前
英俊的铭应助科研通管家采纳,获得10
12秒前
小马甲应助科研通管家采纳,获得10
12秒前
赘婿应助科研通管家采纳,获得10
13秒前
小二郎应助科研通管家采纳,获得10
13秒前
小蘑菇应助科研通管家采纳,获得10
13秒前
星辰大海应助科研通管家采纳,获得10
13秒前
pluto应助科研通管家采纳,获得10
13秒前
柏林寒冬应助科研通管家采纳,获得10
13秒前
13秒前
传奇3应助科研通管家采纳,获得10
13秒前
JamesPei应助科研通管家采纳,获得10
13秒前
Owen应助科研通管家采纳,获得10
13秒前
天天快乐应助科研通管家采纳,获得10
13秒前
13秒前
13秒前
Orange应助科研通管家采纳,获得10
13秒前
高分求助中
A new approach to the extrapolation of accelerated life test data 1000
Cognitive Neuroscience: The Biology of the Mind 1000
Technical Brochure TB 814: LPIT applications in HV gas insulated switchgear 1000
Immigrant Incorporation in East Asian Democracies 600
Nucleophilic substitution in azasydnone-modified dinitroanisoles 500
不知道标题是什么 500
A Preliminary Study on Correlation Between Independent Components of Facial Thermal Images and Subjective Assessment of Chronic Stress 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 冶金 细胞生物学 免疫学
热门帖子
关注 科研通微信公众号,转发送积分 3966458
求助须知:如何正确求助?哪些是违规求助? 3511940
关于积分的说明 11161056
捐赠科研通 3246726
什么是DOI,文献DOI怎么找? 1793483
邀请新用户注册赠送积分活动 874465
科研通“疑难数据库(出版商)”最低求助积分说明 804403