计算机科学
可视化
覆盖
计算机视觉
工作流程
管道(软件)
像素
人工智能
计算机图形学(图像)
虚拟现实
增强现实
数据库
程序设计语言
作者
Zaid Abbas Al‐Sabbag,Chul Min Yeum,Sriram Narasimhan
标识
DOI:10.1016/j.aei.2021.101473
摘要
In this study, a new visual inspection method that can interactively detect and quantify structural defects using an Extended Reality (XR) device (headset) is proposed. The XR device, which is at the core of this method, supports an interactive environment using a holographic overlay of graphical information on the spatial environment and physical objects being inspected. By leveraging this capability, a novel XR-supported inspection pipeline, called eXtended Reality-based Inspection and Visualization (XRIV), is developed. Key tasks supported by this method include detecting visual damage from sensory data acquired by the XR device, estimating its size, and visualizing (overlaying) information on the spatial environment. The crucial step of real-time interactive segmentation—detection and pixel-wise damage boundary refinement—is achieved using a feature Back-propagating Refinement Scheme (f-BRS) algorithm. Then, a ray-casting algorithm is applied to back-project the 2D image pixel coordinates of the damage region to their 3D world coordinates for damage area quantification in real-world (physical) units. Finally, the area information is overlaid and anchored to the scene containing damage for visualization and documentation. The performance of XRIV is experimentally demonstrated by measuring surface structural damage of an in-service concrete bridge with less than 10% errors for two different test cases, and image processing latency of 2–3 s (or 0.5 s per seed point) from f-BRS. The proposed XRIV pipeline underscores the advantages of real-time interaction between expert users and the XR device through immersive visualization so that a human–machine collaborative workflow can be established to obtain better inspection outcomes in terms of accuracy and robustness.
科研通智能强力驱动
Strongly Powered by AbleSci AI