软件部署
夹持器
计算机科学
机器视觉
计算机视觉
人工智能
工程类
机械工程
软件工程
作者
L. Yang,Ishara Paranawithana,Kamal Youcef‐Toumi,U-Xuan Tan
出处
期刊:IEEE Transactions on Automation Science and Engineering
[Institute of Electrical and Electronics Engineers]
日期:2017-10-23
卷期号:15 (4): 1609-1620
被引量:14
标识
DOI:10.1109/tase.2017.2754517
摘要
In this paper, an automatic vision-guided micromanipulation approach to facilitate versatile deployment and portable setup is proposed. This paper is motivated by the importance of micromanipulation and the limitations in existing automation technology in micromanipulation. Despite significant advancements in micromanipulation techniques, there remain bottlenecks in integrating and adopting automation for this application. An underlying reason for the gaps is the difficulty in deploying and setting up such systems. To address this, we identified two important design requirements, namely, portability and versatility of the micromanipulation platform. A self-contained vision-guided approach requiring no complicated preparation or setup is proposed. This is achieved through an uncalibrated self-initializing workflow algorithm also capable of assisted targeting. The feasibility of the solution is demonstrated on a low-cost portable microscope camera and compact actuated microstages. Results suggest subpixel accuracy in localizing the tool tip during initialization steps. The self-focus mechanism could recover intentional blurring of the tip by autonomously manipulating it 95.3% closer to the focal plane. The average error in visual servo is less than a pixel with our depth compensation mechanism showing better maintaining of similarity score in tracking. Cell detection rate in a 1637-frame video stream is 97.7% with subpixels localization uncertainty. Our work addresses the gaps in existing automation technology in the application of robotic vision-guided micromanipulation and potentially contributes to the way cell manipulation is performed.
科研通智能强力驱动
Strongly Powered by AbleSci AI