跛足
挤奶
钥匙(锁)
步态
计算机科学
人工智能
动物福利
奶牛
自动化
机器学习
物理医学与康复
工程类
医学
动物科学
计算机安全
生态学
生物
机械工程
外科
作者
Marjaneh Taghavi,Helena Russello,W. Ouweltjes,C. Kamphuis,Ines Adriaens
标识
DOI:10.3168/jds.2023-23680
摘要
Lameness in dairy cattle is a costly and highly prevalent problem that impacts all aspects of sustainable dairy production, including animal welfare. Automation of gait assessment would allow monitoring of locomotion in which the cows' walking pattern can be evaluated frequently and with limited labor. With the right interpretation algorithms, this could result in more timely detection of locomotion problems. This in turn would facilitate timely intervention and early treatment which is crucial to reduce the impact of abnormal behavior and pain on animal welfare. Gait features of dairy cows can potentially be derived from key points that locate crucial anatomical points on a cows' body. The aim of this study is 2-fold: (1) demonstrate automation of the detection of dairy cows' key points in a practical indoor setting with natural occlusions from gates and races, and (2) propose the necessary steps to post process these key points to make them suitable for subsequent gait feature calculations. Both the automated detection of key points as well as the post-processing of them are crucial prerequisites for camera-based automated locomotion monitoring in a real farm environment. Side-view video footage of 34 Holstein Friesian dairy cows, captured when exiting the milking parlor, were used for model development. From these videos, 758 samples of 2 successive frames were extracted. A previously developed deep learning model called T-LEAP was trained to detect 17 key points on cows in our indoor farm environment with natural occlusions. To this end, the data set of 758 samples was randomly split into a train (n = 22 cows; no. of samples = 388), validation (n = 7 cows; no. of samples = 108), and test data set (n = 15 cows; no. of samples = 262). The performance of T-LEAP to automatically assign key points in our indoor situation was assessed using the average percentage of correctly detected key points using a threshold of 0.2 of the head length (PCKh@0.2). The model's performance on the test set achieved a good result with PCKh@0.2: 89% on all 17 key points together. Detecting key points on the back (n = 3 key points) of the cow had the poorest performance PCKh@0.2: 59%. In addition to the indoor performance of the model, a more detailed study of the detection performance was conducted to formulate post-processing steps necessary to use these key points for gait feature calculations and subsequent automated locomotion monitoring. This detailed study included the evaluation of the detection performance in multiple directions. This study revealed that the performance of the key points on a cows' back were the poorest in the horizontal direction. Based on this more in-depth study, we recommend the implementation of the outlined post-processing techniques to address the following issues: (1) correcting camera distortion, (2) rectifying erroneous key point detection, and (3) establishing the necessary procedures for translating hoof key points into gait features.
科研通智能强力驱动
Strongly Powered by AbleSci AI