Back to R&D main

V.TEC.1731 - Video based automated cattle weighing system

Technology now exists to automatically measure cattle weight with vision systems.

Project start date: 03 May 2024
Project end date: 01 November 2025
Publication date: 28 January 2026
Project status: Completed
Livestock species: Grain-fed Cattle, Grass-fed Cattle
Relevant regions: National, International
Download Report

Summary

This project was undertaken to address the limitations of traditional cattle weighing methods, which require physical handling, are labour-intensive, and can cause stress to the animals. The objective was to develop an automated, non-intrusive, video-based system capable of estimating the weight of live cattle as they move naturally through a walking path or high visitation area, without requiring direct contact or disrupting their behaviour. The project involved five on-site data collection sessions at the Elizabeth Macarthur Agricultural Institute, where synchronised RGB video footage and ground-truth weight were collected from 1,685 cattle. Initially, depth cameras were used, but due to alignment issues and degradation over time, the system transitioned to an RGB-only approach for greater consistency and reliability.

Objectives

The primary objective of this project was to develop a non-intrusive, automated video-based system for estimating the liveweight of cattle, enabling practical deployment on commercial farms without disrupting animal movement. The key goals included:

  • design and implement an imaging system using RGB (and initially depth) cameras to capture top, side, and angled views of cattle as they walk through a monitoring zone
  • collect a comprehensive dataset of synchronised video footage and accurate ground-truth weight measurements from cattle in a realistic farm setting
  • develop and train AI models capable of accurately predicting cattle weight from single- and multi-view images, with particular emphasis on robustness and generalisation to new cattle
  • evaluate model performance using metrics such as Mean Absolute Error (MAE) and percentage error, and assess the trade-offs between model complexity and prediction accuracy
  • identify a commercially viable solution, emphasising configurations (like single-view top-down) that offer strong predictive performance with simpler, cost-effective hardware for practical field deployment.

Key findings

Through extensive data collection – spanning five farm visits and 1,685 multi-view samples – the project trained and evaluated multiple model configurations. The two-view model (top + side views) achieved the best overall performance, with a Mean Absolute Error (MAE) of 19.83kg and an average percentage error of 6.06%. Notably, the single-view (top-view) model also performed competitively, with a MAE of 23.19kg and offering a more practical setup for remote or resource-constrained deployments. The results confirmed the feasibility of using AI and computer vision for non-intrusive cattle monitoring, presenting a promising alternative to traditional weighing systems. The system’s ability to generalise to unseen cattle and operate under varied farm conditions supports its on-farm application in commercial settings.

Benefits to industry

The system operates using affordable, easy-to-deploy hardware, such as a single RGB camera and a lightweight computing device. Its non-intrusive nature ensures stress-free monitoring without the need for manual weighing, which is particularly beneficial for improving animal welfare and reducing labour demands. Once set up, the system can function autonomously, continuously capturing weight estimates as cattle move naturally through the camera’s field of view.


For practical farm applications, this tool provides significant utility. It allows producers to monitor individual animal growth trajectories, detect irregular patterns that may signal health or feeding issues, and make more informed decisions around nutrition, treatment, and breeding. Because the system does not require physical contact or specialised equipment, it is well-suited for farms in remote or resource-limited areas and can be scaled from small to large operations with minimal technical overhead.

MLA action

MLA is working with the project team to attract further funding to continue the development of the technology.

Future research

Future research will likely focus on determining the preferred camera set up to ensure maximum commercial applicability and extending data collection to bos taurus breeds to ensure accuracy across animal types.

More information

Project manager: Jack Cook
Contact email: reports@mla.com.au
Primary researcher: University of Technology Sydney