The event took place from September to December 2020 and was attended by students from more than 35 universities. Their goal was to create minimal working examples, and the main topics for projects and classes were the IT sphere and modern technologies in agriculture.
A team from St. Petersburg Electrotechnical University “LETI” consisting of students of the Faculty of Information Measurement and Biotechnical Systems Daria Valenkova and Tatiana Kuznetsova, as well as students of the Faculty of Computer Science and Technology Maxim Nigmatulin, Alexander Ershov, Evgeny Shalugin, and Maxim Sobolev presented the Smart Cow Monitoring project. The coordinators are Dmitry Kaplun, Associate Professor at the Department of Automation and Control Processes, and Alexander Sinitsa, a graduate student at the department. The authors, who strive to digitalize agriculture, competed in the food track of the University 20.35 event.
The project makes cattle tracking less time-consuming and more cost-effective by automating the process and reducing the cost of veterinary services. The development will be useful for agricultural enterprises, individual farmers, and research institutes that study animal behavior.
To solve the task of monitoring animals using video for the FOODNET initiative, the team used machine learning and created a biometrics and computer vision system. The system utilizes cameras that record what is happening in the barn or pasture in real time and transmit the data to a computer recognizing cows’ behavior patterns using artificial intelligence. Based on the information received, the system determines the health status of the animal and, if necessary, generates recommendations and reports changes to the operator.
ETU “LETI” team members worked at the EkoNivaAgro agricultural holding in the Voronezh Region: they collected data and tested the solutions. They have already developed cow recognition and continuous tracking modules with an accuracy of up to 98%; a prototype interface is ready, and its UX-testing takes place. The developers have received datasets with information about cows’ diseases and are collecting video data.
“As part of the event, the team developed the prototype program at the hackathon, conducted user testing, and received feedback from experts from various fields.”Maxim Nigmatulin, a third-year student of the Faculty of Computer Science and Technology of ETU “LETI”
The project was one of the best at the University 20.35 event and took third place in the Agrotechnologies area. On December 23, ETU “LETI” students received diplomas and presents at the online awarding ceremony.
“The project received high praise and good feedback from experts, including representatives of the FOODNET initiative, industry, leading agricultural research organizations, and universities. That is especially gratifying if you consider that we started the project from scratch in September. In a few months, the students managed to do a great job, including preparing the conference paper. We plan to develop the project both commercially and scientifically. In particular, we are already writing an article together with a research group led by Professor Sarma from Gauhati University (India). Besides, I would like to mention the support the project is receiving from the AgTechInventum platform, jointly organized by the Association of Agricultural Equipment Dealers and the VDMA Agricultural Machinery Union.”Dmitry Kaplun, Associate Professor at the Department of Automation and Control Processes.
The developers plan to train a model to identify each cow, diagnose disease by recognizing behavioral patterns, and determine critical physiological indicators with leading scientists in veterinary informatics. Then they will integrate the solution with existing non-contact sensors and scale solutions to other animals.
Earlier, the project developed by students of St. Petersburg Electrotechnical University “LETI” was the best at the international online conference and pitch session for agricultural start-ups and received a grant of 250 thousand rubles.