Show simple item record

dc.contributor.authorColl Ribes, Gabriel
dc.contributor.authorTorres Rodriguez, Ivan Jesús
dc.contributor.authorGrau Saldes, Antoni
dc.contributor.authorGuerra Paradas, Edmundo
dc.contributor.authorSanfeliu Cortés, Alberto
dc.contributor.otherUniversitat Politècnica de Catalunya. Departament d'Enginyeria de Sistemes, Automàtica i Informàtica Industrial
dc.date.accessioned2023-11-30T13:51:15Z
dc.date.available2023-11-30T13:51:15Z
dc.date.issued2023-12
dc.identifier.citationColl, G. [et al.]. Accurate detection and depth estimation of table grapes and peduncles for robot harvesting, combining monocular depth estimation and CNN methods. "Computers and electronics in agriculture", Desembre 2023, vol. 215, núm. 108362.
dc.identifier.issn1872-7107
dc.identifier.urihttp://hdl.handle.net/2117/397425
dc.description.abstractPrecision agriculture is a growing field in the agricultural industry and it holds great potential in fruit and vegetable harvesting. In this work, we present a robust accurate method for the detection and localization of the peduncle of table grapes, with direct implementation in automatic grape harvesting with robots. The bunch and peduncle detection methods presented in this work rely on a combination of instance segmentation and monocular depth estimation using Convolutional Neural Networks (CNN). Regarding depth estimation, we propose a combination of different depth techniques that allow precise localization of the peduncle using traditional stereo cameras, even with the particular complexity of grape peduncles. The methods proposed in this work have been tested on the WGISD (Embrapa Wine Grape Instance Segmentation) dataset, improving the results of state-of-the-art techniques. Furthermore, within the context of the EU project CANOPIES, the methods have also been tested on a dataset of 1,326 RGB-D images of table grapes, recorded at the Corsira Agricultural Cooperative Society (Aprilia, Italy), using a Realsense D435i camera located at the arm of a CANOPIES two-manipulator robot developed in the project. The detection results on the WGISD dataset show that the use of RGB-D information () leads to superior performance compared to the use of RGB data alone (). This trend is also evident in the CANOPIES Grape Bunch and Peduncle dataset, where the mAP for RGB-D images () outperforms that of RGB data (). Regarding depth estimation, our method achieves a mean squared error of 2.66 cm within a distance of 1 m in the CANOPIES dataset.
dc.language.isoeng
dc.publisherElsevier
dc.rightsAttribution-NonCommercial-NoDerivatives 4.0 International
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/
dc.subjectÀrees temàtiques de la UPC::Informàtica::Robòtica
dc.subject.lcshAgriculture--Automation
dc.subject.otherImage segmentation
dc.subject.otherMonocular depth
dc.subject.otherGrape bunch and peduncle detection
dc.subject.otherGrape bunch and peduncle depth estimation
dc.subject.otherRobot harvesting
dc.titleAccurate detection and depth estimation of table grapes and peduncles for robot harvesting, combining monocular depth estimation and CNN methods
dc.typeArticle
dc.subject.lemacAgricultura--Automatització
dc.contributor.groupUniversitat Politècnica de Catalunya. VIS - Visió Artificial i Sistemes Intel·ligents
dc.identifier.doi10.1016/j.compag.2023.108362
dc.description.peerreviewedPeer Reviewed
dc.relation.publisherversionhttps://www.sciencedirect.com/science/article/pii/S0168169923007500
dc.rights.accessOpen Access
local.identifier.drac37737718
dc.description.versionPostprint (published version)
local.citation.authorColl, G.; Torres , I.; Grau, A.; Guerra, E.; Sanfeliu, A.
local.citation.publicationNameComputers and electronics in agriculture
local.citation.volume215
local.citation.number108362


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record