dc.contributor.author | Coll Ribes, Gabriel |
dc.contributor.author | Torres Rodriguez, Ivan Jesús |
dc.contributor.author | Grau Saldes, Antoni |
dc.contributor.author | Guerra Paradas, Edmundo |
dc.contributor.author | Sanfeliu Cortés, Alberto |
dc.contributor.other | Universitat Politècnica de Catalunya. Departament d'Enginyeria de Sistemes, Automàtica i Informàtica Industrial |
dc.date.accessioned | 2023-11-30T13:51:15Z |
dc.date.available | 2023-11-30T13:51:15Z |
dc.date.issued | 2023-12 |
dc.identifier.citation | Coll, G. [et al.]. Accurate detection and depth estimation of table grapes and peduncles for robot harvesting, combining monocular depth estimation and CNN methods. "Computers and electronics in agriculture", Desembre 2023, vol. 215, núm. 108362. |
dc.identifier.issn | 1872-7107 |
dc.identifier.uri | http://hdl.handle.net/2117/397425 |
dc.description.abstract | Precision agriculture is a growing field in the agricultural industry and it holds great potential in fruit and vegetable harvesting. In this work, we present a robust accurate method for the detection and localization of the peduncle of table grapes, with direct implementation in automatic grape harvesting with robots. The bunch and peduncle detection methods presented in this work rely on a combination of instance segmentation and monocular depth estimation using Convolutional Neural Networks (CNN). Regarding depth estimation, we propose a combination of different depth techniques that allow precise localization of the peduncle using traditional stereo cameras, even with the particular complexity of grape peduncles. The methods proposed in this work have been tested on the WGISD (Embrapa Wine Grape Instance Segmentation) dataset, improving the results of state-of-the-art techniques. Furthermore, within the context of the EU project CANOPIES, the methods have also been tested on a dataset of 1,326 RGB-D images of table grapes, recorded at the Corsira Agricultural Cooperative Society (Aprilia, Italy), using a Realsense D435i camera located at the arm of a CANOPIES two-manipulator robot developed in the project. The detection results on the WGISD dataset show that the use of RGB-D information () leads to superior performance compared to the use of RGB data alone (). This trend is also evident in the CANOPIES Grape Bunch and Peduncle dataset, where the mAP for RGB-D images () outperforms that of RGB data (). Regarding depth estimation, our method achieves a mean squared error of 2.66 cm within a distance of 1 m in the CANOPIES dataset. |
dc.language.iso | eng |
dc.publisher | Elsevier |
dc.rights | Attribution-NonCommercial-NoDerivatives 4.0 International |
dc.rights.uri | http://creativecommons.org/licenses/by-nc-nd/4.0/ |
dc.subject | Àrees temàtiques de la UPC::Informàtica::Robòtica |
dc.subject.lcsh | Agriculture--Automation |
dc.subject.other | Image segmentation |
dc.subject.other | Monocular depth |
dc.subject.other | Grape bunch and peduncle detection |
dc.subject.other | Grape bunch and peduncle depth estimation |
dc.subject.other | Robot harvesting |
dc.title | Accurate detection and depth estimation of table grapes and peduncles for robot harvesting, combining monocular depth estimation and CNN methods |
dc.type | Article |
dc.subject.lemac | Agricultura--Automatització |
dc.contributor.group | Universitat Politècnica de Catalunya. VIS - Visió Artificial i Sistemes Intel·ligents |
dc.identifier.doi | 10.1016/j.compag.2023.108362 |
dc.description.peerreviewed | Peer Reviewed |
dc.relation.publisherversion | https://www.sciencedirect.com/science/article/pii/S0168169923007500 |
dc.rights.access | Open Access |
local.identifier.drac | 37737718 |
dc.description.version | Postprint (published version) |
local.citation.author | Coll, G.; Torres , I.; Grau, A.; Guerra, E.; Sanfeliu, A. |
local.citation.publicationName | Computers and electronics in agriculture |
local.citation.volume | 215 |
local.citation.number | 108362 |