Show simple item record

dc.contributor.authorMarbán González, Arturo
dc.contributor.authorSrinivasan, Vignesh
dc.contributor.authorSamek, Wojciech
dc.contributor.authorFernández Ruzafa, José
dc.contributor.authorCasals Gelpí, Alicia
dc.contributor.otherUniversitat Politècnica de Catalunya. Departament d'Enginyeria de Sistemes, Automàtica i Informàtica Industrial
dc.date.accessioned2019-02-15T13:30:32Z
dc.date.available2021-01-29T01:28:25Z
dc.date.issued2019-04
dc.identifier.citationMarbán, A. [et al.]. A recurrent convolutional neural network approach for sensorless force estimation in robotic surgery. "Biomedical signal processing and control", Abril 2019, vol. 50, p. 134-150.
dc.identifier.issn1746-8094
dc.identifier.urihttp://hdl.handle.net/2117/129232
dc.description.abstractProviding force feedback as relevant information in current Robot-Assisted Minimally Invasive Surgery systems constitutes a technological challenge due to the constraints imposed by the surgical environment. In this context, force estimation techniques represent a potential solution, enabling to sense the interaction forces between the surgical instruments and soft-tissues. Specifically, if visual feedback is available for observing soft-tissues’ deformation, this feedback can be used to estimate the forces applied to these tissues. To this end, a force estimation model, based on Convolutional Neural Networks and Long-Short Term Memory networks, is proposed in this work. This model is designed to process both, the spatiotemporal information present in video sequences and the temporal structure of tool data (the surgical tool-tip trajectory and its grasping status). A series of analyses are carried out to reveal the advantages of the proposal and the challenges that remain for real applications. This research work focuses on two surgical task scenarios, referred to as pushing and pulling tissue. For these two scenarios, different input data modalities and their effect on the force estimation quality are investigated. These input data modalities are tool data, video sequences and a combination of both. The results suggest that the force estimation quality is better when both, the tool data and video sequences, are processed by the neural network model. Moreover, this study reveals the need for a loss function, designed to promote the modeling of smooth and sharp details found in force signals. Finally, the results show that the modeling of forces due to pulling tasks is more challenging than for the simplest pushing actions.
dc.format.extent17 p.
dc.language.isoeng
dc.rightsAttribution-NonCommercial-NoDerivs 3.0 Spain
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/3.0/es/
dc.subjectÀrees temàtiques de la UPC::Informàtica::Robòtica
dc.subject.lcshNeural networks (Computer science)
dc.subject.lcshRobotics in medicine
dc.subject.otherRobotic surgery
dc.subject.otherForce estimation
dc.subject.otherConvolutional neural networks
dc.subject.otherLSTM networks
dc.titleA recurrent convolutional neural network approach for sensorless force estimation in robotic surgery
dc.typeArticle
dc.subject.lemacXarxes neuronals (Informàtica)
dc.subject.lemacRobòtica en medicina
dc.contributor.groupUniversitat Politècnica de Catalunya. GRINS - Grup de Recerca en Robòtica Intel·ligent i Sistemes
dc.identifier.doi10.1016/j.bspc.2019.01.011
dc.description.peerreviewedPeer Reviewed
dc.relation.publisherversionhttps://www.sciencedirect.com/science/article/abs/pii/S1746809419300114
dc.rights.accessOpen Access
local.identifier.drac23661537
dc.description.versionPostprint (author's final draft)
dc.relation.projectidinfo:eu-repo/grantAgreement/MINECO//DPI2015-70415-C2-1-R/ES/ESTRATEGIAS DISTRIBUIDAS DE CONTROL Y COOPERACION PERSONA-ROBOT EN ENTORNOS ASISTENCIALES/
local.citation.authorMarbán, A.; Srinivasan, V.; Samek, W.; Fernández, J.; Casals, A.
local.citation.publicationNameBiomedical signal processing and control
local.citation.volume50
local.citation.startingPage134
local.citation.endingPage150


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record