Mostra el registre d'ítem simple

dc.contributor.authorAvilés Rivero, Angélica
dc.contributor.authorAlsaleh, Samar M.
dc.contributor.authorCasals Gelpí, Alicia
dc.contributor.otherUniversitat Politècnica de Catalunya. Departament d'Enginyeria de Sistemes, Automàtica i Informàtica Industrial
dc.date.accessioned2018-03-07T08:20:46Z
dc.date.available2018-03-07T08:20:46Z
dc.date.issued2017
dc.identifier.citationAvilés, A., Alsaleh, S., Casals, A. Sight to touch: 3D diffeomorphic deformation recovery with mixture components for perceiving forces in robotic-assisted surgery. A: IEEE/RSJ International Conference on Intelligent Robots and Systems. "2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS): 24-28 Sept. 2017". Institute of Electrical and Electronics Engineers (IEEE), 2017, p. 160-165.
dc.identifier.isbn978-1-5386-2682-5
dc.identifier.urihttp://hdl.handle.net/2117/114875
dc.description.abstractRobotic-assisted minimally invasive surgical sys-tems suffer from one major limitation which is the lack of interaction forces feedback. The restricted sense of touch hinders the surgeons’ performance and reduces their dexterity and precision during a procedure. In this work, we present a sensory substitution approach that relies on visual stimuli to transmit the tool-tissue interaction forces to the operating surgeon. Our approach combines a 3D diffeomorphic defor-mation mapping with a generative model to precisely label the force level. The main highlights of our approach are that the use of diffeomorphic transformation ensures anatomical structure preservation and the label assignment is based on a parametric form of several mixture elements. We performed experimentations on both ex-vivo and in-vivo datasets and offer careful numerical results evaluating our approach. The results show that our solution has an error measure less than 1mm in all directions and an average labeling error of 2.05%. It can also be applicable to other scenarios that require force feedback such as microsurgery, knot tying or needle-based procedures.
dc.format.extent6 p.
dc.language.isoeng
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)
dc.subjectÀrees temàtiques de la UPC::Informàtica::Robòtica
dc.subject.lcshBiomechanics
dc.subject.lcshThree dimensional imaging in medicine
dc.subject.lcshRobotics in medicine
dc.subject.otherVision based force estimation
dc.subject.otherRobot assisted minimally invasive surgery
dc.subject.otherTopology preservation
dc.titleSight to touch: 3D diffeomorphic deformation recovery with mixture components for perceiving forces in robotic-assisted surgery
dc.typeConference report
dc.subject.lemacBiomecànica
dc.subject.lemacImatges tridimensionals en medicina
dc.subject.lemacRobòtica en medicina
dc.contributor.groupUniversitat Politècnica de Catalunya. GRINS - Grup de Recerca en Robòtica Intel·ligent i Sistemes
dc.identifier.doi10.1109/IROS.2017.8202152
dc.description.peerreviewedPeer Reviewed
dc.relation.publisherversionhttp://ieeexplore.ieee.org/document/8202152/
dc.rights.accessOpen Access
local.identifier.drac21992218
dc.description.versionPostprint (author's final draft)
local.citation.authorAvilés, A.; Alsaleh, S.; Casals, A.
local.citation.contributorIEEE/RSJ International Conference on Intelligent Robots and Systems
local.citation.publicationName2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS): 24-28 Sept. 2017
local.citation.startingPage160
local.citation.endingPage165


Fitxers d'aquest items

Thumbnail

Aquest ítem apareix a les col·leccions següents

Mostra el registre d'ítem simple