TU Darmstadt / ULB / TUbiblio

i3PosNet: instrument pose estimation from X-ray in temporal bone surgery

Kügler, David ; Sehring, Jannik ; Stefanov, Andrei ; Stenin, Igor ; Kristin, Julia ; Klenzner, Thomas ; Schipper, Jörg ; Mukhopadhyay, Anirban (2020)
i3PosNet: instrument pose estimation from X-ray in temporal bone surgery.
In: International Journal of Computer Assisted Radiology and Surgery, 15 (7)
doi: 10.1007/s11548-020-02157-4
Artikel, Bibliographie

Kurzbeschreibung (Abstract)

PURPOSE:Accurate estimation of the position and orientation (pose) of surgical instruments is crucial for delicate minimally invasive temporal bone surgery. Current techniques lack in accuracy and/or line-of-sight constraints (conventional tracking systems) or expose the patient to prohibitive ionizing radiation (intra-operative CT). A possible solution is to capture the instrument with a c-arm at irregular intervals and recover the pose from the image. METHODS:i3PosNet infers the position and orientation of instruments from images using a pose estimation network. Said framework considers localized patches and outputs pseudo-landmarks. The pose is reconstructed from pseudo-landmarks by geometric considerations. RESULTS:We show i3PosNet reaches errors [Formula: see text] mm. It outperforms conventional image registration-based approaches reducing average and maximum errors by at least two thirds. i3PosNet trained on synthetic images generalizes to real X-rays without any further adaptation. CONCLUSION:The translation of deep learning-based methods to surgical applications is difficult, because large representative datasets for training and testing are not available. This work empirically shows sub-millimeter pose estimation trained solely based on synthetic training data.

Typ des Eintrags: Artikel
Erschienen: 2020
Autor(en): Kügler, David ; Sehring, Jannik ; Stefanov, Andrei ; Stenin, Igor ; Kristin, Julia ; Klenzner, Thomas ; Schipper, Jörg ; Mukhopadhyay, Anirban
Art des Eintrags: Bibliographie
Titel: i3PosNet: instrument pose estimation from X-ray in temporal bone surgery
Sprache: Englisch
Publikationsjahr: Juli 2020
Titel der Zeitschrift, Zeitung oder Schriftenreihe: International Journal of Computer Assisted Radiology and Surgery
Jahrgang/Volume einer Zeitschrift: 15
(Heft-)Nummer: 7
DOI: 10.1007/s11548-020-02157-4
URL / URN: https://doi.org/10.1007/s11548-020-02157-4
Kurzbeschreibung (Abstract):

PURPOSE:Accurate estimation of the position and orientation (pose) of surgical instruments is crucial for delicate minimally invasive temporal bone surgery. Current techniques lack in accuracy and/or line-of-sight constraints (conventional tracking systems) or expose the patient to prohibitive ionizing radiation (intra-operative CT). A possible solution is to capture the instrument with a c-arm at irregular intervals and recover the pose from the image. METHODS:i3PosNet infers the position and orientation of instruments from images using a pose estimation network. Said framework considers localized patches and outputs pseudo-landmarks. The pose is reconstructed from pseudo-landmarks by geometric considerations. RESULTS:We show i3PosNet reaches errors [Formula: see text] mm. It outperforms conventional image registration-based approaches reducing average and maximum errors by at least two thirds. i3PosNet trained on synthetic images generalizes to real X-rays without any further adaptation. CONCLUSION:The translation of deep learning-based methods to surgical applications is difficult, because large representative datasets for training and testing are not available. This work empirically shows sub-millimeter pose estimation trained solely based on synthetic training data.

Fachbereich(e)/-gebiet(e): 20 Fachbereich Informatik
20 Fachbereich Informatik > Graphisch-Interaktive Systeme
Hinterlegungsdatum: 26 Jun 2020 07:52
Letzte Änderung: 26 Jun 2020 07:52
PPN:
Export:
Suche nach Titel in: TUfind oder in Google
Frage zum Eintrag Frage zum Eintrag

Optionen (nur für Redakteure)
Redaktionelle Details anzeigen Redaktionelle Details anzeigen