Structured patch model for a unified automatic and interactive segmentation framework

Sang Hyun Park, Soochahn Lee, Il Dong Yun, Sang Uk Lee

Research output: Contribution to journalArticlepeer-review

6 Scopus citations

Abstract

We present a novel interactive segmentation framework incorporating a priori knowledge learned from training data. The knowledge is learned as a structured patch model (StPM) comprising sets of corresponding local patch priors and their pairwise spatial distribution statistics which represent the local shape and appearance along its boundary and the global shape structure, respectively. When successive user annotations are given, the StPM is appropriately adjusted in the target image and used together with the annotations to guide the segmentation. The StPM reduces the dependency on the placement and quantity of user annotations with little increase in complexity since the time-consuming StPM construction is performed offline. Furthermore, a seamless learning system can be established by directly adding the patch priors and the pairwise statistics of segmentation results to the StPM. The proposed method was evaluated on three datasets, respectively, of 2D chest CT, 3D knee MR, and 3D brain MR. The experimental results demonstrate that within an equal amount of time, the proposed interactive segmentation framework outperforms recent state-of-the-art methods in terms of accuracy, while it requires significantly less computing and editing time to obtain results with comparable accuracy.

Original languageEnglish
Pages (from-to)297-312
Number of pages16
JournalMedical Image Analysis
Volume24
Issue number1
DOIs
StatePublished - 29 Jan 2015

Bibliographical note

Publisher Copyright:
© 2015 Elsevier B.V.

Keywords

  • Adaptive prior
  • Incremental learning
  • Interactive segmentation
  • Markov random field
  • Structured patch model

Fingerprint

Dive into the research topics of 'Structured patch model for a unified automatic and interactive segmentation framework'. Together they form a unique fingerprint.

Cite this