Algorithms for Tissue Image Analysis using Multifractal Techniques

Type of content
Theses / Dissertations
Publisher's DOI/URI
Thesis discipline
Computer Science
Degree name
Master of Science
Publisher
University of Canterbury. Computer Science and Software Engineering
Journal Title
Journal ISSN
Volume Title
Language
Date
2012
Authors
Tay, ChiangHau
Abstract

Histopathological classification and grading of biopsy specimens play an important role in early cancer detection and prognosis. Nottingham Grading System (NGS) is one of the standard grading procedures used in breast cancer assessment, where three parameters, Mitotic Count (MC), Nuclear Pleomorphism (NP), and Tubule Formation (TF) are used for prognostic information. The grading takes into account the deviations in cellular structures and appearance between tumour and normal cells, using measures such as density, size, colour, and regularity. Cell structures in tissue images are also known to exhibit multifractal characteristics. This research focused on the multifractal properties of several graded biopsy specimens and analysed the dependency and variation of the fractal parameters with respect to the scores pre-assigned by pathologists. The effectiveness of using multifractal techniques on breast cancer grading was measured with a set of quantitative evaluations for MC, NP, and TF criteria. The developed method for MC scoring has obtained 82.87% true positive rate on detecting mitotic cells. Furthermore, the overall positive classification rates for NP and TF analysis were 67.38% and 71.82%, respectively, while obtaining 30.26% of false classification rate for NP analysis and 27.17% for TF analysis. The results have shown that multifractal formalism is a feasible and novel method that could be used for automatic grading of biopsy sections.

Description
Citation
Keywords
Breast cancer assessment, Multifractal spectra, Image analysis, Histopathologic classification, Feature detection, Cancer grading
Ngā upoko tukutuku/Māori subject headings
ANZSRC fields of research
Rights
Copyright ChiangHau Tay