Lymph Node Gross Tumor Volume Detection in Oncology Imaging via
Relationship Learning Using Graph Neural Network
- URL: http://arxiv.org/abs/2008.13013v1
- Date: Sat, 29 Aug 2020 16:59:23 GMT
- Title: Lymph Node Gross Tumor Volume Detection in Oncology Imaging via
Relationship Learning Using Graph Neural Network
- Authors: Chun-Hung Chao, Zhuotun Zhu, Dazhou Guo, Ke Yan, Tsung-Ying Ho,
Jinzheng Cai, Adam P. Harrison, Xianghua Ye, Jing Xiao, Alan Yuille, Min Sun,
Le Lu, Dakai Jin
- Abstract summary: We propose a unified LN appearance and inter-LN relationship learning framework to detect the true GTV$_LN$.
The proposed method significantly improves over the state-of-the-art (SOTA) LN classification method by $5.5%$ and $13.1%$ in F1 score.
- Score: 37.88742249495087
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Determining the spread of GTV$_{LN}$ is essential in defining the respective
resection or irradiating regions for the downstream workflows of surgical
resection and radiotherapy for many cancers. Different from the more common
enlarged lymph node (LN), GTV$_{LN}$ also includes smaller ones if associated
with high positron emission tomography signals and/or any metastasis signs in
CT. This is a daunting task. In this work, we propose a unified LN appearance
and inter-LN relationship learning framework to detect the true GTV$_{LN}$.
This is motivated by the prior clinical knowledge that LNs form a connected
lymphatic system, and the spread of cancer cells among LNs often follows
certain pathways. Specifically, we first utilize a 3D convolutional neural
network with ROI-pooling to extract the GTV$_{LN}$'s instance-wise appearance
features. Next, we introduce a graph neural network to further model the
inter-LN relationships where the global LN-tumor spatial priors are included in
the learning process. This leads to an end-to-end trainable network to detect
by classifying GTV$_{LN}$. We operate our model on a set of GTV$_{LN}$
candidates generated by a preliminary 1st-stage method, which has a sensitivity
of $>85\%$ at the cost of high false positive (FP) ($>15$ FPs per patient). We
validate our approach on a radiotherapy dataset with 142 paired PET/RTCT scans
containing the chest and upper abdominal body parts. The proposed method
significantly improves over the state-of-the-art (SOTA) LN classification
method by $5.5\%$ and $13.1\%$ in F1 score and the averaged sensitivity value
at $2, 3, 4, 6$ FPs per patient, respectively.
Related papers
- Integrating features from lymph node stations for metastatic lymph node
detection [21.259023907494395]
It is desired to leverage recent development in deep learning to automatically detect metastatic LNs.
We introduce an additional branch to leverage information about LN stations, an important reference for radiologists during metastatic LN diagnosis.
We validate our method on a dataset containing 114 intravenous contrast-enhanced Computed Tomography (CT) images of oral squamous cell carcinoma (O SCC) patients.
arXiv Detail & Related papers (2023-01-09T08:35:58Z) - A deep local attention network for pre-operative lymph node metastasis
prediction in pancreatic cancer via multiphase CT imaging [22.57399272278884]
We propose a fully-automated LN segmentation and identification network to directly facilitate the LN metastasis status prediction task.
We explore the anatomical spatial context priors of pancreatic LN locations by generating a guiding attention map from related organs and vessels.
We develop a LN metastasis status prediction network that combines the patient-wise aggregation results of LN segmentation/identification and deep imaging features extracted from the tumor region.
arXiv Detail & Related papers (2023-01-04T05:14:31Z) - Moving from 2D to 3D: volumetric medical image classification for rectal
cancer staging [62.346649719614]
preoperative discrimination between T2 and T3 stages is arguably both the most challenging and clinically significant task for rectal cancer treatment.
We present a volumetric convolutional neural network to accurately discriminate T2 from T3 stage rectal cancer with rectal MR volumes.
arXiv Detail & Related papers (2022-09-13T07:10:14Z) - Localizing the Recurrent Laryngeal Nerve via Ultrasound with a Bayesian
Shape Framework [65.19784967388934]
Tumor infiltration of the recurrent laryngeal nerve (RLN) is a contraindication for robotic thyroidectomy and can be difficult to detect via standard laryngoscopy.
We propose a knowledge-driven framework for RLN localization, mimicking the standard approach surgeons take to identify the RLN according to its surrounding organs.
Experimental results indicate that the proposed method achieves superior hit rates and substantially smaller distance errors compared with state-of-the-art methods.
arXiv Detail & Related papers (2022-06-30T13:04:42Z) - CNN Filter Learning from Drawn Markers for the Detection of Suggestive
Signs of COVID-19 in CT Images [58.720142291102135]
We propose a method that does not require either large annotated datasets or backpropagation to estimate the filters of a convolutional neural network (CNN)
For a few CT images, the user draws markers at representative normal and abnormal regions.
The method generates a feature extractor composed of a sequence of convolutional layers, whose kernels are specialized in enhancing regions similar to the marked ones.
arXiv Detail & Related papers (2021-11-16T15:03:42Z) - Controlling False Positive/Negative Rates for Deep-Learning-Based
Prostate Cancer Detection on Multiparametric MR images [58.85481248101611]
We propose a novel PCa detection network that incorporates a lesion-level cost-sensitive loss and an additional slice-level loss based on a lesion-to-slice mapping function.
Our experiments based on 290 clinical patients concludes that 1) The lesion-level FNR was effectively reduced from 0.19 to 0.10 and the lesion-level FPR was reduced from 1.03 to 0.66 by changing the lesion-level cost.
arXiv Detail & Related papers (2021-06-04T09:51:27Z) - Esophageal Tumor Segmentation in CT Images using Dilated Dense Attention
Unet (DDAUnet) [3.0929226049096217]
We present a fully automatic end-to-end esophageal tumor segmentation method based on convolutional neural networks (CNNs)
The proposed network, called Dilated Dense Attention Unet (DDAUnet), leverages spatial and channel attention in each dense block to selectively concentrate on determinant feature maps and regions.
arXiv Detail & Related papers (2020-12-06T11:42:52Z) - Lymph Node Gross Tumor Volume Detection and Segmentation via
Distance-based Gating using 3D CT/PET Imaging in Radiotherapy [18.958512013804462]
We propose an effective distance-based gating approach to simulate and simplify the high-level reasoning protocols conducted by radiation oncologists.
A novel multi-branch detection-by-segmentation network is trained with each branch specializing on learning one GTVLN category features.
Our results validate significant improvements on the mean recall from $72.5%$ to $78.2%$, as compared to previous state-of-the-art work.
arXiv Detail & Related papers (2020-08-27T00:37:50Z) - Detecting Scatteredly-Distributed, Small, andCritically Important
Objects in 3D OncologyImaging via Decision Stratification [23.075722503902714]
We focus on the detection and segmentation of oncology-significant (or suspicious cancer metastasized) lymph nodes.
We propose a divide-and-conquer decision stratification approach that divides OSLNs into tumor-proximal and tumor-distal categories.
We present a novel global-local network (GLNet) that combines high-level lesion characteristics with features learned from localized 3D image patches.
arXiv Detail & Related papers (2020-05-27T23:12:11Z) - Segmentation for Classification of Screening Pancreatic Neuroendocrine
Tumors [72.65802386845002]
This work presents comprehensive results to detect in the early stage the pancreatic neuroendocrine tumors (PNETs) in abdominal CT scans.
To the best of our knowledge, this task has not been studied before as a computational task.
Our approach outperforms state-of-the-art segmentation networks and achieves a sensitivity of $89.47%$ at a specificity of $81.08%$.
arXiv Detail & Related papers (2020-04-04T21:21:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.