Network Pruning Assisted Dense Convolution Knowledge Distillation Transformer for Brain Cancer Classification with Effective Tumor Identification Approach
DOI:
https://doi.org/10.64252/ge0nmw65Keywords:
Brain tumor, Trimmed pixel, Dense Residual, Tripartite Attention, Knowledge distillation, PruningAbstract
The survival rate of patient’s has been reduced if brain tumor types are misdiagnosed, which can stop an efficient response to medical intervention. In traditional method, brain tumor have been differentiated by inspecting the MRI images of the patient’s brain. However in existing with large amount of data and different type of tumor identification consume more time and prone to human errors. The introduction of attention-based mechanism in medical imaging have been shown an accurate diagnostic, especially in identification and classification of brain tumor. In this study, an novel of hybrid knowledge distillation transformer model with improved U-net model is used for brain tumor detection. This paper is divided into four phases: pre-processing, identification, feature extraction and classification. Initially input images were pre-processed by using Trimmed pixel density based median filter (Pix-TrMed). From pre-processed image identify the tumor, which have been performed by Grouped dense residual convolution based U-Net model (Grp-DRcU-Net) model. After tumor identification, extract the important feature from the identified tumor outcomes, which is performed by improved ResNet (Im-ResN) model. Based on these extraction, classify the various types of tumors such as necrotic tumor core (NCR), peritumoral edema (ED) and enhancing tumor (ET) . The classification of brain tumor is performed by Network pruning assisted tripartite attention based knowledge distillation in dense convolutional transformers (Netr-HDCViT). The experimental result of this study is conduct on two datasets, such as BRATS 2020 and 2021, the analysis done on this datasets has achieve better performances in term of accuracy at values of 98.83% and 98.56% respectively