Article Text

Download PDFPDF
Original research
Exploring prognostic indicators in the pathological images of hepatocellular carcinoma based on deep learning
  1. Jie-Yi Shi1,
  2. Xiaodong Wang2,
  3. Guang-Yu Ding1,
  4. Zhou Dong3,
  5. Jing Han4,
  6. Zehui Guan3,
  7. Li-Jie Ma5,
  8. Yuxuan Zheng2,
  9. Lei Zhang2,
  10. Guan-Zhen Yu6,
  11. Xiao-Ying Wang1,
  12. Zhen-Bin Ding1,
  13. Ai-Wu Ke1,
  14. Haoqing Yang2,
  15. Liming Wang2,
  16. Lirong Ai3,
  17. Ya Cao7,
  18. Jian Zhou1,8,
  19. Jia Fan1,8,
  20. Xiyang Liu2,
  21. Qiang Gao1,8,9
  1. 1Department of Liver Surgery and Transplantation, Liver Cancer Institute, Zhongshan Hospital, and Key Laboratory of Carcinogenesis and Cancer Invasion (Ministry of Education), Fudan University, Shanghai, P. R. China
  2. 2School of Computer Science and Technology, Xidian University, Xi’an, P. R. China
  3. 3School of Computer Science, Northwestern Polytechnical University, Xi'an, P. R. China
  4. 4Department of Pathology, Zhongshan Hospital Fudan University, Shanghai, P. R. China
  5. 5Department of General Surgery, Zhongshan Hospital (South), Public Health Clinical Centre, Fudan University, Shanghai, P. R. China
  6. 6Shanghai Key Laboratory of Multidimensional Information Processing, East China Normal University, Shanghai, P. R. China
  7. 7Cancer Research Institute, Xiangya School of Medicine, Central South University, Hunan, P. R. China
  8. 8Institute of Biomedical Sciences, Fudan University, Shanghai, P. R. China
  9. 9State Key Laboratory of Genetic Engineering at Fudan University, Shanghai, P. R. China
  1. Correspondence to Dr Qiang Gao, Liver Cancer Institute, Zhongshan Hospital Fudan University, Shanghai 200032, China; gao.qiang{at}; Professor Xiyang Liu; xyliu{at}


Objective Tumour pathology contains rich information, including tissue structure and cell morphology, that reflects disease progression and patient survival. However, phenotypic information is subtle and complex, making the discovery of prognostic indicators from pathological images challenging.

Design An interpretable, weakly supervised deep learning framework incorporating prior knowledge was proposed to analyse hepatocellular carcinoma (HCC) and explore new prognostic phenotypes on pathological whole-slide images (WSIs) from the Zhongshan cohort of 1125 HCC patients (2451 WSIs) and TCGA cohort of 320 HCC patients (320 WSIs). A ‘tumour risk score (TRS)’ was established to evaluate patient outcomes, and then risk activation mapping (RAM) was applied to visualise the pathological phenotypes of TRS. The multi-omics data of The Cancer Genome Atlas(TCGA) HCC were used to assess the potential pathogenesis underlying TRS.

Results Survival analysis revealed that TRS was an independent prognosticator in both the Zhongshan cohort (p<0.0001) and TCGA cohort (p=0.0003). The predictive ability of TRS was superior to and independent of clinical staging systems, and TRS could evenly stratify patients into up to five groups with significantly different prognoses. Notably, sinusoidal capillarisation, prominent nucleoli and karyotheca, the nucleus/cytoplasm ratio and infiltrating inflammatory cells were identified as the main underlying features of TRS. The multi-omics data of TCGA HCC hint at the relevance of TRS to tumour immune infiltration and genetic alterations such as the FAT3 and RYR2 mutations.

Conclusion Our deep learning framework is an effective and labour-saving method for decoding pathological images, providing a valuable means for HCC risk stratification and precise patient treatment.

  • cancer
  • liver
View Full Text

Statistics from


  • J-YS, XW, G-YD and ZD are joint first authors.

  • Contributors QG and XL conceived and directed the project. JS and GD collected the original slides and manually labelled whole-slide images, and JH and GY checked the annotation. XW led the deep learning algorithm development and evaluation. ZD, ZG, YZ, LZ and HY wrote the code for different tasks. LM collected and analysed the TCGA data. GY helped to scan all the hepatocellular carcinoma slides. XW, ZD and AK provided clinical guidance. HY, LW and LA contributed to the analysis of the data. JZ, YC and JF provided strategic guidance. JS and XW wrote the manuscript with the assistance and feedback of the other co-authors.

  • Funding This work was supported by the National Natural Science Foundation of China (No. 91859105, 8196112802, 81872321 and 81802302), National Key R&D Program of China (2018YFC0116500), Basic Research Project from Technology Commission of Shanghai Municipality (No. 17JC1402200), Shanghai Municipal Key Clinical Specialty and Program of Shanghai Academic Research Leader (19XD1420700).

  • Competing interests None declared.

  • Patient and public involvement Patients and/or the public were not involved in the design, or conduct, or reporting, or dissemination plans of this research.

  • Patient consent for publication Not required.

  • Ethics approval Ethical approval was obtained from the Zhongshan Hospital Research Ethics Committee.

  • Provenance and peer review Not commissioned; externally peer reviewed.

  • Data availability statement The data from Zhongshan Hospital that support the findings of this study are available upon reasonable request from the corresponding author (QG). The data from Zhongshan Hospital are not publicly available, because they contain protected patient privacy information. The external validation of TCGA data set is publicly available at the TCGA portal ( We provide a manifest linking to the sample IDs considered in the study (at We also provided annotated files of TCGA tumour regions (at Code availability: All code related to this method was written in Python. Custom code related to the image extraction, preprocessing pipeline, deep-learning model builder, data provider and experimenter driver were available (at

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.