3,078
Views
0
CrossRef citations to date
0
Altmetric
Editorial

Virtual tissue staining in pathology using machine learning

ORCID Icon & ORCID Icon
Pages 987-989 | Received 03 Oct 2022, Accepted 25 Nov 2022, Published online: 29 Nov 2022

1. Introduction

Pathology is a medical discipline dealing with diagnosing and studying diseases. Through recognizing structural histological alterations, pathologists acquire valuable information on the effect of these changes on cellular and tissue function. Pathologist evaluation is performed by examination of histologically stained tissue mounted on a glass slide through an optical microscope, or, in recent years, of a digitized version of the histological image (i.e. digital whole slide imaging, WSI).

The long-established pathology workflow consists of a series of processes to prepare stained tissue samples, which involve fixation, processing, embedding, sectioning and staining [Citation1]. Staining is used to highlight important features of the tissue, as well as to enhance tissue contrast. This is in general a time-consuming, laborious process that needs to be performed in a designated lab infrastructure by trained technicians due to the toxicity of most chemical staining reagents. The semi-automated or manual staining processes and the utilization of different chemical reagents lead to high technical variability in sample preparation, which sometimes causes diagnostic challenges.

Furthermore, the staining process distorts the tissue and prohibits additional staining of any specific section and further molecular analysis on the same section. This is highly important in small tissue biopsies of diagnostically challenging cases, where multiple stains are often needed, followed by ancillary tests (e.g. DNA/RNA sequencing) that may be required to reach a diagnosis. If all the tissue biopsy material is used for staining, such molecular analysis cannot be performed.

2. Advancing pathology through virtual staining

Numerous optical imaging methods with alternative contrast mechanisms have been explored over the last decades [Citation2]. Most of these were targeted at eliminating the tissue fixation step and providing an intraoperative or bedside instrument that can be used for tumor margin assessment during surgery [Citation3]. Some of these methods have also augmented their results with a post-processing step that generates hematoxylin and eosin stain (H&E)-like images [Citation4,Citation5]. H&E stain is the principal stain in pathology, resulting in high contrast for the nucleus, cytoplasm, and extracellular constituents. However, the quality of these pseudo-H&E images usually lags behind the quality that pathologists are accustomed to work with, and they have limitations in representing pigments [Citation6]. Furthermore, this type of pseudo staining is in general much more challenging to successfully apply to other types of stains beyond H&E.

Recently, our research group introduced a method to virtually stain autofluorescence images of unstained tissue sections, eliminating the need for chemical staining while creating high-quality virtually stained slides (see ) [Citation3,Citation7]. In this virtual staining workflow, we acquired autofluorescence images of unstained tissue slides using a conventional fluorescence scanning microscope. Next, we used a deep convolutional neural network (CNN) which was trained using the concept of Generative Adversarial Networks (GAN) [Citation8] to learn the transformation from a label-free unstained autofluorescence input image to the corresponding bright-field image of the histologically stained version of the same sample. The network output created virtually-stained images that were well matched to the images of the same tissue samples that were labeled with different stains, including H&E, Jones, Masson’s trichrome and PAS (Periodic acid–Schiff). The virtually stained slides achieved a high degree of agreement with the histologically stained images of the same samples when they were blindly evaluated by a group of pathologists, while significantly shortening the staining time, manual labor and cost associated with lab-based chemical staining of tissue. This virtual staining framework was also applied to other contrast mechanisms, including holography and quantitative phase imaging [Citation9] as well as reflectance confocal microscopy [Citation10].

Figure 1. Illustration of virtual histology workflow using deep learning. The unstained tissue slide is imaged by a standard optical microscope with different fluorescence channels (DAPI/FITC/TxRed/CY5, grayscale images). A GAN-based CNN transforms the unstained tissue slides into a virtually stained image [a-Jones stain], a digital blend of histological stains [b-H&E, Jones and Masson’s trichrome stains], or IHC stain [c-HER2 stain].

Figure 1. Illustration of virtual histology workflow using deep learning. The unstained tissue slide is imaged by a standard optical microscope with different fluorescence channels (DAPI/FITC/TxRed/CY5, grayscale images). A GAN-based CNN transforms the unstained tissue slides into a virtually stained image [a-Jones stain], a digital blend of histological stains [b-H&E, Jones and Masson’s trichrome stains], or IHC stain [c-HER2 stain].

The computational nature of this deep learning-based virtual staining technology enabled us to generate stains that would be impossible to create using traditional histochemical staining [Citation11]. For example, we used what we refer to as a ‘digital staining matrix’ which allowed us to generate and digitally blend multiple stains using a single deep neural network by specifying which stain should be performed at the pixel level [Citation11]. Not only could this novel framework be used to perform multiple stains on a single tissue section, it could also be used to create micro-structured stains, digitally staining different areas of label-free tissue with different stains. Furthermore, this digital staining matrix enabled these stains to blend together by setting the encoding matrix as a mixture of the possible stains. This technology can be used to focus on the most relevant information possible from the various virtual stains being performed so that pathologists can get more diagnostically relevant information while reducing their slide examination time.

Stemming from the success of virtual staining of the label-free tissue sections, we also created neural networks that enabled stain-to-stain transformations [Citation12,Citation13]. This deep learning-based framework enables the digital transformation of existing images of tissue biopsy stained with one type of stain into many other types of stains. A stain-to-stain transformation process takes less than one minute per tissue sample, as opposed to several hours or even more than a day when performed by human experts, without requiring additional tissue sections. This speed advantage enables faster preliminary diagnoses that require special stains, while also providing significant cost savings.

Another recent advancement through our virtual staining methodology was the introduction of virtual immunohistochemistry (IHC) staining [Citation14]. IHC is one of the pillars of modern diagnostic pathology and a fundamental research tool in both pathology and translational research laboratories. Conventional IHC staining is a delicate process that requires accurate control of time, temperature, and concentrations of the reagents at each tissue handling step; in fact, IHC stains have up to 30% technical staining failures [Citation15]. Focusing on HER2, a pivotal breast cancer-related protein, we generated virtual HER2 IHC images from the autofluorescence images of unlabeled breast tissue sections, matching the bright-field images captured after the standard IHC-staining while reducing the turnaround time to minutes.

3. Future developments and utilization of virtual histology technology

The ability to virtually stain label-free tissue sections can potentially restructure the clinical workflow in pathology (). The current virtual staining process takes a few minutes per slide, and this image synthesis and virtual stain creation time could be dramatically shortened by using dedicated hardware and reach real-time performance, which might especially be useful in intra-operative pathology consultation. By applying multiple stains to a single tissue section, alongside the added capabilities of stain blending, synthesis, and micro-structured virtual staining, we will be able to create novel stains that may better highlight cellular structures and organelles. Pathologists will be able to create customized stains that will maximize their morphological diagnostic abilities while minimizing the examination time.

Table 1. Comparison of conventional and virtual tissue staining.

We believe that this deep learning-based virtual staining framework paves the way for new applications in life sciences and biomedical diagnostics. For example, the significantly reduced IHC staining time could help clinicians to provide better patient care, especially in short-staffed pathology departments where IHC can take nearly a week to perform and can result in significant delays in the treatment of patients. Furthermore, this technology could potentially be used to virtually stain cellular elements that current IHC methods fail to highlight, such as heavily masked antigens, proteins with low expression levels and possibly assist in the detection of genomic aberrations (e.g. oncogene amplifications, deletions and fusions) that require expensive ancillary tests not available in many pathology departments.

In summary, virtual staining technology has transformative potential in enabling tissue diagnoses and helping clinicians to provide better patient care while reducing costs, labor and time-to-diagnosis.

Declaration of interest

A.O. has pending patent applications on virtual staining of tissue and is a co-founder of a company (Pictor Labs) that aims to commercialize virtual tissue staining related technologies. The authors have no other relevant affiliations or financial involvement with any organization or entity with a financial interest in or financial conflict with the subject matter or materials discussed in the manuscript. This includes employment, consultancies, honoraria, stock ownership or options, expert testimony, grants or patents received or pending, or royalties.

Reviewers disclosure

Peer reviewers on this manuscript have no relevant financial relationships or otherwise to disclose.

Additional information

Funding

Authors acknowledge the US National Science Foundation funding (PI: Aydogan Ozcan).

References

  • Alturkistani HA, Tashkandi FM, Mohammedsaleh ZM. Histological Stains: a literature review and case study. Glob J Health Sci. 2015;8(3):72–79.
  • Gurcan MN, Boucheron LE, Can A, et al. Histopathological image analysis: a review. IEEE Rev Biomed Eng. 2009;2:147–171.
  • Rivenson Y, de Haan K, Wallace WD, et al. Emerging advances to transform histopathology using virtual staining. BME Frontiers. 2020. https://spj.sciencemag.org/journals/bmef/2020/9647163/
  • Kang L, Li X, Zhang Y, et al. Deep learning enables ultraviolet photoacoustic microscopy based histological imaging with near real-time virtual staining. Photoacoustics. 2022;25:100308.
  • Zhang Y, Kang L, Wong IHM, et al. High-throughput, label-free and slide-free histological imaging by computational microscopy and unsupervised learning. Adv Sci Weinh Baden-Wurtt Ger. 2022;9. e2102358.
  • Mayerich D, Walsh MJ, Kadjacsy-Balla A, et al. Stain-less staining for computed histopathology. Technology. 2015;3(1):27–31.
  • Rivenson Y, Wang H, Wei Z, et al. Virtual histological staining of unlabelled tissue-autofluorescence images via deep learning. Nat Biomed Eng. 2019;3(6):466–477.
  • Goodfellow IJ, Pouget-Abadie J, Mirza M, et al. Generative adversarial networks. ArXiv14062661 Cs Stat. 2014. http://arxiv.org/abs/1406.2661.
  • Rivenson Y, Liu T, Wei Z, et al. PhaseStain: the digital staining of label-free quantitative phase microscopy images using deep learning. Light Sci Appl. 2019;8(1):23.
  • Li J, Garfinkel J, Zhang X, et al. Biopsy-free in vivo virtual histology of skin using deep learning. Light Sci Appl. 2021;10(1):233.
  • Zhang Y, de Haan K, Rivenson Y, et al. Digital synthesis of histological stains using micro-structured and multiplexed virtual staining of label-free tissue. Light Sci Appl. 2020;9(1):78.
  • de Haan K, Zhang Y, Zuckerman JE, et al. Deep learning-based transformation of H&E stained tissues into special stains. Nat Commun. 2021;12(1):4884.
  • Yang X, Bai B, Zhang Y, et al. Virtual stain transfer in histology via cascaded deep neural networks. ACS Photonics. 2022;9(9):3134–3143.
  • Bai B, Wang H, Li Y, et al. Label-free virtual HER2 immunohistochemical staining of breast tissue using deep learning. BME Frontiers. 2022. https://spj.sciencemag.org/journals/bmef/2022/9786242/
  • Kim S-W, Roh J, Park C-S. Immunohistochemistry for pathologists: protocols, pitfalls, and tips. J Pathol Transl Med. 2016;50(6):411–418.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.