Abstract
Toxicologic pathology is one of the most valuable fields contributing to the advancement of animal and human health. With the ever-changing technological and economic environment, the basic skill set that pathologists are equipped with may require refinement to address the current and future needs. Periodically, pathologists must add relevant, new skills to their toolbox. The Career Development and Outreach Committee of the Society of Toxicologic Pathology (STP) sponsored a career development workshop entitled “Looking Forward: Cutting-edge Technologies and Skills for Pathologists in the Future” in conjunction with the STP 38th Annual Symposium. Experts were chosen to speak on artificial intelligence, clustered regularly interspaced short palindromic repeats technology, microRNAs, and next-generation sequencing. This article provides a summary of the talks presented at the workshop.
Keywords
Introduction
Toxicologic pathology and toxicologic pathologists play a major role in advancement of human and animal health. 1 –3 The role played by a toxicologic pathologist is ever changing with the advancement of our understanding on the mechanism of diseases and rapid progress in the technologies leading to development of new tools. The role of a pathologist has changed over the years from an emphasis on routine clinical pathology end points and morphological characterization of the gross and microscopic changes in the tissues to an era where there is increasing pressure to integrate the large amount of molecular pathology data with the traditional toxicologic pathology end points. This has been well articulated in a publication by Dr Maronpot. 1 He emphasizes the importance of organizing continuing education (CE) courses to create awareness and train the toxicologic pathologists in the new and emerging fields, as most of the training programs are currently not designed to incorporate newer technologies and tools in their curriculum. Therefore, CE courses will let toxicologic pathologists to stay current and be part of and/or leaders of the multidisciplinary teams engaged in advancing human and animal health. In this regard, The Career Development and Outreach Committee organized a career development workshop entitled “Looking forward: Cutting-edge Technologies and Skills for Pathologists in the Future” as part of the activities at the 2019 Society of Toxicologic Pathology (STP) Annual Symposium in Raleigh, North Carolina. Oliver Turner, Bsc (Hons), BVSc, MRCVS, PhD, DACVP, DABT, opened the workshop with a talk on “artificial intelligence, pathologist’s friend or foe?” This was followed by a talk on Clustered Regularly Interspaced Short Palindromic Repeats (CRISPR) technology by Channabasavaiah Gurumurthy, BVSc, MVSc, PhD Exec, MBA. Rebecca Kohnken, DVM, PhD, DACVP, presented on “microRNA: biomarkers and therapies,” while Ramesh Kovi, BVSc&AH, MVSc, PhD, DACVP, DABT, spoke on “Next-Generation Sequencing.” The workshop concluded with a panel discussion involving all the speakers and an enthusiastic audience.
Artificial Intelligence, Pathologist’s Friend or Foe
Dr Oliver Turner (Novartis) tackled the provocative field of artificial intelligence (AI). In his presentation, he considered whether AI should be viewed as “friend” or “foe” to the pathologist by offering answers to the following questions: What is AI?; What are some of the current applications of AI in pathology and toxicologic pathology?; What are the perceived advantages and limitations to AI?; How and where can one obtain training? and, How will it affect a pathologist’s career?
John McCarthy first defined AI in 1956 as “the science and engineering of making intelligent machines.” 4 This is still true for weak (single task) or narrow AI (ANI) but is often mistaken for strong, deep, or general AI (AGI), where AI’s ability to mimic human intelligence and/or behavior is indistinguishable from that of a human. Dr Turner made the point that all existing AI are ANI and that it will likely be decades to centuries before AGI is realized, if ever. Important and relevant terms within the world of AI include machine learning (ML), which in turn includes neural networks, which in turn includes deep learning. Since approximately 2012, the convolutional neural network (CNN) subtype has become the de facto standard in image recognition and is approaching human performance in a number of tasks. 5 These systems function by learning relevant factors directly from huge image training sets, often containing millions of images. Dr Turner introduced a variety of CNNs and an overview of how they work.
Current applications of AI in medical disciplines such as oncology and radiology are many and varied. Those in pathology and toxicology are also increasing in number. There are individuals who now herald AI as the next great revolution in pathology. 6 In recent years, with its application to drug discovery, toxicology, and related fields, ML has led to what some see as a “deep learning revolution” in drug discovery and development. Dr Turner highlighted some work that he has been involved with at Novartis, demonstrating that CNNs can mine normal histology with a high degree of accuracy and offer a path for new possibilities in data mining, establishing morphological signatures, correlations with other modalities (omics), and content-based image retrieval.
Perceived advantages and limitations to AI, deep learning, and digital pathology were considered from 2 recent publications. 7,8 On balance, it seems that AI is viewed as having the potential to favorably modify the pathologist’s role in medicine but has yet to extensively impact the landscape.
Opportunities for pathologists to continually educate in this field are myriad: from the traditional books and conferences to online tools such as webinars, YouTube contents, and websites such as https://brilliant.org/ and https://openai.com/. It is also possible to learn how to create computer code and establish self-derived neural networks with programs such as Python (https://www.learnpython.org/) or Tensorflow (https://www.tensorflow.org/).
It was proposed that elements of AI are already impacting many pathologist’s careers, for example, in the fields of image analysis and 3D reconstruction, multispectral imaging, augmented reality, and the education and regulation in these areas will only increase. For example, the 2018-2022 ACVP strategic plan has as its #1 goal “Shape the future practice of veterinary pathology in the age of digital pathology, AI and advanced molecular tools.” Dr Turner cited quotes from recent articles to help show how other medical professions are dealing with this issue, for example, “the only radiologists whose jobs may be threatened are the ones who refuse to work with AI.” 9
In conclusion, Dr Turner suggested that we consider Amara’s law “We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run” but also acknowledged several important concepts, notably: AI is here to stay; there is huge potential to increase productivity and accuracy on a wide range of pathology end points, and the next stage will see AI being utilized in the pathology decision-making process with more and more qualitative decisions being impacted, but that, we are still at “ground zero.” At the policy level, there will be many aspects to consider, for example, it will be important to establish who is responsible for a misdiagnosis leading to potential health risks for humans; will it be the software engineer/vendor or the pathologist?
Dr Turner proposed that AI should not be seen as a foe but rather as a friend but with caveats. Toxicologic pathologists should take the view that AI and ML will synergistically improve their individual and collective diagnostic efficacy and competency and not replace them in the future.
Clustered Regularly Interspaced Short Palindromic Repeats Technology
Dr Channabasavaiah Gurumurthy spoke on how mouse transgenesis can be redefined using CRISPR technology. He explained different types of gene editing, their pros and cons, how the older gene editing methods compare to CRISPR, and how CRISPR can be used to make rapid progress in science. He explained how efficiency can be increased using
History of Genetic Engineering
Genetic engineering is a highly specialized field of research related to technologies to modify genomes of organisms. The use of genetic engineering technologies began in 1970s. Gene-modified laboratory animal models have greatly advanced biomedical research and have promoted the understanding of how genes function. Genetically engineered models constitute 3 broad categories. (1) Transgenic models in which a new DNA copy is inserted into the genome 10 ; (2) knockout (KO) models where a specific gene is deleted or disrupted; 11 and (3) knock-in (KI) model where a new DNA copy is replaced/inserted at a specific genomic locus. 12 Transgenic models could be created simply by injecting recombinant DNA into fertilized zygotes and implanting the zygotes into pseudo-pregnant mice via surgical techniques, whereas creating KO and KI models involve the use of embryonic stem (ES) cells, where recombinant targeting DNA is first introduced at the specific site in the genome via homologous recombination in ES cells followed by their injection into early-stage embryos (blastocysts) and implanting the embryos into pseudo-pregnant females to create founder KO/KI lines. These techniques were established using the laboratory mouse as a model system, where ES cells seemed to have robust performance. A Nobel prize in physiology or medicine was awarded in 2007 for the developers of the gene targeting technologies in mouse.
The CRISPR-Cas9 Genome Editing Technology
The CRISPR-Cas9 tools for genome editing were introduced in 2012 to 2013,
13
–15
which greatly augmented and simplified the traditional genetic engineering technologies. The CRISPR system consists of a guide RNA of 20 bases long that identifies the target genomic region that recruits
Newer Improvements in Genetic Engineering Technologies Employing the CRISPR Tool
Because there is a need for a newer improved CRISPR tool, Dr Gurumurthy’s laboratory, in collaboration with Dr Masato Ohtsuka, Tokai University, Japan, developed 2 new improvements in CRISPR-based methods called
Usefulness of Easi-CRISPR Technique in Biomedical Research
The most commonly, and widely, used mouse models are conditional KO models involving the
MicroRNAs: Biomarkers and Therapeutics
Dr Rebecca Kohnken spoke on microRNAs (miRs) and specifically addressed their role in diagnosis and prognosis, how they are useful in understanding the disease mechanism, their use as pharmacodynamic and/or toxicity biomarkers and their use in therapeutics. At the end, she discussed several miRs that are currently under investigation/trials, some of the delivery systems, and preclinical safety assessment.
Twenty-five years after the discovery of small noncoding regulatory miRs, the field of RNA biology has greatly expanded to a vast regulatory network that governs all aspects of cellular function. Their potential uses in toxicologic pathology as pharmacodynamic and/or toxicity biomarkers, in medicine as diagnostic and/or prognostic biomarkers, and as therapeutics are intriguing and of great interest to drug development scientists.
Micro-RNAs are small (18-22 nucleotide) single-stranded RNA molecules with a wide diversity of cellular and extracellular functions. The miR sequences are found within coding and noncoding regions of the genome. Following transcription by RNA polymerase II, serial cleavage and processing leads to an miR duplex that then integrates with the RNA-induced silencing complex (RISC). The seed sequence of the single-stranded mature miR within the RISC then binds RNA targets. 26 It was previously thought that miRs bound either in perfect or imperfect complementarity to the 3′ untranslated region of a target messenger RNA (mRNA) and caused degradation or translational repression, respectively, of the target transcript. It is now known that miRs can have more complex effects on their targets, including cleavage, promotion of mRNA decay, and others. 27 Importantly, a single miR can interact with and regulate many targets (more than 100 on average), while most transcripts of protein-coding genes can be regulated by multiple mature miRs. 28
Micro-RNAs have varied uses as biomarkers for disease diagnosis and/or pharmacologic response in an early discovery setting, of toxicity in a drug development setting, and of disease progression, treatment response, and prognosis in a clinical setting. The characteristics of miRs that make them ideal for these applications include cell type and disease specificity, noninvasive methods of collection, and standardized and rapid quantitative analytical methods. 29 The miRs are accessible in many biofluids, are released from cells during injury from various toxic mechanisms, and have improved stability when compared to traditional protein biomarkers. 30 Furthermore, miRs have relatively conserved sequences across species, thus making quantitative analysis in a preclinical setting manageable. Specific applications of miRs in the toxicology field are varied. Hepatotoxicity, nephrotoxicity, and cardiotoxicity, in particular, have had the largest advances in quantitative miR profiling for xenobiotic-induced alterations in miR expression. Micro-RNA-122, for example, is best studied as a biomarker of hepatocellular injury and has been shown to correlate with cellular injury from multiple toxic mechanisms in a manner more consistent and more sensitive than increases in serum aminotransferase activity. 31
There are currently several early-phase clinical trials for miR-based therapies for cancer as well as other diseases, including an inhibitor of miR-155 for leukemia, miR-122 inhibitor for hepatitis C virus infection, and miR-29b mimic for fibrosis. Based on the miR expression profile of the indication of interest, it may be advantageous to either rescue or inhibit the expression of a single miR. For example, there are several approaches available and in development to replace an miR with tumor-suppressive function using miR mimics as well as to inhibit the function of an oncogenic miR with antagomir (anti-miR) or other inhibitor therapies.
Delivery of the active excipient presents a significant challenge for development of miR-based therapy. In general, RNA-based therapies have poor pharmacological properties, such as off-target effects, low stability in serum, and triggering of innate immune responses. 32 Strategies for overcoming low bioavailability and inefficient cellular delivery include chemical modifications, such as phosphorotioate oligonucleotide backbones, liposomes, nanocells, and viral vectors. 33 These modifications and delivery systems also provide stability and protect the miR therapeutic from nuclease digestion in blood. However, weaknesses of in vivo delivery systems remain, particularly with regard to binding affinity to the target and off-target effects owing to the high doses required to achieve clinical effects.
Other important considerations for development of miR-based therapies include target organ microenvironment, context specificity, and toxicity. Toxicities primarily arise from the delivery system and may be related to immune activation; however, the potential for toxicity from the miR itself remains. Anti-miRs and any components used in delivery of these anti-miRs can be recognized by the innate immune system leading to immunostimulation and potential toxicity. RNAs are recognized by endolysosomal Toll-like receptors (TLRs), particularly TLR7 and TLR8, that bind single-stranded RNAs. 34 Liposomes and related delivery systems have been shown to result in various forms of immunotoxicity. 35 With regard to chemical modifications, sequence-independent toxicity may also manifest as inhibition of coagulation, activation of the complement cascade, cytokine release, and/or immune cell activation. 36
With the extensive uses of miRs as biomarkers in various fields and the great therapeutic potential of miR-based therapies, it will be exciting to see how development and regulatory strategies evolve for these applications.
Next-Generation Sequencing for Pathologists
Dr Ramesh Kovi (EPL, Inc), a toxicologic and molecular pathologist, provided an overview of the fundamentals of next-generation sequencing (NGS). Dr Kovi discussed the central dogma of molecular biology that deals with an unidirectional detailed residue-by-residue transfer of sequential information from DNA to RNA to proteins. 37 He further provided technical details and basic concepts about various types of sequencing: sequencing by chain termination (first generation or Sanger sequencing), sequencing by synthesis (NGS), and various applications, recent advances, and limitations of NGS technologies.
With the discovery of the structure of DNA, 38 significant advances have been made in understanding the complexity and diversity of genomes in health and disease. Completion of the Human Genome Project has revealed the need for greater and more advanced technologies and data analysis tools to answer complex biological questions. However, limited throughput and the high costs of sequencing remained major barriers for advancement of genomic studies. Next-generation sequencing is a type of DNA sequencing technology that uses deep parallel sequencing of multiple small fragments of DNA to determine the nucleotide sequence. 39 The Sanger sequencing method is considered a “gold standard” with a very low error rate of 1 in 100 000 and longer read length up to 1 kB. However, the speed of sequencing and amounts of DNA sequence data generated with NGS, a “high-throughput technology,” are exponentially greater and are produced at significantly reduced costs. 40 Some of the limitations of NGS include relatively higher error rates (∼0.1%-15%), shorter read lengths (<500 bp), and requirement of sophisticated bioinformatics support.
A NGS experiment consists of a series of discrete steps: nucleic acid isolation, library preparation, cluster or bridge amplification, sequencing, and alignment and data analysis that uniquely contribute to the overall quality of the data. 41 Therefore, quality assessment at each of these steps is critical to obtain high-quality sequencing data. Sequencing quality metrics can provide important information about the accuracy of each step in NGS including nucleic acid quality (dsDNA concentration by Qubit [ThermoFisher Scientific, San Jose, CA; not measured by Nanodrop method], fragment analysis and DNA integrity number by TapeStation [Agilent Technologies, Santa Clara, CA.]), library pool (concentration, expected size, and adaptor duplications), base calling (Phred quality score, Q score), read alignment score, % mappability, read depth, and single-nucleotide variant (SNV) or copy number variation validation. To date, more than 15 000 genomes have been sequenced and deposited to National Center for Biotechnology Information (NCBI) genome repositories. 42 Approximately 15 PB of sequence data are generated every year globally, which poses an enormous challenge for both data analysis and infrastructure required for storage and bioinformatics solutions. 43
Data analysis pipelines have been evolving and improving in the past years to assess the quality of raw sequencing data, preprocessing steps, mapping quality, variant analysis, and visualization of the data. The algorithms and models are becoming more and more complex in order to describe the multistage process of NGS and to address different types of artifacts by modeling. Several variant-calling algorithms have been developed including SNV callers, Unique Molecular Identifiers (UMI)-based variant callers, to have high confidence in SNVs, indels, structural variants, and complex variants obtained from the sequence data. 44 –46
Whole-genome sequencing (WGS) is one of the most widely used applications of NGS. 47 However, other applications include whole-exome sequencing (WES), targeted sequencing, whole-transcriptome sequencing (RNAseq), small RNA sequencing (miRNAseq), whole-genome bisulfite sequencing (methylation profiling), mapping protein–DNA interactions (ChIPseq), and mitochondrial genome sequencing (mtDNAseq).
The key messages for pathologists from this presentation were quality assessment at every step of NGS, randomization of samples, matched genomic control (nontarget organ) from the same animals, sequencing read depth or coverage for various platforms (at least ×30 for WGS, ×100 for WES, and variable depth for RNAseq/miRNAseq), and phenotypic anchoring of variants to histomorphology of region of interest and are critical to generate high-quality sequencing data and to decipher associated biological implications.
Conclusion
In summary, this workshop provided valuable information on some of the cutting-edge technologies that will be an essential and integral part of toxicologic pathology in the near future. It also encouraged and provided information on the resources available to prepare pathologists for future challenges and to stay relevant in the current field of toxicologic pathology.
Footnotes
Acknowledgments
The authors thank STP’s Career Development and Outreach Committee members for providing critical inputs.
Declaration of Conflicting Interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article.
