Abstract

Systems biology and synthetic biology are emerging disciplines which are becoming increasingly utilised in several areas of bioscience. Toxicology is beginning to benefit from systems biology and we suggest in the future that is will also benefit from synthetic biology. Thus, a new era is on the horizon. This review illustrates how a suite of innovative techniques and tools can be applied to understanding complex health and toxicology issues. We review limitations confronted by the traditional computational approaches to toxicology and epidemiology research, using polycyclic aromatic hydrocarbons (PAHs) and their effects on adverse birth outcomes as an illustrative example. We introduce how systems toxicology (and their subdisciplines, genomic, proteomic, and metabolomic toxicology) will help to overcome such limitations. In particular, we discuss the advantages and disadvantages of mathematical frameworks that computationally represent biological systems. Finally, we discuss the nascent discipline of synthetic biology and highlight relevant toxicological centred applications of this technique, including improvements in personalised medicine. We conclude this review by presenting a number of opportunities and challenges that could shape the future of these rapidly evolving disciplines.

1. Introduction

Many areas of biological sciences and clinical medicine are benefitting from applying the emerging disciplines of systems biology [14] and synthetic biology [5]. Toxicology research is not different and in recent years toxicological procedures have begun to incorporate a wide array of computational techniques and artificial biological approaches for assessing the toxicological risk of chemicals. Historically, population-based investigations of disease risk associated with environmental exposures relied on statistical associations for causal inference. The introduction of novel integrated approaches when applied to toxicology investigations will sharpen our ability to distinguish causally relevant events between environmental exposures and disease outcomes. Systems biology is such an approach.

Systems biology encompasses a discipline that investigates the complex mechanisms underlying biological systems by treating the behaviour of genes, proteins, biochemical networks, and physiological responses as integrated parts within a whole system [6]. As a result the term systems toxicology was coined to describe the application of systems biology approaches to toxicological studies [7]. In practice this approach involves collecting large data sets from an array of sources including genomic, biochemical, proteomic, and metabolomic data. This data is then used to inform computational models that are capable of examining quantitatively and qualitatively the behaviour of biological systems under a wide variety of conditions [8, 9]. The major advantage of this approach lies in the researcher’s ability to model a multitude of complex biochemical events, many of which occur simultaneously [10]. This contrasts with the reductionist approach of studying biological systems by focusing on a small component operating in isolation. Synthetic biology is another approach that could change the face of toxicology. It represents the interface between scientific disciplines such as chemical and electrical engineering, biology, bioengineering, and computational modeling. In fact computational modeling is the glue that joins together the fields of systems and synthetic biology. Recent progress in this field has witnessed the engineering of synthetic genetic circuits [11], gene promoters [12], proteins [13], and a variety of synthetic biomolecules [14].

This review briefly outlines some of the more traditional approaches to toxicological research and will then discuss recent developments in systems and synthetic biology which are important for toxicology research. Systems biology and synthetic biology which comprises modeling toxicological effects using a mathematical framework and artificial cellular networks/tissues, respectively, are the emerging face of toxicology research. In this review we primarily focus on the role of computational modeling in the future of toxicology research. The rationale for this focus is that it is increasingly necessary to integrate the vast data generated from systems toxicology into a cohesive computational framework [15]. We anticipate that such a framework will lead to an improved understanding of how individual or combinations of toxicants interact with intracellular, physiological, and whole-body biological systems [15]. This paradigm shift could lead to a reduction in the number of animals used in toxicant risk assessment, allow for the illusive modeling of the effects of complex mixtures, create a template for individual toxicological exposure assessment by age gender or genetic background, and improve risk assessment in the future.

2. Traditional Approaches for Toxicant-Health Outcomes Research

Some of the more traditional approaches to toxicology testing and risk assessment have relied heavily on in vitro and in vivo experiments to generate data to assess health outcomes. Dose-response analysis investigations are commonly used to study the biological effect of toxicants on a cell culture, animal, or both over a period of time [16]. The types of negative biological responses can range from molecular and physiological perturbations, to alterations in the organism’s behaviour to mortality. However, there exists a problem with this method, as the response curve can alter significantly when the species is changed. This is crucial because the utility of any toxicological test depends on its consistency and its potential to determine the extent of the hazard associated with exposure to humans [16]. Not only are there problems with integrating data from different species for one contaminant, but there are also issues when integrating data from different chemicals in different species.

Toxicology studies investigating mixtures routinely employ statistical techniques to fit mathematical functions to toxicology data. These statistical techniques are based on certain assumptions. For example, Chen et al. (2012) used statistical nonlinear regression models to estimate acrylamide concentrations in French fries and the associated life-time cancer risk [17]. Such methods are useful for identifying statistical associations; however, empirical models have limitations as they do not capture toxicant-toxicant interactions. Also, they do not capture the interplay of such chemicals with physiological or intracellular biological mechanisms or how mechanisms are dependent on the level of toxicant exposures. To address this issue, toxicology research employs a number of mechanistic computational approaches. For example, the well-established physiologically based pharmacokinetic/pharmacodynamic (PBPK/PD) approach is capable of incorporating physiological mechanisms and predicting the change rate in chemical amount in tissues, chemical distribution, metabolic turn over, and the excretion rate of toxicants in a wide variety of animals including humans; all of this information is encapsulated in the output from model simulations for a particular time course. A recent interesting application of PBPK modeling was to investigate variations in cytochrome P450-mediated pharmacokinetics between Chinese and Caucasian populations [18]. The model was able to predict plasma drug concentration-time profiles in both population groups.

3. Single Exposures with Multiple Outcomes or Multiple Exposures and Single Outcomes

One valuable translational aspect of toxicological studies is their capacity to inform risk assessment to improve human and ecological health. Risk assessment is directly dependent on the empirical data generated, whether at a benchtop or laptop. Current strategies for risk assessment are informed by toxicity and exposure estimates and are limited by gaps in our data and also areas of uncertainty. To deal with uncertainty, such as how mixtures of chemicals with similar structures or mechanisms of toxicity interact, assumptions have to be made. For example, one might assume that two endocrine disrupters, with similar chemical structures, would have an additive effect on the endocrine system. Indeed, Miller et al. found that coexposure to two endocrine disrupters, polychlorinated biphenyls (PCBs) and polybrominated diphenyl ethers (PBDEs), had additive effects on reducing thyroid hormones in developing rats [19]. However, that data set, where an assumed additivity was validated by experimental results, pertained only to thyroid hormone reductions. Moreover, the neurological effects of hypothyroidism are complex and often times sex-specific [20]. Thus only looking at one biological outcome in one sex for a mixture of toxicants may be misleading in terms of understanding the total toxicological health effect of the mixture. In order to deal with this complexity, it is vital that we employ new methodologies that are transparent based on clear assumptions, on a consistent mathematical framework that can be shared readily across disciplines.

Understanding whether complex mixtures of chemicals will have additive, synergistic, or antagonistic effects on biological pathways is critical for predicting toxicological consequences on populations [21, 22]. Yet it also remains one of the most difficult and often neglected areas of toxicology, simply because it is neither cost nor time efficient to run the necessary experimental conditions to consider every potential mixture of contaminants in different situations. In this scenario, computational modeling may offer hope for the future. It is possible to predict the effects of new chemicals based on their chemical properties that may be shared with previously well-characterised chemicals. For example, the effects of mixtures of PFOS have been characterised with respect to structure-functional responses and T3 related activity in cells [23], and such data may provide a useful building block to use when simulating other similar molecules using an in silico frame work. The array of interactions that can be modelled with respect to additivity, synergy, and antagonism is beyond the scope of this review and is well discussed by Singh et al. (2014) in their review [24]. However, it is clear that using a mathematical based framework to anticipate and model potential interactions of mixtures in silico is in essence one of the resolvable problems that in silico toxicology processes can offer.

4. Systems Level Thinking Is Needed in Population-Based Investigations of Health Outcomes

Extrapolating toxicological data to understanding human health effects is a complex issue. Firstly, there are few methods to study human health effects resulting from exposure mixtures of toxicants, likely partly because of the scant data on mixtures of chemicals. As a result, epidemiologic risk estimation of toxicant mixture exposure is based on in vitro/in vivo toxicological equivalency factor (TEF). This approach assumes interspecies equivalence in risk and that risks are additive. However, polycyclic aromatic hydrocarbons (PAHs), for example, can have synergistic or antagonistic effects depending on the individual PAH compound studied [25]. Secondly, toxicant effects on multiple organ systems present a huge multiscale and temporal challenge. Many different organs can be affected by PAHs. As a case in point, prenatal exposure to PAHs through maternal inhalation is associated with a wide range of fetotoxic effects, including intrauterine growth restriction [26], preterm delivery [27], DNA damage [28], shorter stature at the age of 3 [29], and neurocognitive impairments during childhood. In addition, some toxicological effects are only manifested some years after exposure. For example, when a prenatally monitored group of newborns were followed to school age, the prenatal PAH exposure further impaired neurodevelopmental performances [30, 31] and increased the likelihood of asthma-related symptoms [32].

Thirdly, a number of environmental toxicants, in particular an emerging class of endocrine-disrupting chemicals, are observed to exert nonmonotonic dose response as well as low dose effects, particularly when exposure occurs during early life [33]. Lastly, the timing of exposure during pregnancy influences the toxicity of the exposure. For example, exposure to benzo[a]pyrene (B[a]P) is most detrimental to the brain during the first trimester, whereas the fetal liver is most vulnerable to the toxicant during the second trimester [34]. A systems biology centred approach would overcome such temporal and tissue specific problems by integrating how the toxicant impacts the temporal behaviour of cells, tissues, and whole organs systems. With this comprehensive analysis the behaviour of a toxic substance(s) can be inferred depending on the species that it is interacting with. Thus, systems toxicology, which integrates traditional techniques within the system biology paradigm, provides us a means to tackle some of the major challenges in the field of toxicology. Recently, Warner and colleagues (2012) used a systems toxicology approach to examine the mechanisms which underpin species-specific sensitivity to 1,3,5-trinitroperhydro-1,3,5-triazine (RDX), a neurotoxicant [35]. Toxicity was quantified via transcriptional, morphological, and behavioural markers in zebrafish and fathead minnow fry exposed for 96 h to RDX concentrations ranging from 0.9 to 27.7 mg/L in zebrafish. Using this holistic approach, it was established that zebrafish and minnow fry had different degrees of sensitivity to this neurotoxicant. More recently, Lu and colleagues (2014) utilised the systems toxicology paradigm effectively when they integrated traditional toxicology approaches with transcriptomics and metabonomics to determine the mechanisms underpinning hepatic erythromycin estolate (EE) injury [36]. Hepatic microarray analysis of the EE-treated rats showed that differentially expressed genes had an augmented ATP-binding cassette (ABC) transporter, cell cycle, and p53 signaling pathways. Metabonomics analysis showed that EE exposure could disrupt amino acid metabolism, lipid metabolism, and nucleotide metabolism, which the authors suggest is a result of the EE toxicological effects on the liver through oxidative stress.

5. Systems Toxicology: Recent Applications

The last few years has witnessed a significant increase in the use of systems biology centred approaches for toxicology research. Such procedures involve incorporating the so-called -omics techniques which include transcriptomics, proteomics, and metabolomics. These are data rich techniques that typically employ microarray analysis, mass spectrometry, and nuclear magnetic resonance (NMR) to generate a myriad of quantitative data. Bioinformatics techniques are, in turn, used to manage and archive this data while computational systems modeling, in turn, utilises the information obtained from these diverse sources to assemble mechanistic pathway models that are capable of making quantitative and qualitative predictions about the behaviour of toxicity pathways. In this section, we describe recent examples of how each of these “-omics” methods has been applied to toxicology research. Moreover, we detail how this data is being used to fuel the construction of novel toxicant centred computational models.

5.1. Genomic Toxicology (Toxicogenomics)

Toxicogenomics seeks to apply global techniques to evaluate how the genome is regulated during transcription and replication in response to the exposure of a biological system to a toxic chemical. For example, the discipline of transcriptomics has been established as a technique that can be utilised in toxicological research for over 15 years when DNA microarrays were first proposed as a tool to be used in toxicology research [37]. The field has progressed significantly in recent years and there has been a plethora of studies that have examined gene expression levels to determine transcriptomic responses to toxicants [38, 39]. A recent interesting investigation by Song and colleagues (2013) studied the transcriptomic response of blood cells in workers exposed to the volatile chemicals toluene and trichloroethylene [40]. The analysis was able to establish a unique transcriptomic signature that differentiated exposed subjects from unexposed subjects, while 378 genetic markers were identified predicting the exposure to each of the toxicants [40].

System-wide studies have been changing as new techniques become available; sequencing has changed the way genome-wide toxicology data is generated. Where microarrays have previously been used to generate expression profiles, RNA collection followed by deep sequencing (RNA-seq) has enhanced several toxicological studies [4143]. For example, a recent study by Yang and colleagues (2014) used RNA-seq to elucidate the molecular effects of crotonaldehyde exposure on macrophage-like cells [44]. Analysis of the transcriptome revealed that the expression of 342 genes was significantly altered (173 genes upregulated and 169 genes downregulated) in the first timepoint in the study. The categories of genes affected by crotonaldehyde exposure included oxidative stress, apoptosis, immune response, and inflammatory response pathways.

Epigenetic modifications made to DNA are a result of exposure to an environmental change, altered expression, and structure of the DNA and can be inherited. For example, DNA methylation can be due to a number of toxicants, natural and synthetic, and is observed in a genome-wide manner through sequencing. MeDIP-seq (methylated DNA immunoprecipitation followed by deep sequencing) was used in a study by Cheng et al. (2014) to examine the changes in the methylome of the lungs before and after exposure to environmental irritants to better understand the role between DNA methylation and asthma [45]. Researchers found that 213 genes were differentially methylated, 83 of which mapped to the reference genome. Further analysis of the 83 candidates revealed the transforming growth factor beta (TGFβ) signalling pathway as one of the epigenetically altered regions, providing link between DNA methylation and asthma. Though sequencing the methylome is a new field, it is beginning to provide new insights into toxicological research [46].

A significant challenge for this field is to use transcriptomic signatures such as those used to identify gene regulation events that are linked mechanistically to the mode of action of a toxicant. To this end a recent project called the Comparative Toxicogenomics Database (CTD; http://ctdbase.org/) seeks to investigate the impact of toxin exposure on human health by using data from curated scientific literature to understand the interactions of toxins with genes and proteins and then looking for disease associations [47]. Moreover, toxicology studies may also benefit from using Reactome (http://www.reactome.org), an open-data resource of human pathways and reactions [48].

5.2. Proteomic Toxicology (Toxicoproteomics)

Proteomics is the global quantification and analysis of proteins. Toxicoproteomics is the application of proteomic techniques to isolate proteins whose behaviour or normal function is detrimentally affected by exposure to a toxicant. There are a number of subdisciplines within proteomics and many examples of their successful application to toxicology research. For example, protein profiling/quantification has been ubiquitously applied to many areas of toxicology. For instance, in a recent ecotoxicology study, a liquid chromatography-tandem mass spectrometry- (LC-MS/MS-) based proteomic approach was used to identify and quantify differentially expressed hepatic proteins from female fathead minnows exposed to fadrozole, an inhibitor of estrogen synthesis [49].

Other investigations have used functional proteomics to characterize changes in enzyme activity as a result of toxicant exposure. For example, Chen and colleagues (2012) utilised functional proteomics to investigate the inhibition of plant growth by mercury exposure and were able to demonstrate time dependent changes in the activity of a cluster of antioxidant enzymes including enzymatic activity of superoxide dismutase (SOD), ascorbate peroxidase, catalase, and peroxidase as a result of exposure to mercury [50]. Proteome mapping has also been used to investigate changes to proteins exposed to toxicants. A recent study by Hispard and colleagues (2011) utilised hepatic proteome maps from control and rats exposed to cadmium to highlight the reduction in SOD in the exposure rodents [51]. Structural proteomics involves the large-scale analysis of protein structures, with the overall goal of constructing a complete three-dimensional reference map of the proteome and has been applied to toxicological research [52]. In addition, posttranslational modifications to proteins such as protein phosphorylation status have been determined in response to toxicant exposure [53]. Thus there is vast body of proteomics based data available that could be integrated into a mathematical framework to understand the systemic effects of toxicant exposures on proteins.

5.3. Metabolomic Toxicology (Toxicometabolomics)

Metabolomics involves globally profiling the metabolism of an organism. Metabolomics focuses on a plethora of metabolites that are collectively referred to as the metabolome. In contrast to proteomics and transcriptomics, metabolomics focuses on analysing molecules of low molecular weight that characterize metabolic activity such as free fatty acids (FFA), amino acids, carbohydrates, and certain lipids [54]. The application of metabolomics techniques to toxicology is referred to as toxicometabolomics [55]. A recent study used metabolomics analysis in mice to examine the hypothesis that two of the metabolic intermediates of trichloroethylene (TCE) were involved in liver toxicity by activating peroxisome proliferator-activated receptor α (PPARα), a key receptor involved in fat metabolism. TCE exposure resulted in a decrease in urine of metabolites involved in fatty acid metabolism, resulting from altered expression of PPARα target genes [56]. Other recent toxicometabolomic uses have included testing for the metabolic response to low-level toxicant exposure in a novel renal tubule epithelial cell system [57] and metabolomic profiling of in vivo plasma responses to dioxin-associated dietary contaminant exposure in rats [58].

5.4. Computational Systems Modelling

Computational modeling is the epicentre of systems biology and incorporates a wide variety of quantitative techniques that can aid toxicology studies (Figure 1). The model can quantitatively represent the components of a particular cellular pathway and how it responds to toxicant exposures. Computational systems modeling integrates with other disciplines under the systems biology umbrella, as quantitative data from diverse fields including genomics, metabolomics, and proteomics can be utilized to inform model construction and refinement. There is an established rationale for using computational models. Biological pathways are intrinsically detailed and intrinsically complicated. This level of detail gives rise to networks of interacting nodes. Many of the nodes interact in a nonlinear fashion and often communicate with each other via sophisticated feedback or feedforward loops. This places a significant cognitive burden on the human brain to retain this level of complexity and detail. Thus, it is highly improbable that one can reason about such complex systems by human intuition alone. Therefore, computational systems modeling offers a complimentary means of dealing with this complexity. At the centre of all computational models is mathematics and there are many theoretical frameworks which can be adopted to deal with the complexities of temporal variation in physiological responses. The theoretical framework that is employed will depend on the nature of the system to be modelled. Table 1 details the different modeling approaches and gives examples of where these approaches have been applied to toxicology research in recent years.

5.5. Standards for Model Exchange in Systems Biology

Although several standards exist, the Systems Biology Markup Language (SBML) is the leading exchange format for the exchange of biological models and has been in existence since 2002 [66]. Models encased in this format are becoming increasingly common. Its increasing significance is emphasized by the growth of models encased in the SBML framework. This framework format is designed specifically to enable the portability of a computer model regardless of what computing tool has been used to develop the model. The BioModels database which was established to act as a repository for SBML models has been published in peer reviewed journals. This database now consists of both a curated and noncurated section [67]. Curated models are both syntactically and semantically correct while the latter are syntactically correct but awaiting semantic verification. We noted that no toxicological model is in this database and proposed that such a platform provides an easy-to-use approach for toxicologists to begin to integrate their data on a systems level.

5.6. Utilizing Adverse Outcome Pathways (AOPs) Modeling

A significant challenge in biological modeling is to ensure that a model is a valid mechanistic reflection of the underlying biological processes that it describes. This is difficult as gaps in our understanding are commonplace. The last few years have witnessed the advent of AOPs [68, 69]. An AOP is a theoretical structure that provides links between biological events which lead to adverse health consequences. In practice an AOP consists of a series of events connected together in a linear but holistic fashion, for example, toxicantinitiatormolecular ignition eventaltered cellular pathwayaltered tissue behaviororgan system disfunctionadverse outcome. Therefore in essence AOPs are essentially a means of providing an unambiguous (where possible mechanistic) representation which is informed by our current understanding of a biological pathway and how a perturbation to that pathway could have an adverse outcome at the level of a whole organism or even population level [69]. The AOP approach has been applied to a number of areas including drug-induced cholestasis [70], skin sensation, and respiratory allergies [71]. In terms of skin sensation a recent AOP has been developed by Maxwell et al. (2014) which linked together a model of total haptenated protein and CD8+ and a cell response model [72]. The goal of this AOP is to establish the quantitative relationship between the toxin that an individual is sensitive to and the magnitude of the corresponding immune response [72]. The steps involved in assembling an AOP involve deciding on the information that needs to be included for that particular pathway. The information includes the details of the molecular initiation event through to organism effects. This information is then summarized in the form of a flow chart that depicts the AOP from the molecular ignition even to its effect on the whole organism or population with relationships between the various entities represented by arrows (see above) that illustrate the nature of the interaction. The weight of the interaction is then established before a confidence evaluation is determined [68]. There are a number of online resources and software tools which can help when developing and AOP. These include eChemPortal, which archives substances on chemical substances http://www.echemportal.org/echemportal/. Pathway information can also be obtained by using WikiPathways which was established as a curated resource that housed the details of a diverse range of biological pathways http://wikipathways.org/index.php/WikiPathways [73]. The Toxin and Toxin Target Database (T3DB) is also a useful resource that archives the details of >3000 toxins and also contains toxin target information http://www.t3db.org/ [74]. The construction of AOPs could be further facilitated by the development of additional software tailored specifically to their needs. Moreover, some of the existing computational systems biology tools could be adapted to suit the nature of the AOP framework.

5.7. The Exposome

The idea of an exposome is a concept that fits neatly within a systems biology way of thinking. The exposome takes into account the toxic effects of both intrinsic and extrinsic chemical activities. This view of a biological system considers both the internal environment as a set of chemical activities that have the potential to cause damage, for example, oxidative stress. This is in addition to external stressors such as environmental exposure to toxic chemicals. The exposome encapsulates both these sources of biological damages from conception onwards [75]. It is suggested that the impact of toxicants on a broad range of physiological parameters of health could be measured to give a profile of the exposome over time. This could involve measuring toxic chemicals, such as reactive electrophiles, metals, metabolic products, hormones and their derivatives, and chronic organic compounds [76]. However, it must be emphasised that this concept is very much in its infancy; however, it is likely to become a prominent feature of toxicant focused studies in the years to come.

6. Synthetic Biology: Current and Potential Applications to Toxicology Research

Synthetic biology is a nascent discipline that is provoking a readjustment of the boundaries between the physical sciences and the biological sciences [5]. The aim of this new discipline is to engineer novel biological entities from genes [77] and to generate virtual organs to improve health [78]. These synthetic biological entities can be the result of a reconfiguration and reassembly of preexisting biological systems. There are a number of recent examples whereby toxicology has come into contact with this area. In this section we will discuss these and will also highlight areas of toxicology that could benefit from synthetic biology in the future.

6.1. Synthetic Biological Circuits and Synthetic Bacteria

Synthetic biological circuits are engineered systems designed to evaluate the effect that an input has on the output. In a toxicogenomics study, these synthetic gene circuits permit the impact of an environmental influence on transcription to be assessed. Recent relevant examples relating specifically to toxicology research involved the construction of a synthetic mammalian gene circuit that detected the EthR-O(ethR) interaction in human cells and produces quantitative reporter gene expression readout. Challenging of the synthetic network with compounds of a rationally designed chemical library revealed 2-phenylethyl-butyrate as a nontoxic substance that abolished EthR’s repressor function inside human cells, in mice, and within M. tuberculosis where it triggered depression of ethA and increased the sensitivity of this pathogen to ethionamide [79]. Another example of a synthetic biological circuit was demonstrated by Moser et al. (2013) when the detection of methylating chemicals by the E. coli Ada protein was performed in S. cerevisiae [80]. In this study, the Ada protein detection system, which normally identifies methyl adducts on the DNA backbone in bacteria, was tuned for a lower detection threshold, associated with a reporter construct, and transferred into yeast. They found that not only does the sensing mechanism work correctly in a eukaryotic cell, but, at a 28 μM detection-threshold for methyl iodine, it is also suitable for detecting methylating compounds at concentrations found in environmental samples. Advanced DNA cloning methods have been applied to toxicology. For example, a synthetic bacterium has been engineered that is capable of degrading the herbicide atrazine. Briefly this involved screening a pool of synthesized RNAs that were capable of binding the toxin. The RNAs were, in turn, incorporated into the regulatory region of cheZ, a chemotaxis gene. Those cells that expressed this gene were then isolated, which was indicated by motility. When cheZ was expressed in the engineered strain as a result of atrazine, flagellar rotation is properly regulated, allowing cells to display chemotaxis. As the engineered cells degrade atrazine, they generate a gradient of the herbicide, which encouraged the bacteria to migrate to a concentration gradient at a rate which is depended on cheZ [81]. More recently, Zhao and colleagues (2014) reported the use of the synthetic Deg-On system (a system consisting of two plasmids) that converts proteasomal degradation of the transcriptional regulator TetR into a fluorescent signal, which translates ubiquitin proteasome system (UPS) activity to a readable signal [82]. According to the authors by connecting UPS activity to a fluorescence signal, this engineered circuit will have a number of applications, including screening for UPS activating molecules and selecting for mammalian cells with different levels of proteasome activity. One potential application of this might be in toxicant studies as UPS activation can be used as a means of detecting toxicant exposure; for example, it is well known that exposure to arsenic trioxide activates the UPS system [83, 84]. Synthetic bacteria have been engineered that is capable of degrading the herbicide atrazine. Briefly this involved screening a pool of synthesized RNAs that were capable of binding the toxin. The RNAs were, in turn, incorporated into the regulatory region of cheZ, a chemotaxis gene. Those cells that expressed this gene were then isolated, which was indicated by motility. When cheZ was expressed in the engineered strain as a result of atrazine, flagellar rotation is properly regulated, allowing cells to display chemotaxis. As the engineered cells degrade atrazine, they generate a gradient of the herbicide, which encouraged the bacteria to migrate to a concentration gradient at a rate which is dependent on cheZ [81]. To conclude this section, synthetic biological circuits and synthetic bacteria could offer significant future benefits for toxicology studies. Their potential remains open to considerable debate as we await the development of additional methodologies and circuitry that are specifically designed to deal with toxicant detection.

6.2. Computational Software, Standards, and Resources for Toxicology

Computational modeling is the glue that joins together the fields of systems and synthetic biology. Modeling within the synthetic biology context plays a pivotal role in the design stages of the biological entity that is to be engineered [85, 86]. Moreover, the utility of modeling extends to being able to predict the dynamics of a network under a variety of different parameters and diverse environments. As a consequence, a wide range of computational software applications has been created to deal with these two key stages in the engineering process. For example, TinkerCell is a computer-aided design (CAD) software tool for synthetic biology which combines a visual interface with an application programming interface (API) [87]. This permits developers to exchange their code with others via a central repository (http://www.tinkercell.com/). GenoCAD is a design of synthetic DNA sequences based on grammatical models of the genetic part (http://genocad.org/). Synthetic Biology Open Language (SBOL; http://www.sbolstandard.org/) is a data exchange standard for descriptions of genetic parts, devices, modules, and systems. A list of software applications that supports this standard can be found at http://www.sbolstandard.org/sbolstandard/software-tools-using-sbol/. In terms of DNA assembly, there are a large number of DNA assembly methods, and two of the most commonly used standards are BioBricks assembly (http://biobricks.org/) and Gibson assembly which is underpinned by the joining of DNA sequences in a single isothermal reaction. There are also a number of resources available for synthetic biology, including the registry of standard biological parts (http://parts.igem.org/Main_Page). A collection of parts is a source of components for the construction of novel biological systems. Moreover, there is also the Standard Biological Parts Knowledgebase (SBPkb) (http://www.sbolstandard.org/libsbol/sbpkb).

7. Problems Associated with Integrating Data

Sequencing results from system-wide studies often generate a large amount of data that often require specific computer programs for analysis. Some researchers can deftly switch between the benchtop and laptop to analyze data. However, many need assistance in extracting answers from the questions asked of the large datasets and this becomes a roadblock. There are websites which function as exchanges for guidelines and tools to help analyzing sequencing data; some examples are GenePattern (http://www.broadinstitute.org/cancer/software/genepattern) [88], USCS Genome Bioinformatics [89] http://genome.ucsc.edu/cite.html, and HOMER (http://homer.salk.edu/homer/ngs/index.html). One site, in particular, the Galaxy Server http://www.usegalaxy.org, has become a particularly well-known reference site, an analysis tool, and a place to share programs to help appreciate large sets of data [9092]. The aforementioned websites have now become useful resources to help reduce difficulties with analyzing sequence data, thus enabling more researchers to use systems approaches to address research questions. We anticipate similar problems in the field of toxicology research, whereby it is unlikely that a benchtop scientist will have the computational knowledge to use to begin integrating their data with that extracted from the literature to understand the systemic effects of toxicant exposures. It is also a challenge from a computational systems biology perspective to incorporate data from a diverse array of sources which often originates from different spatial and temporal scales. For example, this data may include everything from gene expression data to metabolites to chemical dose or concentrations of exposure. Integrating this data is a significant barrier for the future success of computational systems toxicology. Recent examples from other areas of bioscience could provide the template for this area. For example, Chen and colleagues (2014) have developed a multiscale mathematical model of immunogenicity for therapeutic proteins [93, 94]. The model was assembled using key biological mechanisms, including antigen presentation, activation, proliferation, differentiation of immune cells, secretion of antidrug antibodies (ADA), and in vivo disposition of ADA and therapeutic proteins. The model has three scales: a subcellular level representing antigen presentation processes by dendritic cells; a cellular level accounting for cell kinetics during humoral immune response; a whole-body level accounting for therapeutic protein in vivo disposition. This is a multiscale template that could in theory be applied to toxicology-focused dynamic computational models.

8. Limitations of Systems/Synthetic Biology and Future Opportunities

Systems and synthetic biology has a number of limitations. Solving these challenges will go a long way towards determining how effectively these new disciplines continue to integrate with toxicology research. A major current limitation of systems biology focuses on the calibration of computational models [95]. This has significant implications for toxicology research as evaluation of risk is paramount and exceptionally sensitive models undermine their potential utility. However, mechanistic computational systems modeling will likely be applied more widely once optimization techniques to calibrate systems models and inference techniques such Bayesian computational methods are employed. Fortunately, recent initiatives have focused on addressing this key limitation of computational systems models. Recently, computational biology has seen the application of a broad range of both optimization and inference techniques applied to this area [77, 96, 97] including most notably the introduction of approximate Bayesian techniques [98]. It also has to be recognised that, in order for computational systems to adequately represent toxicant interactions with biological systems, it is necessary to accept that these interactions take place at a variety of temporal and spatial scales. This is true if one considers that rates of reactions can vary dramatically from microseconds for cellular processes to years for whole organisms. This is also the case when spatial scale is considered as we are dealing with nanometre cellular structures through meters for whole organisms. A number of projects have attempted to address this issue but a resolution of the problem remains very much in its infancy.

A significant limitation of synthetic biology from a toxicology perspective is that no synthetic structure has been constructed yet that is sufficiently biologically detailed enough to represent how a toxicant or group of toxicants might interact with a whole organ system. There is a need for a novel means of quantifying the toxicity of natural and synthetic chemicals on organ and whole body systems. Limitations also extend to the assembly of synthetic genetic components as these are often drawn out process.

If one considers the potential future opportunities of these disciplines, then personalised medicine is a buzzword that is worth considering, and one might acknowledge that personalised toxicology has similar connotations. However, systems biology holds the promise of personalised toxicology. Consider, for example, how adjusting a model may be achieved to reflect the different activity of CYP enzymes for drug metabolism, in addition to aging-related alterations in enzyme activity that could be simply achieved. In such a scenario, it would be possible to predict toxicological exposure effects not only for a given population but also for a single individual and thus reducing large datasets into personalised simulations. Thus historical methods of prediction, such as the lifetime average daily dose of a compound, could be improved, to generate lifetime toxicological effects across systems. Such approaches will be heavily dependent on assumptions, which will need to be clearly stated and readily understandable by interested parties. However, with the expected influence that cheaper methods of gene sequencing will have on personalised medicine, one can expect that similar influences will be found on individuals requesting personalised toxicological reports, not only for drug metabolism but also for environmental exposures. It is highly probable that systems and synthetic biology approaches are the next natural phase for toxicology development. Approaches like the AOP are primed to evolve into systems and synthetic biology integrating methods. AOPs could provide the schematic and mechanistic underpinning of developing cohesive systems toxicology methods.

9. Discussion

In the last decade systems biology has embedded itself within bioscience research. In recent years its impact is beginning to be increasingly felt by those conducting toxicology studies. Systems toxicology is a novel discipline that seeks to use traditional toxicology approaches in conjunction with the systems biology paradigm to provide an integrated interpretation and understanding of biological processes from the molecular to the systemic and to use these to assess disease risk. Broadly the aims of this approach are to improve our understanding of the mechanisms of toxicity, to use dynamic computational models to predict the toxicity of unknown compounds or long-term effects of exposure and to use this nascent approach to improve public health and protect the environment for the betterment of society. There are many recent worthwhile examples of how toxicant studies have benefitted from adopting this integrative way of probing biological systems. This has given rise to the term systems toxicology and its -omic centred subdisciplines of toxicogenomics, toxicoproteomics, and toxicometabolomics. Recently, these techniques have been applied ubiquitously in toxicology research ranging from studies that have determined unique transcriptomic profiles to volatile chemical exposure to investigations that used metabolomic profiling to examine in vivo plasma responses to dioxin-associated dietary contaminant exposure in rodents. Computational systems biology is perhaps the least utilised systems biology technique to date. Currently, there is a paucity of toxicology themed computational models available in the BioModels database, an archive for models encased in SBML, the leading exchange format for systems biology. This is an interesting note as toxicology investigations have historically used mathematical modeling to assess the health risk to toxicant exposure. Therefore, it is surprising that some of the modeling approaches routinely used in systems biology, such as stochastic intracellular modeling, have not been explored more widely in toxicology research. We are currently using our previously published work which examined the impact of PCB’s and PDBE’s on T4 levels in rodents as a template for the assembly of a comprehensive mechanistic computational model.

This review also uncovered various examples of how the nascent area of synthetic biology, a discipline whose goal is to design and assemble novel synthetic biological entities, is also beginning to impact toxicology research. For instance, it is apparent that this new field has been used to benefit toxicology research by the design and engineering of novel bacteria organisms, which express genes that are capable of degrading harmful chemicals such as the herbicide atrazine. In terms of the future application of synthetic biology to toxicology research, we suggest an ambitious endeavour, encouraging the development of a synthetic liver, capable of mimicking the actions of glucuronidation. There are clear advantages and benefits to the development of such a synthetic construct, for instance, a reduction in the number of rodents used in toxicology studies and a reduction in the cost of these investigations. However, such a project has a number of limitations and obstacles to overcome before it can come to fruition. For instance, it would be necessary to improve our understanding further of the biological mechanisms that underpin glucuronidation. Moreover, it would be necessary to develop our knowledge of how toxicant-toxicant interactions impinge on the behaviour of the pathway. Such a system would have huge potential and without doubt would impact the way toxicant studies are conducted in the future.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgments

Veronica M. Miller would like to acknowledge funding from Alexander and Bo McInnis and the Autism Research Institute for her toxicological studies and support.