Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

Green Techniques for Organic Synthesis and Medicinal Chemistry
Green Techniques for Organic Synthesis and Medicinal Chemistry
Green Techniques for Organic Synthesis and Medicinal Chemistry
Ebook1,590 pages15 hours

Green Techniques for Organic Synthesis and Medicinal Chemistry

Rating: 0 out of 5 stars

()

Read preview

About this ebook

An updated overview of the rapidly developing field of green engineering techniques for organic synthesis and medicinal chemistry

Green chemistry remains a high priority in modern organic synthesis and pharmaceutical R&D, with important environmental and economic implications. This book presents comprehensive coverage of green chemistry techniques for organic and medicinal chemistry applications, summarizing the available new technologies, analyzing each technique’s features and green chemistry characteristics, and providing examples to demonstrate applications for green organic synthesis and medicinal chemistry.

The extensively revised edition of Green Techniques for Organic Synthesis and Medicinal Chemistry includes 7 entirely new chapters on topics including green chemistry and  innovation, green chemistry metrics, green chemistry and biological drugs, and the business case for green chemistry in the generic pharmaceutical industry. It is divided into 4 parts. The first part introduces readers to the concepts of green chemistry and green engineering, global environmental regulations, green analytical chemistry, green solvents, and green chemistry metrics. The other three sections cover green catalysis, green synthetic techniques, and green techniques and strategies in the pharmaceutical industry.

  • Includes more than 30% new and updated material—plus seven brand new chapters
  • Edited by highly regarded experts in the field (Berkeley Cue is one of the fathers of Green Chemistry in Pharma) with backgrounds in academia and industry
  • Brings together a team of international authors from academia, industry, government agencies, and consultancies (including John Warner, one of the founders of the field of Green Chemistry)

Green Techniques for Organic Synthesis and Medicinal Chemistry, Second Edition is an essential resource on green chemistry technologies for academic researchers, R&D professionals, and students working in organic chemistry and medicinal chemistry.

LanguageEnglish
PublisherWiley
Release dateMay 10, 2012
ISBN9781118308653
Green Techniques for Organic Synthesis and Medicinal Chemistry

Read more from Wei Zhang

Related to Green Techniques for Organic Synthesis and Medicinal Chemistry

Related ebooks

Chemistry For You

View More

Related articles

Reviews for Green Techniques for Organic Synthesis and Medicinal Chemistry

Rating: 0 out of 5 stars
0 ratings

0 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    Green Techniques for Organic Synthesis and Medicinal Chemistry - Wei Zhang

    Part I

    Introduction

    Chapter 1

    Green Toxicology

    Nicholas D. Anastas

    Poseidon's Trident, LLC, Milton, Massachusetts, USA

    1.1 Introduction

    Toxicology is the study of adverse effects of chemical, biological and physical agents on organisms. In other words, it is the study of poisons. This chapter focuses on the principles and practices of chemical toxicology aimed at an intended audience of synthetic and medicinal chemists. A single chapter is clearly inadequate to present any area of toxicology in the detail necessary to become skilled in any issue in depth, especially one as complex as toxicology. Therefore, only the most critical aspects of toxicology necessary to inform safer chemical design are presented here. Several exceptional textbooks are available that serve as outstanding resources for those who want to investigate this fascinating field further [1–4]. This chapter provides a foundation upon which to build a conceptual dossier and a core skill set for characterizing toxicity.

    Currently, the majority of chemists lack appropriate training in toxicology, often unaware of the potential hazards associated with the chemicals they use. Molecular designers are uniquely positioned to design less hazardous molecules. There is a growing demand to incorporate the intentional act of informed molecular design into the chemical enterprise. This can only happen when chemists are trained in the principles of toxicology.

    There are very few synthetic chemists that are also trained in toxicology and vice versa. This gap needs to be filled by scientists comprehensively trained in both disciplines. Anastomosis of these two disciplines can be termed green toxicology. Green toxicology is the application of the principles of toxicology to chemicals with the specific intent of deriving design protocols for hazard reduction. This is a step on a continuum linking chemistry, medicinal chemistry, toxicology, and finally green toxicology. This chapter outlines the principles of toxicology that can be applied to synthetic design to construct less hazardous molecules ab initio.

    The choice of toxicology topics to emphasize in this chapter was the result of an assessment by the author of the concepts most useful to molecular designers involved in the practice of green chemical design. There are many ways to present this information, however, the sections in this chapter are arranged to include: (1) the scope and history of toxicology; (2) principles; (3) the disposition of chemicals; (4) mechanistic toxicology; (5) environmental toxicology; and (6) risk assessment.

    Examples of the application of toxicology data in designing safer chemicals are provided throughout the chapter as concept demonstration exercises.

    1.2 History and Scope of Toxicology

    Toxicology, in some form, has been known and practiced by humans for many thousands of years. Humans chose between plants that were edible and those that were poisonous, nutritional or medicinal. How were these choices made? Man's ingenuity was challenged early in evolution to use observational toxicology to survive and propagate the species. Animals were already making these decisions and have adapted themselves to make the right choices. Early approaches to toxicology were often crude, based on fear, magic and folklore with inexperience often leading to unintended consequences. With time, the art and science of toxicology developed steadily.

    Toxicology is the study of adverse effects on living systems resulting from chemical, biological or physical agents [5]. This definition implies the enormity of the scope, the scientific complexity and the depth of understanding required to effectively practice toxicology. Toxicology is both an art and a science similar to organic synthesis and medicine. The science of modern toxicology has gone beyond the traditional practice of a primarily descriptive discipline based on observations in whole animals, to the current practice of using knowledge of the mechanisms of toxicity to describe, predict and, ultimately, mitigate toxicity through molecular design. The art is in the interpretation of the results and ranking a toxicant relative to other compounds.

    Chemicals are used in commerce and industry, as medicines, and may be naturally occurring or synthetic. Under certain circumstances of exposure, they present a hazard to humans and the environment. The nature and severity of these hazards is determined by the physicochemical properties of the agent that determine its interaction with its ultimate target (intrinsic hazard) and its ability to come into contact with receptors (exposure).

    Determining whether a substance is toxic or nontoxic requires a set of metrics representing both assessment and measurement endpoints for clearly defined adverse outcomes. A cornerstone of toxicology, articulated by the medieval physician Paracelsus, states that every compound is toxic at sufficient dose; in other words, the dose makes the poison [6]. This central message has been expanded and appropriately refined to include time as a core component of the manifestation of toxicity. The idea that a chemical can be nontoxic is actually a misnomer because all chemicals are toxic at some defined dose. There is also an inherent assumption that there is a threshold dose below which adverse effects do not occur. This concept is significant when applied to determining an acceptable dose as part of a risk assessment which is discussed in more detail in Section 8.

    The structure–hazard relationship forms the nexus between molecular design and toxicology. Because chemistry studies the properties and transformations of matter and toxicology is tasked with understanding the effects of chemicals on human health and the environment, understanding both chemistry and toxicology is unquestionably necessary to design safer chemicals.

    Toxicology is comprised of a diverse collection of subdisciplines each focused on a specialized area of investigation. Modern toxicology evolved from the related field of pharmacology, a mature science that investigates the effects of chemicals on living systems for the purpose of therapeutics and other medicinal endpoints. Toxicology can be thought of as pharmacology at high doses. Both disciplines are based on similar fundamental principles.

    Mechanistic toxicologists focus on elucidating the mechanisms by which chemicals exert their toxic effects on living organisms. Understanding the mechanisms and modes of action can serve as the basis for developing approaches for reducing intrinsic chemical hazard, for risk assessment, and for forensic investigations. Descriptive toxicologists investigate the overt signs of toxicity that result from traditional direct testing methods. Clinical toxicologists examine the potential toxicity of chemicals used in therapeutic situations. Environmental or ecological toxicologists investigate the hazards to organisms other than humans (including wildlife and plants) as well as describe the fate and transport of chemicals in the environment. Regulatory toxicologists apply the data provided by descriptive, mechanistic and environmental toxicologists in risk assessments to determine acceptable levels of exposure in domestic, industrial and global situations.

    Green toxicologists, as described above, use the principles of chemistry to identify opportunities to design molecules with reduced hazard by establishing design rules.

    1.2.1 The need for green toxicology

    Several subdisciplines of toxicology have evolved to fill specific research and regulatory needs and to meet science and policy objectives as part of green chemistry. The goal of design for hazard reduction, or benign-by-design, is to minimize the intrinsic toxicity associated with exposure to a chemical. The success of these efforts relies on cooperative efforts among toxicologists, synthetic chemists and environmental scientists. This chapter focuses on describing the principles and practice of green toxicology; however the same approach can be applied to physical hazards and global hazards.

    The evolution of toxicology from a primarily descriptive discipline into a well-developed predictive science relies on the newest approaches of molecular toxicology including the incorporation of toxicogenomics and other tools focused at the genetic level of organization to uncover toxic mechanisms of action at the biochemical, cellular tissue and systems levels [7]. Systems biology has been an essential tool in framing the picture of toxicity in a more holistic way by describing and predicting adverse outcomes. A framework for designing safer chemicals has been described and is a useful tool for identifying opportunities for safer chemical design [8, 9]. Designing safer chemicals requires incorporating toxicology into the design phase of the molecular design process.

    1.3 Principles of Toxicology

    The central maxim of toxicology is that there is a quantitative relationship between the dose of a toxicant, toxin or xenobiotic in an organism and the biological response it produces. This fundamental association is called the dose–response relationship and is essential to both toxicology and to pharmacology. Before continuing, a few definitions of terms used throughout this chapter are necessary to maintain subtle but important differences among potentially toxic compounds. Toxicity is a relative property of a molecule's potential to cause harm. A toxicant is any agent capable of producing adverse responses in an organism. A toxin is a toxicant of natural origin, for example a natural product from a plant or a toxin from a venomous animal. A xenobiotic is a compound that is foreign to the organism. Often the terms toxicant, toxin and xenobiotic are used interchangeably but incorrectly.

    The definitive determinant of toxicity is a function of the concentration of the ultimate toxicant at the target site for a long enough period of time, which is governed by the time course of action (kinetics) and the response to the interaction at the target site (dynamics). Both concepts are discussed in more detail later in this chapter.

    1.3.1 Characteristics of exposure

    Toxic responses cannot occur unless an organism is exposed to a chemical and the ultimate toxicant reaches its site of action. The route by which the toxicant enters an organism can profoundly influence its ultimate fate. The major routes of exposure are inhalation, ingestion, dermal contact, and uptake by other species-specific organs, for example, gills in fish.

    The frequency and duration of exposure influences the concentration in the organism (body burden) and therefore the ultimate concentration at the site of action. Exposure periods are generally classified into four general categories: acute, subacute, subchronic, and chronic (Table 1.1). These well-developed time duration categories are approximate and usually apply to well-designed animal studies and rarely apply to actual exposure scenarios in residential and occupational settings. These intervals can be adjusted to satisfy experimental conditions or regulatory requirements. Acute effects occur immediately upon exposure or within a very short period of time post exposure. These adverse effects can result from a single exposure or from multiple exposures within a very short period. Some examples are dermal corrosivity of strong acids, inhalation toxicity of carbon monoxide, and ingestion of high doses of arsenic.

    Table 1.1 Exposure categories and time durations.

    Chronic effects manifest after repeated exposures, from several months to the organism's entire lifetime. For the same chemical, the acute effects are often vastly different from chronic effects.

    1.3.2 Spectrum of toxic effects

    Virtually every chemical is toxic at sufficiently elevated dose, and for an appropriately sufficient amount of time and route of exposure. Chemicals have a spectrum of undesired effects depending on the dose, frequency and duration of exposure, the intrinsic toxicity of the molecule and the influence of protective or adaptive mechanisms. The dose needed to produce a particular deleterious effect among chemicals can range more than eight orders of magnitude. No chemical demonstrates a single, well-defined and exclusive adverse effect. For example, acute exposures to volatile anesthetics result in dizziness and anesthesia whereas chronic exposure to lower concentrations can result in liver and kidney damage.

    The potential for reversible toxicity is important for characterizing the significance of a toxicant. A compound that demonstrates irreversible effects, for example permanent corrosive tissue damage, or covalent binding to macromolecules, is of more concern than those compounds that demonstrate adverse effects that are reversible upon cessation of the exposure.

    Toxicants can adversely affect a limited anatomical or physiological space or manifest toxicity throughout the system (i.e., systemically). These effects are not limited exclusively to a particular toxicant or class of toxicant and xenobiotics can manifest both types of characteristics.

    Some toxicants are so reactive that the damage is manifested directly at the site of exposure, for example, strong acids and bases and strong oxidizers or reducers are capable of causing immediate and irreversible necrosis of skin. Most other toxicants must be absorbed and transported to their site of action.

    1.3.3 The dose–response relationship

    The dose–response relationship describes the correlation between an increase in the dose of a chemical and the resulting increased response, which can be either beneficial or adverse. Though the relationship is elegant in its simplicity, it remains a formidable assignment to fully characterize the complex and often subtle nature of toxic responses. Toxicity is a function of dose, exposure, and time [10]. Consequences of the interaction of a molecule with a biological target will propagate through molecular, biochemical, cellular and organism levels of organization ultimately resulting in a biological consequence. This consequence can be detrimental in the case of toxicity, or beneficial in the case of therapeutic compounds.

    The term dose refers to the total amount of a substance to which an organism is exposed. Dose is commonly expressed as mass of substance per weight of the organism per time (e.g., mg/kg/d). The entire dose is not necessarily absorbed and distributed to its site of action. The external or applied dose is the amount of a chemical at the interface between the environment and the organism. The biologically effective or internal dose is the amount of toxicant actually reaching the target. The total dose can be calculated if the duration and frequency of exposure are known.

    Responses are generally normally distributed reflecting the variation within a population of responses. Those responding at lower doses or concentrations reflect sensitive individuals or hyper-responders. Individuals that are more resistant to the effects are hypo-responders, whereas most of the members of the exposed population respond to similar doses reflected in the median or average response.

    In both toxicology and pharmacology it is customary to plot the dose as the independent variable on the x-axis and the response as the independent variable on the y-axis. When the dose is plotted arithmetically, a hyperbolic curve is generated showing the increased response with increased dose (Figure 1.1). If the dose is log-transformed and plotted against response, a line segment is obtained making the statistical manipulation easier to evaluate.

    Figure 1.1 Typical dose–response curve.

    An advantage to the log dose–response plot is a much more straightforward interpretation of differences in potency among a group of toxicants acting through similar modes or mechanisms of action. Chemicals producing the same maximal effect but at a lower dose will occupy a position farther to the left on the plot of the dose–response curve indicating greater potency.

    Most chemicals follow a threshold response, that is, the probability of a response is essentially zero below a certain dose or concentration. This can also be defined as less than an observable response for a population. The position of the dose–response curve provides information on the amount of a chemical that is necessary to elicit a maximal response.

    Many adverse reactions involve the interactions of xenobiotics with receptors. The concept of a receptor was described by Langly in 1878 and the term receptor was first used by Paul Erlich in the early twentieth century. A receptor is any component of an organism, generally macromolecular, that reacts with endogenous or exogenous ligands. Some examples of receptors include those for hormones, neurotransmitters, small proteins, and opioids.

    Selective toxicity is succinctly defined as agents that produce injury to the undesirable entity (uneconomic species) without causing harm to the desired entity (economic species) [11]. The concept of selectivity can be applied across species as in the case of pesticides that are designed to eliminate pests and not harm humans or nontarget species or can be applied to situations within a single organism as in the case of antineoplastic chemicals that are designed to target vulnerable features of cancer cells while not damaging noncancer cells.

    1.4 Disposition of Toxicants in Organisms

    Two divisions of reactions describe the chemical journey from exposure to its final destination at its target: toxicodynamics and toxicokinetics. Toxicodynamics is the study of the interactions and subsequent responses of an organism from exposure to a toxicant. Potency and efficacy are two attributes associated with the toxicodynamic phase of the dose–response relationship. These reactions include the entire available chemical bonding schemes including covalent bonding, hydrogen bonding, ionic, noncovalent interactions, and so on. It can be thought of as what the body does to the chemical. Potency is defined as the dose of a chemical required to achieve a maximal response.

    Efficacy, or intrinsic activity, is related to the affinity that a toxicant has for a particular receptor and with the resulting biological response [12]. The affinity of a toxicant for a ligand is related to the tendency to form a stable complex resulting in a biological response. This concept explains the differences between full agonists, partial agonists, and antagonists. An agonist binds with a target site with a resulting complete response. A partial agonist binds to a target site with a predictable but a diminished response. An antagonist binds to a receptor with no resulting response. A xenobiotic that achieves the same maximal response at a lower dose than required for another compound to reach the same maximal response is considered to be more potent.

    Toxicokinetics describes the processes associated with the time course of a xenobiotic along its pathway to its receptor site or sites. Generally, kinetics is the study of the time course of movement and the time course of chemical reactions including those processes associated with toxicity. In familiar terms, toxicokinetics describes the processes that the body performs on the xenobiotic.

    How do toxicants access their sites of biological action? Unless they act directly at the exposure site, then they must be transported to the site of action through Absorption, Distribution, Metabolism, and Excretion, commonly referred to by the acronym ADME. All of these factors have a role in determining the amount of toxicant reaching the target site as well as the length of time the xenobiotic remains in the organism (Figure 1.2).

    Figure 1.2 Factors influencing the concentration at the target site.

    The applied dose of a chemical refers to the amount of a chemical that comes in contact with an organism or receptor. The journey to the site of biological action is governed by four primary processes that control the amount of xenobiotic that will ultimately reach the site of biological action. The internal dose or biologically effective dose is the amount of a chemical that reaches the site of action. In general terms, the flux of a chemical in an organism is determined by: (1) the extent and rate of uptake at the site of exposure; (2) the rate of distribution to the tissues; (3) the extent of biotransformation; and (4) how quickly and efficiently the compound is eliminated from the organism. The totality of these interactions can be described as what the organism does to the chemical.

    1.4.1 Absorption

    Absorption is the process of chemical, biological and physical agents crossing biological membranes. In most organisms chemical absorption is governed by the reactions occurring at biological membranes that are composed generally of lipid bilayers with polar head groups reflecting their amphipathic nature.The major sites of absorption are the gastrointestinal (GI) tract, the lungs, the skin (dermal absorption) and the gills of aquatic organisms. Absorption across each of these anatomic structures is dictated by the properties of the compound and the properties of the membrane itself. Nonpolar, unionized organic chemicals can transverse biological membranes through passive diffusion because of the lipid nature of biological membranes. Absorption is primarily dependent upon the lipophilicity and charge of the compound and the presence of any specific transporter systems in place. Specific membrane transporters include those for small molecules and certain amino acids.

    Absorption from the GI tract is one of the most common and most well described routes of exposure due mainly to the studies on pharmaceuticals. The primary factors influencing absorption from the GI tract are pH of the particular site within the GI, the pKa and lipid solubility of the molecule.

    For toxicants that are weak acids and bases, the pH partition theory can be used to determine the extent of ionization of a toxicant that in turn can help characterize the extent of absorption. The Henderson–Hasselbach equation relates pH to the percent of compound ionized at a given pH:

    (1.1)

    equation

    (1.2)

    equation

    Nonionized compounds are absorbed more efficiently than compounds that are ionized. The pH is an obvious influence in the ratio of ionized to nonionized species and therefore the pH changes along the GI tract which profoundly influences the extent of absorption at a particular anatomical location. At physiological pH, most weak organic acids and bases will exist in various proportions based on their pKa.

    Special Topic 1: Design for reduced oral absorption

    The process of oral absorption in humans is dependent upon a number of physicochemical characteristics of the molecule, the absorptive surface of the membrane and the surrounding conditions (e.g., pH). Absorption is often a first and necessary step in a complete pathway to toxicity, therefore, any molecular modifications that reduce the potential for absorption will likely reduce or eliminate toxicity.

    Lipinski's Rule of Five

    Lipinski and colleagues examined the influence of a selected set of physicochemical properties on the extent of oral absorption for a group of pharmaceuticals to determine whether a quantitative structure–activity relationship (QSAR) could be established for predicting the success of new drugs as part of the drug discovery process [13] providing a set of guidelines for deciding whether an unknown molecule ID is drug-like referring to the likelihood of the compound being absorbed into the general circulation. Compounds will be well absorbed if they possess the following characteristics:

    Partition coefficient log P < 5

    Molecular weight <500 g/mol

    Fewer than 5 hydrogen bond donors

    Fewer than 10 hydrogen bond acceptors

    Reverse design.

    These guidelines can be used to design safer molecules by exploiting molecular features that cause a decrease in absorption of a potentially toxic molecule. This idea of reverse design applies the inverse of Lipinski's rule to reduce the likelihood of a toxicant reaching a site of action by increasing log P above 5 and molecular weight above 500 g/mol; and increasing the number of hydrogen bond donors and acceptors. Recent work has shown that physicochemical properties of known toxic compounds are statistically different from bulk chemicals.

    1.4.1.1 Respiratory Tract

    The lungs of mammals and the gills of aquatic organisms are the primary route of respiratory exposure. The lungs have a high surface area (140 m²) and a high cardiac output (100%) resulting in an extremely high exposure from volatile inhalable compounds. The rate of absorption in the lungs is controlled by the rate of diffusion across the alveolar membranes. Alveoli are small sac-like structures that have very thin membranes, on the order of 0.2–0.4 μm in humans, and have a very high surface area. Therefore the blood:gas partition ratio dominates the absorption across pulmonary tissue. Chemicals with a high blood:gas partition coefficient will be rapidly distributed to the circulation whereas chemicals with low blood:gas partition coefficients will not be distributed from inhaled air to the blood efficiently.

    Particle size is another factor in pulmonary absorption efficiency. Larger particles are not absorbed as efficiently as smaller particles. Particles less than 1 μm will reach the deep lung and eventually be absorbed by the alveoli. Particles approximately 5 μm will be deposited in the tracheobronchiolar region and cleared by the mucociliary system. Particles greater than 5–10 μm are cleared by the nasopharyngeal system and are ultimately swallowed and deposited in the GI tract.

    1.4.1.2 Dermal Absorption

    The blood flow and surface area of the skin is much smaller than either the GI tract or the respiratory system, therefore the kinetics of absorption are very different. The rate determining step for dermal absorption is the transfer across the stratum corneum or outer layer of the skin. The permeability coefficient is used to mathematically describe the rate of transfer across the dermal barrier. The properties that most influence dermal permeability are molecular size, water solubility, lipophilicity, and the presence of solvent carriers. Smaller, lipid soluble molecules will transverse through the skin of mammals or the chitin exoskeleton of insects more readily than larger water soluble compounds.

    Bioavailability is a measure of the rate and extent that a compound reaches the general circulation and is a property of a compound that is dependent upon physicochemical factors similar to those used to evaluate absorption [13]. Predicting the extent of bioavailability requires knowledge of a compound's particle size and charge, lipophilicity, water solubility, and ionization potential. In evaluating the potential bioavailability in aquatic organisms and other wildlife, the extent of partitioning to soils and sediments, the organic carbon content, and volatility must also be considered [3]. Designing chemicals that possess properties that limit bioavailability will result in reduced concentrations in the organisms and ultimately in a lower hazard.

    1.4.2 Distribution

    Distribution is defined as the movement of a compound from its site of exposure to other sites in an organism including the sites of action [14]. The rate of distribution is determined primarily by blood flow and the rate of diffusion into target cells. Well-perfused tissues and organs such as the liver, lungs, and heart essentially have instantaneous distribution. Less well-perfused tissues, such as fat and bone, may require several weeks to reach equilibrium.

    The amount of a chemical in an organism can be quantitatively measured by using a relationship between the initial dose of a toxicant and the volume into which it appears to distribute. The volume of distribution (Vd) is a convenient way to measure the apparent volume into which a compound is distributed in an organism. The Vd does not correspond to any specific physiological compartment but is a theoretical volume used to estimate the extent of distribution within biological compartments:

    (1.3) equation

    The Vd is related to the total amount of compound in the body to the volume of the compound in the blood. In general, lipophilic compounds will remain in the hydrophilic central compartment and more lipophilic compounds will slowly partition into fatty compartments. The major compartments and volumes are presented in Table 1.2.

    Table 1.2 Distribution of body water in humans [12].

    Compounds that have a high affinity for plasma proteins or fatty tissue may have a Vd that is much greater than the total plasma water. The usefulness of Vd is in determining the extent of availability of each chemical. Compounds that bind well to plasma proteins will effectively be removed from distribution, thereby restricting the concentration in the plasma volume.

    The blood–brain barrier (BBB) is a specialized array of neurons that are designed to protect the central nervous system from the potential adverse effects of xenobiotics. Specialized tight associations among gap junctions prevent many chemicals from entering the brain and central nervous system, however it is not perfect. In young and developing organisms, the BBB is not fully developed and therefore it is not a robust barrier to many toxicants. Some small molecules can cross the BBB in adults [15].

    1.4.3 Metabolism

    Metabolism, or biotransformation, is the process of chemical transformation of a toxicant to different structures, called metabolites, which likely possess a different toxicity profile than the parent compound. Biotransformation affects both endogenous chemicals and xenobiotics. Metabolism can result in a transformation product that is less toxic, more toxic, or equitoxic but in general more water soluble and more easily excreted.

    Chemical modification can alter biological effects through toxication, also called bioactivation, which refers to the situation where the metabolic process results in a metabolite that is more toxic than the parent. If the metabolite demonstrates lower toxicity than the parent compound, the metabolic process is termed detoxication. These processes can involve both enzymatic and nonenzymatic processes. Excellent detailed reviews of metabolism can be found in Cassarett and Doull's Toxicology [16].

    The liver is the primary organ of metabolism in mammals for both exogenous and endogenous chemicals. The enzymes associated with biotransformation have broad substrate specificity that allows a few enzymes to interact with a myriad of xenobiotics. This attribute is part of the adaptable defense mechanism present in all organisms. The coding of the synthesis of these enzymes is inducible meaning that exposure to the compound itself or other compounds can cause an increase in the production of metabolic enzymes to deal with exposures to potential toxic compounds.

    Metabolism is divided into two major phases based on the general reactions associated with each category. Phase I metabolism prepares compounds for excretion by revealing or adding a functional group that makes a compound more water soluble and therefore more easily excreted. Phase I enzymes are located in almost all tissues; however the greatest concentration is in the liver. Other sites include the gut, skin, and lung. The major classes of Phase I reactions are hydrolysis, reduction, and oxidation. Hydrolysis reactions include the cholinesterase and pseudocholinesterase enzymes involved with the metabolism of pesticides and other ester containing molecules. The reduction of azo and nitro containing compounds is an essential detoxication mechanism as well as a necessary step in the generation of a therapeutically active molecule sulfanilamide from the precursor prontosil molecule.

    The cytochrome P450 enzymes are the major catalysts that are responsible for a variety of oxygenation reactions associated with the biotransformation of many xenobiotics [17]. Cytochrome P450 enzymes have high catalytic versatility located in the endoplasmic reticulum of the microsomal fractions of cells. The main function of this group of isozymes is to insert one atom of oxygen into a substrate thereby increasing hydrophilicity. These enzymes are proteins that contain heme with a reduced iron species essential for transferring electrons and work in concert with the coenzymes NADPH and NADPH reductase.

    Phase II metabolism is associated with synthetic conjugation reactions including glucuronidation, sulfation, methylation and conjugation with glutathione. Phase I metabolites are often used as substrates for Phase II reactions. Phase II products are water soluble and therefore more easily excreted by the kidney.

    1.4.4 Excretion

    Excretion is the removal of a chemical from an organism through any of the available processes including respiration, or urinary, biliary and fecal excretion. Toxicants or xenobiotics can be eliminated from organisms by several routes depending on the anatomy and physiology of the organism.

    For water soluble compounds, or compounds that have been metabolized to water soluble compounds, the kidney is the main organ of excretion in most organisms. Compounds with molecular weights less than 60 kDa are filtered by the glomerulus. Protein binding will decrease filtering by the kidney resulting in a greater amount of the chemicals remaining in the general circulation. Volatile compounds are excreted primarily through exhalation. Lipid soluble compounds are excreted through bile and feces.

    1.5 Non-Organ System Toxicity

    Not every compound interacts with an anatomical receptor or tissue that we refer to as organs. Many xenobiotics act through nontarget system related toxic pathways. Three important categories of nonorgan system related toxicity are carcinogenesis, reproductive and developmental deficits.

    1.5.1 Carcinogenesis

    Carcinogenesis is a multistage process associated with the induction of neoplasms, or new growths, that leads to a family of disease states commonly termed cancers. Cancer is not a single disease but is made up of multiple conditions that share common traits.

    Three main steps that comprise carcinogenesis are initiation, promotion, and progression. Initiation involves a permanent change in the fundamental nature of a cell. A cell can remain in the initiated state indefinitely until acted upon by a promoter. Mutagens are compounds that act directly on DNA causing mutations.

    A promoter is the trigger for an initiated cell to multiply into larger groups of neoplastic cells. Promoters can act in a number of ways, for example acting on oncogenes, killing normal cells that surround initiated cells, inhibiting the action of suppressor genes thereby resulting in the loss of cell cycle control and unchecked cell proliferation. Promotion is thought to follow a dose–response relationship unlike the initiation phase of carcinogenesis. A complete carcinogen acts as both an initiator and a promoter.

    Progression is the final step in the carcinogenic triad. Cells that have reached this stage tend to demonstrate the familiar attributes of malignant tumors, specifically invasion of nearby tissues, metastasis, and loss of differentiation.

    Many carcinogenic molecules are electrophiles or undergo bioactivation to electrophiles [18] that can bind to cellular nucleophiles like DNA, proteins causing myriad adverse effects including covalent binding to DNA, disruption of critical enzyme pathways, and destruction of structural cellular components.

    1.5.2 Reproductive and developmental toxicity

    The processes associated with reproduction and development in all organisms and plants are extremely complex, awe inspiring events requiring flawless execution of elegant combinations of timing and of process fidelity. Any errors at critical stages in the process can lead to devastating consequences including physical malformations, increased reproductive failures, physiological deficits, and death.

    Reproductive toxicology is the study of the adverse effects on the male and female reproductive systems resulting from exposure to biological, chemical or physical agents. The endocrine system is integral to reproduction and other physiological processes.

    The endocrine system is a group of specialized organs, tissues that function to regulate many of the activities of the organs and tissues and is the primary system responsible for homeostasis. This regulation takes place through a complex, well-regulated system of hormones and small peptides that regulate physiological functions including reproduction, energy production, metabolism, and growth. The entire endocrine system is ultimately controlled by a neurohumoral feedback system controlled by the hypothalamus and pituitary glands located in the midbrain [4]. The primary regulatory glands of the endocrine system are the gonads, the thyroid gland, and the adrenal glands.

    The endocrine system naturally produces endogenous hormones that are responsible for homeostasis. Several natural disease states and conditions can disrupt the endocrine system including hyper- and hypothyroidism, estrogen induced breast cancer and over production of adrenocortocosteroids.

    Exposure to certain xenobiotics is associated with adverse endocrine effects. These chemicals are commonly referred to as endocrine disrupting chemicals (EDCs). Endocrine disruption can interfere at all levels of physiological organization and with all endocrine organs, for example mimicking natural hormones, blocking endogenous receptors, or directly affect the system itself.

    Estrogen active or estrogenic compounds have received a great deal of attention from the research community and from the public because of their putative role in breast and ovarian cancers as well as with disrupting reproduction and development in aquatic communities. Many compounds are suspected endocrine disruptors including pesticides, industrial chemicals (e.g., nonylphenol), plasticizers (e.g., phthalates), and some pharmaceuticals (e.g., DES).

    Developmental toxicology is the study of adverse effects in a developing organism. Teratology is a subdiscipline of developmental toxicology that focuses on the specific time period between conception and birth. The word teratology is derived from the Greek work teratos meaning monster. Many teratological effects are quite obvious, for example, cleft palate and missing limbs. The timing of exposure is of paramount concern especially for developmental toxicity. Critical exposure periods of susceptibility correlate with the timing of organ development and are quite precise. Any perturbations of the normal timing significantly increase the likelihood of adverse developmental effects including teratogenesis. Guidelines have been developed to assess developmental risks [19].

    Thalidomide is among the most notorious examples of how a slight change in molecular structure can influence biological effects. Thalidomide was a drug prescribed to pregnant women to mitigate morning sickness. Within a year of its introduction onto the market, reports of severe limb malformations in newborns were disclosed. The drug was removed from the market soon after incidents of these adverse effects were made available.

    Thalidomide exerts its teratogenic effects by interfering with organogenesis between days 24–33 of development [20]. The exact mechanism is not known, however, intercolation into DNA is one of the leading hypotheses among the more than thirty proposed. Thalidomide exists in two isomeric forms; the S- and the R-enantiomers (Scheme 1.1). Research has indicated that only the S-enatiomer is capable of intercolating into DNA resulting in toxicity. Understanding the relationship between the structural requirements and exposure limitations necessary for developmental toxicity has enabled pharmacologists to identify new clinical uses for thalidomide including against leprosy, treatment of AIDS and against some aggressive forms of cancer [21].

    Scheme 1.1 R- and S-isomers of thalidomide.

    1.5.3 Immunotoxicology

    Concern for the potential toxicity to the immune system arises from the central role that the immune system has on maintaining and protecting health. Immunotoxicology is a blend of toxicology and immunology defined as adverse changes in structure or function resulting from exposure to a toxicant [22]. Immunotoxicity can manifest as hypersensitivity or allergic reactions (e.g., dermatitis, inflammation), immunodeficiency (e.g., HIV/AIDS) and autoimmunity (e.g., autism). Substances that provoke immune responses are termed antigens. Antigens can be beneficial, for example, neutralizing infectious agents, or the responses can be adverse and result in anaphylaxis, or immune deficiency.

    The immune system is a highly complex and well-regulated system of cells, tissues, organs and molecular mediators that respond to both endogenous and exogenous challenges. The cells of the immune system are associated with white cells and include macrophages and lymphocytes, specifically T-cells and B-cells. These cells can directly destroy invading substances or begin a cascade of response against the toxicant through chemical signaling. The B-cells secrete antibodies (immunoglobins) that stimulate the production of specific antibodies. T-cells regulate the magnitude of immune response through signal modulation.

    The organs associated with the immune system include the bone marrow, the thymus, the lymph nodes, the spleen and the tonsils.

    1.6 Mechanistic Toxicology

    Mechanistic toxicology focuses on elucidating and describing the molecular events from exposure to the events that lead to the disruption of biological targets and describes the resulting adverse outcomes on living systems. A mechanism of action is defined as a detailed description of the key molecular events associated with a toxic response. A mode of action is a generic description of the key events and processes, starting with the interaction of an agent with a cell, through functional and anatomical changes, resulting in toxicity. Advancements in elucidating mechanisms and modes of action at the molecular level have informed decisions regarding the relationship between molecular structure and adverse outcomes Ideally the steps in the pathway of toxic response are identified and are connected to the manifestation of toxicity. This connection is extremely challenging and few mechanisms have been described in detail. Significant progress has been made in the past several decades identifying mechanisms and modes of action. However, there is much work to be done. Chemists are familiar with reaction mechanisms as they pertain to synthetic reactions, but are not often familiar with those same fundamental mechanisms applied to biological situations [23, 24].

    Determining individual steps involved with the manifestation of an adverse response provides a starting point for documenting opportunities to control the structure–toxicity relationship which is a critical step in the process for designing safer chemicals.

    The manifestation of toxicity reflects a complex sequence of connected events. This journey can be thought of as a molecular itinerary that describes a chain of related events linking exposure with each step in a toxicity pathway leading ultimately to the manifestation of toxicity. Elucidating these mechanisms form the nexus between synthetic chemistry and toxicology, and develop the basis of green toxicology and safer chemical design.

    Mechanisms are rarely single events occurring in isolation. In reality, toxic mechanisms are complex multifaceted events involving feedback and repair mechanisms, transport to the site of action, molecular transformation, reaction with target molecules and ultimately excretion.

    Modes of action can be divided into two general categories: nonspecific (narcosis) and specific. Narcosis, also called baseline toxicity, is a generalized depression of biological activity resulting from the presence of toxicants. The exact mechanism is not known, however, there have been several theories advanced to ascribe a more precise characterization of the process. Some examples of chemicals that act through baseline narcosis include ethanol and general anesthetics [25]. Narcosis has been further divided into polar and nonpolar modes.

    Specific toxicity reflects the interaction of compounds with identifiable biological targets. Examples of specific endpoints include oxidative uncouplers, DNA alkylators, acetylcholinesterase inhibitors and central nervous system active compounds.

    The progress in the rapidly expanding, multidisciplinary field of toxicogenomics has provided essential insight into the action of xenobiotics at the level of gene expression. Toxicogenomics holds the promise of revealing the effects of toxicants on the genome and the expression of these alterations at the biochemical, cellular tissue and organism level. Tens of thousands of genes exist in the human genome and not all of them are expressed. The differential expression of these genes is responsible for normal function as well as for the responses to exposures to a xenobiotic. Toxicants perturb the normal functioning at all levels of organization in an organism.

    1.7 Quantitative Structure–Activity Relationships

    The relationship between chemical structure and biological activity has been a source of curiosity since the late nineteenth century. In 1893, Richet described the relationship between toxicity and structure in short chain length alcohols. Myer and Overton established the first relationship between lipid solubility of a chemical and narcosis in tadpoles. More quantitative treatment of structure–activity relationships using n-octanol as the preferred lipohilic solvent was done by Hansch et al. [26, 27]. These studies established the octanol–water partition coefficent, Kow, as the standard for characterizing lipophilicity in biological systems. A more in depth treatment and discussion of the history of QSAR can be found in the work done by Selassie et al. [28]. QSAR models can be used for establishing quantitative relationships between structure and activity or structure and property, to predict the potential activity for compounds of unknown toxicity and for designing safer chemicals.

    Molecular descriptors are any parameter used in the development of either a SAR or QSAR to model any type of property under investigation [e.g., molecular structure, log P with some type of biological attribute or response (e.g., toxicity, LC50, carcinogenicity, and so on)]. A plethora of potential property response combinations exist and many have been developed with a high degree of repeatability [29]. Some of the more common properties are listed in Table 1.3.

    Table 1.3 Examples of molecular descriptors.

    The partition coefficient between water and n-octanol occupies a central role for predicting the behavior of nonpolar organic molecules and is the most important of the molecular descriptors used to date. Hansch, Leo and Fujita performed the seminal work in investigating the influence of partition coefficients with biological action [30]. In a biphasic system containing immiscible liquids, a partition coefficient can be determined by measuring the chemical under investigation in each of the two phases at equilibrium. This relationship can be represented by the equation:

    (1.4) equation

    where P is the partition coefficient, X0 is the toxicant concentration in n-octanol, and Xaq is the toxicant concentration in the aqueous phase.

    When the organic phase is n-octanol, the partition coefficient is represented by Kow. The relationship is parabolic when the log of the inverse of the concentration is plotted against the log of the partition coefficient and reflects the influence that the degree of lipophilicity has on the movement of compounds across biological membranes. At low lipid solubilities, compounds do not pass as readily through membranes as do compounds that are more lipid soluble.

    The partition coefficient values can span five or six orders of magnitude, therefore the log base 10 values are often used. Most xenobiotics that have a log Kow between 2 and 6 will cross membranes easily and effectively. Above values of 6, the compounds are said to be super-lipophilic and their passage across membranes diminishes because the compounds tend to dissolve in the membrane lipids very strongly and never move or the time to equilibrium is too great to be measured on biologically important time frames.

    QSARs have been used to develop mathematical relationships between structure and biological effect using regression analysis. An early regression equation was derived using data generated by Overton for the ability of several alcohols to induce anesthesia in tadpoles. The regression analysis produced the QSAR model:

    (1.5) equation

    where C is the concentration of the chemical needed to produce anesthesia and P is the partition coefficient.

    A QSAR is only as good as the quality of the data used to develop the model. There are potentially significant limitations in using QSARs to describe structure–activity relationships and in applying these relationships to predicting the responses of untested molecules to the same or similar physicochemical properties. One must understand the underlying mechanisms and modes of action as completely as possible as well as all the other factors associated with modifying toxicity for the chemical under study. Modifying factors among and between species include anatomical and physiological differences, referred to as interspecies and then there are the intra-species differences accounting for variability in the populations associated with genetic polymorphism, age, gender, and disease state. A structure–activity relationship is usually derived using molecules of similar molecular structure, for example, congeners of polychlorinated biphenyls or substituted halogenated aromatics. Extrapolating these results to predict the toxicity of compounds with different molecular attributes is not often successful. Special topic 2 describes a method developed by DeVito and colleagues for designing safer nitriles using a mechanistic and QSAR approach [31, 32].

    Special Topic 2: Designing Safer Nitriles

    Designing safer nitriles provides an example of how an understanding of the mechanistic nature and QSAR of a class of toxic chemicals can lead to a statement of design rules that inform the design of safer chemicals. Nitriles are a class of chemicals widely used for a variety of applications including as a solvent, in medicines and in other industrial applications. Nitriles occur naturally in both plants and animals and are also synthesized. Their ubiquitous nature and volume of use mean that the number of individuals potentially exposed to nitriles is significant, therefore evaluating and reducing the risk associated with exposure to this class of compounds is warranted.

    All nitriles contain the cyano functional group (CN). The toxicity of nitriles is similar to the toxicity of cyanide intoxication implying thatthe cyanide moiety from the molecule is the ultimate toxicant. DeVito and others have developed a mechanistic-based model for predicting the acute toxicity of nitriles based on the rate of hydrogen atom abstraction by cytochrome 450 with hydrogen radical stability as the primary variable [31]. An evaluation of the toxicity, as measured by the LD 50, compared with the structural characteristics of selected nitriles revealed that the critical mechanistic step is the rate of α-hydrogen abstraction. A higher rate led to a greater acute toxicity [32]. From this evaluation, structural modifications for reduced hazard, or design rules, were derived. Among the molecular attributes that are associated with lower acute toxicity were: (1) steric hindrance around the α-hydrogen to restrict cytochrome P450 enzyme access; (2) add groups that reduce the stability of the α radical; and (3) avoid hetero-containing groups on the α carbon.

    1.8 Environmental Toxicology

    Green toxicology also uses the principles of environmental chemodynamics and environmental or ecological toxicology to evaluate the behavior of chemicals in the environment by incorporating this information into evaluating biological, chemical and physical hazards to wildlife and to plants. Morphology, specialized biochemical mechanisms, persistence and bioaccumulation can all modify toxicity. Insects have a protective outer layer called chitin made of polysaccharides that provides a barrier to penetration by most organic compounds. Aquatic toxicology has developed sufficiently over the past several decades to emerge as a separate, focused and independent discipline examining the adverse effects of xenobiotics on aquatic organisms. Many aquatic organisms utilize gills to provide oxygen to critical sites. Plant morphology and biochemistry present features that must be examined and evaluated when characterizing phytotoxicity. Specialized cell walls provide a protective barrier that animal cells do not possess which will affect the ability of toxicants to reach targets in plants.

    1.8.1 Persistence and bioaccumulation

    Highly persistent and bioaccumulative chemicals are generally recognized as environmental hazards prompting national and international regulatory agencies to develop standards and guidelines to characterize these chemical attributes [33].

    Persistent chemicals resist natural environmental breakdown and may be present in the environment for many decades. Persistence is determined by evaluating chemical half-lives in various environmental media. The inherent stability of a molecule to the available degradation pathways in each environmental medium (i.e., air, water, soil/sediment, organisms) will determine its half-life and consequently its persistence and bioaccumulation potential.

    Chemicals that resist degradation in the environment because they are slow to transform are termed recalcitrant or refractory pollutants. Persistence is a relative distinction dictated by the choice of measurement parameters used to determine degradation under standardized conditions, thus persistence is established by the circumstances of use. Environmental persistence is therefore operationally defined, often guided by regulatory requirements by comparing the media-specific half-lives of the chemical in question with a series of predetermined measurement criteria. Chemical transformations can occur biotically (e.g., enzymatic metabolism), abiotically (e.g., hydrolysis, photolysis, oxidation) or through a combination of both pathways. Boethling et al. have developed an approach for designing chemicals that degrade quickly in the environment to innocuous substances [34].

    Special Topic 3: Design for Biodegradability

    Chemicals that resist environmental degradation persist in the environment. Designing chemicals for increased rates of biodegradability to innocuous products will reduce toxicity by limiting exposure times for hazardous agents. Incorporating molecular characteristics that promote biodegradation is a prudent risk reduction strategy.

    The influence of structure on biodegradability has been investigated recently and certain generalizations for designing biodegradable molecules have been proposed [34]. Certain molecular properties can be correlated to the rate of biodegradability just as with the other types of environmental behavior. These attributes include molecular weight, whether a molecule is branched or is a straight chain, the position and identity of substituents in a compound, and lipophilicity. Examples are provided below of molecular features that either increase or decrease biodegradation rates.

    Bioaccumulation is defined as the net uptake of chemicals from the environment by all possible routes of exposure [3]. Bioaccumulation occurs when the rate of uptake of a chemical is greater than the rate of elimination. Chemicals that tend to bioaccumulate are lipophilic and therefore tend to accumulate and are stored in fatty tissue resulting in an increased body burden [35]. The bioaccumulation factor (BAF) is a measure of the tendency of a chemical to bioacccumulate in an organism and is represented by the ratio of the concentration of a chemical in an organism at steady state to the concentration in the environment. A higher BAF indicates a greater likelihood of bioaccumulation. The bioconcentration factor (BCF) is used to calculate accumulation from exposure to water only, and therefore is generally limited to use in aquatic toxicology. The BCF is the ratio of the concentration of a chemical in an organism to its concentration in water.

    1.9 Risk Assessment

    Minimizing risks to human health and the environment from exposure to chemicals is a natural extension of the principles and practice of toxicology and is a central goal of green chemistry and of green toxicology. Risk is defined as the probability that an adverse outcome will occur after exposure to a toxicant. Hazard is an inherent property of a molecule's intrinsic toxicity. Risk is a function of both hazard and exposure and can be expressed as a simple mathematical relationship:

    (1.6) equation

    Reducing intrinsic hazard, exposure or both will result in an overall risk reduction. Green chemistry seeks to reduce intrinsic hazard [36]. Even though this equation appears to be simple, each individual component can be extremely complex and challenging to characterize with certainty. Intrinsic or inherent hazard is dependent upon molecular structure, including geometric features, electronic attributes and other physicochemical properties. The regulatory process of chemical risk assessment can be performed in several ways but a tiered risk assessment paradigm developed by the National Research Council has developed into the most often used [37, 38]. Risk assessment integrates both qualitative and quantitative information into statements of relative risk. The process consists of four components: (1) hazard identification; (2) dose–response assessment; (3) exposure assessment; and (4) risk characterization. This science-based approach is inherently iterative in that evaluating existing data reveals the need for future data collection and integrates information from various sources to characterize risks to humans and ecological receptors. Risk assessment is undertaken to achieve a number of goals including balancing risk and benefit, to set acceptable risk levels, for ranking chemicals for prioritizing research and for identifying risk reduction opportunities including green chemical design.

    Hazard identification is the first step in the process focused on determining whether a chemical has the inherent nature to be hazardous. Theoretically any compound will meet this requirement but hazard identification is intended to limit the evaluation to those chemicals that are used with high frequency or in large volumes [39]. Data are obtained from a variety of sources including in vitro and in vivo assays, QSARs and, in the most useful case, from epidemiological data from characterizing effects to the target organism. In vitro tests include developmental toxicity assays and the Ames test for mutagenicity.

    Animal testing or in vivo assays involve exposing experimental animals to the chemical of concern at several predetermined concentrations over appropriate time periods. Currently data generated from animal testing is a key and indispensable component of the risk assessment process. Extrapolation from animals to humans has inherent uncertainty that is accounted for using uncertainty or safety factors. Variables including body weight, homology of genetic material and enzymes and anatomy and physiology all must be considered to confidently apply animal data to human hazard evaluation.

    Animal testing has several drawbacks including the expense involved (2–4 million US dollars per assay), it is time consuming (2–5 years per assay), and it is fraught with animal welfare issues [40]. Therefore, alternatives to animal testing, including the use of QSARs, have many advantages. Using QSARs effectively requires input data of sufficient quantity and quality to support structure–toxicity relationships for risk assessment. Many research efforts are now underway to generate these data, and identify and address research gaps. Computerized models have progressed rapidly to allow the data from high-throughput screening efforts to be evaluated [41, 42]. Using a QSAR approach for risk assessment has been successful for a number of chemicals including dioxins and furans, polychlorinated biphenyls (PCBs) and polyaromatic hydrocarbons (PAHs).

    1.9.1 Non-Cancer risk assessment

    The process of risk assessment has been separated into noncancer and cancer methods for both biological and policy reasons. Noncancer risk assessment is used to identify points of departure (POD) including no observed (adverse) effect level [NO(A)EL], lowest observed (adverse) effect level [LO(A)EL] or the benchmark dose (BMD) for critical toxicological endpoints for the most sensitive and appropriate species. In almost all cases, the POD is derived using the results of animal test data. Because uncertainty exists in all point estimates, uncertainty or safety factors, generally a value of 10 per factor, are applied to the chosen POD to account for a number of variables including:

    interspecies (animal to human);

    intraspecies (sensitive individual);

    LO(A)EL to NO(A)EL;

    subchronic to chronic extrapolation;

    database modifying factors.

    The POD is divided by an appropriate total number of uncertainty factors resulting in a reference dose (RfD) that is defined as the dose of a toxicant at which no adverse effects are expected [43]. A lower RfD reflects a more toxic compound.

    1.9.2 Cancer risk assessment

    Cancer risk assessment differs from noncancer risk assessment in that there is no assumption of a threshold dose assumed, in other words, one molecule can theoretically initiate carcinogenesis. Models are used to generate point estimates, with associated confidence intervals, termed cancer slope factors (CSFs), that represent the potency of a carcinogen. A larger model value indicates a more potent carcinogen which is in contrast to the derivation of an RfD where a lower values reflects a more potent compound.

    Risk cannot occur without a complete exposure pathway established between a chemical and a receptor, therefore determining the concentration and magnitude of exposure is critical to the risk assessment process. Exposure to chemicals involves characterizing exposure point concentrations (EPCs), frequency and duration of exposure, and body weights and other attributes of an exposed population. If actual measurement values are not available, default assumptions for the set exposure parameters are often used that may compromise the confidence in the risk estimates because of the estimated population characteristics. The actual process of exposure assessment can be found in guidance produced by USEPA [44].

    Risk characterization, the final step in the overall risk assessment paradigm, strives to integrate information from the exposure estimates with the hazard information gleaned from the hazard identification and dose–response elements to estimate an overall potential risk to human health and to the environment. The types of adverse effects (cancer and noncancer) as well as the magnitude of these effects are part of this analysis that provides information for risk managers to compare with predetermined risk management criteria.

    1.10 Conclusions

    We are at the dawn of a perfect sunrise when we as citizens and scientists, environmentalists and engineers, have the opportunity to act upstream of pollution thereby decreasing the pressure downstream from the sources of inherently toxic pollution by advocating for, and applying, the principles of green chemistry and green toxicology.

    The desire for designing safer chemicals has been articulated for a number of years and has been incorporated into pharmaceutical and industrial chemistry research strategies [32, 45]. The first step in designing a safer chemical is to establish the relationships between molecular structure, functionality and adverse biological (toxicological) outcomes. Chemists are familiar with the properties that are required for functionality, for example in the preparation of dyes, solvents, surfactants, pharmaceuticals and their commercially important products. These chemists may not be as familiar with the process of evaluating the structure–hazard relationship for potential toxicity.

    The same chemical principles apply to designing for functionality as for designing for reduced hazard. Instead of functionality, the spectrum of toxic effects must be considered in the design phase and must also be considered design flaws.

    Advancements in mechanistic toxicology and identifying the factors affecting toxicokinetics and toxicodynamics have provided opportunities to exploit molecular soft spots to design safer chemicals.

    There are potential challenges to designing safer chemicals. Comprehensive hazard evaluation is an extremely complex undertaking reflecting the inherent complexity of biological systems. Some of these challenges include:

    1. The lack of specificity and selectivity for highly reactive chemicals. Highly reactive chemicals present unique challenges to green toxicologists because of the unpredictability of the potential interactions with biological targets. Highly reactive chemicals react quickly and indiscriminately leading them to be called promiscuous molecules.

    2. Structural diversity is vast and as a result the prediction of toxicity is made more challenging, especially based on two-dimensional inspection.

    3. The fate of a chemical entity is dependent on a number of processes at the molecular level.

    References

    1. Klaassen, C.D. (2008) Cassarett and Doull's Toxicology: The Basic Science of Poisons, 7th edn (ed. C. D. Klaassen), McGraw-Hill, New York.

    2. Hayes, A.W. (1994) Principles and Methods of Toxicology, 3rd edn (ed. A. Wallace Hates), Raven Press, New York.

    3. Rand, G. (1996) Fundamentals of Aquatic Toxicology: Effects, Environmental Fate and Risk Assessment, 2nd edn, Taylor and Francis, Washington, DC.

    4. Hodgson, E.A. (2010) A Textbook of Modern Toxicology, 4th edn, John Wiley & Sons, Ltd, Hoboken, NJ.

    5. Eaton, D.L. and Gilbert, S.G. (2008) Principles of toxicology in Casarett and Doull's Toxicology: The Basis Science of Poisons, 7th edn (ed. C. D. Klaassen), McGraw Hill, New York, pp. 11–43.

    6. Borzelleca, J.F.

    Enjoying the preview?
    Page 1 of 1