Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

Handbook of Mineral Elements in Food
Handbook of Mineral Elements in Food
Handbook of Mineral Elements in Food
Ebook2,262 pages25 hours

Handbook of Mineral Elements in Food

Rating: 0 out of 5 stars

()

Read preview

About this ebook

Mineral elements are found in foods and drink of all different types, from drinking water through to mothers’ milk. The search for mineral elements has shown that many trace and ultratrace-level elements presented in food are required for a healthy life. By identifying and analysing these elements, it is possible to evaluate them for their specific health-giving properties, and conversely, to isolate their less desirable properties with a view to reducing or removing them altogether from some foods. The analysis of mineral elements requires a number of different techniques – some methods may be suitable for one food type yet completely unsuited to another.

The Handbook of Mineral Elements in Food is the first book to bring together the analytical techniques, the regulatory and legislative framework, and the widest possible range of food types into one comprehensive handbook for food scientists and technologists. Much of the book is based on the authors’ own data, most of which is previously unpublished, making the Handbook of Mineral Elements in Food a vital and up-to-the-minute reference for food scientists in industry and academia alike. Analytical chemists, nutritionists and food policy makers will also find it an invaluable resource.

Showcasing contributions from international researchers, and constituting a major resource for our future understanding of the topic, the Handbook of Mineral Elements in Food is an essential reference and should be found wherever food science and technology are researched and taught.

LanguageEnglish
PublisherWiley
Release dateApr 20, 2015
ISBN9781118654330
Handbook of Mineral Elements in Food
Author

Miguel de la Guardia

Prof Dr.Miguel de la Guardia is Full Professor at Valencia University (Department of Analytical Chemistry) from 1991. He has published more than 550 papers in journals of the Science Citation Index with 8747 citations,5 Spanish patents, 3 books on Green Analytical Chemistry (Elsevier, RSC and Wiley) and and 2 books on Food analysis (Elsevier and Wiley) additionally than 15 book chapters. His H index is 39 He has supervised 33 PhD thesis and is member of the Editorial board of TrEAC Trends in Environmental Analytical Chemistry (The Netherlands), Bioimpacts (Iran) Spectroscopy Letters (USA), Current Green Chemistry (United Arab Emirates) Ciencia (Venezuela), J. Braz. Chem. Soc. (Brazil) Journal of Analytical Methods in Chemistry and Chemical Speciation & Bioavailability (UK), SOP Transactionson Nano-technology (USA) and SOP Transactions on Analytical Chemistry (USA). Member of the Advisory board of Analytica Chimica Acta (The Netherlands) between 1995 and 2000, and editor of five special issues of the journal Spectroscopy letters(USA) about Quantitative Vibrational Spectrometry (2005), Spectrometry and Automation (2006), Research on Spectroscopy in Morocco (2007), RISO Conference Special issue (2008) and Green Spectroscopy (2009) and a special issue of TrAC on Green Analytical Chemistry(2011)Invited Editor of a special issue on Green Analytical Methods of Analytical and Bioanalytical Chemistry published in 2012 and co-editor with S. Garrigues of a special issue on Analytical Diagnostics for Analytical Methods (RSC) to be published in 2014. Chevallier dans l’ordre des Palmes Acadèmiquess decorated by the Minister Council of France.

Related to Handbook of Mineral Elements in Food

Related ebooks

Food Science For You

View More

Related articles

Reviews for Handbook of Mineral Elements in Food

Rating: 0 out of 5 stars
0 ratings

0 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    Handbook of Mineral Elements in Food - Miguel de la Guardia

    Preface

    Publishing another book on the mineral composition of foods could seem to be a redundant task because of the presence in the scientific literature of the many excellent publications on both the nutritional aspects of essential and toxic metals in foods and the analytical approaches for determining and evaluating the concentration of mineral elements in foods.

    However, when we proposed this project to Wiley-Blackwell our conception was to assemble all the different aspects of minerals and foods, inviting a series of international specialists in different fields to cover the nutritional and chemical perspectives together with the legal international framework and the specific aspects of different elements and different classes of foods.

    Thus this book is not just an update of the data available in the international journals and databases regarding the mineral concentrations of foods produced around the world. It is intended to be a modern reference handbook, with the clear intention to address all aspects of the mineral elements present in the products utilized in human nutrition, from mother’s milk to baby food, from water to vegetables, dairy products, meat, fish and sea products, eggs, cereals and pulses, fruits and drinks, and from the analytical tools available to determine the mineral profile of foods to the evaluation of the nutritional aspects of such mineral composition.

    Our intention to move from a classical consideration, usually based on exposition element by element, to one focused on the different types of foods can be justified by an appreciation of the contemporary analytical tools available for mineral determination. These have evolved from those based on single-element determination to ones that provide detailed information on multiple elements present in samples at minor, trace and ultratrace levels. Thus modern analytical methodologies for mineral analysis have permitted complete evaluation of the mineral profile of foods, while also allowing the proportion of each element and the content of a specific target element to be determined.

    In addition to introducing a change in the presentation of the available data in different fields, we have also considered the nutritional and legal framework within which the reported data can be interpreted and have also added a section on the most useful analytical techniques suitable for evaluating the mineral profile of foods, incorporating the now well-known methods that have emerged in recent years.

    In short, we present this book to the reader in the hope of providing a complete picture of the mineral profile of food in order that both nutritionists and developers and users of methodology find inspiration in addressing the main problems related to the presence of minerals in foods and their determination.

    Finally, we would like to acknowledge the generous participation of all the authors and their outstanding contributions, together with the continuous support of the professionals at Wiley-Blackwell, who helped to improve our initial proposal and ensured the book’s smooth production during this last year.

    Miguel de la Guardia and Salvador Garrigues

    Valencia

    CHAPTER 1

    The importance of minerals in the human diet

    Késia Diego Quintaes¹ & Rosa Wanda Diez-Garcia²

    ¹ Federal University of Ouro Preto, Nutrition School, Department of Clinical and Social Nutrition, Ouro Preto, MG, Brazil

    ² University of São Paulo, Ribeirão Preto Medical School, Ribeirão Preto, SP, Brazil

    Abstract:

    Mineral nutrients are indispensible to the maintenance of life. A mineral element is considered essential when deficient ingestion results in harm or suboptimal function, and if supplementation with physiological levels of this specific element prevents or repairs this damage. Human nutritional requirements demand at least 23 mineral elements, and there are various methods available to establish the nutritional status of minerals. The required daily quantities of mineral nutrients are small, particularly when compared with nutrients such as carbohydrates and lipids. The minimum and maximum mineral contents necessary to produce adverse effects can vary widely between different mineral nutrients. Chosen food regimens are related to geographical availability, and the corresponding biodiversity. Biological adaptive processes, constrained by the regional diversity of diets, have over time established existing nutritional requirements. Nutritional recommendations define mineral consumption values that are not easily achieved with the contemporary Western diet. Scientific evidence suggests that nutrient supplements cannot replace a healthy diet, with the consumption of a wide variety of nutritious foods being the best way to maintain health and prevent chronic disease. There is scope for significant additional study of the role of minerals in the human diet, and their impact on human health.

    Keywords: mineral supplements; bioavailability; nutritional recommendations; human diet; food culture; micronutrients

    1.1 Historical aspects

    Mineral nutrients are essential for the proper functioning of every organism on earth. The interactions between mineral elements in biological systems and their role in mediating the chemical and biological reactions fundamental to life are still being discovered. Archaeological evidence of the feeding habits adopted by our human predecessors has been discovered by fossil studies from different periods and sites. Determination of the minerals in mineralized prehistoric human remains reveals the dietary conditions and food habits, and also the environmental and living conditions, of the population. Understanding the feeding habits adopted by our ancestors helps to elucidate the evolution of the species [1–3].

    The availability and distribution of foods and their preparation indicate the preferences of the population and provide knowledge of the social organization practised [1]. The main tool used to unravel the past is archaeological chemistry (archaeochemistry), which has been useful in discovering the practices and lifestyles of past human populations, including their feeding habits [3]. Human remains studied using archaeochemistry, be it via the recovery of tools or paintings or analysis of the concentration of the chemical elements and their isotopic forms, help us understand the role of the biocultural system resulting from the interaction between humans and their environment [4]. The definition of marker elements is essential for archaeochemistry to contribute to the reconstruction of human history, and of the respective feeding practices adopted.

    Bones and teeth are body structures considered as indicators of the exposure of humans to minerals [5, 6]. Although bones present numerous analytical difficulties with regard to the separation and characterization of the constituent minerals, they are the target of much research to determine the form of feeding practised by our ancestors. It is known that the geochemistry of modern vertebrate bones is directly related to the consumption of food and water: the composition of bone represents the principal food items ingested over the 10–30 years preceding death, varying as a function of the rate of bone renewal of the anatomical part under study. Because of its minimum annual renewal rate of 10%, bone is considered a monitor of some trace elements throughout life. However, because of its morphology, bone exhibits a marked turnover of cortical tissue (32%) and of trabeculated tissue (4.3%) and thus shows greater susceptibility to post-mortem changes [2, 6].

    In addition, the form of the mineral incorporated into the bones can be distinct for the various elements. Reconstruction of the diet consumed based on the mineral composition of bones requires that the diagenetic process has not altered bone composition over time. In fossils, the concentrations of the elements Fe, Mn and Cu can be increased due to diagenesis of the soil in which the individuals were buried [2].

    The bone concentration of trace elements such as Sr and Ba has been used to discriminate between herbivorous and omnivorous food patterns, where a low Sr level indicates low consumption of foods of animal origin, especially fish [2, 5, 7]. It is worth pointing out that the Sr and F contents suffer progressive post-mortem alterations, and can be enriched due to diffusion–adsorption processes [3], whereas variations in the Sr content of bones of the same individual suggests seasonal or geographical changes in the diet [8].

    It is known that the majority of Mg in the human organism comes from foods of animal origin [7]. One of the functions of Mg in the human is its involvement in cell metabolism, and insufficient amounts appear to affect the senescence process negatively [9]. Because of the high level of Zn in blood and meat, a diet based on meat could induce a higher level of this element. However, high levels of Zn can also be found in certain legumes and vegetables and can lead to erroneous interpretation of the diet [2].

    The choice of a food regime or types of food is related to the foods available locally and to the biodiversity encountered, and both factors contribute to nutritive mineral values of the diets [10]. In medieval culture, foods shared as presents were an integral part of the diet. Small items such as fruits, offered by anyone, characterized the attributes of commensurability, sociability, hospitality and charity, and also important factors in developing networks of human relationships. The form of donation demonstrated the singularity and adequacy of some food categories and made the intention of the donors obvious [11].

    In addition to the diet practiced with selected or donated foods, age and geographical location can lead to variations in the concentration of certain elements in the bones, not only of those classified as nutrients but also of non-nutrient minerals [6]. The levels of the latter in human bones in distinct eras can provide indications of the relative importance of a particular source of contamination. The levels of heavy metals in food, water and air depend predominantly on nature and the local geochemical conditions. The synthetic materials in the immediate environment (e.g. home or workplace) or more distant environment can give useful information concerning the relative importance of these sources, by comparing the levels of contaminating metals in human bones with those in animals not exposed to the same sources of contamination. Additional information can be provided by comparing the temporal alterations in the levels of metal contaminants in human and animal bones with variations in the availability and use of particular elements, and also in the rate of anthropogenic dispersion in the environment [7].

    With the objective of reconstructing the diet of our human ancestors, researchers have analysed the variability in the trace element contents of pre-Columbian human bones found in Chupicuaro, Mexico. A comparison of the mineral contents extracted from the skeletons with the mineral content of hydrothermal waters and carbonates available in the Chupicuaro environment provides evidence that Chupicuaro humans very probably consumed these waters, although it is not possible to know if the ‘hydrothermal diet’ was consumed directly or indirectly or if this diet was the result of cultural habits or the availability of environmental resources [3].

    The concentrations of Pb, Cu, Zn, Cd and Fe have been determined in the residual bones of 30 individuals buried in the region of Cartagena (Spain), dated from different historical periods, and also in eight individuals who died in a contemporaneous era. The evolution of the concentrations of the metals over time indicates an increase between the Neolithic period and the period covering the Bronze Age, Roman domination and the Byzantine era. The maximum content for Cu was in the Byzantine era, whereas for Fe it was the Islamic era. Zinc showed a tendency to increase throughout the periods studied, whereas Cd was the only metal showing a typical tendency to decrease. Of the elements analysed, the lowest values found corresponded with the Neolithic period, showing similar values to the samples of contemporaneous bones with respect to Pb, Cu, Cd and Fe [12].

    Another study involved the analysis of Pb by histological light microscopy and synchrotron radiation X-ray fluorescence (SR-XRF) in the bones of individuals who had documented health problems as a function of exposure to this contaminating element between 1793 and 1822 in Antigua (Antilhas). This study indicated agreement between the results obtained by the two techniques [8].

    From a historical perspective, dietary intakes of minerals such Fe, Zn, Cu and Mg have fallen in recent years as a function of both reduced energy requirements due to the sedentary lifestyle adopted by most people and changes in dietary patterns combined with lower mineral element densities in the diet. Compositional data indicated a downward trend in the mineral content of foods which has been attributed, at least in part, to intensive farming practices that result in depletion of mineral content of the soil. In the UK, research has demonstrated that the concentrations of Fe, Zn, Cu and Mg in wheat grain remained stable between 1845 and the mid 1960s. However, after this period, and coinciding with the introduction of semi-dwarf high-yielding cultivars, the levels of these minerals decreased significantly, whereas concentrations in the soil either increased or remained stable. Similarly, decreasing trends in mineral concentrations in wheat grain have also been seen when crops were treated with no fertilizers, inorganic fertilizers or organic manure [13].

    From the eighteenth century, advances in detemining the chemical composition of the fluids and tissues of animal organisms made it possible to recognize the importance of mineral elements in living organisms. In the nineteenth century, calcium phosphate was the principal constituent of bone, which also included Ba and Mg but in much smaller proportions. Iron was also recognized to be part of haemoglobin and haem, a constituent of blood which also contained P, Na, Cu, Mg, Pb and F. The question derived from this knowledge was whether the diet should provide these inorganic elements [14].

    At the end of the eighteenth century, the idea that vegetables were capable of producing inorganic substances was common. An experiment carried out by L.N. Vauquelin with chickens in 1799 aimed to verify whether the animal organism was also capable of generating inorganic material. He showed that the ingress of mineral substances such as calcium phosphate, calcium carbonate and silica via the feed was smaller than the egress via excrement and eggs. It was concluded that there had been a transmutation of the elements by the chickens. These results were accepted for several years. At the end of 1830 it was shown experimentally that vegetables must obtain their organic elements from the soil. A study carried out by Dumas and Prévost with eggs demonstrated conclusively that an embryo did not create the minerals present in its body. Similarly, in 1842 C.J E. Chossat showed that the supply of Ca via the diet was important for maintaining bone composition, and in a publication in 1844 Boussingault considered the nutritional need for minerals to be evident and also noted the importance of knowing the amounts obtained from food [14].

    1.2 Types and metabolic function of mineral nutrients

    Human nutritional requirements demand at least 49 nutrients to meet organic metabolic needs. Of these, 23 mineral elements are involved in physiological and biochemical activities [15, 16]. The daily quantities of mineral nutrients are, by nature, small, especially when compared with nutrients such as carbohydrates, proteins and lipids. Since minerals are indispensable to functioning of the organism, they must be regularly present in the diet. The macrominerals are those present in greater proportions in the body tissues, leading to greater amounts in the diet. Microminerals are equally essential to the human diet, although required in smaller amounts. A third group also exists, that of the essential trace elements or oligoelements, considered thus when the daily requirement is very small.

    Mertz [17] defined as essential trace elements those elements with daily requirements below 18 mg. However, elements such as Zn, Cu, Mb, Fe and Se, which occur in relatively low concentrations in plant and animal tissues (<100 ppm) and in rocks, are also classified as trace elements. Altogether, the trace elements make up less than 1% of the total composition of elements on the Earth and are essential for the correct functioning of many plant, animal and human biological systems [18].

    Verdú and Marín [19] used the term ‘macrominerals’ for elements with a daily requirement equal to or above 100 mg, and the term ‘microminerals’ for those with daily requirements generally below 100 mg, which includes the 11 essential trace elements within this group. Of these 11 elements (Va, Cr, Mn, Fe, Co, Cu, Zn, Mo, Se, F and I), eight are in period 4 of the periodic table, and the positive relationship between nucleus size and electron availability of the elements in this period allows them to interact with organic molecules present in biological systems. The remaining trace elements (Se, F and I) are non-metals [16]. Table 1.1 exhibits the principal functions and biochemical modes of action of the macrominerals and microminerals, although not all have defined requirements or even functions.

    Table 1.1 Metabolic functions and biochemical modes of action of macrominerals and microminerals.

    It is important to highlight the fact that each mineral has a unique function, regardless of its categorization as either macromineral or micromineral. The microminerals, which are essential components of biological structures, can be toxic in amounts only slightly above the amounts required for their physiological action. In addition, this principle of toxicity can be extended to other elements which are not considered essential nutrients: by possessing similar atomic characteristics, they can imitate the reactivity of a micromineral [16].

    The discovery of these elements was due to deficient states observed in different regions of the world. The deficiencies can be reversed with the use of supplements, primarily by consumption of specific foods which naturally contain the element, and subsequently by consumption of the isolated chemical component, since this can be obtained and purified as a result of advances in analytical methods [20]. Although the existence of nutritional requirements for other microminerals (e.g. B, Mo) has been confirmed (the case of fluorine being partly a question of semantics), the evidence for nutritional requirements of some of the ‘essential’ microminerals is tenuous or limited [21]. It is possible that in the not too distant future, B and Va may be included in the list of essential microminerals.

    The biochemical investigations carried out with minerals in recent decades have led to a relatively good understanding of their mode of action, justifying their essential presence in the human diet. Initial studies identified a role for mineral nutrients in the activity of practically all enzymes, whether at the active site of the enzyme or as cofactors, it being apparent that adequate amounts are central to the ability to deal with the derived metabolic response. Other functions have been established for these nutrients in recent years, for example the capacity to modulate genetic transcription and the antioxidant capacity of metalloenzymes such as glutathione peroxidase (which contains Se) and superoxide dismutase (which contains Zn and Cu), with both catalysing reactions responsible for the removal of oxidizing species [20].

    Of the macrominerals, Na, K and Cl are important intracellular components. They are responsible for creating the environment around almost all cells, across which metabolites and gases pass from one side of the membrane to the other. The electrolyte gradient across the cell membrane is a prerequisite for cell excitability, signal transport, cell transport and movement processes. The macromineral daily requirements may need adjustment according to the clinical situation. For example, the requirement could increase when there are excessive gastrointestinal losses or even decrease in cases of kidney failure [22].

    1.3 Essentiality and toxicological aspects

    An organism has the capacity to recognize a mineral nutrient and deal with the dichotomy between its essentiality and its toxicity by regulating the absorption and excretion of these essential nutrients and also by resorting to storage systems as a way of regulating the maintenance of levels compatible with life, while at the same time minimizing the likelihood that these mineral elements take part in toxic reactions. In this process, proteins are fundamental for the recognition and transport of mineral nutrients [16].

    One definition of nutrient essentiality considers as essential an element that is indispensible to the maintenance of life [17]. However, even if a mineral is classified as essential for the functioning of the organism, macrominerals and microminerals can display toxicity in concentrations above those required for their biological functions. Such toxicity can also be displayed in elements not considered as nutrients but which mimic the functions of those that are due to the similarity of their chemical characteristics [16].

    A mineral element is considered essential when deficient ingestion results in harm or suboptimal function and if supplementation with physiological levels of this specific element prevents or repairs this damage. The absence of an essential mineral can lead to death, particularly with those elements required in very low concentrations. Essentiality is considered to be present when demonstrated by more than one independent investigator in more than one animal model, it being easier to control exposure to macrominerals than to elements required in smaller concentrations. The criterion for essentiality is not related to the degree of debility resulting from nutritional deficit [17].

    The animal model is among the methods employed to estimate nutritional requirements, and these have changed over time. Four approaches are among the principal methods used currently: the clinical approach, nutrient balance, functional indicators of nutritional sufficiency (biochemical, physiological, molecular), and optimal nutrient intake. The dietary reference intakes (DRIs) for minerals are a set of recommendations for apparently healthy individuals (referring to the absence of disease based on clinical signs and symptoms and function) and are determined experimentally under controlled conditions, usually with regular or light physical activity [23–25].

    The requirement is understood to be an intake level that will meet specified adequacy criteria, preventing any risk of deficit or excess, and including a gradient of biological effects related to the intake of the nutrient in question. The dose–response is assumed to have a Gaussian distribution unless known to be otherwise. A risk function (a probability of 0 to 1) of deficiency or excess can be derived [26]. Risk assessment was introduced for essential trace elements about 20 years ago, establishing that every essential trace element must have a range of intakes that are safe from toxicity but which are adequate to meet nutritional requirements [27]. The variation integrating the dose–response curve range, and also the lower and upper limits, are delineated as a function of the nutritional and toxicological data respectively, and are situated in a range in which there is low risk of deficiency [27].

    The recommended value represents the recommended dietary allowance (RDA) or adequate intake (AI), whereas the tolerable upper intake level (UL) represents the limiting value above which the risk of toxicity is present. To establish a safe UL, the presence of a no-observed-adverse-effect level (NOAEL) in the intake response is assumed [24, 25]. Table 1.2 presents the DRIs of the nutrients essential for human life.

    Table 1.2 Nutrients essential to human life, showing dietary reference intakes (DRIs) for macronutrients and micronutrients.

    * AMDR, acceptable macronutrient distribution ranges.

    † EAR, estimated average requirement.

    ‡ AI, adequate intake.

    The minimum and maximum quantities necessary and compatible with the normal functions of the organism determine the RDA for safe daily intake. The AI range is situated in values that imply a low incidence of adverse health effects, avoiding both health risks resulting from inadequate ingestion and the risks of excessive ingestion (toxicity). There is an AI range for each element considered essential, although there can be great variation between the ranges for distinct elements [28].

    For some essential nutrients, the interval between the maximum and minimum limits may be very small, some even showing overlapping values, which makes it difficult to define the safety margins for nutritional recommendations. For Se, a new approach method has been developed that models the relationships between intake and risks of either deficiency or excess, using an observed incidence for each effect and population distribution characteristics. Using this model it was possible to formulate advice for risk managers on the incidence of adverse effects due to either deficiency or excess at different intake levels [28]. Nevertheless, other elements such as Fe show wide variation in their bioavailability, and their biomarkers, such as ferritin, can be affected by infectious and/or inflammatory processes, resulting in confusion in the relationship between intake and nutritional status [29].

    1.4 Diagnosis of mineral status

    Optimal mineral status is essential for maintaining health and ensuring optimal body function. Since there are many minerals essential to metabolism, there are many ways to determine the nutritional status of the minerals [21]. The nutritional status can be understood as the body condition that results from the process of using the nutrients contained in foods, leading to an equilibrium between supply and assimilation of the nutrients, and the nutrient consumption of the organism [30], which is influenced by the sex and age of the individual. The nutritional mineral status can be diagnosed from diverse biomarkers which indicate a status that can be modified intentionally (i.e. dietary modifications) or non-intentionally (i.e. presence of pathology) (Figure 1.1).

    c1-fig-0001

    Figure 1.1 Methods for diagnosing individual mineral status.

    For certain mineral elements, the concentration in human tissue is one of the current ways of determining nutritional status. The blood plasma can be collected and analysed for specific elements (i.e. Se, Mn) in clinical and population studies. However, the plasma concentration may show inadequate sensitivity due to the absence of reference values for many minerals, or even as a function of homeostatic adjustment mechanisms that maintain levels stable even where there is deficient intake (i.e. Ca, Zn). For Cu, Zn and Fe, a lack of specificity can also compromise the value determined in the plasma. For others, such as Mn, the plasma concentration does not represent adequate evaluation of the nutritional status of these elements [21, 31]. It was recently shown that dietary supplementation with Co produces an alteration in the blood level of this element and can be used to predict exposure to or intake of this metal [32].

    The cellular components of the blood can also be used as indicators of the mineral nutritional status, although this type of analysis is more usual in research studies. The method requires sample preparation and has limited application for a variety of elements, so its use is restricted. A study involving analysis of the nutritional status of Mn in women undergoing menopause, after adjustment for the various covariants, showed that both plasma ferritin and the menopause can be used to predict Mn content in the blood, and that the hormonal state (before and after menopause) acts as a modifier of Mn concentration in the blood [32].

    Use of the enterocyte membrane in determining the nutritional status of Zn and Co could be feasible in experimental nutrition, since this membrane contains these elements and has an association with the mineral content administered [33]. The mineral contents of hair and nails represent interesting alternatives for the determination of the prior nutritional status of certain minerals. However, the improper use of this method commercially has reduced the credibility of these tissues as indicators, and it is frequently difficult to infer a mineral nutritional deficiency from mere observation of alterations in the hair and nails. Prolonged mineral deficit leads to anaemia or severe hypocalcaemia, indicated by organic Fe or Ca deficits respectively, both of which impact nail structure. It should be mentioned that Ca does not contribute to nail hardness and represents only 0.2% of nail plate weight, a region which also contains other minerals such as Mg, Fe, Zn, Na and Cu [34].

    The determination of mineral homeostasis and metabolism is another way to estimate body mineral status. Mineral homeostasis can be evaluated using daily values for mineral intake and excretion as the parameter, the renal system having fundamental importance in the maintenance of homeostasis of the majority of minerals and the intestine being the primary site of homeostasis for Ca. In pathological situations, alterations in excretion can occur, altering the homeostasis of a mineral or group of minerals. Urinary excretion only represents a useful biomarker for minerals where the renal system has an active role, such as with I and Co [21].

    In cases where endogenous mineral excretion is regulated by the gastrointestinal tract, including its accessory organs the liver and pancreas, theoretically the quantification of minerals in the faeces is a biomarker of dietary intake and the nutritional status of these minerals, Zn being the main target of this type of investigation [35]. No less important is the fact that the mineral absorption process is, to a great extent, regulated by the amount in the digestive tract, this relationship being inversely proportional. Considering that there could be excretion of endogenous mineral concomitant with excretion of mineral intake, marking techniques of mineral intake and metabolic collection help in estimating homeostasis. Although marking techniques are useful in determining the organic turnover of minerals such as Ca, I and Zn, their applicability is limited outside clinical trials or with restricted samples [21].

    Advances in molecular biology have been promising in the area of mineral biomarkers. A recent study indicated that not only was it possible to determine the nutritional status of iron via transferrin receptors but that this was also useful in differentiating types of anaemia, whether due to iron deficiency or inflammation [36]. With respect to toxicological questions, osteopontin is considered a biomarker for exposure to U, suggesting kidney damage when its content is reduced in the urine [37].

    The response to an increased intake can be used as a biomarker for some essential minerals, although insufficient in many cases (i.e. Ca, Na). A current detailed determination of the nutritional status of the mineral in question, for example Zn and Co, is required for evaluation of the physiological response, it being necessary to confirm the pre-existing dietary deficiency or nutritional status deficit for a randomized intervention of the mineral under investigation to be carried out [21, 32]. As a positive point, the usefulness of the analysis of the response to increased intake of the mineral resides in the fact that this combines the mineral deficit data with non-specific disturbances in normal physiology and also with non-specific morbidity [21].

    The functional indices present a differentiated value, because they indicate when ingestion and nutritional status are so highly compromised as to cause measurable changes in the normal biochemical and physiological processes of the organism. The specificity of these alterations should be exclusive for a determined deficient element. Thyroid diseases due to iodine deficiency are an example, and have been recognized for decades and effectively prevented and treated [38].

    Body mineral stores can be useful in determining nutritional status. Alterations in the store can represent an increase in intake, bioavailability or even a reduction in excretion. This situation can be noted classically in relation to Fe, whose organic values are rigidly controlled by the ferritin content, the storage protein of this element, since Fe is an oxidizing element. Another element whose storage level can be measured is Ca, by measuring the amount contained in the bones. More recently, such an evaluation has been studied for Zn, an element that occurs in hundreds of enzymes and millions of protein domains. It was found that Zn possessed a complex protein homeostatic system that regulated the amount of cellular Zn, coordinating the exportation, importation and detection of the organic status of the metal [39].

    1.5 Food culture and mineral diet content

    Nutritional recommendations define mineral consumption values that are not easily achieved with the contemporary Western diet [40, 41]. If current nutritional needs are not easily met in a situation that provides food diversity, improved access to foods, improved processing and supplementation, access to foods from various geographical regions, the application of improved agricultural technologies and availability of foods with high nutrient availability, one might wonder how our ancestors survived. In the past, diets were more susceptible to food shortages, seasonal variations in food supply, geographic restrictions of human groups, unpredictability of access to food, and other factors that denote the precarious food conditions of the past. These factors raise the interesting question of what were the ideal characteristics of our ancestors’ diet [42]. The diversity of diets spread across the continents over different historical periods and time indicate that biological adaptive processes were fundamental conditions for nutritional needs.

    Milton [43] suggests that studies on human food practices should use the reconstituted diets of our ancestors, with the argument that changes in diet are comparatively recent in relation to the human evolutionary process. Conversely, the forms of cooking, the domestication of grains, ultra-processed foods, and the increased consumption of sugar and saturated fat are recent events. The human gut, for example, derived from ancestral lines associated with a given range of possible food sources. For a better understanding of the nutritional requirements and nutrient intake patterns of other active components present in the diet, Milton investigated the diet of primates and the composition of wild foods, in comparison with the composition of cultivated foods. The study was conducted by comparing the mineral content of vegetarian food sources eaten by groups of non-human primates. The concentration of minerals was higher in wild specimens and it was concluded that the mixture of chemical constituents in the plant, consumed simultaneously, seems to explain the advantages of this type of diet compared with the Western diet [43].

    Based on the same premise, some studies have shown that the diet of Palaeolithic humans, comprising fruits, roots, meat and seafood, was dense in micronutrients, especially I, Fe, Zn, Cu and Se, although Ca and vitamin D were scare [44, 45]. Another study found a decrease in concentration of Mg in wheat, with a decline in the range of 15–23% since 1850, which coincides with the increase in cardiovascular disease and mortality [46]. These findings lead us to believe that a higher concentration of micronutrients in the diet of our ancestors ensured the conditions for their survival, and the modifications of these conditions may be associated with current diseases.

    However, malnutrition and food shortages are factors that have contributed to the evolution and health of humankind. The diversity of human groups spread across different parts of the world, with differing climatic characteristics and geographies, have resulted in revised eating patterns, and possibly adaptive mechanisms to suit these factors.

    Lazenby [47] suggests that the accelerated bone loss among the Inuit and Inupiat populations of the Arctic regions is due to the higher production and use of thyroid hormones (T3 and T4) as a mechanism for adaptation to cold rather than as a result of a high-protein diet, also called the ‘acid-ash’ diet. Studies with different ethnicities illustrate nutritional differences and adaptive mechanisms that add to the complexity of establishing nutritional recommendations for minerals [47].

    1.6 Health consequences of human mineral malnutrition or excessive intake

    Two distinct aspects need to be considered in order to determine the optimum interval of values for the RDA of a determined mineral nutrient: one refers to the minimum content necessary for the human organism to maintain its normal functions and the other to the maximum content considered compatible with normal functioning (risk). The interval between the upper and lower limits may be highly restricted, and can even show overlapping values, principally due to the fact that the upper limit is a derived value, with different uncertainty values being used in the toxicological evaluation of the risk of nutrient deficiency [29].

    Thus the DRI – the nutrition recommendations of the Institute of Medicine (IOM) of the National Academy of Sciences of the USA – establishes for mineral nutrients the maximum tolerable UL that is safe and with negligible toxicity risk, and a minimum value that guarantees a low risk of nutritional deficiency [24, 25]. The health risk arising from inadequate intake, and the health risk caused by excessive intake, are both considered in the range of acceptable doses for the daily intake [48].

    The estimated average requirement (EAR) values, the data on which the RDA values are based, require the existence of a dose–response relationship between one or more status biomarkers (which are predictive of optimal health) and dietary intake. However, for some elements (e.g. Cu) there are no sensitive and specific biomarkers of the status [29]. In part this difficulty is due to the complex network of transport routes available for mineral nutrients to arrive at the cells. The route chosen by the organism is determined by the physiological needs for each element [49], which is dynamic and not static. Processes such as pregnancy, breast-feeding or even intense exercise can alter the daily requirements for minerals [23].

    The AI value should offer a low incidence of adverse health effects due to very low intake (deficiency or absence of a health benefit) or very high intake (toxicity) [48]. Information is required on the impact of both excessive and minimal intake on health and also on the dietary consumption of different population groups in order to carry out a risk–benefit analysis of the mineral nutrients, together with measures of variation, including:

    determination of the risks and benefits for health on a short- and long-term basis (identifiable/measurable end points, evidence of causality, dietary reference values);

    consumption data, with respective variation between population groups and the dose–response relationship;

    information on the appropriate type of risk–benefit analysis (qualitative or quantitative), including phenotype and genotype data [29].

    Insufficient and excessive mineral nutrition represent opposite states, both deleterious to adequate functioning of the organism. The degree of effect varies as a function of the mineral in question, the level of deficit or excess, and the requirements of the individual, which vary as a function of age, sex and physiological state. The amount of mineral stored in the organism could be relevant in the case of dietary deficiency, when the organism depends on the organic reserve to supply its daily needs. An initially low mineral store could represent a smaller toxicological risk compared with an excessive intake, which, if it becomes chronic, can have negative repercussions on the individual.

    It is important to point out that the organism needs to be efficient to deal, on a daily basis, with elements that are simultaneously essential and toxic, depending on the quantity consumed and the amount stored. Considering the existence of an adequate mineral store, a chronic intake deficit or increase in metabolic demand would imply an initial diminishing of the amount stored in the organism. For minerals such as Ca, the result of a chronic deficit will be observed in the long term by the induction of osteoporosis. For other elements such as Fe, deficiency will result in iron-deficiency anaemia, which usually develops after a few months of low dietary intake or increase in daily requirements. An increase in the metabolic demand due to sporting activities has been proposed for Fe, Ca and Na, although there is a need for greater scientific evidence in the case of Ca [23].

    With respect to mineral toxicity, this goes beyond the UL parameter, and can be expanded to include other non-essential elements with atomic characteristics similar to those of mineral nutrients, thus mimicking their reactivity. The use of proteins, isolated or combined with other biomolecules, that have the ability to specifically recognize the target mineral enables the organism to manage this situation, capturing the element of nutritional interest and leading it to the target tissue at the same time as preventing the mineral nutrient from taking part in undesirable prejudicial reactions [16]. Absorption of the vitamin cyanocobalamin by mammals can be cited as an example of this, since this vitamin contains Co in its structure [50]. However, the use of protein transporters is not always sufficiently capable of separating the minerals of nutritional interest from non-nutrient elements, such as in the case of Fe and Al, which can compete with each other for transport by transferrin [51], a situation that can favour iron-deficiency anaemia.

    The minimum and maximum values necessary to produce adverse effects can vary widely between distinct mineral nutrients. In deficiency, it is important to note that the adverse effects observed are the result of continued low intake, and that the severity and/or incidence would be attenuated by an increase in dietary intake [48]. The adverse effects resulting from excessive consumption will be aggravated if there is an increase in intake [48]. However, research has shown that to reach or surpass the UL for essential mineral nutrients by way of a varied diet is not commonly encountered among human population groups [52, 53], with the exception of Na, for which Western feeding is above the UL established, in contrast to the K content, which is below the nutritional recommendation and therefore favourable for high blood pressure [54]. This has been observed even in oral therapeutic diets served to hospital patients [55].

    With respect to dietary deficient elements, apart from K, the most frequent involve Ca, Fe, Se, Zn and I, the latter being present in specific regions. Considering that minerals such as Se have an antioxidant role and that gender and age are individual characteristics that can affect the capacity to prevent oxidative stress [56], the proposed daily intake levels should be adjusted to supply the individual organic requirements.

    An understanding of the consequences to human health of a deficient or excessive mineral intake, isolated or as a group, is also necessary for food production and even for the production of nutritional supplements. Fortification of food could be an alternative, to be implemented with a view to attending to the nutritional demands of the target population.

    1.7 Minerals, health and ageing

    To convert organic needs into dietary requirements, adjustments are important, taking into account diet- and host-related factors. The diet-related factors depend on the nature of the usual diet and may consider the chemical form of the nutrient in question, the nature of the dietary matrix and the interactions between nutrients and/or other components, food preparation and processing practices. The most important host-related factors are gastric secretion and intestinal motility, since both will change with age, ethnicity, genotype and sex [57].

    Ageing is characterized by the progressive deterioration of physiological functions, including weakening of innate and acquired immunity, even in the ‘healthy’ elderly. The current idea is that improvements in the nutritional status of elderly individuals will enhance their immune system, which could enhance their nutritional status by mechanisms preventing the consequences of infectious diseases, such as nutrient malabsorption, loss of nutrient and energy storage capacity, and reduced appetite [58].

    However, the reduction in food intake that occurs between the ages of 20 and 80 years can be attributed to physiological actions [59], which can partly be explained by the age-related decline in olfaction and taste capacities. Chronic diseases and/or drug use can also have a negative effect on food consumption, resulting in the elderly showing a greater risk for micronutrient deficiency than healthy adults [60].

    Micronutrient depletion leads to specific and well-known clinical symptoms in younger adults, which can be diagnosed and treated, whereas in the elderly mild micronutrient deficiencies and their consequences are difficult to assess [60]. For example, deficiencies of trace elements such as Zn, Fe and Cu can affect the organic response of the host and could confuse the diagnosis or the interpretation of the symptoms in older individuals [61].

    Using Ca as an example, it is known that the daily Ca intake tends to decline with advancing age, and that the intestinal absorption process is reduced in older women compared with younger women as a function of oestrogen deficiency, a condition aggravated in the presence of a vitamin D deficiency, and which is favourable to an increased risk of osteoporosis. Many factors, including various disease conditions and medications, can also affect Ca absorption and become more relevant with increasing age [62].

    Researchers working on the European Survey on Nutrition and the Elderly, a Concerted Action (SENECA) study examined 1005 elderly people from eight countries in relation to the adequacy of their intake of energy, some vitamins and Ca and Fe. The prevalence of an inadequate intake of micronutrients was high at all energy intake levels, especially in women. Fe was the most prevalent with respect to inadequate intake, Portugal being the country with more individuals showing inadequate intake of Fe [63]. It is important to note that the body requirements for Fe in postmenopausal women are reduced, since there are no menstrual blood losses [64].

    The minerals Zn and Mg and some other trace elements may be present in less than optimal concentrations in the diet of older people [65]. It is known that Zn deficiency, common in the elderly, is linked to impaired immune function and an increased risk of acquiring infection, which can be rectified by zinc supplementation [66]. However, the administration of higher than recommended ULs for zinc may adversely affect immune function, so much more work is needed before recommendations can be made about Zn intake [65, 66].

    With respect to Mg, it seems likely that Mg inadequacy interferes with cell metabolism, which could accelerate the senescence of human endothelial cells and fibroblasts, negatively affecting cell division. The Western diet is relatively deficient in Mg and correcting the intake might contribute to healthier ageing and the prevention of age-related diseases [9]. For most minerals, such as Mg, Ca, P, Fe, Zn, I, Cr, Mb and Se, body requirements do not seem to alter with age, except for Fe, which is reduced in older as compared to younger women. Dietary Na and its role in hypertension has stimulated concern and controversy [65].

    However, the provision of more minerals in the form of a nutritional supplement to older subjects can offer some health benefits. The effects of 2 years of daily supplementation with trace elements (zinc sulfate and selenium sulfide) on immunity and the incidence of infections in 725 institutionalized elderly patients (>65 years) showed that the number of patients without respiratory tract infections during the study was higher in the groups that received the elements. The authors concluded that low-dose supplementation of Zn and Se provides significant improvement in elderly patients by increasing the humoral response after vaccination, and could have considerable public health importance by reducing morbidity from respiratory tract infections [67].

    A recent randomized, blind, placebo-controlled trial carried out with 910 non-institutionalized men and women, aged 65, who received a daily placebo or multivitamin and multimineral supplement for 1 year [14 mg Fe (fumurate), 150 μg I (potassium iodide), 0.75 mg Cu (gluconate), 15 mg Zn (oxide) and 1 mg Mn (sulfate)] showed that supplementation did not alter the life quality of those who received it or contacts with primary care and days of infection. The study concluded that the routine multivitamin and multimineral supplementation of older people living at home does not affect self-reported infection-related morbidity [68].

    Important differences among the study interventions include the use of institutionalized or non-institutionalized subjects and supplements with just minerals or a mixture of minerals and vitamins, factors that can explain some of the discrepancies in the findings of the studies. However, a systematic review of the use of multivitamin and multimineral supplementation to reduce infections among the elderly showed no significant effect of the micronutrient mixture supplements. The review also showed that the elderly, aged 65 years or over, undernourished at the baseline, could benefit from 6 months of supplementation as compared with the other elderly individuals [69].

    1.8 Foods or supplements as a source of minerals

    Several epidemiological studies, observational studies and interventional trials have shown a significant inverse relationship between the intake of foods rich in mineral antioxidants, particularity Se, or intake of the mineral itself and a reduction in the risk of several age-related chronic diseases, including some types of cancer, eye and neurodegenerative diseases, and ischaemic cardiovascular diseases [57, 70]. The link between health and mineral nutrients is responsible, in part, for the increase in their use that has occurred since the 1970s. The multivitamin–multimineral is the most frequently reported dietary supplement used among North Americans, and use increases with advancing age [71].

    In the USA, nutrient and dietary supplements are regulated as a food subcategory by the Food and Drug Administration (FDA), Center for Food Safety and Applied Nutrition. The Dietary Supplement Health and Education Act’s safety and labelling requirements define dietary supplements as, in part, products intended to supplement the diet, containing any of the following dietary ingredients: a vitamin; a mineral; a herb or other botanical item; an amino acid; a dietary substance for use by humans to supplement the diet by increasing the total dietary intake; or a concentrate, metabolite, a constituent, extract, or combination of any of the above ingredients. Dietary supplements are intended to be taken orally and can be presented in many forms, including pills, capsules, tablets, liquids, powders or other forms, so long as they are not presented for use as a conventional food or as the sole item of a meal or diet [72].

    In the European Union (EU), the European Commission has established rules to help ensure that food supplements are safe and properly labelled. In the EU, food supplements are regulated as foods and the legislation focuses on the vitamins and minerals used as ingredients in food supplements. The main EU legislation is Directive 2002/46/EC, which is related to food supplements containing vitamins and minerals [73]. The European Commission defined food supplements as a concentrated source of nutrients or other substances with a nutritional or physiological effect, whose purpose is to supplement the normal diet. Food supplements are marketed in ‘dose’ form, for example as pills, tablets, capsules or liquids in measured doses. Supplements may be used to correct nutritional deficiencies or maintain an adequate intake of certain nutrients [73].

    A recent analysis of the data from the National Center for Health Statistics about National Health and the Nutrition Examination Survey (NHANES 2003–2006) on the use of mineral supplements showed that North American individuals are taking Fe, Se, Cr, Zn and Mg supplements. The results revealed that about 18–19% were using Fe, Se and Cr, while 26–27% were using Zn and Mg supplements. Overall analysis revealed that half of the North American population and 70% of adults aged 71 years or older use dietary supplements, and that one-third use multivitamin–multimineral dietary supplements [71].

    However, recently published randomized placebo-controlled primary prevention trials have been unable to demonstrate the same beneficial effects when individual exposure occurs by ingestion of natural food mineral sources. The apparently contradictory results between the observational studies and the randomized trials could be explained by the fact that the doses used in clinical trials were much higher than the highest levels achieved by usual dietary intake [57].

    In China, a 6-year prospective intervention study including supplementation with 14 vitamins and 12 minerals or a placebo among adults with precancerous lesions of the oesophagus showed that the cumulative cancer incidence rates were nearly the same. The authors concluded that there was no substantial short-term beneficial effect on the incidence or mortality for oesophageal/gastric cancer following daily supplementation [74].

    Another randomized, double-blind, placebo-controlled primary prevention trial (SU.VI.MAX), involving a total of 13,017 French adults of both genders, tested the efficacy of single daily dose of a nutritional supplement (120 mg ascorbic acid, 30 mg vitamin E, 6 mg β-carotene, 100 μg Se and 20 mg Zn) or a placebo on the incidence of cancer and ischaemic cardiovascular disease. The median follow-up time was 7.5 years. No significant differences were detected between the groups in total cancer incidence (4.1% for the study group vs. 4.5% for the placebo group), ischaemic cardiovascular disease incidence (2.1% vs. 2.1%), or all-cause mortality (1.2% vs. 1.5%). An ex-stratified analysis showed that a low-dose antioxidant supplementation lowered the total cancer incidence and all-cause mortality in men but not in women [75], and at the same time exhibited a fourfold higher melanoma risk in women but not men [76]. The authors suggested that the effectiveness of supplementation in men could be attributed to their lower baseline status of certain nutrient antioxidants [75].

    Concerning the incidence of melanoma, another population-based prospective study of 69,671 men and women who self-reported the intake of multivitamins and supplemental antioxidants, including Se and β-carotene, over 10 years detected no significant association between multivitamin use and melanoma risk in women or in men, and no increased melanoma risk was noted with the use of supplemental β-carotene or Se at doses comparable with those used in the SU.VI.MAX study [77].

    In India, a double-blind, matched-pair, cluster randomization study enrolled 214 pre-menarche girls, with the aim of investigating the effect of a 1-year nutritional supplement containing Ca, a multivitamin with Zn plus vitamin D on bone mass. Three groups were established: Ca group (500 mg/day Ca); Ca + MZ group (500 mg/day Ca + multivitamin tablet containing 15 mg/day Zn); control group (multivitamin tablet with no minerals). All subjects received vitamin D supplementation. The mean percent increase in total body bone mineral content was higher in the two Ca-supplemented groups (13.6–22.0%) compared with the control group, which showed no improvement [78].

    Recently, a re-examination of the literature concerning the effect of antioxidant supplementation on mortality and health in randomized trials was carried out, using the data from 66 randomized clinical trials, and found that 36% showed a positive outcome, 60% had no outcome and 4% had a negative outcome [70]. In one of these studies, a randomized controlled trial, a supportive epidemiological and preclinical examination was carried out to determine whether Se, vitamin E or both could prevent prostate cancer and other diseases with little or no toxicity in relatively healthy men. The results showed that neither Se nor vitamin E, alone or in combination, at the doses and formulations used, could prevent prostate cancer in this population of relatively healthy men [79].

    In the Biesalski group meta-analysis, the interventions in the studies analysed were categorized as primary prevention (risk reduction in healthy populations), secondary prevention (slow pathogenesis or prevention of recurrent events and/or cause-specific mortality) or therapeutic (treatment to improve quality of life, limit complications, and/or provide rehabilitation). The authors found positive outcomes in 8 of 20 primary prevention studies, 10 of 34

    Enjoying the preview?
    Page 1 of 1