Which of the following are devices used to measure personal exposure to radiation?

Anesthesia for Procedures in Non–Operating Room Locations

Manuel C. Pardo MD, in Basics of Anesthesia, 2018

Monitoring the Radiation Dose

Anesthesia providers, like all other health care workers who are at risk for radiation exposure, can monitor their monthly dosage by wearing radiation exposure badges. The physics unit of measurement for a biologic radiation dose is the sievert (Sv): 100 rem = 1 Sv. Because some types of ionizing radiation are more injurious than others, the biologic radiation dose is a product of the type-specific radiation weighting factor (or “quality factor”) and the ionizing energy absorbed per gram of tissue. Radiation exposure can be monitored with one or more film badges. In the United States, the average annual dose from cosmic rays and naturally occurring radioactive materials is about 3 mSv (300 mrem). Patients undergoing a chest radiograph receive a dose of 0.04 mSv, whereas those undergoing a computed tomography (CT) scan of the head receive 2 mSv. Federal guidelines set a limit of 50 mSv for the maximum annual occupational dose.

Radiation safety, waste management and transportation

Rachel Bidder, Neil Hartman, in Reference Module in Biomedical Sciences, 2021

Personal radiation monitoring

Personal radiation monitoring is used to monitor the external radiation dose received by the radiation worker. Instantaneous and passive monitors are available and both have their merits. Instantaneous dose-meters give the wearer a live radiation dose-rate (μSv/h) or cumulative dose (μSv). These are particularly useful during radiation incidents where the radiation dose maybe unknown. Instantaneous monitors are expensive. Conversely, passive personal dose-meters are relatively cheap and perfect for the routine work in a radiopharmacy. Passive monitors are usually worn for a set period (such as monthly) and then replaced so that they can be analyzed. Personal dosimetry results are returned several weeks later, which is why instantaneous dose-meters are preferable during radiation incidents.

The use of a personal dosimetry is essential and must be worn appropriately by all radiopharmacy personnel. The body dose-meter is worn underneath protective clothing and at chest or waist height. The meter must also face the correct direction. In addition to the body meter, an extremity dose-meter is worn to measure the highest radiation dose received by the extremities (usually the fingers but an eye dose can also be measured using these). An audit is recommended to review the location of the highest extremity dose to each worker so that the dose-meter can be placed appropriately with a view to measure the peak dose.

Personal radiation monitoring is only really required for classified radiation workers. However, personal radiation dosimetry is recommended for all radiopharmacy staff so that evidence is available to support the classification status of radiation workers and subsequent dose assessments in the event of a radiation incident.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128229606000545

Interventional Radiology

Hani H. Abujudeh MD, MBA, FSIR, FACR, in Radiology Noninterpretive Skills: The Requisites, 2018

Monitoring Radiation Dose

Data-driven improvements in radiation safety require collecting data and analyzing them. Modern fluoroscopy units provide multiple measures of radiation use (Table 24.4).The usefulness of these measures varies. Although older units might only provide fluoroscopy time, fluoroscopy time is not a useful measure of radiation exposure because it does not take into account dose per image or images per second.

One of the chief aims of monitoring radiation exposure is to avoid radiation-induced skin injuries. These are most common following complex procedures where an intense x-ray beam repeatedly passed through the same skin entry point. Tracking peak skin dose during a procedure requires not only assessing the intensity of the beam but also mapping the skin entry point. Reference point air kerma (Ka,r) serves as a surrogate measure of beam intensity at the skin entry point, but it must be combined with patient position, table position, beam angles, and field of view during each exposure to create skin dose maps. Newer systems are capable of aggregating this information and calculating maps of estimated skin doses. A second goal of monitoring radiation exposure is monitoring overall exposure. Kerma area product (PKA) estimates the total x-ray energy deposited in the field of view. WhenPKA is combined with information on which organs were within the field of view and the imaging angles, it can be used to begin estimating organ doses. Although such calculations are not yet routinely available, they provide the best insight into the possibility of future carcinogenesis.

Recording these dose metrics provides two benefits. First, it allows predictions at the level of individual patients, such as the probability of skin injury from the current procedure and how that risk might be compounded by subsequent procedures in the coming months. Indeed, the procedure team might use skin dose maps to plan beam position and orientation to avoid skin sites that received the highest dose in the prior procedure. The team might also use those skin dose maps to counsel patients about the delayed but observable consequences of the recent procedure on the patient’s skin. Second, capturing dose metrics provides the data needed to drive improvement efforts. In the same way that image-guided procedures have an expected duration (e.g., 5 minutes for a central venous catheter or 60 minutes for a hepatic chemoembolization), fluoroscopic procedures should have expected values for the different radiation metrics. Quality improvement projects use deviations from expectation as signals of potential failures in efficient/effective use of radiation or the predictive model. A single large deviation (30 minutes for a central venous catheter where theKa,r was 200 mGy) is usually readily apparent. Smaller, but systematic deviations (routinely requiring 10 minutes for a central venous catheter withKa,r of 75 mGy) are best detected using control charts and other tools from statistical process control. Effective monitoring also requires flexibility as to who is monitoring. During procedures, the performing physician’s attention is usually focused on catheters, needles, guidewires, and other devices, not the portion of the fluoroscopy screen where radiation metrics are displayed. As a result, the other team members, especially technologists, are usually better able to monitor radiation metrics. This includes acknowledging alarms that occur at preset thresholds and alerting the physician when key thresholds are crossed (Table 24.5).

Radiation Decontamination*

George A. Alexander, in Ciottone's Disaster Medicine (Second Edition), 2016

Monitoring Instruments

A variety of instruments are available for detecting and measuring radiation. Radiation monitoring entails the measurement of radiation fields in the vicinity of a radiation source, measurement of surface contamination, and measurement of airborne radioactivity. Such monitoring methods are also known as radiation surveys. Radiation survey meters are used to evaluate radiation contamination of patients, equipment, or the environment. Old civil defense instruments, such as the CD V-700 and CD V-715 survey meters, can be used. The CD V-700 meter is used to detect low-intensity gamma and most beta radiation. It can measure only up to 50 mR/h. The CD V-715 meter is used to measure high-intensity gamma radiation. It can measure up to 500 R/h; however, it cannot detect beta or alpha radiation. These instruments are also called Geiger counters or Geiger-Mueller meters.

Newer portable and compact radiation monitor units with digital readouts and alarm systems are commercially available to measure alpha, beta, and/or gamma radiation. Because alpha radiation travels a very short distance in air and is not penetrating, radiation survey instruments cannot detect alpha radiation through even a thin layer of water, blood, dust, paper, or other materials. Most beta emitters can be detected with a survey instrument, such as a CD V-700, provided the metal probe cover is opened. Because gamma radiation or x-rays frequently accompany the emission of alpha and beta radiation from radioactive isotopes, the latter constitute both an external and internal hazard to humans. Gamma radiation is readily detected with survey instruments.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780323286657000844

Plain Radiographic Imaging

Robert Percuoco, in Clinical Imaging (Third Edition), 2014

Protection of Personnel

Laws entitle occupational radiation workers to a radiation-safe environment. A state inspector scrutinizes each facility for radiation safety annually or biannually. Basic principles of radiation protection for personnel include time, distance, and shielding. Time spent in the vicinity of x-ray exposure should be kept to a minimum. In accordance with the inverse square law, radiation exposure decreases significantly as distance is increased. Whenever possible, workers should maintain a safe distance from sources of ionizing radiation. Unfortunately, radiographers work very close to x-ray machines, especially fluoroscopes, and fluoroscopic exposures run intermittently for minutes to an hour or more. In these cases, shielding is best applied. Shielding includes anything from a lead apron and gloves to a protective barrier placed between the source of radiation and the exposed individual.

For personal radiation protection, radiographers should follow these guidelines:

Apply the rules of time, distance, and shielding.

Maintain the smallest collimation field appropriate for the examination.

Always wear a radiation monitoring device (e.g., film badge, thermoluminescent dosimeter [TLD], pocket dosimeter) to detect exposure.

Avoid holding a patient during a radiographic examination. If using a restraining device is not possible and a patient must be held, no person should routinely hold patients.

Ensure that anyone holding a patient is properly protected with a full lead apron and lead gloves.

Dose Limits.

The federal government of the United States sets dose limits for radiation workers and the general public. The federal agency that enforces radiation safety laws and dose limits is the Nuclear Regulatory Commission (NRC). Dose limits are reported in the NCRP publications. NCRP Report No. 116, Limitation of Exposure to Ionizing Radiation (1993), provides the most up-to-date information on occupational and nonoccupational dose limits.1 A summary of Report No. 116 can be found in Table 1-3. These dose limits do not pertain to patients receiving medical exposures for diagnostic or therapeutic purposes. The annual effective dose limit for occupational exposures is 5 rem (50 mSv) per year. The cumulative effective dose limit for occupational exposures is

1rem×Age of worker

Dose limits are established primarily to minimize the risks of stochastic and nonstochastic radiation effects. A stochastic radiation effect is one in which the probability of occurrence, rather than severity, increases as the dose increases. No threshold dose exists below which risk is eliminated. Examples of stochastic effects are cancer and genetic effects.

A nonstochastic radiation effect is one that manifests with certainty after a certain dose and the severity increases as the dose increases. A threshold dose exists below which nonstochastic effects do not manifest. Examples include sterility changes, radiation burns, and cataract formation in the lens of the eye.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780323084956000014

Dirty Bomb (Radiological Dispersal Device)*

George A. Alexander, in Ciottone's Disaster Medicine (Second Edition), 2016

Pre-Incident Actions

One of the most important preemptive actions that emergency medical service agencies, hospital-based emergency departments, and outpatient facilities should do is to determine whether their community is a possible target for a terrorist dirty bomb attack. Coordinating with local and state law enforcement and response agencies should provide a framework in which to assess the dirty bomb threat and develop a medical radiation incident or injury protocol. The protocol should be incorporated into the overall disaster plan. The radiation disaster plan should address decontamination, security, radiation monitoring, and decorporation of radioactive materials. The hospital radiation safety officer should be included in the medical radiation response team. Hospital staff should understand the hazards of radioactive contamination and be trained in radiation-monitoring techniques. Staff would need access to dosimeters, Geiger-Mueller counters, and personal protective equipment. Radiation-detection capabilities are critical to an effective medical response. Hospitals should have a realistic decontamination plan for patients,9 a lockdown plan to control access, and evacuation plans. A radiation risk communication program is required for the public. As has been shown by the Fukushima Nuclear Plant accident in 2011 the most important task for scientists is to make available to the public scientific knowledge of the level of radiation risk resulting in more understanding of their potential health risks.10

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780323286657000777

Three Mile Island

J. Sorensen, in Encyclopedia of Toxicology (Third Edition), 2014

Introduction

The accident at the Three Mile Island Unit 2 (TMI-2) nuclear power plant near Middletown, Pennsylvania, on Wednesday, 28 March, 1979, was the most serious in a commercial nuclear power plant in the United States even though it led to no deaths or injuries to plant workers or members of adjacent communities. It did have major implications for the nuclear industry because it resulted in major changes in the regulatory requirements involving emergency response planning, reactor operator training, radiation monitoring, human factors engineering, radiation protection, and many other areas of nuclear power plant operations. It also caused the US Nuclear Regulatory Commission (NRC) to tighten and heighten its regulatory oversight.

The accident at TMI-2 was initiated at 4 a.m. by a minor malfunction, or transient, in the nonnuclear part of the reactor. The main feed-water pumps stopped running, caused by either a mechanical or an electrical failure, which prevented the steam generators from removing heat. This minor event evolved into a series of automated responses in the reactor's coolant system, and during all of this, the relief valve on top of a piece of equipment called the pressurizer became stuck in an open position. Misreading of the plant conditions by the operators over a 2.25-h period before the relief valve was closed and the turning off of an automatic emergency cooling system caused the reactor core to become partially uncovered and severely damaged. The major consequences of the accident unfolded over the next week and it took a month to bring the reactor to a cold shutdown.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780123864543000828

Radiological Hazards and Lasers

Michael Sheetz, in Research Regulatory Compliance, 2015

4.1 Radioactive Materials License

The use of RAM by an institution for research purposes will require a specific license for the use of the type and quantity of RAM unless it is “exempt” or covered under a “general license.” The specific license will be issued by either the NRC or an agreement state. The license is issued to a legal entity such as a corporation, partnership, or individual. Once licensed, the entity must ensure that the radiation activities conducted are in compliance with terms and conditions of the license and applicable regulations. Accordingly, for educational, medical, and research institutions, the license is issued to the respective corporate entity and not a department or individual employee.

To obtain a RAMs license, an application must be submitted to the applicable regulatory agency for review and approval before a license document is provided to the entity. Information required to be submitted in the application include:

1.

Isotope, chemical or physical form, and maximum activity;

2.

Purpose for which the RAM will be used;

3.

Individuals responsible for the radiation protection program and their training and experience;

4.

Description of the facilities and equipment used for controlling exposures and protecting individuals;

5.

Description of radiation monitoring equipment

6.

Description of the RPP;

7.

Radioactive waste disposal methods.

For institutions that perform a variety of research, the purpose for which RAM will be used is often listed as “research and development,” which is defined in 10 CFR 30 as: (1) theoretical analysis, exploration, or experimentation; or (2) the extension of investigative findings and theories of a scientific or technical nature into practical application for experimental and demonstration purposes, including the experimental production and testing of models, devices, equipment, materials, and processes. Research and development does not include the internal or external administration of radiation or RAM to human beings. This will essentially cover the licensee for any type of research involving the use of RAM. One exception to this would be for tracer studies involving the release of RAM to the environment. This use would require specific review and approval by the regulatory agency.

The license application must also identify an individual to be named as the RSO. This individual must have the requisite training and work experience to be responsible for and manage on a day-to-day basis the RPP. Depending on the types and uses of RAM under the license, a RSC may also be required for oversight and management of the RPP. Licenses will be issued referencing applicable regulatory requirements and also include specific conditions for use of the requested RAM. Licenses are issued for periods of 5 to 10 years, after which they must be renewed by submission of an application if the licensed activities are to continue. Changes to a license, such as adding a new radioisotope or type of use or changing the RSO, are made by submitting an amendment request to the respective regulatory agency. A licensee may not transfer control to another person or entity, or dispose of the license without prior written consent from the regulatory agency. Licensing fees are associated with the application, amendment, and renewal process, along with an annual license fee assessment, the amount of which is based on the type of materials license.

The following NRC regulations are applicable to the licensing process. While these regulations will be compatible or similar to agreement state licensing regulations, the specific agreement state regulations should be addressed if under their authority:

1.

10 CFR part 2, “Rules of Practice for Domestic Licensing Proceedings and Issuance of Orders”;

2.

10 CFR part 19, “Notices, Instructions and Reports to Workers: Inspection and Investigations”;

3.

10 CFR part 20, “Standards for Protection Against Radiation”;

4.

10 CFR part 21, “Reporting of Defects and Noncompliance”;

5.

10 CFR part 30, “Rules of General Applicability to Domestic Licensing of Byproduct Material”;

6.

10 CFR part 31, “General Domestic Licenses for Byproduct Material”;

7.

10 CFR part 32, “Specific Domestic Licenses to Manufacture or Transfer Certain Items Containing Byproduct Material”;

8.

10 CFR part 33, “Specific Domestic Licenses of Broad Scope for Byproduct Material”;

9.

10 CFR part 35, “Medical Use of Byproduct Material”;

10.

10 CFR part 40, “Domestic Licensing of Source Material”;

11.

10 CFR part 70, “Domestic Licensing of Special Nuclear Material” (for pacemaker devices);

12.

10 CFR part 71, “Packaging and Transportation of Radioactive Material.”

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780124200586000058

Radiation Sources and Detectors

F. Sauli, in Comprehensive Biomedical Physics, 2014

8.22.1 Gaseous Detectors: Historical Background

The history of gaseous counters goes back to last century's early years, when Ernst Rutherford and Hans Geiger conceived an instrument capable to detect the tiny ionization trails released in a gas by natural radiation (Rutherford and Geiger, 1908). Consisting of a thin wire centered on a cylindrical tube, on application of a positive potential to the wire (the anode) and exploiting the avalanche multiplication process in the gas, the detector provides an amplified and detectable signal proportional to the original ionization, hence the name proportional counter. Further developments by Geiger and Walter Müller led to a device capable of detecting single electrons released in the gas; simple, reliable and cheap, Geiger–Müller counters are still widely used for radiation monitoring.

Single-wire gas counters have been used for decades, with a design and response matching the experimental needs. The concurrent study of the mechanisms of collisions between electrons and molecules under the effect of electric fields (the so-called gaseous electronics) provided a theoretical background for the understanding of the complex processes encountered within the detectors, and of their efficiency, timing, and energy resolution properties (see, e.g., Loeb, 1961).

Although examples exist of arrays of proportional counters, the use of the devices remained confined to detectors of limited geometrical coverage. In the fast expanding field of particle physics experiments, the need to instrument large detection areas with localization capability led to the development of other tools exploiting the avalanche charge multiplication in gases, such as spark and streamer chambers, where a high‐voltage pulse applied between electrodes synchronously with the presence of charged tracks causes a detectable breakdown along the ionization trails. Originally recorded with optical means, the position of sparks could be sensed electronically with the development of various methods of localization, replacing the continuous electrodes with wire structures (for a survey of these technologies, see Rice-Evans, 1974).

Although very powerful at the time, detectors based on the growth of a spark have modest rate capability, due to the time needed to remove the large amount of charge generated by a spark and avoid refiring; in optimal conditions, event rates could not exceed a few per second, a rather drastic limitation for experiments.

The multiwire proportional chamber (MWPC), introduced in 1968 by Georges Charpak, revolutionized the field of fast position-sensitive detectors (Charpak et al., 1968). Continuously active, efficient at particle fluxes up to several MHz per cm2 and with sub-mm position accuracy, the device met the most stringent experimental requirements of the time. The development of large‐area MWPC manufacturing technologies, and the emerging availability of high-density electronics led soon to a new generation of detectors. Exploitation of the electrons' drift time and of the cathode‐induced signals originated a variety of other devices fulfilling the needs of high-energy physics experimentation (see, e.g., Charpak and Sauli, 1984).

The commissioning of high luminosity colliders, and the quest for rare events embedded in a large flux of combinatorial background, revealed several weaknesses of detectors based on wire structures. The discrete spacing of wires implies a limited accuracy and multitrack separation; the long time taken by the ions produced in the avalanches to clear the region of multiplication results in a field-distorting space charge accumulation, with a consequent fast drop of gain at high fluxes. More seriously, a permanent damage of the structures due to the formation of solid deposits on electrodes (the so-called aging) permanently affects the detectors after long-term exposure to radiation. After decades of research on the subject, aside from a generic set of do and do not rules, a general solution to the aging problem in wire chambers has yet not been found (Capeáns, 2003).

An innovative detector named micro-strip gas counter (MSGC), introduced in 1988 (Oed, 1988), seemed to fulfill the more demanding requirements: a substantial improvement in position resolution, and an increase by several orders of magnitude of rate capability, as compared to MWPCs. Consisting of alternating anode and cathode strips engraved on an insulating support, although limited in size, MSGCs could be manufactured industrially with photolithographic processes. Several experiments were designed to make use of large arrays of MSGCs (Sauli, 1998). Disappointingly, and despite a large effort in optimizing structures and operating conditions (Bouclier et al., 1995), the devices appeared prone to fast degradation and discharges, with devastating effects on the fragile electrodes, and have been virtually discontinued.

The problems encountered with the MSGCs spawned the development of alternative structures, collectively named micro-pattern gas detectors (MPGDs), promising comparable performances but more resilient to radiation and spark damages: microgap, microwire, microdot, field gradient lattice, Compteur à Trous, and others; for an overview of these devices and related references, see Sauli and Sharma (1999). In most cases, however, the new structures appeared to be difficult to manufacture in reasonable sizes and quantities. Noticeable exceptions are the micro-mesh gaseous structure (MICROMEGAS) (Giomataris et al., 1996) and the gas electron multiplier (GEM) developed by Sauli (1997).

Already in use in many experimental setups worldwide, the new devices are still the subject of extensive development work aimed at improving performances and manufacturing methods in the framework of the RD51 international collaboration (Duarte Pinto, 2009). A review of the progress with MPGDs can be found in Titov (2007).

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780444536327006250

Reproductive Hazards of Occupational and Environmental Exposures

Marja-Liisa Lindbohm, ... Helena Taskinen, in Women and Health (Second Edition), 2013

Physical Agents

Ionizing Radiation

The adverse effects of exposure to high doses of ionizing radiation on reproductive health are well known. Electromagnetic ionizing radiation, x-rays and gamma rays penetrate tissues and reach the sexual organs and fetus easily, while alpha and beta particulate radiation (helium nuclei and electrons, respectively) do not penetrate tissues deeply, but these particles do generate ions along their short path in the tissue. Radio nuclides also emit ionizing radiation. Occupational exposure to ionizing radiation may occur at nuclear power plants and in health care professions (x-rays and radio nuclides). Flight crew may be exposed to cosmic radiation.

Evidence on the adverse reproductive effects of ionizing radiation in humans has been gathered primarily from environmental exposure, medical diagnostics, and curative procedures, rather than from studies of occupational exposure, where findings have been inconsistent (Table 40.2). High exposure has been related to amenorrhoea and early menopause,101 growth retardation, miscarriages, congenital anomalies, mental retardation, and childhood cancer.102 In occupational studies an increased risk of miscarriage among veterinarians and veterinary assistants using diagnostic x-rays76 and among cabin attendants103 has been reported. The results of a study among laboratory technicians suggested that exposure to radioisotopes may carry a high risk of preterm birth and birth defects.104

Table 40.2. Physical and Biological Occupational Hazards Associated with Adverse Reproductive Effects

HazardReported EffectsIndustry or Occupation
Ionizing radiation Fetal loss Health care personnel
Microwaves, shortwaves, high-frequency EMF Fetal loss, low birth weight, altered sex ratio Physiotherapists
Low-frequency EMF Fetal loss, childhood cancer Industrial work
Noise Menstrual disorders, reduced fertility, preterm birth, intrauterine growth retardation, low birth weight Industrial manufacturing
Cosmic radiation Menstrual disorders, fetal loss Flight attendants
Infection disease a Fetal loss, intrauterine growth retardation, birth defects Health care personnel, childcare workers

aEvidence based mainly on studies in the general population.

Most studies considering maternal exposure to radiation have found no excess cancer risk in the children of nuclear workers or medical radiographers. A record linkage study showed a weak association between maternal radiation work during pregnancy and childhood cancer in offspring, although the evidence is limited by the small number of study subjects.105 Overall, evidence of the hazardous effects of radioactive radiation indicates that the fetus is sensitive to its effects. If a female worker has declared that she is pregnant, the International Commission on Radiological Protection has recommended that the level of protection for the embryo/fetus should be broadly similar to that provided for members of the public.106

A large cohort study among nuclear industry workers exposed to low-level ionizing radiation found no evidence of a link between exposure and birth defects, but risk of early fetal death and stillbirth were higher in women whose jobs required radiation monitoring before conception as compared to unmonitored women. No trend with dose was observed,107 so the findings require further investigation. There was also weak evidence of an association between exposure and primary infertility, but the relatively few monitored women prevented detailed examination of these data.108

Electromagnetic Fields

The potential harmful effects of occupational exposure to electromagnetic fields have primarily been investigated among physiotherapists exposed to shortwave, microwave, and ultrasound radiation from physiotherapy equipment, but results for adverse reproductive effects are inconsistent. Some, but not all, studies have found increased risk of miscarriage. A low boy-to-girl birth ratio and low male newborn birth weight have been reported in physiotherapists exposed to high-frequency electromagnetic radiation in some studies.109

Evidence on the adverse reproductive effects of electromagnetic fields among physiotherapists and static magnetic fields among MRI technologists is inconclusive. Workers using magnetic resonance imaging (MRI) in diagnostic medicine may be exposed to strong static magnetic fields. A survey of MRI technologists did not show any major reproductive hazards associated with MRI work, although a slight but not significant increase in miscarriages was observed.110 Studies on MRI examinations during pregnancy have also indicated no excess of adverse pregnancy outcome, but the study sizes have been very small.111

Some studies have looked at the effects of exposure to extremely low frequency (ELF) magnetic fields from electric bed heating devices, other residential sources, and video display terminals (VDTs). The findings on the effects of the use of electric blankets or heated waterbeds on miscarriage and other adverse pregnancy outcomes have been inconsistent. Most studies on residential exposure have also reported no clear association with fetal growth or birth defects.112 Two studies used personal measurements of ELF magnetic fields in the exposure assessment. Both of them reported an increased risk of miscarriage related to maximum measured magnetic field levels (≥1.6 μT or ≥2.3 μT).113,114

Several studies have been done on video display terminal workers, and exposure to the low frequency electromagnetic fields of VDTs has been suspected as a potential reproductive hazard. Most studies suggest that VDT work is not associated with miscarriage, congenital anomalies, fetal growth retardation, or other pregnancy complications.115,116 The miscarriage risk for women who worked with older, high-emission VDTs (ELF>3 mG), however, seems to be still uncertain.117

The epidemiologic evidence does not, taken as a whole, suggest a strong association between exposure to ELF magnetic fields and adverse reproductive outcome, although an effect at high levels of exposure cannot be excluded. No generally accepted mechanism for biological effects on reproduction exists. The association between exposure to ELF magnetic fields during pregnancy and childhood cancer is inconsistent. One study suggests that children of mothers occupationally exposed to the highest levels of electromagnetic fields during pregnancy have an increased risk of leukemia.118 In another study, the contribution of ELF magnetic field exposure on the incidence of childhood brain tumors was examined using individual exposure estimations or a job exposure matrix.119 Results are suggestive of a possible association between maternal occupational ELF magnetic field exposure and certain brain tumors in their offspring. However, a pooled analysis of 10 studies did not reveal consistent evidence of increased childhood brain tumor risk associated with exposure.120

Noise

Current evidence suggests that high noise exposure should be considered as a potential reproductive hazard. Most epidemiologic studies on the effects of occupational noise exposure concern preterm birth and low birth weight. Elevated risks of these outcomes have been observed among noise-exposed workers, but negative results have also been found. Exposure to high noise levels (8 hour time-weighted average of approximately around 85 dB(A) or higher) has been associated with fetal growth retardation.121 Excesses of hormonal disturbances, delayed conception, infertility, and miscarriage also have been reported for occupational noise exposure. A possible mechanism is that the stress-response induces an increase in maternal catecholamine secretion that may stimulate or retard uterine contractions and affect utero-placental blood flow.

Noise is also suspected to adversely affect fetal auditory response. Environmental noise penetrates the tissues and fluids surrounding the fetal head and stimulates the inner ear by bone conduction, allowing the fetus to hear predominately low-frequency sounds (high-frequency is greatly attenuated by the abdomen).122 Two early reports, which were subsequently heavily criticized on methodological grounds, suggested that occupational noise exposure (85–95 dB(A) or >100 dB(A)) might increase hearing deterioration risk in the children. In a more recent study, no hearing impairment was observed in children of mothers exposed to occupational noise (>80 but <90 dB(A)) during pregnancy compared to the children of unexposed mothers.123

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780123849786000406

What devices are used to measure exposure to radiation?

Geiger counters are commonly used to measure the amount of radioactivity, but there are other types of detectors that may be used.

What is used to measure exposure rate of radiation in humans?

The biological risk of exposure to radiation is measured using the conventional unit rem or the SI unit sievert (Sv).