Biological effects from exposure to electromagnetic radiation emitted by cell tower base stations and other antenna arrays

B. Blake Levitt,a Henry Laib

aP.O. Box 2014, New Preston, CT 06777, USA. (e-mail: (email: bbl353355@gmail.com)).

bDepartment of Bioengineering, Box 355061, University of Washington, Seattle, WA 98195, USA.

Received April 30, 2010. Accepted August 6, 2010.

Environmental Reviews, 2010, 18(NA): 369-395, https://doi.org/10.1139/A10-018

In this paper Top of page 1. Introduction 2. A changing industry 3. Cell towers in perspective: some definitions 4. Specific absorption rate (SAR) 5. Transmission facilities 6. Government radiofrequency radiation (RFR) guidelines: how spatial energy translates to the body’s absorption 7. Biological effects at low intensities 8. Long-term exposures and cumulative effects 9. Effects below 4 W/kg: thermal versus nonthermal 10. Studies on exposure to cell tower transmissions 11. Risk perception, electrohypersensitivity, and psychological factors 12. Assessing exposures 13. Discussion Abstract The siting of cellular phone base stations and other cellular infrastructure such as roof-mounted antenna arrays, especially in residential neighborhoods, is a contentious subject in land-use regulation. Local resistance from nearby residents and landowners is often based on fears of adverse health effects despite reassurances from telecommunications service providers that international exposure standards will be followed. Both anecdotal reports and some epidemiology studies have found headaches, skin rashes, sleep disturbances, depression, decreased libido, increased rates of suicide, concentration problems, dizziness, memory changes, increased risk of cancer, tremors, and other neurophysiological effects in populations near base stations. The objective of this paper is to review the existing studies of people living or working near cellular infrastructure and other pertinent studies that could apply to long-term, low-level radiofrequency radiation (RFR) exposures. While specific epidemiological research in this area is sparse and contradictory, and such exposures are difficult to quantify given the increasing background levels of RFR from myriad personal consumer products, some research does exist to warrant caution in infrastructure siting. Further epidemiology research that takes total ambient RFR exposures into consideration is warranted. Symptoms reported today may be classic microwave sickness, first described in 1978. Nonionizing electromagnetic fields are among the fastest growing forms of environmental pollution. Some extrapolations can be made from research other than epidemiology regarding biological effects from exposures at levels far below current exposure guidelines. Keywords: radiofrequency radiation (RFR), antenna arrays, cellular phone base stations, microwave sickness, nonionizing electromagnetic fields, environmental pollution

In this paper Top of page 1. Introduction « 2. A changing industry 3. Cell towers in perspective: some definitions 4. Specific absorption rate (SAR) 5. Transmission facilities 6. Government radiofrequency radiation (RFR) guidelines: how spatial energy translates to the body’s absorption 7. Biological effects at low intensities 8. Long-term exposures and cumulative effects 9. Effects below 4 W/kg: thermal versus nonthermal 10. Studies on exposure to cell tower transmissions 11. Risk perception, electrohypersensitivity, and psychological factors 12. Assessing exposures 13. Discussion 1. Introduction Wireless technologies are ubiquitous today. According to the European Information Technology Observatory, an industry-funded organization in Germany, the threshold of 5.1 billion cell phone users worldwide will be reached by the end of 2010 — up from 3.3 billion in 2007. That number is expected to increase by another 10% to 5.6 billion in 2011, out of a total worldwide population of 6.5 billion. In 2010, cell phone subscribers in the U.S. numbered 287 million, Russia 220 million, Germany 111 million, Italy 87 million, Great Britain 81 million, France 62 million, and Spain 57 million. Growth is strong throughout Asia and in South America but especially so in developing countries where landline systems were never fully established. The investment firm Bank of America Merril-Lynch estimated that the worldwide penetration of mobile phone customers is twice that of landline customers today and that America has the highest minutes of use per month per user. Today, 94% of Americans live in counties with four or more wireless service providers, plus 99% of Americans live in counties where next generation, 3G (third generation), 4G (fourth generation), and broadband services are available. All of this capacity requires an extensive infrastructure that the industry continues to build in the U.S., despite a 93% wireless penetration of the total U.S. population. Next generation services are continuing to drive the build-out of both new infrastructure as well as adaptation of pre-existing sites. According to the industry, there are an estimated 251 618 cell sites in the U.S. today, up from 19 844 in 1995.4 There is no comprehensive data for antennas hidden inside of buildings but one industry-maintained Web site (www.antennasearch.com), allows people to type in an address and all antennas within a 3 mile (1 mile = 1.6 km) area will come up. There are hundreds of thousands in the U.S. alone. People are increasingly abandoning landline systems in favor of wireless communications. One estimate in 2006 found that 42% of all wireless subscribers used their wireless phone as their primary phone. According to the National Center for Health Statistics of the U.S. Centers for Disease Control (CDC), by the second half of 2008, one in every five American households had no landlines but did have at least one wireless phone (Department of Health and Human Services 2008). The figures reflected a 2.7% increase over the first half of 2008 — the largest jump since the CDC began tracking such data in 2003, and represented a total of 20.2% of the U.S. population — a figure that coincides with industry estimates of 24.50% of completely wireless households in 2010. The CDC also found that approximately 18.7% of all children, nearly 14 million, lived in households with only wireless phones. The CDC further found that one in every seven American homes, 14.5% of the population, received all or almost all of their calls via wireless phones, even when there was a landline in the home. They called these “wireless-mostly households.” The trend away from landline phones is obviously increasing as wireless providers market their services specifically toward a mobile customer, particularly younger adults who readily embrace new technologies. One study (Silke et al. 2010) in Germany found that children from lower socioeconomic backgrounds not only owned more cell phones than children from higher economic groups, but also used their cell phones more often — as determined by the test groups’ wearing of personal dosimetry devices. This was the first study to track such data and it found an interesting contradiction to the assumption that higher socioeconomic groups were the largest users of cell services. At one time, cell phones were the status symbol of the wealthy. Today, it is also a status symbol of lower socioeconomic groups. The CDC found in their survey discussed above that 65.3% of adults living in poverty or living near poverty were more likely than higher income adults to be living in households with wireless only telephones. There may be multiple reasons for these findings, including a shift away from cell phone dialogues to texting in younger adults in higher socioeconomic categories. In some developing countries where landline systems have never been fully developed outside of urban centers, cell phones are the only means of communication. Cellular technology, especially the new 3G, 4G, and broadband services that allow wireless communications for real-time voice communication, text messaging, photos, Internet connections, music and video downloads, and TV viewing, is the fastest growing segment of many economies that are in otherwise sharp decline due to the global economic downturn. There is some indication that although the cellular phone markets for many European countries are more mature than in the U.S., people there may be maintaining their landline use while augmenting with mobile phone capability. This may be a consequence of the more robust media coverage regarding health and safety issues of wireless technology in the European press, particularly in the UK, as well as recommendations by European governments like France and Germany that citizens not abandon their landline phones or wired computer systems because of safety concerns. According to OfCom’s 2008 Communications Market Interim Report (OfCom 2008), which provided information up to December 2007, approximately 86% of UK adults use cell phones. While four out of five households have both cell phones and landlines, only 11% use cell phones exclusively, a total down from 28% noted by this group in 2005. In addition, 44% of UK adults use text messaging on a daily basis. Fixed landline services fell by 9% in 2007 but OfCom notes that landline services continue to be strong despite the fact that mobile services also continued to grow by 16%. This indicates that people are continuing to use both landlines and wireless technology rather than choosing one over the other in the UK. There were 51 300 UK base station sites in the beginning of 2009 (two-thirds installed on existing buildings or structures) with an estimated 52 900 needed to accommodate new 3G and 4G services by the end of 2009. Clearly, this is an enormous global industry. Yet, no money has ever been appropriated by the industry in the U.S., or by any U.S. government agency, to study the potential health effects on people living near the infrastructure. The most recent research has all come from outside of the U.S. According to the CTIA − The Wireless Association, “If the wireless telecom industry were a country, its economy would be bigger than that of Egypt, and, if measured by GNP (gross national product), [it] would rank as the 46th largest country in the world.” They further say, “It took more than 21 years for color televisions to reach 100 million consumers, more than 90 years for landline service to reach 100 million consumers, and less than 17 years for wireless to reach 100 million consumers.” In lieu of building new cell towers, some municipalities are licensing public utility poles throughout urban areas for Wi-Fi antennas that allow wireless Internet access. These systems can require hundreds of antennas in close proximity to the population with some exposures at a lateral height where second- and third-storey windows face antennas. Most of these systems are categorically excluded from regulation by the U.S. Federal Communications Commission (FCC) or oversight by government agencies because they operate below a certain power density threshold. However, power density is not the only factor determining biological effects from radiofrequency radiation (RFR). In addition, when the U.S. and other countries permanently changed from analog signals used for television transmission to newer digital formats, the old analog frequencies were reallocated for use by municipal services such as police, fire, and emergency medical dispatch, as well as to private telecommunications companies wanting to expand their networks and services. This creates another significant increase in ambient background exposures. Wi-Max is another wireless service in the wings that will broaden wireless capabilities further and place additional towers and (or) transmitters in close proximity to the population in addition to what is already in existence. Wi-Max aims to make wireless Internet access universal without tying the user to a specific location or “hotspot.” The rollout of Wi-Max in the U.S., which began in 2009, uses lower frequencies at high power densities than currently used by cellular phone transmission. Many in science and the activist communities are worried, especially those concerrned about electromagnetic-hypersensitivity syndrome (EHS). It remains to be seen what additional exposures “smart grid” or “smart meter” technology proposals to upgrade the electrical powerline transmission systems will entail regarding total ambient RFR increases, but it will add another ubiquitous low-level layer. Some of the largest corporations on earth, notably Siemens and General Electric, are involved. Smart grids are being built out in some areas of the U.S. and in Canada and throughout Europe. That technology plans to alter certain aspects of powerline utility metering from a wired system to a partially wireless one. The systems require a combination of wireless transmitters attached to homes and businesses that will send radio signals of approximately 1 W output in the 2.4000–2.4835 GHz range to local “access point” transceivers, which will then relay the signal to a further distant information center (Tell 2008). Access point antennas will require additional power density and will be capable of interfacing with frequencies between 900 MHz and 1.9 GHz. Most signals will be intermittent, operating between 2 to 33 seconds per hour. Access points will be mounted on utility poles as well as on free-standing towers. The systems will form wide area networks (WANs), capable of covering whole towns and counties through a combination of “mesh-like” networks from house to house. Some meters installed on private homes will also act as transmission relays, boosting signals from more distant buildings in a neighborhood. Eventually, WANs will be completely linked. Smart grid technology also proposes to allow homeowners to attach additional RFR devices to existing indoor appliances, to track power use, with the intention of reducing usage during peak hours. Manufacturers like General Electric are already making appliances with transmitters embedded in them. Many new appliances will be incapable of having transmitters deactivated without disabling the appliance and the warranty. People will be able to access their home appliances remotely by cell phone. The WANs smart grids described earlier in the text differ significantly from the current upgrades that many utility companies have initiated within recent years that already use low-power RFR meters attached to homes and businesses. Those first generation RFR meters transmit to a mobile van that travels through an area and “collects” the information on a regular billing cycle. Smart grids do away with the van and the meter reader and work off of a centralized RFR antenna system capable of blanketing whole regions with RFR. Another new technology in the wings is broadband over powerlines (BPL). It was approved by the U.S. FCC in 2007 and some systems have already been built out. Critics of the latter technology warned during the approval process that radiofrequency interference could occur in homes and businesses and those warnings have proven accurate. BPL technology couples radiofrequency bands with extremely low frequency (ELF) bands that travel over powerline infrastructure, thereby creating a multi-frequency field designed to extend some distance from the lines themselves. Such couplings follow the path of conductive material, including secondary distribution lines, into people’s homes. There is no doubt that wireless technologies are popular with consumers and businesses alike, but all of this requires an extensive infrastructure to function. Infrastructure typically consists of freestanding towers (either preexisting towers to which cell antennas can be mounted, or new towers specifically built for cellular service), and myriad methods of placing transceiving antennas near the service being called for by users. This includes attaching antenna panels to the sides of buildings as well as roof-mountings; antennas hidden inside church steeples, barn silos, elevator shafts, and any number of other “stealth sites.” It also includes camouflaging towers to look like trees indigenous to areas where they are placed, e.g., pine trees in northern climates, cacti in deserts, and palm trees in temperate zones, or as chimneys, flagpoles, silos, or other tall structures (Rinebold 2001). Often the rationale for stealth antenna placement or camouflaging of towers is based on the aesthetic concerns of host communities. An aesthetic emphasis is often the only perceived control of a municipality, particularly in countries like America where there is an overriding federal preemption that precludes taking the “environmental effects” of RFR into consideration in cell tower siting as stipulated in Section 704 of The Telecommunications Act of 1996 (USFCC 1996). Citizen resistance, however, is most often based on health concerns regarding the safety of RFR exposures to those who live near the infrastructure. Many citizens, especially those who claim to be hypersensitive to electromagnetic fields, state they would rather know where the antennas are and that hiding them greatly complicates society’s ability to monitor for safety. Industry representatives try to reassure communities that facilities are many orders of magnitude below what is allowed for exposure by standards-setting boards and studies bear that out (Cooper et al. 2006; Henderson and Bangay 2006; Bornkessel et al. 2007). These include standards by the International Commission on Non-Ionizing Radiation Protection (ICNIRP) used throughout Europe, Canada, and elsewhere (ICNIRP 1998). The standards currently adopted by the U.S. FCC, which uses a two-tiered system of recommendations put out by the National Council on Radiation Protection (NCRP) for civilian exposures (referred to as uncontrolled environments), and the International Electricians and Electronics Engineers (IEEE) for professional exposures (referred to as controlled environments) (U.S. FCC 1997). The U.S. may eventually adopt standards closer to ICNIRP. The current U.S. standards are more protective than ICNIRP’s in some frequency ranges so any harmonization toward the ICNIRP standards will make the U.S. limits more lenient. All of the standards currently in place are based on RFRs ability to heat tissue, called thermal effects. A longstanding criticism, going back to the 1950s (Levitt 1995), is that such acute heating effects do not take potentially more subtle non-thermal effects into consideration. And based on the number of citizens who have tried to stop cell towers from being installed in their neighborhoods, laypeople in many countries do not find adherence to exisitng standards valid in addressing health concerns. Therefore, infrastructure siting does not have the confidence of the public (Levitt 1998).

In this paper Top of page 1. Introduction 2. A changing industry « 3. Cell towers in perspective: some definitions 4. Specific absorption rate (SAR) 5. Transmission facilities 6. Government radiofrequency radiation (RFR) guidelines: how spatial energy translates to the body’s absorption 7. Biological effects at low intensities 8. Long-term exposures and cumulative effects 9. Effects below 4 W/kg: thermal versus nonthermal 10. Studies on exposure to cell tower transmissions 11. Risk perception, electrohypersensitivity, and psychological factors 12. Assessing exposures 13. Discussion 2. A changing industry Cellular phone technology has changed significantly over the last two decades. The first wireless systems began in the mid-1980s and used analog signals in the 850–900 MHz range. Because those wavelengths were longer, infrastructure was needed on average every 8 to 10 miles apart. Then came the digital personal communications systems (PCS) in the late 1990s, which used higher frequencies, around 1900 GHz, and digitized signals. The PCS systems, using shorter wavelengths and with more stringent exposure guidelines, require infrastructure approximately every 1 to 3 miles apart. Digital signals work on a binary method, mimicking a wave that allows any frequency to be split in several ways, thereby carrying more information far beyond just voice messages. Today’s 3G network can send photos and download music and video directly onto a cell phone screen or iPod. The new 4G systems digitize and recycle some of the older frequencies in the 700 to 875 MHz bands to create another service for wireless Internet access. The 4G network does not require a customer who wants to log on wirelessly to locate a “hot spot” as is the case with private Wi-Fi systems. Today’s Wi-Fi uses a network of small antennas, creating coverage of a small area of 100 ft (∼30 m) or so at homes or businesses. Wi-fi can also create a small wireless computer system in a school where they are often called wireless local area networks (WLANs). Whole cities can make Wi-Fi available by mounting antennas to utility poles. Large-scale Wi-Fi systems have come under increasing opposition from citizens concerned about health issues who have legally blocked such installations (Antenna Free Union). Small-scale Wi-Fi has also come under more scrutiny as governments in France and throughout Europe have banned such installations in libraries and schools, based on precautionary principles (REFLEX Program 2004).

In this paper Top of page 1. Introduction 2. A changing industry 3. Cell towers in perspective: some definitions « 4. Specific absorption rate (SAR) 5. Transmission facilities 6. Government radiofrequency radiation (RFR) guidelines: how spatial energy translates to the body’s absorption 7. Biological effects at low intensities 8. Long-term exposures and cumulative effects 9. Effects below 4 W/kg: thermal versus nonthermal 10. Studies on exposure to cell tower transmissions 11. Risk perception, electrohypersensitivity, and psychological factors 12. Assessing exposures 13. Discussion 3. Cell towers in perspective: some definitions Cell towers are considered low-power installations when compared to many other commercial uses of radiofrequency energy. Wireless transmission for radio, television (TV), satellite communications, police and military radar, federal homeland security systems, emergency response networks, and many other applications all emit RFR, sometimes at millions of watts of effective radiated power (ERP). Cellular facilities, by contrast, use a few hundred watts of ERP per channel, depending on the use being called for at any given time and the number of service providers co-located at any given tower. No matter what the use, once emitted, RFR travels through space at the speed of light and oscillates during propagation. The number of times the wave oscillates in one second determines its frequency. Radiofrequency radiation covers a large segment of the electromagnetic spectrum and falls within the nonionizing bands. Its frequency ranges between 10 kHz to 300 GHz; 1 Hz = 1 oscillation per second; 1 kHz = 1000 Hz; 1 MHz = 1 000 000 Hz; and 1 GHz = 1 000 000 000 Hz. Different frequencies of RFR are used in different applications. Some examples include the frequency range of 540 to 1600 kHz used in AM radio transmission; and 76 to 108 MHz used for FM radio. Cell-phone technology uses frequencies between 800 MHz and 3 GHz. The RFR of 2450 MHz is used in some Wi-Fi applications and microwave cooking. Any signal can be digitized. All of the new telecommunications technologies are digitized and in the U.S., all TV is broadcast in 100% digital formats — digital television (DTV) and high definition television (HDTV). The old analog TV signals, primarily in the 700 MHz ranges, will now be recycled and relicensed for other applications to additional users, creating additional layers of ambient exposures. The intensity of RFR is generally measured and noted in scientific literature in watts per square meter (W/m2); milliwatts per square centimetre (mW/cm2), or microwatts per square centimetre (μW/cm2). All are energy relationships that exist in space. However, biological effects depend on how much of the energy is absorbed in the body of a living organism, not just what exists in space.

In this paper Top of page 1. Introduction 2. A changing industry 3. Cell towers in perspective: some definitions 4. Specific absorption rate (SAR) « 5. Transmission facilities 6. Government radiofrequency radiation (RFR) guidelines: how spatial energy translates to the body’s absorption 7. Biological effects at low intensities 8. Long-term exposures and cumulative effects 9. Effects below 4 W/kg: thermal versus nonthermal 10. Studies on exposure to cell tower transmissions 11. Risk perception, electrohypersensitivity, and psychological factors 12. Assessing exposures 13. Discussion 4. Specific absorption rate (SAR) Absorption of RFR depends on many factors including the transmission frequency and the power density, one’s distance from the radiating source, and one’s orientation toward the radiation of the system. Other factors include the size, shape, mineral and water content of an organism. Children absorb energy differently than adults because of differences in their anatomies and tissue composition. Children are not just “little adults”. For this reason, and because their bodies are still developing, children may be more susceptible to damage from cell phone radiation. For instance, radiation from a cell phone penetrates deeper into the head of children (Gandhi et al. 1996; Wiart et al. 2008) and certain tissues of a child’s head, e.g., the bone marrow and the eye, absorb significantly more energy than those in an adult head (Christ et al. 2010). The same can be presumed for proximity to towers, even though exposure will be lower from towers under most circumstances than from cell phones. This is because of the distance from the source. The transmitter is placed directly against the head during cell phone use whereas proximity to a cell tower will be an ambient exposure at a distance. There is little difference between cell phones and the domestic cordless phones used today. Both use similar frequencies and involve a transmitter placed against the head. But the newer digitally enhanced cordless technology (DECT) cordless domestic phones transmit a constant signal even when the phone is not in use, unlike the older domestic cordless phones. But some DECT brands are available that stop transmission if the mobile units are placed in their docking station. The term used to describe the absorption of RFR in the body is specific absorption rate (SAR), which is the rate of energy that is actually absorbed by a unit of tissue. Specific absorption rates (SARs) are generally expressed in watts per kilogram (W/kg) of tissue. The SAR measurements are averaged either over the whole body, or over a small volume of tissue, typically between 1 and 10 g of tissue. The SAR is used to quantify energy absorption to fields typically between 100 kHz and 10 GHz and encompasses RFR from devices such as cellular phones up through diagnostic MRI (magnetic resonance imaging). Specific absorption rates are a more reliable determinant and index of RFR’s biological effects than are power density, or the intensity of the field in space, because SARs reflect what is actually being absorbed rather than the energy in space. However, while SARs may be a more precise model, at least in theory, there were only a handful of animal studies that were used to determine the threshold values of SAR for the setting of human exposure guidelines (de Lorge and Ezell 1980; de Lorge 1984). (For further information see Section 8). Those values are still reflected in today’s standards. It is presumed that by controlling the field strength from the transmitting source that SARs will automatically be controlled too, but this may not be true in all cases, especially with far-field exposures such as near cell or broadcast towers. Actual measurement of SARs is very difficult in real life so measurements of electric and magnetic fields are used as surrogates because they are easier to assess. In fact, it is impossible to conduct SAR measurements in living organisms so all values are inferred from dead animal measurements (thermography, calorimetry, etc.), phantom models, or computer simulation (FDTD). However, according to the Scientific Committee on Emerging and Newly Identified Health Risks (SCENIHR) Health Effects of Exposure to EMF, released in January of 2009: … recent studies of whole body plane wave exposure of both adult and children phantoms demonstrated that when children and small persons are exposed to levels which are in compliance with reference levels, exceeding the basic restrictions cannot be excluded [Dimbylow and Bloch 2007; Wang et al. 2006; Kuhn et al., 2007; Hadjem et al., 2007]. While the whole frequency range has been investigated, such effects were found in the frequency bands around 100 MHz and also around 2 GHz. For a model of a 5-year-old child it has been shown that when the phantom is exposed to electromagnetic fields at reference levels, the basic restrictions were exceeded by 40% [Conil et al., 2008]…. Moreover, a few studies demonstrated that multipath exposure can lead to higher exposure levels compared to plane wave exposure [Neubauer et al. 2006; Vermeeren et al. 2007]. It is important to realize that this issue refers to far field exposure only, for which the actual exposure levels are orders of magnitude below existing guidelines. (p. 34–35, SCENIHR 2009) In addition to average SARs, there are indications that biological effects may also depend on how energy is actually deposited in the body. Different propagation characteristics such as modulation, or different wave-forms and shapes, may have different effects on living systems. For example, the same amount of energy can be delivered to tissue continuously or in short pulses. Different biological effects may result depending on the type and duration of the exposure.

In this paper Top of page 1. Introduction 2. A changing industry 3. Cell towers in perspective: some definitions 4. Specific absorption rate (SAR) 5. Transmission facilities « 6. Government radiofrequency radiation (RFR) guidelines: how spatial energy translates to the body’s absorption 7. Biological effects at low intensities 8. Long-term exposures and cumulative effects 9. Effects below 4 W/kg: thermal versus nonthermal 10. Studies on exposure to cell tower transmissions 11. Risk perception, electrohypersensitivity, and psychological factors 12. Assessing exposures 13. Discussion 5. Transmission facilities The intensity of RFR decreases rapidly with the distance from the emitting source; therefore, exposure to RFR from transmission towers is often of low intensity depending on one’s proximity. But intensity is not the only factor. Living near a facility will involve long-duration exposures, sometimes for years, at many hours per day. People working at home or the infirm can experience low-level 24 h exposures. Nighttimes alone will create 8 h continuous exposures. The current standards for both ICNIRP, IEEE and the NCRP (adopted by the U.S. FCC) are for whole-body exposures averaged over a short duration (minutes) and are based on results from short-term exposure studies, not for long-term, low-level exposures such as those experienced by people living or working near transmitting facilities. For such populations, these can be involuntary exposures, unlike cell phones where user choice is involved. There have been some recent attempts to quantify human SARs in proximity to cell towers but these are primarily for occupational exposures in close proximity to the sources and questions raised were dosimetry-based regarding the accuracy of antenna modeling (van Wyk et al. 2005). In one study by Martínez-Búrdalo et al. (2005) however, the researchers used high-resolution human body models placed at different distances to assess SARs in worst-case exposures to three different frequencies — 900, 1800, and 2170 MHz. Their focus was to compute whole-body averaged SARs at a maximum 10 g averaged SAR inside the exposed model. They concluded that for … antenna–body distances in the near zone of the antenna, the fact that averaged field values are below reference levels, could, at certain frequencies, not guarantee guidelines compliance based on basic restrictions. (p. 4125, Martínez-Búrdalo et al. 2005) This raises questions about the basic validity of predicting SARs in real-life exposure situations or compliance to guidelines according to standard modeling methods, at least when one is very close to an antenna. Thus, the relevant questions for the general population living or working near transmitting facilities are: Do biological and (or) health effects occur after exposure to low-intensity RFR? Do effects accumulate over time, since the exposure is of a long duration and may be intermittent? What precisely is the definition of low-intensity RFR? What might its biological effects be and what does the science tell us about such exposures?

In this paper Top of page 1. Introduction 2. A changing industry 3. Cell towers in perspective: some definitions 4. Specific absorption rate (SAR) 5. Transmission facilities 6. Government radiofrequency radiation (RFR) guidelines: how spatial energy translates to the body’s absorption « 7. Biological effects at low intensities 8. Long-term exposures and cumulative effects 9. Effects below 4 W/kg: thermal versus nonthermal 10. Studies on exposure to cell tower transmissions 11. Risk perception, electrohypersensitivity, and psychological factors 12. Assessing exposures 13. Discussion 6. Government radiofrequency radiation (RFR) guidelines: how spatial energy translates to the body’s absorption The U.S. FCC has issued guidelines for both power density and SARs. For power density, the U.S. guidelines are between 0.2–1.0 mW/cm2. For cell phones, SAR levels require hand-held devices to be at or below 1.6 W/kg measured over 1.0 g of tissue. For whole body exposures, the limit is 0.08 W/kg. In most European countries, the SAR limit for hand-held devices is 2.0 W/kg averaged over 10 g of tissue. Whole body exposure limits are 0.08 W/kg. At 100–200 ft (∼30–60 m) from a cell phone base station, a person can be exposed to a power density of 0.001 mW/cm2 (i.e., 1.0 μW/cm2). The SAR at such a distance can be 0.001 W/kg (i.e., 1.0 mW/kg). The U.S. guidelines for SARs are between 0.08–0.40 W/kg. For the purposes of this paper, we will define low-intensity exposure to RFR of power density of 0.001 mW/cm2 or a SAR of 0.001 W/kg.

In this paper Top of page 1. Introduction 2. A changing industry 3. Cell towers in perspective: some definitions 4. Specific absorption rate (SAR) 5. Transmission facilities 6. Government radiofrequency radiation (RFR) guidelines: how spatial energy translates to the body’s absorption 7. Biological effects at low intensities « 8. Long-term exposures and cumulative effects 9. Effects below 4 W/kg: thermal versus nonthermal 10. Studies on exposure to cell tower transmissions 11. Risk perception, electrohypersensitivity, and psychological factors 12. Assessing exposures 13. Discussion 7. Biological effects at low intensities Many biological effects have been documented at very low intensities comparable to what the population experiences within 200 to 500 ft (∼60–150 m) of a cell tower, including effects that occurred in studies of cell cultures and animals after exposures to low-intensity RFR. Effects reported include: genetic, growth, and reproductive; increases in permeability of the blood–brain barrier; behavioral; molecular, cellular, and metabolic; and increases in cancer risk. Some examples are as follows: Dutta et al. (1989) reported an increase in calcium efflux in human neuroblastoma cells after exposure to RFR at 0.005 W/kg. Calcium is an important component in normal cellular functions. Fesenko et al. (1999) reported a change in immunological functions in mice after exposure to RFR at a power density of 0.001 mW/cm2. Magras and Xenos (1997) reported a decrease in reproductive function in mice exposed to RFR at power densities of 0.000168–0.001053 mW/cm2. Forgacs et al. (2006) reported an increase in serum testosterone levels in rats exposed to GSM (global system for mobile communication)-like RFR at SAR of 0.018–0.025 W/kg. Persson et al. (1997) reported an increase in the permeability of the blood–brain barrier in mice exposed to RFR at 0.0004–0.008 W/kg. The blood–brain barrier is a physiological mechanism that protects the brain from toxic substances, bacteria, and viruses. Phillips et al. (1998) reported DNA damage in cells exposed to RFR at SAR of 0.0024–0.024 W/kg. Kesari and Behari (2009) also reported an increase in DNA strand breaks in brain cells of rats after exposure to RFR at SAR of 0.0008 W/kg. Belyaev et al. (2009) reported changes in DNA repair mechanisms after RFR exposure at a SAR of 0.0037 W/kg. A list of publications reporting biological and (or) health effects of low-intensity RFR exposure is in Table 1. Out of the 56 papers in the list, 37 provided the SAR of exposure. The average SAR of these studies at which biological effects occurred is 0.022 W/kg — a finding below the current standards. »View table Table 1. List of studies reporting biological effects at low intensities of radiofrequency radiation (RFR). Out of the 56 papers in the list, 37 provided the SAR of exposure. The average SAR of these studies at which biological effects occurred is 0.022 W/kg — a finding below the current standards. Ten years ago, there were only about a dozen studies reporting such low-intensity effects; currently, there are more than 60. This body of work cannot be ignored. These are important findings with implications for anyone living or working near a transmitting facility. However, again, most of the studies in the list are on short-term (minutes to hours) exposure to low-intensity RFR. Long-term exposure studies are sparse. In addition, we do not know if all of these reported effects occur in humans exposed to low-intensity RFR, or whether the reported effects are health hazards. Biological effects do not automatically mean adverse health effects, plus many biological effects are reversible. However, it is clear that low-intensity RFR is not biologically inert. Clearly, more needs to be learned before a presumption of safety can continue to be made regarding placement of antenna arrays near the population, as is the case today.

In this paper Top of page 1. Introduction 2. A changing industry 3. Cell towers in perspective: some definitions 4. Specific absorption rate (SAR) 5. Transmission facilities 6. Government radiofrequency radiation (RFR) guidelines: how spatial energy translates to the body’s absorption 7. Biological effects at low intensities 8. Long-term exposures and cumulative effects 9. Effects below 4 W/kg: thermal versus nonthermal 10. Studies on exposure to cell tower transmissions 11. Risk perception, electrohypersensitivity, and psychological factors « 12. Assessing exposures 13. Discussion 11. Risk perception, electrohypersensitivity, and psychological factors Others have followed up on what role risk perception might play in populations near cell base stations to see if it is associated with health complaints. Blettner et al. (2008) conducted a cross-sectional, multi-phase study in Germany. In the initial phase, 30 047 people out of a total of 51 444, who took part in a nationwide survey, were also asked about their health and attitudes towards mobile phone base stations. A list of 38 potential health complaints were used. With a response rate of 58.6%, 18.0% were concerned about adverse health effects from base stations, 10.3% directly attributed personal adverse effects to them. It was found that people living within 500 m, or those concerned about personal exposures, reported more health complaints than others. The authors concluded that even though a substantial proportion of the German population is concerned about such exposures, the observed higher health complaints cannot be attributed to those concerns alone. Kristiansen et al. (2009) also explored the prevalence and nature of concerns about mobile phone radiation, especially since the introduction of new 3G–UMTS (universal mobile telecommunications system) networks that require many more towers and antennas have sparked debate throughout Europe. Some local governments have prohibited mobile antennas on public buildings due to concerns about cancer, especially brain cancer in children and impaired psychomotor functions. One aim of the researchers was risk assessment — to compare people’s perceptions of risk from cell phones and masts to other fears, such as being struck by lightening. In Denmark, they used data from a 2006 telephone survey of 1004 people aged 15+ years. They found that 28% of the respondents were concerned about exposure to mobile phone radiation and 15% about radiation from masts. In contrast, 82% of respondents were concerned about other forms of environmental pollution. Nearly half of the respondents considered the mortality risk of 3G phones and masts to be of the same order of magnitude as being struck by lightning (0.1 fatalities per million people per year), while 7% thought it was equivalent to tobacco-induced lung cancer (approximately 500 fatalities per million per year). Among women, concerns about mobile phone radiation, perceived mobile phone mortality risk, and concerns about unknown consequences of new technologies, increased with educational levels. More than two thirds of the respondents felt that they had not received adequate public information about the 3G system. The results of the study indicated that the majority of the survey population had little concern about mobile phone radiation, while a minority is very concerned. Augner et al. (2009) examined the effects of short-term GSM base station exposure on psychological symptoms including good mood, alertness, and calmness as measured by a standardized well-being questionnaire. Fifty-seven participants were randomly assigned to one of three different exposure scenarios. Each of those scenarios subjected participants to five 50 min exposure sessions, with only the first four relevant for the study of psychological symptoms. Three exposure levels were created by shielding devices, which could be installed or removed between sessions to create double-blinded conditions. The overall median power densities were 0.00052 μW/cm2 during low exposures, 0.0154 μW/cm2 during medium exposures, and 0.2127 μW/cm2 during high-exposure sessions. Participants in high- and medium-exposure scenarios were significantly calmer during those sessions than participants in low-exposure scenarios throughout. However, no significant differences between exposure scenarios in the “good mood” or “alertness” factors were found. The researchers concluded that short-term exposure to GSM base station signals may have an impact on well-being by reducing psychological arousal. Eltiti et al. (2007) looked into exposures to the GSM and UMTS exposures from base stations and the effects to 56 participants who were self-reported as sensitive to electromagnetic fields. Some call it electro-hypersensitivity (EHS) or just electrosensitivity. People with EHS report that they suffer negative health effects when exposed to electromagnetic fields from everyday objects such as cell phones, mobile phone base stations, and many other common things in modern societies. EHS is a recognized functional impairment in Sweden. This study used both open provocation and double-blind tests to determine if electrosensitive and control individuals experienced more negative health effects when exposed to base-station-like signals compared with sham exposures. Fifty-six electrosensitive and 120 control participants were tested first in an open provocation test. Of these, 12 electrosensitive and six controls withdrew after the first session. Some of the electrosensitive subjects later issued a statement saying that the initial exposures made them too uncomfortable to continue participating in the study. This means that the study may have lost its most vulnerable test subjects right at the beginning, possibly skewing later outcomes. The remainder completed a series of double-blind tests. Subjective measures of well-being and symptoms, as well as physiological measures of blood-volume pulse, heart rate, and skin conductance were obtained. They found that during the open provocation, electrosensitive individuals reported lower levels of well-being to both GSM and UMTS signals compared with sham exposure, whereas controls reported more symptoms during the UMTS exposure. During double-blind tests the GSM signal did not have any effect on either group. Electrosensitive participants did report elevated levels of arousal during the UMTS condition, but the number or severity of symptoms experienced did not increase. Physiological measures did not differ across the three exposure conditions for either group. The researchers concluded that short-term exposure to a typical GSM base-station-like signal did not affect well-being or physiological functions in electrosensitive or control individuals even though the electrosensitive individuals reported elevated levels of arousal when exposed to a UMTS signal. The researchers stated that this difference was likely due to the effect of the order of the exposures throughout the series rather than to the exposure itself. The researchers do not speculate about possible data bias when one quarter of the most sensitive test subjects dropped out at the beginning. In follow-up work, Eltiti et al. (2009) attempted to clarify some of the inconsistencies in the research with people who report sensitivity to electromagnetic fields. Such individuals, they noted, often report cognitive impairments that they believe are due to exposure to mobile phone technology. They further said that previous research in this area has revealed mixed results, with the majority of research only testing control individuals. Their aim was to clarify whether short-term (50 min) exposure at 1 μW/cm2 to typical GSM and UMTS base station signals affects attention, memory, and physiological endpoints in electrosensitive and control participants. Data from 44 electrosensitive and 44 matched-control participants who performed the digit symbol substitution task (DSST), digit span task (DS), and a mental arithmetic task (MA), while being exposed to GSM, UMTS, and sham signals under double-blind conditions were analyzed. Overall, the researchers concluded that cognitive functioning was not affected by short-term exposure to either GSM or UMTS signals. Nor did exposure affect the physiological measurements of blood-volume pulse, heart rate, and skin conductance that were taken while participants performed the cognitive tasks. The GSM signal was a combined signal of 900 and 1800 MHz frequencies, each with a power flux density of 0.5 μW/cm2, which resulted in combined power flux density of 1 μW/cm2 over the area where test subjects were seated. Previous measurements in 2002 by the National Radiological Protection Board in the UK, measuring power density from base stations at 17 sites and 118 locations (Mann et al. 2002), found that in general, the power flux density was between 0.001 μW/cm2 to 0.1 μW/cm2, with the highest power density being 0.83 μW/cm2. The higher exposure used by the researchers in this study was deemed comparable by them to the maximum exposure a person would encounter in the real world. But many electrosensitive individuals report that they react to much lower exposures too. Overall, the electrosensitive participants had a significantly higher level of mean skin conductance than control subjects while performing cognitive tasks. The researchers noted that this was consistent with other studies that hypothesize sensitive individuals may have a general imbalance in autonomic nervous system regulation. Generally, cognitive functioning was not affected in either electrosensitives or controls. When Bonferroni corrections were applied to the data, the effects on mean skin conductance disappeared. A criticism is that this averaging of test results hides more subtle effects. Wallace et al. (2010) also tried to determine if short-term exposure to RFR had an impact on well-being and what role, if any, psychological factors play. Their study focused on “Airwave”, a new communication system being rolled out across the UK for police and emergency services. Some police officers have complained about skin rashes, nausea, headaches, and depression as a consequence of using Airwave two-way radio handsets. The researchers used a small group of self-reported electrosensitive people to determine if they reacted to the exposures, and to determine if exposures to specific signals affect a selection of the adult population who do not report sensitivity to electromagnetic fields. A randomized double-blind provocation study was conducted to establish whether short-term exposure to a terrestrial trunked radio (TETRA) base station signal has an impact on health and well-being in individuals with electrosensitivity and controls. Fifty-one individuals with electrosensitivity and 132 age- and gender-matched controls participated first in an open provocation test, while 48 electrosensitive and 132 control participants went on to complete double-blind tests in a fully screened semi-anechoic chamber. Heart rate, skin conductance, and blood pressure readings provided objective indices of short-term physiological response. Visual analogue scales and symptom scales provided subjective indices of well-being. Their results found no differences on any measure between TETRA and sham (no signal) under double-blind conditions for either control or electrosensitive participants and neither group could detect the presence of a TETRA signal above chance (50%). The researchers noted, however, that when conditions were not double-blinded, the electrosensitive individuals did report feeling worse and experienced more severe symptoms during TETRA compared with sham exposure. They concluded that the adverse symptoms experienced by electrosensitive individuals are caused by the belief of harm from TETRA base stations rather than because of the low-level EMF exposure itself. It is interesting to note that the three previously mentioned studies were all conducted at the same Electromagnetics and Health Laboratory at the University of Essex, Essex, UK, by the same relative group of investigators. Those claiming to be electrosensitive are a small subgroup in the population, often in touch through Internet support groups. In the first test, many electrosensitives dropped out because they found the exposures used in the study too uncomfortable. The drop-out rate decreased with the subsequent studies, which raises the question of whether the electrosensitive participants in the latter studies were truly electrosensitive. There is a possibility that a true subgroup of electrosensitives cannot tolerate such study conditions, or that potential test subjects are networking in a way that preclude their participation in the first place. In fact, researchers were not able to recruit their target numbers for electrosensitive participants in any of the studies. The researchers also do not state if there were any of the same electrosensitive participants used in the three studies. Nor do they offer comment regarding the order of the test methods possibly skewing results. Because of uncertainty regarding whether EMF exposures are actually causing the symptoms that electrosensitives report, and since many electrosensitives also report sensitivities to myriad chemicals and other environmental factors, it has been recommended (Hansson Mild et al. 2006) that a new term be used to describe such individuals — idiopathic environmental intolerance with attribution to electromagnetic fields (IEI-EMF). Furubayashi et al. (2009) also tried to determine if people who reported symptoms to mobile phones are more susceptible than control subjects to the effect of EMF emitted from base stations. They conducted a double-blind, cross-over provocation study, sent questionnaires to 5000 women and obtained 2472 valid responses from possible candidates. From those, they were only able to recruit 11 subjects with mobile phone related symptoms (MPRS) and 43 controls. The assumption was that individuals with MPRS matched the description of electrosensitivity by the World Health Organization (WHO). There were four EMF exposure conditions, each of which lasted 30 min: (i) continuous, (ii) intermittent, (iii) sham exposure with noise, and (iv) sham exposure without noise. Subjects were exposed to EMF of 2.14 GHz, 10 V/m (26.53 μW/cm2) wideband code division multiple access (W-CDMA), in a shielded room to simulate whole-body exposure to EMF from base stations, although the exposure strength they used was higher than that commonly received from base stations. The researchers measured several psychological and cognitive parameters immediately before and after exposure, and monitored autonomic functions. Subjects were asked to report on their perception of EMF and level of discomfort during the experiment. The MPRS group did not differ from the controls in their ability to detect exposure to EMF. They did, however, consistently experience more discomfort in general, regardless of whether or not they were actually exposed to EMF, and despite the lack of significant changes in their autonomic functions. The researchers noted that others had found electrosensitive subjects to be more susceptible to stress imposed by task performance, although they did not differ from normal controls in their personality traits. The researchers concluded that the two groups did not differ in their responses to real or sham EMF exposure according to any psychological, cognitive or autonomic assessment. They said they found no evidence of any causal link between hypersensitivity symptoms and exposure to EMF from base stations. However, this study, had few MPRS participants. Regel et al. (2006) also investigated the effects of the influence of UMTS base-station-like signals on well-being and cognitive performance in subjects with and without self-reported sensitivity to RFR. The researchers performed a controlled exposure experiment in a randomized, double-blind crossover study, with 45 min at an electric field strength of 0 V/m, 1.0 V/m (0.2653 μW/cm2), or 10.0 V/m (26.53 μW/cm2), incident with a polarization of 45° from the left-rear side of the subject, at weekly intervals. A total of 117 healthy subjects that included 33 self-reported sensitive subjects and 84 nonsensitive subjects, participated in the study. The team assessed well-being, perceived field strength, and cognitive performance with questionnaires and cognitive tasks and conducted statistical analyses using linear mixed models. Organ-specific and brain-tissue-specific dosimetry, including uncertainty and variation analysis, was performed. Their results found that in both groups, well-being and perceived field strength were not associated with actual exposure levels. They observed no consistent condition-induced changes in cognitive performance except for two marginal effects. At 10 V/m (26.53 μW/cm2) they observed a slight effect on speed in one of six tasks in the sensitive subjects and an effect on accuracy in another task in nonsensitive subjects. Both effects disappeared after multiple endpoint adjustments. They concluded that they could not confirm a short-term effect of UMTS base-station-like exposure on well-being. The reported effects on brain functioning were marginal, which they attributed to chance. Peak spatial absorption in brain tissue was considerably smaller than during use of a mobile phone. They concluded that no conclusions could be drawn regarding short-term effects of cell phone exposure or the effects of long-term base-station-like exposures on human health. Siegrist et al. (2005) investigated risk perceptions associated with mobile phones, base stations, and other sources of EMFs through a telephone survey conducted in Switzerland. Participants assessed both risks and benefits associated with nine different sources of EMF. Trust in the authorities regulating these hazards was also assessed. Participants answered a set of questions related to attitudes toward EMF and toward mobile phone base stations. Their results were: high-voltage transmission lines are perceived as the most risky source of EMF; and mobile phones and base stations received lower risk ratings. Trust in authorities was positively associated with perceived benefits and negatively associated with perceived risks. Also, people who use their mobile phones frequently perceived lower risks and higher benefits than people who use their mobile phones infrequently. People who believed they lived close to a base station did not significantly differ in their perceived level of risks associated with mobile phone base stations from people who did not believe they lived close to a base station. A majority of participants favored limits to exposures based on worst-case scenarios. The researchers also correlated perceived risks with other beliefs and found that belief in paranormal phenomena is related to level of perceived risks associated with EMF. In addition, people who believed that most chemical substances cause cancer also worried more about EMF than people who did not believe that chemical substances are harmful. This study found the obvious — that some people worry more about environmental factors than others across a range of concerns. Wilen et al. (2006) investigated the effects of exposure to mobile phone RFR on people who experience subjective symptoms when using mobile phones. Twenty subjects with MPRS were matched with 20 controls without MPRS. Each subject participated in two experimental sessions, one with true exposure and one with sham exposure, in random order. In the true exposure condition, the test subjects were exposed for 30 min to an RFR field generating a maximum SAR (1 g) in the head of 1 W/kg through an indoor base station antenna attached to signals from a 900 MHz GSM mobile phone. Physiological and cognitive parameters were measured during the experiment for heart rate and heart rate variability (HRV), respiration, local blood flow, electrodermal activity, critical flicker fusion threshold (CFFT), short-term memory, and reaction time. No significant differences related to RFR exposure conditions and no differences in baseline data were found between subject groups with the exception for reaction time, which was significantly longer among the test subjects than among the controls the first time the test was performed. This difference disappeared when the test was repeated. However, the test subjects differed significantly from the controls with respect to HRV as measured in the frequency domain. The test subjects displayed a shift in the low/high frequency ratio towards a sympathetic dominance in the autonomous nervous system during the CFFT and memory tests, regardless of exposure condition. They interpreted this as a sign of differences in the autonomous nervous system regulation among persons with MPRS and persons with no such symptoms.

In this paper Top of page 1. Introduction 2. A changing industry 3. Cell towers in perspective: some definitions 4. Specific absorption rate (SAR) 5. Transmission facilities 6. Government radiofrequency radiation (RFR) guidelines: how spatial energy translates to the body’s absorption 7. Biological effects at low intensities 8. Long-term exposures and cumulative effects 9. Effects below 4 W/kg: thermal versus nonthermal 10. Studies on exposure to cell tower transmissions 11. Risk perception, electrohypersensitivity, and psychological factors 12. Assessing exposures « 13. Discussion 12. Assessing exposures Quantifying, qualifying, and measuring radiofrequency (RF) energy both indoors and outdoors has frustrated scientists, researchers, regulators, and citizens alike. The questions involve how best to capture actual exposure data — through epidemiology, computer estimates, self-reporting, or actual dosimetry measurements. Determining how best to do this is more important than ever, given the increasing background levels of RFR. Distance from a generating source has traditionally been used as a surrogate for probable power density but that is imperfect at best, given how RF energy behaves once it is transmitted. Complicated factors and numerous variables come into play. The wearing of personal dosimetry devices appears to be a promising area for capturing cumulative exposure data. Neubauer et al. (2007) asked the question if epidemiology studies are even possible now, given the increasing deployment of wireless technologies. They examined the methodological challenges and used experts in engineering, dosimetry, and epidemiology to critically evaluate dosimetric concepts and specific aspects of exposure assessment regarding epidemiological study outcomes. They concluded that, at least in theory, epidemiology studies near base stations are feasible but that all relevant RF sources have to be taken into account. They called for pilot studies to validate exposure assessments and recommended that short-to-medium term effects on health and well-being are best investigated by cohort studies. They also said that for long-term effects, groups with high exposures need to be identified first, and that for immediate effects, human laboratory studies are the preferred approach. In other words, multiple approaches are required. They did not make specific recommendations on how to quantify long-term, low-level effects on health and well-being. Radon et al. (2006) compared personal RF dosimetry measurements against recall to ascertain the reliability of self-reporting near base stations. Their aim was to test the feasibility and reliability of personal dosimetry devices. They used a 24 h assessment on 42 children, 57 adolescents, and 64 adults who wore a Maschek dosimeter prototype, then compared the self-reported exposures with the measurements. They also compared the readings of Maschek prototype with those of the Antennessa DSP-090 in 40 test subjects. They found that self-reported exposures did not correlate with actual readings. The two dosimeters were in moderate agreement. Their conclusion was that personal dosimetry, or the wearing of measuring devices, was a feasible method in epidemiology studies. A study by Frei et al. (2009) also used personal dosimetry devices to examine the total exposure levels of RFR in the Swiss urban population. What they found was startling — nearly a third of the test subjects’ cumulative exposures were from cell base stations. Prior to this study, exposure from base stations was thought to be insignificant due to their low-power densities and to affect only those living or working in close proximity to the infrastructure. This study showed that the general population moves in and out of these particular fields with more regularity than previously expected. In a sample of 166 volunteers from Basel, Switzerland, who agreed to wear personal exposure meters (called exposimeters), the researchers found that nearly one third of total exposures came from base stations. Participants carried an exposimeter for 1 week (2 separate weeks in 32 participants) and also completed an activity diary. Mean values were calculated using the robust regression on order statistics (ROS) method. Results found a mean weekly exposure to all RFR and (or) EMF sources was 0.013 μW/cm2 (range of individual means 0.0014–0.0881 μW/cm2). Exposure was mainly from mobile phone base stations (32.0%), mobile phone handsets (29.1%), and digital enhanced cordless telecommunications (DECT) phones (22.7%). People owning a DECT phone (total mean 0.015 μW/cm2) or mobile phone (0.014 μW/cm2) were exposed more than those not owning a DECT or mobile phone (0.010 μW/cm2). Mean values were highest in trains (0.116 μW/cm2), airports (0.074 μW/cm2), and tramways or buses (0.036 μW/cm2) and were higher during daytime (0.016 μW/cm2) than nighttime (0.008 μW/cm2). The Spearman correlation coefficient between mean exposure in the first and second week was 0.61. Another surprising finding of this study contradicted Neubauer et al. (2008) who found that a rough dosimetric estimate of a 24 h exposure from a base station (1–2 V/m) (i.e., 0.2653–1.061 μW/cm2) corresponded to approximately 30 min of mobile phone use. But Frei et al. (2009) found, using the exposimeter, that cell phone use was 200 times higher than the average base station exposure contribution in self-selected volunteers (0.487 versus 0.002 μW/cm2). This implied that at the belt, backpack, or in close vicinity to the body, the mean base station contribution corresponds to about 7 min of mobile phone use (24 h divided by 200), not 30 min. They concluded that exposure to RFR varied considerably between persons and locations but was fairly consistent for individuals. They noted that cell phones, base stations, and cordless phones were important sources of exposure in urban Switzerland but that people could reduce their exposures by replacing their cordless domestic phones with conventional landlines at home. They determined that it was feasible to combine diary data with personal exposure measurements and that such data was useful in evaluating RFR exposure during daily living, as well as helpful in reducing exposure misclassification in future epidemiology studies. Viel et al. (2009) also used personal exposure meters (EME SPY 120 made by Satimo and ESM 140 made by Maschek) to characterize actual residential exposure from antennas. Their primary aim was to assess personal exposures, not ambient field strengths. Two hundred randomly selected people were enrolled to wear measurement meters for 24 h and asked to keep a time–location–activity diary. Two exposure metrics for each radiofrequency were then calculated: the proportion of measurements above the detection limit of 0.05 V/m (0.0006631 μW/cm2) and the maximum electric field strength. Residential addresses were geocoded and distances from each antenna were calculated. They found that much of the time-recorded field strength was below the detection level of 0.05 V/m, with the exception of the FM radio bands, which had a detection threshold of 12.3%. The maximum electric field was always lower than 1.5 V/m (0.5968 μW/cm2). Exposure to GSM and digital cellular system (DCS) frequencies peaked around 280 m in urban areas and 1000 m from antennas in more suburban/rural areas. A downward trend in exposures was found within a 10 km distance for FM exposures. Conversely, UMTS, TV3, and TV 4 and 5 signals did not vary with distance. The difference in peak exposures for cell frequencies were attributed to microcell antennas being more numerous in urban areas, often mounted a few meters above ground level, whereas macrocell base stations in less urban areas are placed higher (between 15 and 50 m above ground level) to cover distances of several kilometres. They concluded that despite the limiting factors and high variability of RF exposure assessments, in using sound statistical technique they were able to determine that exposures from GSM and DCS cellular base stations actually increase with distance in the near source zone, with a maximum exposure where the main beam intersects the ground. They noted that such information should be available to local authorities and the public regarding the siting of base stations. Their findings coincide with Abdel-Rassoul et al. (2007) who found field strengths to be less in the building directly underneath antennas, with reported health complaints higher in inhabitants of the building across the street. Amoako et al. (2009) conducted a survey of RFR at public access points close to schools, hospitals, and highly populated areas in Ghana near 50 cell phone base stations. Their primary objective was to measure and analyze field strength levels. Measurements were made using an Anritsu model MS 2601A spectrum analyzer to determine the electric field level in the 900 and 1800 MHz frequency bands. Using a GPS (global positioning system), various base stations were mapped. Measurements were taken at 1.5 m above ground to maintain line of sight with the RF source. Signals were measured during the day over a 3 h period, at a distance of approximately 300 m. The results indicated that power densities for 900 MHz at public access points varied from as low as 0.000001 μW/cm2 to as high as 0.001 μW/cm2. At 1800 MHz, the variation of power densities was from 0.000001 to 0.01 μW/cm2. There are no specific RFR standards in Ghana. These researchers determined that while their results in most cites were compliant with the ICNIRP standards, levels were still 20 times higher than values typically found in the UK, Australia, and the U.S., especially for Ghana base stations in rural areas with higher power output. They determined that there is a need to reduce RFR levels since an increase in mobile phone usage is foreseen. Clearly, predicting actual exposures based on simple distance from antennas using standardized computer formulas is inadequate. Although power density undoubtedly decreases with distance from a generating source, actual exposure metrics can be far more complex, especially in urban areas. Contributing to the complexity is the fact that the narrow vertical spread of the beam creates a low RF field strength at the ground directly below the antenna. As a person moves away or within a particular field, exposures can become complicated, creating peaks and valleys in field strength. Scattering and attenuation alter field strength in relation to building placement and architecture, and local perturbation factors can come into play. Power density levels can be 1 to 100 times lower inside a building, depending on construction materials, and exposures can differ greatly within a building, depending on numerous factors such as orientation toward the generating source and the presence of conductive materials. Exposures can be twice as high in upper floors than in lower floors, as found by Anglesio et al. (2001). However, although distance from a transmitting source has been shown to be an unreliable determinant for accurate exposure predictions, it is nevertheless useful in some general ways. For instance, it has been shown that radiation levels from a tower with 15 nonbroadcast radio systems will fall off to hypothetical natural background levels at approximately 1500 ft (∼500 m) (Rinebold 2001). This would be in general agreement with the lessening of symptoms in people living near cell towers at a distance over 1000 ft (∼300 m) found by Santini et al. (2002) . The previously mentioned studies indicate that accuracy in both test design and personal dosimetry measurements are possible in spite of the complexities and that a general safer distance from a cell tower for residences, schools, daycare centers, hospitals, and nursing homes might be ascertained.

In this paper Top of page 1. Introduction 2. A changing industry 3. Cell towers in perspective: some definitions 4. Specific absorption rate (SAR) 5. Transmission facilities 6. Government radiofrequency radiation (RFR) guidelines: how spatial energy translates to the body’s absorption 7. Biological effects at low intensities 8. Long-term exposures and cumulative effects 9. Effects below 4 W/kg: thermal versus nonthermal 10. Studies on exposure to cell tower transmissions 11. Risk perception, electrohypersensitivity, and psychological factors 12. Assessing exposures 13. Discussion « 13. Discussion Numerous biological effects do occur after short-term exposures to low-intensity RFR but potential hazardous health effects from such exposures on humans are still not well established, despite increasing evidence as demonstrated throughout this paper. Unfortunately, not enough is known about biological effects from long-term exposures, especially as the effects of long-term exposure can be quite different from those of short-term exposure. It is the long-term, low-intensity exposures that are most common today and increasing significantly from myriad wireless products and services. People are reporting symptoms near cell towers and in proximity to other RFR-generating sources including consumer products such as wireless computer routers and Wi-Fi systems that appear to be classic “microwave sickness syndrome,” also known as “radiofrequency radiation sickness.” First identified in the 1950s by Soviet medical researchers, symptoms included headache, fatigue, ocular dysfunction, dizziness, and sleep disorders. In Soviet medicine, clinical manifestations include dermographism, tumors, blood changes, reproductive and cardiovascular abnormalities, depression, irritability, and memory impairment, among others. The Soviet researchers noted that the syndrome is reversible in early stages but is considered lethal over time (Tolgskaya et al. 1973). Johnson-Liakouris (1998) noted there are both occupational studies conducted between 1953 and 1991 and clinical cases of acute exposure between 1975 and 1993 that offer substantive verification for the syndrome. Yet, U.S. regulatory agencies and standards-setting groups continue to quibble about the existence of microwave sickness because it does not fit neatly into engineering models for power density, even as studies are finding that cell towers are creating the same health complaints in the population. It should be noted that before cellular telecommunications technology, no such infrastructure exposures between 800 MHz and 2 GHz existed this close to so many people. Microwave ovens are the primary consumer product utilizing a high RF intensity, but their use is for very brief periods of time and ovens are shielded to prevent leakage above 1000 μW/cm2 — the current FDA standard. In some cases, following the U.S. Telecommunications Act of 1996 preemption of local health considerations in infrastructure siting, antennas have been mounted within mere feet of dwellings. And, on buildings with roof-mounted arrays, exposures can be lateral with top floors of adjacent buildings at close range. It makes little sense to keep denying health symptoms that are being reported in good faith. Though the prevalence of such exposures is relatively new to a widespread population, we, nevertheless, have a 50 year observation period to draw from. The primary questions now involve specific exposure parameters, not the reality of the complaints or attempts to attribute such complaints to psychosomatic causes, malingering, or beliefs in paranormal phenomenon. That line of argument is insulting to regulators, citizens, and their physicians. Serious mitigation efforts are overdue. There is early Russian and U.S. documentation of long-term, very low-level exposures causing microwave sickness as contained in The Johns Hopkins Foreign Service Health Status Study done in 1978 (Lilienfield et al. 1978; United States Senate 1979). This study contains both clinical information, and clear exposure parameters. Called the Lilienfield study, it was conducted between 1953 and 1976 to determine what, if any, effects there had been to personnel in the U.S. Embassy in Moscow after it was discovered that the Soviet government had been systematically irradiating the U.S. government compound there. The symptoms reported were not due to any known tissue heating properties. The power densities were not only very low but the propagation characteristics were remarkably similar to what we have today with cell phone base stations. Lilienfield recorded exposures for continuous-wave, broadband, modulated RFR in the frequency ranges between 0.6 and 9.5 GHz. The exposures were long-term and low-level at 6 to 8 h per day, 5 days per week, with the average length of exposure time per individual between 2 to 4 years. Modulation information contained phase, amplitude, and pulse variations with modulated signals being transmitted for 48 h or less at a time. Radiofrequency power density was between 2 and 28 μW/cm2 — levels comparable to recent studies cited in this paper. The symptoms that Lilienfield found included four that fit the Soviet description for dermographism — eczema, psoriasis, allergic, and inflammatory reactions. Also found were neurological problems with diseases of peripheral nerves and ganglia in males; reproductive problems in females during pregnancy, childbearing, and the period immediately after delivery (puerperium); tumor increases (malignant in females, benign in males); hematological alterations; and effects on mood and well-being including irritability, depression, loss of appetite, concentration, and eye problems. This description of symptoms in the early literature is nearly identical to the Santini, Abdel-Rassoul, and Narvarro studies cited earlier, as well as the current (though still anecdotal) reports in communities where broadcast facilities have switched from analog to digital signals at power intensities that are remarkably similar. In addition, the symptoms in the older literature are also quite similar to complaints in people with EHS. Such reports of adverse effects on well-being are occurring worldwide near cell infrastructure and this does not appear to be related to emotional perceptions of risk. Similar symptoms have also been recorded at varying distances from broadcast towers. It is clear that something else is going on in populations exposed to low-level RFR that computer-generated RFR propagation models and obsolete exposure standards, which only protect against acute exposures, do not encompass or understand. With the increase in so many RFR-emitting devices today, as well as the many in the wings that will dramatically increase total exposures to the population from infrastructure alone, it may be time to approach this from a completely different perspective. It might be more realistic to consider ambient outdoor and indoor RFR exposures in the same way we consider other environmental hazards such as chemicals from building materials that cause sick building syndrome. In considering public health, we should concentrate on aggregate exposures from multiple sources, rather than continuing to focus on individual source points like cell and broadcast base stations. In addition, whole categorically excluded technologies must be included for systems like Wi-Fi, Wi-Max, smart grids, and smart metering as these can greatly increase ambient radiation levels. Only in that way will low-level electromagnetic energy exposures be understood as the broad environmental factor it is. Radiofrequency radiation is a form of energetic air pollution and it should be controlled as such. Our current predilection to take this one product or service at a time does not encompass what we already know beyond reasonable doubt. Only when aggregate exposures are better understood by consumers will disproportionate resistance to base station siting bring more intelligent debate into the public arena and help create safer infrastructure. That can also benefit the industries trying to satisfy customers who want such services. Safety to populations living or working near communications infrastructure has not been given the kind of attention it deserves. Aggregate ambient outdoor and indoor exposures should be emphasized by summing up levels from different generating source points in the vicinity. Radiofrequency radiation should be treated and regulated like radon and toxic chemicals, as aggregate exposures, with appropriate recommendations made to the public including for consumer products that may produce significant RFR levels indoors. When indoor consumer products such as wireless routers, cordless/DECT phones, leaking microwave ovens, wireless speakers, and (or) security systems, etc. are factored in with nearby outdoor transmission infrastructure, indoor levels may rise to exposures that are unsafe. The contradictions in the studies should not be used to paralyze movement toward safer regulation of consumer products, new infrastructure creation, or better tower siting. Enough good science exists regarding long-term low-level exposures — the most prevalent today — to warrant caution. The present U.S. guidelines for RFR exposure are not up to date. The most recent IEEE and NCRP guidelines used by the U.S. FCC have not taken many pertinent recent studies into consideration because, they argue, the results of many of those studies have not been replicated and thus are not valid for standards setting. That is a specious argument. It implies that someone tried to replicate certain works but failed to do so, indicating the studies in question are unreliable. However, in most cases, no one has tried to exactly replicate the works at all. It must be pointed out that the 4 W/kg SAR threshold based on the de Lorge studies have also not been replicated independently. In addition, effects of long-term exposure, modulation, and other propagation characteristics are not considered. Therefore, the current guidelines are questionable in protecting the public from possible harmful effects of RFR exposure and the U.S. FCC should take steps to update their regulations by taking all recent research into consideration without waiting for replication that may never come because of the scarcity of research funding. The ICNIRP standards are more lenient in key exposures to the population than current U.S. FCC regulations. The U.S. standards should not be “harmonized” toward more lenient allowances. The ICNIRP should become more protective instead. All standards should be biologically based, not dosimetry based as is the case today. Exposure of the general population to RFR from wireless communication devices and transmission towers should be kept to a minimum and should follow the “As Low As Reasonably Achievable” (ALARA) principle. Some scientists, organizations, and local governments recommend very low exposure levels — so low, in fact, that many wireless industries claim they cannot function without many more antennas in a given area. However, a denser infrastructure may be impossible to attain because of citizen unwillingness to live in proximity to so many antennas. In general, the lowest regulatory standards currently in place aim to accomplish a maximum exposure of 0.02 V/m, equal to a power density of 0.0001 μW/cm2, which is in line with Salzburg, Austria’s indoor exposure value for GSM cell base stations. Other precautionary target levels aim for an outdoor cumulative exposure of 0.1 μW/cm2 for pulsed RF exposures where they affect the general population and an indoor exposure as low as 0.01 μW/cm2 (Sage and Carpenter 2009). In 2007, The BioInitiative Report, A rationale for a biologically based public exposure standard for electromagnetic fields (ELF and RF), also made this recommendation, based on the precautionary principle (Bioinitiative Report 2007). Citizens and municipalities often ask for firm setbacks from towers to guarantee safety. There are many variables involved with safer tower siting — such as how many providers are co-located, at what frequencies they operate, the tower’s height, surrounding topographical characteristics, the presence of metal objects, and others. Hard and fast setbacks are difficult to recommend in all circumstances. Deployment of base stations should be kept as efficient as possible to avoid exposure of the public to unnecessary high levels of RFR. As a general guideline, cell base stations should not be located less than 1500 ft (∼500 m) from the population, and at a height of about 150 ft (∼50 m). Several of the papers previously cited indicate that symptoms lessen at that distance, despite the many variables involved. However, with new technologies now being added to cell towers such as Wi-Max networks, which add significantly more power density to the environment, setback recommendations can be a very unpredictable reassurance at best. New technology should be developed to reduce the energy required for effective wireless communication. In addition, regular RFR monitoring of base stations should be considered. Some communities require that ambient background levels be measured at specific distances from proposed tower sites before, and after, towers go online to establish baseline data in case adverse effects in the population are later reported. The establishment of such baselines would help epidemiologists determine what changed in the environment at a specific point in time and help better assess if RFR played a role in health effects. Unfortunately, with so much background RFR today, it is almost impossible to find a clean RFR environment. Pretesting may have become impossible in many places. This will certainly be the case when smart grid technologies create a whole new blanket of low-level RFR, with millions of new transceivers attached to people’s homes and appliances, working off of centralized RFR hubs in every neighborhood. That one technology alone has the ability to permanently negate certain baseline data points. The increasing popularity of wireless technologies makes understanding actual environmental exposures more critical with each passing day. This also includes any potential effects on wildlife. There is a new environmental concept taking form — that of “air as habitat” (Manville 2007) for species such as birds, bats, and insects, in the same way that water is considered habitat for marine life. Until now, air has been considered something “used” but not necessarily “lived in” or critical to the survival of species. However, when air is considered habitat, RFR is among the potential pollutants with an ability to adversely affect other species. It is a new area of inquiry deserving of immediate funding and research.

In this paper Top of page 1. Introduction 2. A changing industry 3. Cell towers in perspective: some definitions 4. Specific absorption rate (SAR) 5. Transmission facilities 6. Government radiofrequency radiation (RFR) guidelines: how spatial energy translates to the body’s absorption 7. Biological effects at low intensities 8. Long-term exposures and cumulative effects 9. Effects below 4 W/kg: thermal versus nonthermal 10. Studies on exposure to cell tower transmissions 11. Risk perception, electrohypersensitivity, and psychological factors 12. Assessing exposures 13. Discussion

References