Cellphone Industry Jargon

  • EMF = ElectroMagnetic Force/Field/Frequency. These terms covers the whole spectrum from Extremely Low Frequency (ELF) radio emissions in the 50/60 Hz mains-power range, through conventional radio signals in the kiloHertz range, to microwaves in the MegaHertz range -- and on to light, ultraviolet, Gamma rays, X-rays, etc.
        EMF signals within this spectrum fundamentally differ only in the frequency of the wave oscillation -- but simply by their higher frequencies microwaves pack more punch: they carry more energy by virtue of the faster vibrations.

  • EMC = Electro-Magnetic Compatibility. How much interference does it produce with nearby electrical or computerized equipment?

  • ELF = Extremely Low Frequency. Waves emitted by mains power lines oscillate at either 50Hz (Europe, Asia, etc.) or 60Hz (USA/Canada/Japan). While these are part of the EMF spectrum, they have wavelengths of thousands of kilometers and operate entirely in the 'near field' which gives them different characteristics to cellphones. The potential health effects of high-tension powerlines is a quite different problem to that of cellphones.
        The electrical power industry had its own Edison Electric Institute and the Electrical Power Research Institute which matched the CTIA and its WTR, and the Tobacco Institute and its Council for Tobacco Research. They all learned from each other and supported each other.
        Questions as to the safety of living under high-tension powerlines and sleeping on electric blankets paralleled those of cellphones, but were quite distinct -- although many researchers took an interest in both.

  • VHF - Very High Frequency -- the official name for the band within the radio spectrum used by television, FM radio, etc. The older cellular phones operated at the upper end of this band. (UHF = Ultra High Frequency)

  • RF = Radio Frequency. This is the portion of the spectrum between the Extremely Low Frequency waves emitted by mains power lines, into the normal radio broadcasting network (say above 10,000Hz) through to FM radio, television and microwave bands. RF stops at the infrared part of the visible light spectrum. The term is often used vaguely to exclude microwaves.

  • Non-ionising Radiation: It has long been assumed that radio and light waves below the frequency of ultra-violet lack enough energy to damage or influence biological cells ... except by over-heating them. This is called the "Thermal assumption". This has always been the basis for exposure standards settings.
        Radio engineers apparently have never bothered to question how humans can see using non-ionising light waves if these can't effect biological tissues -- or how sunlight effects our biological clocks and re-adjusts for jet-lag -- or how photosynthesis works in cold climates.

  • Wavelength: This is nothing more than an inverse calculation of the radio frequency -- but it represents the actual distance between the peaks of successive waves.
        Radio and broadcast television operate at frequencies low enough to produce wavelengths of hundreds of metres to hundreds of kilometers (which allow them to curve around objects in their path); while the wavelength of cellphones are only measured in centimeters (roughly the size of the handset). These waves don't curve around objects, but they do reflect easily which is why you can get signals sometimes without a direct clear line back to the base station.
        These higher frequencies mean that the brain and other organs in the body roughly match the cellphone's wavelength -- and this is thought to be highly significant because of 'resonance'. Objects absorb more energy when their size is proportional to simple fractions/multiples (twice/half/third) of the wavelength.

  • Near-fields/Far Fields: Within one-or-more 'wavelengths' of the source radio waves act as seperate electrical and magnetic fields (the "near field") They are only considered a 'radiation' entity at greater distances -- usually taken to be two to three wavelengths (the "far field"). So cellphone handsets are a 'near-field' problem against the side of the head while base-stations are at 'far field' distances.
        In the near field the tissues are dealing with the separate components of both the electrical field and the magnetic field (at right-angles to each other).

  • Dosimetry: The measurement of the ways in which radio waves are absorbed by different parts of the body.

  • Phantoms: Dosimetry measurements use jello-filled mock heads and torsos with implanted sensors. These are known as phantoms.

  • Power-density: This is the radio equivalent to surface-brightness in measuring light. It is the output power (measured in Watts) spread over a certain area (metres squared) = W/sq m

  • Specific Absorption Rate (SAR): This is a measure of the absorbed radio wave ... actually the temperature change in the tissue of organs when subject to radio waves. It is thought to be relevant as a measure of the 'impact' of RF on biological tissue. Signal power which is not absorbed is assumed to simply pass through.
        However it is important to remember that the power-output of a cellphone in use at any one-time is set by the remote base-station it is utilizing. Therefore the SAR published as a measure of supposed 'cellphone safety' is a quite useless figure since all will be turned up to the same power if used in the same location.

  • Omnidirectional antenna: Cellphones are designed to radiate their transmitted signals equally in all directions. If they weren't, you'd need to constantly swivel your head to track various base-stations and signal-reflections as you walked down a street. Base-stations, by contrast, deliberately beam their signal into an area, with specially designed reflecting antennae which attempt to spread the signal as evenly as possible across its transmission range.

  • Shielding devices: If you put an obstacle in the way of cellphone radiation (to block head absorption) then most of the time the cellphone will only be able to talk to a distant base-station in a direction away from your body, and you'll not be able to move your head without losing the signal.
        This means that the cellphone will need to rack up to full output power, and it will constantly drop out if you move -- completely defeating the whole exercise.
        Fortunately, despite their price, most shielding devices and directional antenna don't work, so the user still has a workable cellphone.

  • Inverse-square law A fundamental principle of all radiating devices. If you double the distance from a transmitter (omnidirectional or directional) the more distant receiver experiences one-quarter the power-density of the closer. At three times the distance, the power-density will be one-ninth. Four-times = 1/16th. One thousand times = 1 millionth.
        So a small battery-powered cellphone can impose a signal on the ear at a power-density which is millions of times higher than that of the high-powered cellular base-station a few hundred metres away. It may also be billions of times higher than that of TV transmitters on a high-towers at suburban distances. [Most radio signals we use are extraordinarily low in power-density]

  • Ear pieces: To get the transmitter away from the head while retaining full functionality people use ear-piece extensions equipped with a microphone. Even a difference of a half-metre will reduce the power-density on the side of the head by very significant amounts.
        Be aware that the cable can conduct radio-waves back to the earpiece (to a limited extent) and this 'solution' often just shifts the concentration of the transmitted signal from the brain to the kidneys or testicles.

Cellular wireless types

AMPS: The original cellphones in use in 1993 were usually of the standard known as Analog Mobile Phone Service or AMPS. These were little more than refined walkie-talkies, with some computer controls back at the base-station which allowed signals to be switched around from one base to another, and for control to be exeerted on the transmitting power of each handset so that the signal arriving from one close-by handset, didn't flood out those from distant handsets.
    AMPS phones operated at fairly high signal levels, mainly because in the early days base-stations were widely distributed, and therefore the range was much greater (remember the inverse square law also).
    There was also a version of AMPS used in Europe known as TACS. It was essentially the same.

TDMA: The first of the digital cellular phones used techniques known as Time Division Multiple Access. Some were promote under the name D-TDMA but the majority were of a variation known as GSM (Group System Mobiles in French). There were some other variation, such as DECT still used in low-power wireless home handsets.
    The European digital cellphone standard GSM specified both the radio signals and controls, and also how these signals would be carried across the Continent by landlines between locations or countries -- and how simple data could also be carried.
    The term Time-Division means that a number of users would be sharing the same radio channel with each being allocated a fraction of the time to send out a highly compressed digital version of the voice, So the handsets of GSM were very much like disco strobe-lights blinking at 217 times a second -- and it was this pulsing of power which produced radio interference in many nearby electrical devices such as pacemakers and hospital equipment.
    The 217Hz pulses were also suspected of playing a part in generating problems like headaches and facial numbness, etc. ... and possibly in DNA breaks, and therefore (maybe) in brain cancer. Obviously fears were exaggerated, but not without reason.
    It became known that the European developers of GSM and the US developers of D-TDMA had known about these problems before 1990 and had simply ignored them when they couldn't be easily fixed. Nor had they bothered to look for possible biological effects until forced to do so in 1994. This was the problem that the CTIA tried to fix with its WTR operation -- which was more PR than genuine research.

CDMA: A US company called Qualcom developed an entirely new way to power cellphones known as Code Division Multiple Access. This signal didn't have power variations, and it relied on more sophisticated computer coding techniques to separate out individual conversation channels from a multi-user radio band.
    It also worked well alongside the old AMPS since it was able to shre the channels, so many dual-mode CDMA/AMPS handset were sold and worked extremely well -- with CDMA being usd in the cities, and AMPS out in the country. CDMA very largely solved interference, battery-power, and user-capacity problems.

W-CDMA: Eventually the world changed over to a Wideband version of CDMA (with the GSM landline protocols) which allows higher data-rates, SMS messaging, etc. This is the system which is now virtually universal. It doesn't pulse its signals, and it tends to operate at much lower powers than any system before -- although it is now being used for much longer times every day.
    W-CDMA may have solved all of the potential health problems, but we won't know for many years because there is virtually no cellphone money supporting research.

Wi-Fi: The fears about Wi-Fi are probably a legacy of the cellphone industry's lack of genuine research. Most Wi-Fi transmissions are well away from the body, and the power required to reach a 'base' is only a tiny fraction of that used by cellphones simply because of the distance.

Cordless telephones: Some of these are of the DECT (TDMA) standard and some CDMA. In both cases the distance between the handset and the base is generally so minor when compared to a cellphone, that the transmission power is relatively insignificant.
    This can only be our best-guess because the research hasn't been done.