ARCHIVED MATERIAL
Usually material that has since disappeared from the web

The Olive comet-assay fiasco

This is a peer-reviewers report on the draft of two papers prepared by a Motorola-funded group at the University of Washington, in St Louis. They had been commissioned to check the quality of the Lai-Singh finding of DNA breaks in rat-brain cells after the rats had been exposed to low-intenssity (cellphone level) microwaves for a two-hour period.

The two papers were essentaially very similar, except that in the first paper the the cells were exposed to a transmitter at the same frequency (2,450 MHz) as that used by Lai and Singh, while in the second, the cells were exposed to both analog AMPS type radiation and Code Division Multiple Access (CDMA) radiation in the 800-900 MHz band.


History of the dispute:
Drs Henry Lai and Natendra Singh at the Washington University in Seattle had reported DNA breaks in the brain cells of rats after relatively brief exposure to their low-intensity microwaves at 2450 MHz. They had identified and measured the DNA breaks by using a form of high-sensitivity assaying (testing) called 'single-cell alkaline electrophoresis.' This was a form of 'comet assay' which had been developed by Narendra Singh.

With Motorola funding, the St Louis group had first tried to duplicate the Singh research techniques and failed for a number of reasons. So they'd decided to use the more automated technique developed for mass-screening by Peggy Oliver instead. The alternate Olive comet-assay technique was widely thought to be noticably inferior to the technique developed by Singh for specialized laboratory use.

Prof. Joseph Roti-Roti's team at the University of Washington, St Louis [Malyapa et al] also changed the research protocols used by Lai-Singh:

  • they exposed a cultured cell-line in vitro(in petri dishes), rather than radiating live rats ( in vivo) then disecting out brain tissue to test.
  • they also used different lysis chemicals (for nucleus separation)
  • there were a number of other minor differences in their use of other chemicals, temperatures and timing of procedures.

By making these changes they completely defeated the purpose of 'replication', and simply did a vaguely similar study in a rather dubious way since it was known that even the more sensitive Singh technique was at the limit of its ability to detect DNA breaks from such low-impact radiation.

For this reason, their so-called 'replication' study was treated with scorn by many scientists at the BioElectroMagnetic Society conference in 1996. The term 'replication' carries with it the implication that a study is testing the integrity and competence of the original study team, and the St Louis group had failed in this completely. An alternate explanation was that they had failed because of their own incompetence.


Comet Assays:
    There is nothing really difficult about the comet assay technique as such, but, in the case of possible cellphone radiation damage to brain cells, they are clearly working close to the limits of the assay. To maintain maximum sensitivity, Singh's technique required strict adherence to a well-tested and standardised laboratory environment, and a highly-experienced assayer to examine the cells on a one-by-one basis using a high-powered measuring microscope.

Narendra Singh is well known to be an obsessive about details, and he is the acknowledged world expert in comet assays. He works in research laboratory conditions with highly standardised chemicals and other controls for breaking down the cell-wall to get access to the nucleus bundle and allow the chromasomes to unravel. His version of the comet assay uses special enzymes to break what are known as DNA crosslinks, but mainly it requires scrupilous attention to detail.

Roti-roti's St Louis laboratory had only a few of these essential requirements and they didn't break the crosslinks.
    While not admitting that this was not a replication Motorola had funded this two-part study (discussed below). The idea was
  1) to check the possibility of the Lai-Singh DNA strand break research without actually replicating it [and so possibly confirming the original study, which would have been disastrous for them.]
  2) Still using the Olive comet assay and petri-dishes, exposing their cells to different cellphone radiation types (AMPS and CDMA),

The main criticism about this dual attempt came from the fact that the St Louis team still didn't understand the essential elements of the comet-assay work, and, in their own draft paper, they demonstrated their incompetence. This was pointed out to John Moulder, the editor of Radiation Research by Peggy Olive herself in the document below. See also Microwave News reporting in February 1998

However since the study supported the industry's side in this fierce faction-fight between cellphone researchers and cellphone companies, Moulder went ahead and published the study anyway. Later Motorola also funded the same team to establishing their claim that the Olive comet-assay technique was superior in sensitivity to the Singh method. Thus attempting to restore the credibility of the first two studies.

This raises a series of questions about scientific integrity.



Explanatory:

Motorola gave Joesph Roti-Roti's team at the University of Washington in St Louis $4 million over a short period of time to do a series of studies... including these near-replications of Lai-Singh which represented the most significant threat to industry claims of 'no possible harm' from cellphone radiation.

After Motorola provided the funding, Singh had flown down to show Roti-Roti's team how to duplicate his test procedures, but they found the process time-consuming, tedious and finicky, and decided to use the Olive computerised 'comet-assay' technique instead. This had been developed by Peggy Olive and her husband as a way of mass-screening for breast cancer, and it relied on computerised image-analysis of batches of cells. In essence, it dealt with calculating the average blurring of image of cell nuclei, since this 'comet-tail' blurring was produced by broken DNA strands.

The rival techniques each had advantages and disadvantages: The Olive technique removed human judgement (and therefore personal bias) from the measurement of strands, but with an associated loss of sensitivity when attempting to detecting low-level effects.

Singh relied on seeing the nucleus in his epifluorescence microscope and measuring the length of DNA particles in the comet-tails. But his laboratory technique relied on personal inspection and human categorisation and calibration — and this opened Lai-Singh to the claim that their judgement may have been colored by a human inclination to find what you want to find. [To overcome this, Lai-Singh use a blinding technique in handling exposed and control slides.]




Types of testing:
    There are three different types of testing being done here:

  • Sensitivity testing: The laboratory needs first of all to calculate the sensitivity of its assay system using:
    • a standard culture of cells which have been controlled so that their phases are synchronous.
    • exposure of these cells progressively to lower impacts from a known isotope (ionising gamma radiation) until the limit of detection is reached
  • Normal cell testing: They are now testing normal cells, either from the brains of the rats or those grown in petri dish cultures:
    • the cells are not in synchronisation (they will be randomly distributed through all phases or mitosis)
    • the exposure comes from the radiation source being researched (cellphones)
  • Positive control testing: This confirms the laboratory's assay quality is maintained during the course of the research. They will use
    • the same cells as in the Normal cell testing — but from unexposed 'control mice',
    • they are exposed to known source of isotope ionising radiation, as used in the sensitivity testing.
    • the regularity of the positive tests, and the comparison between the positive results and the sensitivity results provides a basis for confirming that standards have been maintained.

    Obviously Sensitivity testing is done using optimal techniques to establish a base-line — the best that could possibly be expected. Positive control tests are done under the same conditions, except for the fact that the cells are not synchronised, so the sensitivity achiveable will be far less than optimal. Positive controls are important because they provides assurances that all the other factors (chemicals, rest-periods, times, temperatures, voltages, etc) have been maintained at optimal levels during the normal cell testing.


Standardisation and synchronisation:

It is important that the research laboratory tests and standardises their assaying techniques, including the lysing chemicals used to break down the cell walls; the careful timing of each stage; defined resting periods for the DNA to unravel; the resistance of the agrose gel (physically and electrically); the electrical charging differentials used to drag the DNA strands through the medium; the flow and conductivity of the electrolyte; the temperature and oxygen levels the microscope slides; and even the room brightness. Variations in any of these factors can effect results.

The problems are complex. Unbounded cells in a growth medium are in a continuous cycle of replication and division. And while the chromasomes are being replicated, the nucleus is surrounded by fragments of DNA being constructed to replicate the chromasome — thus allowing the duplicated cells to each have the full complement.

To measure the sensitivity of the assay beforehand, they therefore need to be confident that more than 90% of the cells undergoing this test are in the same "S-phase" (Synthesis). This is the [largely] quiescent period before the cell begins to create measurable lengths of proteins leading to longer gene sequences. The cells then enter a mitosis phase where the cells duplicate their chromasomes and then divide. The most common means of getting large numbers of cells into synchronisation is called the 'mitotic shake-off method' which also requires the culture medium to be carefully standardised and controlled.

When seeking to identify the lowest level of radiation damage detectable, obviously exposed cells will be compared progressively with control (unexposed) cells. To find the limits of the assay, both groups must all be in the S-phase. This is the phase in the mitosis cycle with the least possible natural chromasome breaks. Cells have enzymes which perform DNA repairs, and these go on constantly in cells.

So, to measure the sensitivity of the assay, the laboratory technicians need to cultivate a group of cells which have synchronised division phases, and then to choose only those in the S-phase at the time of testing. Without this careful synchronisation, results from cell-group studies such as with the Olive technique will be contaminated by 'noise' — random sections of genes being synthesised as a prelude to cell division.

This would give a false reading of the assay's sensitivity, since Olive's computer calculations can't readily distinguish the engineered breaks caused by the standardised isotope/gamma ray exposure, from those involved in the natural synthesis processes. With the Singh single-cell method and direct observation, the same variations would result in more obvious indications that they had made a mistake or had a problem somewhere in the process.


Comet assay testing procedures:

The Lai-Singh protocols
Immediately after the exposure period, the whole brain was dissected out and a single cell suspension is made. A small amount from the cell suspension is then mixed with agarose and then put on a fully frosted slide, immediately covered with coverglass.

These slides were kept in an ice-cold steel tray on ice for 1 min. to allow the agarose to gel. After the lysis at 4 degrees C, electrophoresis was started at 250 mA (25 V) for 60 min. and then stained with benzoxazolium-4-quinolium oxalo yellow dimer. Two microscopic slides per animal were prepared and these were examined and analyzed with a Reichert vertical fluorescent microscope. DNA damage was measured in the length of the comet tail with the help of an ocular micrometer.

The standardised sensitivity testing process for comet assays used in a laboratory consist of:

  • White blood cells are cultivated and synchronised using the mitosis shake-out method to ensure that all of the cells are in the same part of the S-phase of the mitosis cycle.
  • The cells are exposed progressively to lower and lower levels/periods of radiation from a known standard source of ioinizing radiation (gamma rays from a radioactive isotope) which can be guaranteed to create DNA breaks in proportion to the exposure.
  • The cells are then suspended in agrose gell on special microscope slides.
  • A lysis solution breaks down the cell wall proteins and liberates the DNA.[variations between methods]
  • The slides are then rested for an period in an alkaline or neutral solution to allow the supercoils of DNA to unravel. A fluorescent stain allows the strands to become visible under UV light.
  • A DC charge of about 24 volts is passed via a circulating electrolyte to create anode attraction along the length of the slide. Electrical attaction drags the semi-conducting DNA strands for varying distances through the agrose gell.
  • the lowest level exposure at which breaks are detectable indicates the sensitivty of the technique.
  • General testing procedures differ in the fact that the rat-brain cells will not be synchronised (therefore not optimal), and the breaks are induced by exposure to the cell phone radiation. The length of each strand is measured and used as an indicator of damage.
  • With Positive tests: the breaks are induced by the isotope's gamma rays and the cells used will not generally be synchronised. This makes them much more difficult for radiation breaks to be identified among the general 'noise' of natural DNA pieces undergoing synthesis. They will therefore require higher exposures before detection than in sensitivity testing, but the difference can be anticipated.

See abstract for similar research
To get an idea of the complexities, see


Positive controls:
    Even after sensitivity testing had been done to establish a base-line for the laboratory's use of a technique, it is also wise for the researcher to perform essentially the same type of sensitivity testing during the course of the study. In this case, the same cells used in the study need to be exposed to standardized gamma rays to check that the total process has worked as expected. Because the research team should be able to predict the results from previous sensitivity testing, these are known as positive controls; they check the viability of the techniques being used, without the complexity of cellphone radiation.

Obviously, the one main difference between the base-line sensitivity testing and the positive/control testing is that it is not possible to use synchronised cells since these need to be harvested from the animal brains of control animals [in Lai-Singh], in exactly the same way they treat the cellphone-radiation-exposed animal brain cells.







Commentary:
    This is one of four returns from peer-reviewers who had been asked to comment on two con-joined papers which have been submitted under the 'Malyapa et al' name to John Moulder, the editor of Radiation Research journal.

    John Moulder is a vocal proponent of the "no effects" faction in cellphone research. He has openly admitted to acting as a consultant to cellphone companies and he provides witness services in legal and ordinance cases objecting to the citing of cellphone towers. Naturally, he is a fierce opponent of cellphone activism, but he maintains that his pro-industry views are not based on financial considerations. Most scientists with a nuclear science background share his view that non-ionizing radiation can not possibly harm biological tissue — but few accept funding from the cellphone industry to maintain a web-site to proclaim this as fact.

These peer-review documents resulted from the Roti-Roti/St Louis group's second attempt to replicate the Lai-Singh study. The dual nature of the submission, is because one of the papers was intended to prove that the Olive comet-assay technique used by the team was as sensitive as the Singh method — thus justifying their previous claims that no DNA breaks could be detected during their first Lai-Singh replication.

They had failed to detect breaks. However the claim to have 'replicated-and-failed' the Lai-Singh study was hard to substantiate — and largely ridiculed — because they hadn't established that the Olive technique was as sensitive as the Singh. Almost every recognised comet assay research laboratory uses the Singh technique because it is recognised as the most sensitive.

Peggy Olive has been given the task of peer-reviewer on these two con-joining papers — despite the fact that both rest on claims about the sensitivity of the Olive-comet assay technique developed by herself and her husband. Narendra Singh who is credited with the rival Singh technique, was also a peer-reviewer, and had an equal vested interest in the outcome. Obviously he was highly critical of the paper.

A third peer-review was a psychologist working for the US Air Force, who didn't seem to have any understanding of comet-assays, and who spent her time correcting the paper's grammar. Why she was chosen to peer-review a paper in one of the most specialised areas of biological laboratory techniques, is something that only John Moulder could explain.

The fourth peer-reviewer has chosen to remain anonymous — but is widely believed to have been Mays Swicord of Motorola.
[So much for thee claimed value of "peer review!"]



Peggy Oliver's comments to John Moulder:
  • She refers to the fact that "this paper could also be topical (because of the radiation "positive control" data, not the microwave results.)"

    She means by this, that the team's positive-control procedures must have been faulty, because they found DNA damage at 0.3 cGy, which meant that the positive control test (done on asynchronous cells) was only possible if their technique was 150-times more sensitive than her own laboritory had found with its careful sensitivity testing.

    She says "I would love to have sensitivity [even at] at 5 cGy with asynchronous cells! With asynchronous cells, we're talking 50 cGy." [Exposure is measured in SI units of radiation absorption called centiGrays (cGy) — which are one-hundredth of a Gray.] Olive herself and others in her laboratory had "tried many times over the last 10 years" to find DNA breaks with exposures 150 times as much as the St Louis group are claiming.

  • She feels that making such high sensitivity claims would result in ridicule since... "all of us (and there are a lot of other labs) do not see such exquisite sensitivity with this method" , and, if asked, they would not be able to support the claims of the novice St Louis group. "Obviously I am concerned with the statement..." she says.

    The whole purpose of positive tests is that the serve as a guarantee that the actual cell-testing has been conducted in a scientific manner... at the time. You can't do later positive testing and retrospectively apply it to your study... that defeats the purpose... and is regarded as a form of cheating because it creats a false sense of research protocol stringency which didn't exist at the time of the study.

    In fact the main purpose of the positive tests are to trap findings like "150-times' the sensitivity known to exist — and force a complete re-evaluation of the whole study.

  • She accepts that there are two opposing factions in this dispute, with Motorola, Roti-Roti's group, John Moulder and herself on one side, and the Lai-Singh team and many other scientists on the other. [Roti-Roti had come under severe attack at the BioElectroMagnetics Society conference when they presented their first 'replication' paper.]

    In this case, the existance of the two factions had them in a bind. Narendra Singh already had a copy of the draft paper, and therefore they couldn't just produce a quick rewrite, alther a few figures, and resubmit the paper to Moulder. The consequence was that the publication of the paper was delayed a year — allowing time for potential critics to forget about its existance— and then it was published with a new positive-control figure of 0.6 cGy.

  • She points out that following the release of the original Lai-Singh paper, "Motorola subsequently funded Joe's work since they were concerned with Singh's results," which suggests that there were factional meetings to discuss replicating the Lai-Singh work involving, not only the St Louis team, but also the editor of the journal which was to publish the study: "I talked to [Roti-Roti] a year ago at Radiation Research, concerning the problem with S-phase cells, but he did not seem to understand my concern (he should since the same problem occurs in his halo assay.)"

    The halo assay was an early form of assay used by Joseph Roti-Roti. The main difference appears to be the absence of an electrical charge which creates the comet-tail. It is classed as a "non-electrophoretic assay"

This raises the question, where did the St Louis group's extraordinary 0.3cGy figure come from ?

And... Why did Olive recommend publication 'after substantial revision.' Clearly the whole program should be junked and done again from scratch? If they don't know the basics of base-line testing and the need for synchronous cell-lines, then every aspect of the research is suspect.

And why did John Moulder publish they paper after receiving this, and other peer-reviewers comments? The Singh comments ran to many pages of criticism and explanation.

Yet the sensitivity checking paper was published a year later (April 1998) in Radiation Research, with its claim modified from a sensitivity limit of 0.3 cGy to 0.6 cGy == which is still ten times better than Olive herself was able to achieve.

Detection of DNA damage by the alkaline comet assay after exposure to low-dose gamma radiation.

The alkaline comet assay as described by Olive et al. (Exp. Cell Res. 198, 259-267, 1992) was used to detect DNA damage in cells exposed to low doses (0-5 cGy) of gamma radiation. Experiments were performed using lymphocytes isolated from whole blood of rats. The comet parameters, normalized comet moment and comet length, described by Kent et al. (Int. J. Radiat. Biol. 67, 655-660, 1995), were used as measurements of DNA damage.

    It was observed that the alkaline comet assay can detect DNA damage at doses as low as 0.6 cGy. The results of the experiments using low-dose gamma radiation are comparable with published results obtained using the alkaline comet assay according to the method of Singh et al. (Int. J. Radiat. Biol. 66, 23-28, 1994).

    Based on this observation and analysis of results published previously, we conclude that the version of the alkaline comet assay described by Olive et al. is as sensitive as other modifications of the comet assay reported in literature for the detection of DNA damage in cells exposed to low doses of ionizing radiation. [Later research has shown that the Singh technique is one or two orders of magnitude more sensitive than the Olive and other techniques.]



Miscellaneous items of interest:
  • Note a subsequent replication done by J Behari and Paulraj R at Jarwaharlal Nehry University in India, using the Singh technique reported that their "study shows that the chronic exposure to these radiations cause statistically significant (p<0.001) increase in DNA single strand breaks in brain cells of rat."
  • A good outline of the comet assay, with a brief note on the history of development.
  • The St Louis attempts at discrediting the Lai-Singh research did not go without criticism from the bioelectromagnetics community.

    This is an analytical piece by Professor Neil Cherry, from Lincoln University in New Zealand.

    Also an overview piece by Henry Lai,


  • See the abstracts of the St Louis team's studies for Motorola:

        The principle authors (Malyapa, Ahern, Straube, et al working under Professor Joe Roti-Roti), published three papers looking for DNA damage after EMF exposure, and a fourth on the sensitivity of the Olive technique.
    1. Measurement of DNA Damage after Exposure to 2450 MHz Electromagnetic Exposure
      The first study (1997a) used the same frequency as that employed by Lai and Singh in their studies - 2450 MHz. SARs were calculated to be 0.7 and 1.9 W/kg. Two types of mammalian cells were used - human glioblastoma cells and mouse fibroblast cells. The cells were irradiated for 2 hours, or 2 hours followed by incubation for 4 hours, or 4 hours followed by 24 hours incubation. No significant differences were observed between the test group and controls. [PART 1 of the above]
    2. Measurement of DNA damage after exposure to electromagnetic radiation in the cellular phone communication frequency band (835.62 and 847.74 MHz).
      In the second study (1997b) done at the same time, the same types of cells were exposed to either frequency-modulated continuous-waves (FMCW - AMPS) at a frequency of 835.62 MHz or to code-division multiple access (CDMA) at 845.74 MHz. The cells were exposed for varying periods up to 24 hours. The SAR was 0.6 W/kg.
      No significant differences were seen between the test groups and controls. [PART 2 of the above]
    3. Detection of DNA damage by the alkaline comet assay after exposure to low-dose gamma radiation. Radiat Res. 149(4):396-400, 1998.
      This 'proved' that the Olive assay was as sensitive as the Singh technique — a fact still not recognised by other comet assay researchers.
    4. DNA damage in rat brain cells after in vivo exposure to 2450 MHz electromagnetic radiation and various methods of euthanasia. Radiat Res. 1998.
      This later study (1998) also attempted to replicate the studies by Lai and Singh and examined the effect of 2450 MHz continuous-wave radiation on rat brain cells. They did not confirm the observation that DNA damage is produced after 2 hours exposure to the radiation, or at 4 hours after the radiation.

Close