If the recommendations of some scientists in the early 1950s had been followed, we would now be eating food treated with waste from nuclear power stations.
In 1953 the Low Temperature Research Station (forerunner to the Institute of Food Research) predicted that nuclear waste could be used to treat food, saying that “the use of such radiation in the food industry appears to be a much more realistic possibility” than other methods of food preservation available at that time.
They concluded that it was impossible to eliminate the dangers of irradiated food completely, and “the presence of some background x-radiation would be virtually inevitable”. Furthermore, “the acceptability of irradiated food for long-term human consumption is still not demonstrated” and would require costly research. In the best case scenario “the cost and inconvenience of the process might well outweigh any advantages of quality even if these could be clearly demonstrated”. They also found that exposure to radiation could greatly reduce the flavour of some foods, whilst giving others a distinct smell.
This information is found in a newly catalogued file in the archives. MS.8950 consists of a collection of documents on radiation dating from the 1940s and 1950s. It provides an interesting insight into the thoughts of scientists at a time when research into the potential uses of nuclear energy was at its height.
One theme that occurs throughout the file is the difficulty in agreeing acceptable doses of radiation exposure. This is made more complicated by the different types of radiation having varying effects, and scientists around the world using different ways of measuring exposure. Things are further complicated by the fact that different forms of exposure, such as long-term, short-term, or whole body exposure, require differing permissible limits.
One result of long-term exposure to radiation can be seen in a 1949 letter to the Medical Research Council (MRC) chairman Sir Edward Mellanby (whose archives are also held by the Wellcome Library, as PP/MEL) from Dr Lewis H. Weed of the U.S National Research Council. They had discovered that a number of otherwise fit and healthy nuclear physicists had developed cataracts. The only thing these men had in common was exposure to neutron radiation from a cyclotron.
The physicists had been undergoing regular blood tests to check their health, but “in no case was there a change in blood picture to indicate over exposure to radiation”. The usefulness, or lack thereof, of blood tests to ensure people working with radioactive materials remained healthy is debated by H. H. Mole, in his paper The Value of Blood Counts as a Radiation Protection Measure. Mole concludes that there is no evidence routine periodic blood tests are of any value as a protective measure, as changes in the blood are often not seen at all, or only seen when an illness has reached the terminal stage.
The threat of nuclear war cast a shadow over much of the twentieth century, and scientists were not immune to this. In 1949 Jacobson, Stone and Allen produced a paper looking at the possible role of physicians in the event of an atomic war. They suggested possible ways of preparing for such an event, and a method of triaging casualties should it occur. They also discuss potential injuries and treatment. The authors conclude that despite millions of dollars being spent on “methods of diagnosing, treating and preventing radiation injury” “we would hardly be better off to cope with an atomic bomb attack on one of our cities than the Japanese were at Nagasaki”.
Others were more positive in their predictions, with the MRC Research Committee on the Medical and Biological Applications of Nuclear Physics suggesting in 1946 that in the event of an atomic accident “mixed fission products” could contaminate water supplies for up to one month.
Author: Natalie Walters is an archivist at the Wellcome Library.