...or how the ethical minefield of biotechnology has grown over time
From beer to recombinant DNA
DINOSAURS roar and gallumph across the widescreen and the world watches spellbound. Steven Spielberg's classic film, Jurassic Park, tells the story of extinct dinosaurs brought to life in a petri dish from genetic material that is all of 65 million years old. This is fiction with a good dose of creative licence all right. But can it ever turn into fact? Can the science of biotechnology coax life out of simple, petrified genetic material?
From the simple art of brewing to the rigorous science of genetics, the story of biotechnology is a complicated and wondrous thing. Ethically the most precarious of sciences, every new discovery -- and there is one every other day -- evokes a confused sense of both wonder and angst in the lay public brought up on a fearful pulp diet of Frankensteins and artificial viruses. And it has more than its rightful share of monomaniacs and lunatics among its proponents. Thinkers from biologist and town planner Patrick Geddes to scientist Edward Tatum have warned biotechnologists -- and by extension the human race -- against succumbing to the easy temptation of playing God.
Biotechnology has its roots in "zymotechnology". Derived from the Greek "zyme" meaning leaven, a substance added to dough to make it ferment and rise, zymotechnology was originally the name of a process of brewing and fermentation. The word was coined by a Prussian court physician, Georg Ernst Stahl (1659-1734), who used it in his book Zymotechnia Fundamentalis. Stahl predicted that zymotechnology would form the basis of Germany's famous lager and liquor industry and thereby link research to commerce.
The early 19th century had seen chemistry -- earlier denigrated as "alchemy", a madman's pursuit of the "philosopher's stone" -- rapidly gain importance as the technological exploitation of living processes began. In 1828, Friedrich Wohler, a professor of chemistry at the University of Gottingen in Bavaria, proved how natural urea, normally extracted from the urine of snakes, could be made artificially. And German chemical firms successfully produced Mauveine, the first synthetic organic dye.
By 1883, chemistry was heading for the belly: beer became a major industry -- the Germans snaffle roughly 3.9 billion litres a year -- and leading chemists were employed to give it the body and bouquet that has made German beer world famous. In 1872, Louis Pasteur had proved that microbes were responsible for fermentation. And in a major breakthrough in 1883, chemists at the Carlsberg Institute in Copenhagen discovered that wild yeast was ruinous for beer. Techniques were then developed to control fermentation.
John Ewald Siebel, a German emigre to America, extended fermentation techniques from beer to food in 1884. His company, the Zymotechnic Institute, trained neophytes in the delicate art of brewing. Siebel offered his expertise to manufacturers of a wide, unconnected range of products stretching from wine to glue and cheese. Siebel's work is a landmark. He sought to and succeeded in linking zymotology to chemistry and microbiology, and by World War I the sluice gates from one to the other were irrevocably open.
In 1910, Max Delbruck, a German scientist, rooted into using yeast as animal food. A food shortage in Germany during World War I was combated with specially cultivated yeast, which took care of 60 per cent of Germany's fodder needs. In 1915, the Germans synthesised and fermented glycerol, a chemical crucial for producing explosives.
The immense variety of possible applications, both benign and horrifically genocidal, transformed zymotechnology into what Hungarian agricultural scientist Karl Ereky first called "biotechnology" in 1914. Ereky sought to transform his country into an agricultural showpiece, and his iteration that the chemical industry could provide lasting answers to any food shortage became the hymn of hope of the 20th century.
That biotechnology didn't live up to his promise and took on a runaway life of its own is the Frankenstein syndrome that has plagued 20th century sci-tech. Europe tried to yoke the beast, but failed.
As usual, German enterprise failed to impress the British. BASF, a German firm, had already put paid to the Indian plantation industry (India was then under the British) when it invented synthetic indigo. News that the Germans were working on synthetic rubber raised British anxieties to fever pitch.
Then a partnership between Chaim Weizmann, the first president of Israel, the British and the Pasteur Institute in Paris led to the discovery of acetone, a chemical used for explosives, and butanol, which could be used to produce rubber.
The mass destruction caused by the poison gases the Germans used in the muddy trenches of Dieppe during World War I brought about a sea change in attitudes in Europe. Uncontrolled technology was seen as not only ruining the environment but disrupting a traditional, bucolic life. The interwar years rippled with debates over health, declining population, feminism, birth control and nutrition.
From Britain's smog-filled environment swam out a group of writers and biologists who looked at biotechnology afresh. Biologist and town planner Patrick Geddes spoke in terms of the "paleotechnic" and "neotechnic" eras. The year 1915, he said, fenced off these two eras. "Paleotechnic" was the age of the "iron monster", the steam engine, and filthy, dangerous coalmines, while "neotechnic" marked a change for the better -- a period of the clean technology of electricity. Geddes then added a third future era, the "geotechnic", in which technology would harmonise with Earth's needs.
In 1915, scientist J B S Haldane warned that every biological invention would inevitably begin as blasphemy. In his book Daedalus or Science and the Future, he first described what is a common conversation piece today: "ectogenesis", which combines in vitro fertilisation with the development of the foetus outside the womb.
Alexander Fleming's discovery of penicillin in 1928 made healthcare almost dependent on chemical engineering, a state of affairs that persisted till a medical market and consumer rebellion a decade ago. By then, penicillin had become a wonder drug with a market which has never stopped growing. Penicillin was produced commercially by British scientists at Oxford, and by 1950 it was used to treat 21 million people.
Microbiologist Selman Waksman extracted streptomycin from soil bacilli to cure tuberculosis. By 1957, the number of antibiotics shot up to 350. Cortisone burst into the market three years later and was used to treat everything ranging from insect bites to arthritis. Polio vaccines were grown in live chimpanzees.
In the years after the World War II, biotechnology was seen as the wonder tool that the rich countries could use to obviate the problems of the poor. But 20 years into the burgeoning industry, the magic wand had become a sword in the belligerent hands of the North.
Biotechnology's eventual marriage to genetics was signed in America. In 1953, the helical structure of DNA was discovered by James Watson, head of Long Island's Cold Spring Harbor Laboratory, and Francis Crick at the Laboratory of Molecular Biology in Cambridge, England. Edward Tatum, genius bacteriologist and Nobel Prize winner said that biological engineering could recombine, modify and produce new genes. Tatum urged scientists to use genetics to counter hereditary diseases and for the fledgling science of eugenics.
Yet Tatum's ostensibly philanthropic vision terrified America. Seeing biotech against the backdrop of the My Lai massacre in Vietnam, the genocidal power of atomic fission, and the mortal hazards of the chemical industry's backwash, America reacted with cynicism and alarm. British journalist Gordon Rattray Taylor wrote Biological Time Bomb, a bestseller that warned bleakly on its book jacket: you may marry an artificial man or woman, choose your children's sex, live to be 150 years old, if the scientific revolution does not destroy us first.
In 1973, Stanley Cohen and Herbert Boyer managed to cut and splice DNA from different sources, creating "recombinant DNA", the running theme of applied genetic research today. But to many scientists, recombinant DNA reeked of monster-making, of Faustian hubris. At Asilomar in California, a group of molecular biologists led by Leon Kass, executive secretary of the Committee on Life Sciences, called for a pause in biotech research till it regulation and checks became viable. A 16-month moratorium led to the 1976 US National Health guidelines. Only geneticist Joshua Lederberg opposed the freeze and emphasised the benefits of genetic research: therapeutic medicines, human proteins, antibiotics, nutrients. Two years later, biotech tightened its hold, with the microbial production of human insulin on an industrial basis. The possibilities of mindboggling horror haven't yet been buried. Only, the benefits of biotech are more upfront, less deniable.
Rita Anand wrote this piece based on The Uses Of Life: A History of Biotechnology, by Rebort Bud.
We are a voice to you; you have been a support to us. Together we build journalism that is independent, credible and fearless. You can further help us by making a donation. This will mean a lot for our ability to bring you news, perspectives and analysis from the ground so that we can make change together.
Comments are moderated and will be published only after the site moderator’s approval. Please use a genuine email ID and provide your name. Selected comments may also be used in the ‘Letters’ section of the Down To Earth print edition.