At the end of World War II the technological advancements that were a product of the war began to filter into the commercial economy. The growing demand for food brought about the need for chemicals to grow, preserve, and package food products as agriculture moved from the family farm and into large-scale operations. This era witnessed the introduction of DDT at a time when its long-term effects were unknown, and in 1950 the US House of Representatives opened hearings to investigate the use of chemicals and additives to food products.3 In 1962 Rachel Carson wrote her landmark book Silent Spring, which brought about public scrutiny in regards to the safety of the fertilizer, insecticide, and pesticide programs that were being used in domestic agriculture. Since that time the US has escalated their drive to monitor the use of chemicals in the food chain and have maintained a policy of the evaluation and licensing the use of hazardous chemicals with the goal of creating safer consumer products. While this policy has brought thousands of products under the scrutiny of the Environmental Protection Agency (EPA), DDT was one of the first and most visible victims of this program.
During the 1950s the World Health Organization (WHO) pursued a policy of widespread use of DDT in Asia, Latin America, and Africa in an effort to eliminate the mosquitoes that transmitted the deadly disease of malaria. By 1971 the WHO estimated that as many as 1 billion people had been freed from the risk on contracting malaria.4 However, there were dangers lurking in the shadows of this success. Because there was a chance of the insects building up a resistance to DDT over time, it was necessary to spray the infected areas on a regular and diligent schedule. In addition, the WHO failed to account for several variables that worked against the program. Local bureaucratic governments failed to spray regularly, infected individuals imported the disease,