Discovery of Leukemia
Generally, the ancient Greeks are credited with being the first to recognize cancer some time in the 4th or 5th century B.C.E. (Greaves, 2000). In 1932, Louis Leakey discovered a mandible in Kenya of Australopithecus or Homo erectus that George Stathopolous believes was a malignant tumor and may have been Burkitt's lymphoma, a common cancer of the jaw in contemporary east Africa. The bones of a dinosaur that may have had cancer have been found from 150 million years ago (Greaves, 2000).
Leukemia was not officially diagnosed until 1845, when John Hughes Benett diagnosed it in Edinburgh (Greaves, 2000). Other European physicians in the 19th century observed that their patients had abnormally high levels of white blood cells, and they called the disease “weisses blut,” meaning “white blood.” The term “leukemia” that is used now comes from the Greek words “leukos” and “heima,” also meaning “white blood” (Leukemia and Lymphoma Society, 2005). In 1913, four types of leukemia were classified: chronic lymphocytic leukemia, chronic myelogenous leukemia, acute lymphocytic leukemia, and erythroleukemia. In 1970, it was first confirmed that some patients could be cured of leukemia, and by the 1980s and 1990s the cure rates for leukemia were around 70% (Pui, 2003).
Although cancer was less common before the twentieth century, humans have been getting cancer for a long time. Some explanations for the increase during the twentieth century are that more people are surviving who would have died of infectious disease, and that in the past many times cancer may not have been properly diagnosed (Greaves, 2000). William Harvey's recognition of the circulatory system of the blood and Gaspae Aselli's explanation of the lymphatic system were important for the diagnosis and understanding of cancer (Greaves, 2000). In the 19th and 20th centuries, physicians first began recognizing that certain factors could increase the risk of developing cancer. In the 1920s, Hermann Muller recognized that ionizing radiation could cause mutations in DNA that contributed to cancer (Greaves, 2000).
The increase of childhood leukemia in modern times may be lifestyle-related. In developed countries, families are usually smaller and hygiene has improved. Infants are no longer exposed to infections at an early age (Greaves, 2000). Our immune systems have evolved to respond to infections shortly after birth, usually through the mother's antibodies during breast-feeding. The immune systems of children exposed at later ages, without having confronted microbes at an earlier age, may not respond as well. These children may have increased risk of developing leukemia. The incidence of leukemia is higher among more industrialized nations, and in those nations among people of higher socioeconomic status, because these people are living in an environment that is least like the environment humans evolved to fit. People in these privileged positions are exposed to more pesticides and chemicals, and fewer infectious diseases, than other people.
Historically, one of the most common treatments for leukemia was arsenic. This cure is mentioned in the ancient Ramayana of India, and was used by Hippocrates (460-370 BC), who gave cancer its name, from the Greek term "carcinos" or "karkinos" for crab (Waxman and Anderson, 2001). In the 18th century, Thomas Fowler created what became known as "Fowler's Solution," a combination of arsenic trioxide and potassium bicarbonate, which "became a standard remedy to treat anemia, Hodgkin's disease, and leukemia"(Waxman and Anderson, 2001). Arsenic became the primary therapy for leukemia, used up to the early 20th century, when it fell out of favor with the advent of radiation therapy (Waxman and Anderson, 2001). In 1865, a German physician named Lissauer used Fowler's solution to treat chronic myelocytic leukemia (Burns, 2004). In the 1970s in China, arsenic was revived as a treatment for acute promyelocytic leukemia. Today, some researchers continue to investigate arsenic as an effective treatment for leukemia.
In the early 20th century, leukemia was considered an incurable, chronic disease. Around 1897, after the discovery of radiation, studies showed that x-rays could reduce the size of tumors: "it was discovered that daily doses of radiation over several weeks would greatly improve therapeutic response" (American Cancer Society). After medical investigation, "x-ray radiation for patient therapy moved into the clinical routine in the early 1920s" (Imaginis). Shortly after becoming widely used, radiation was shown to be a cause as well as a cure for leukemia. According to the American Cancer Society, "many early radiologists used the skin of their arms to test the strength of radiation from their radiotherapy machines, looking for a dose that would produce a pink reaction (erythema) that looked like sunburn.” This was considered an accurate measure of the dose, called the "erythema dose." Unfortunately, but unsurprisingly, many of these early radiologists came down with leukemia (American Cancer Society).
Until after World War II, there were no adequate treatments for leukemia. One of the most important treatments for cancer, chemotherapy, actually developed from an agent of chemical warfare used by the Germans during WWI, mustard gas, which attacks rapidly-dividing white blood cells. Scientists discovered the tumor-fighting effects of mustard gas when a group of soldiers during WWII accidentally came in contact with mustard gas and "were later found to have very low white blood cell counts" (American Cancer Society). This led to the invention of chemotherapy, a more targeted approach to nitrogen mustard exposure.
In the 1940s there were new treatments, such as aminopterin, first used by Sidney Farber of Boston to treat acute childhood leukemia. Aminopterin is a compound related to folic acid that can be found in ALL patients in remission. Aminopterin prevents the DNA replication in tumor cells. After the discovery of aminopterin, "other researchers discovered drugs that blocked different functions involved in cell growth and replication. The era of chemotherapy had begun" (American Cancer Society).
George Hitchings (1905-1998) and Gertrude Elion (1918-1999) used rational drug design to create 6-mercaptopurin, the first truly effective leukemia drug. By 1950, the pair had combined diaminopurine and thioguanine, which disrupted DNA synthesis by replacing adenine and guanine. "Elion later substituted an oxygen atom with a sulfur atom on a purine molecule, thereby creating 6-mercaptopurine (also known as 6-MP)" (Bowden). Despite these new treatments, leukemia was not conquered. Many patients entered remission, but they later relapsed and died. Elion researched the drug intensively, with the result that today therapy uses 6-MP in combination with other drugs, and continues during remission.
The greatest advances of the 20th century took place with the advent of the discovery of DNA by James Watson and Francis Crick. This led to a greater understanding of the detailed mechanisms of cancer, answering some questions and leading to a number of others. In addition to the environmental risk factors, “as our understanding of DNA and genes increased, [scientists] learned that it was the damage to DNA by chemicals and radiation or introduction of new DNA sequences by viruses that often led to the development of cancer" (American Cancer Society). In the future, genetic analysis may prove to uncover new treatments for diseases like leukemia.
Created by Shannon McGlauflin, Jolene Munger, and Rebecca Nelson, 2005.