And HIV test—yellow is positive, clear is negative — seen at Roche company offices in Switzerland. Ullstein bild / Getty Images
But, although the percentage of people living in the U.S. with undiagnosed HIV has been decreasing, the Centers for Disease Control and Prevention (CDC) estimates that 1 in 8 people who are HIV positive still don’t know it.
The virus was identified in 1984 by French and American scientists, which meant that companies could begin to develop a test for antibodies produced in response to the virus. The first test used blood and was known as an enzyme-linked immunosorbent assay or ELISA test. It was approved for use on March 2, 1985. A second test of the same type was approved on March 9.
By that point, the science of testing for HIV was relatively straightforward. The complications came with implementation.
These early tests weren’t actually designed to diagnose patients with AIDS or HIV. Instead, they were designed to screen donated blood for possible infection.
As TIME reported in April 1985, although the 142 Americans who had contracted AIDS from blood transfusions were just a small fraction of the 9,600 people who had AIDS in the U.S., fear of contaminated blood was running high. The need to prevent necessary blood transfusions from becoming a growing source of infection meant blood donation centers began using the test in April of 1985 and by the end of July, the blood supply was declared free of AIDS.
Because the first tests were intended to ensure blood donations would not transmit the virus, they were very sensitive and so had a high rate of false positive results. At the time, medical uncertainty also surrounded the question of whether a positive result meant the blood donor had already developed AIDS or had simply been exposed to the virus. But, given the low rate of positive results in general, these questions didn’t matter very much in the context of blood donations—after all, the blood in question could just be disposed.
The situation was very different for someone wondering if they had a disease that, at the time, had no proven treatment.
While the current public-health view focuses on testing as a path to empowering resulting knowledge, enabling a person to help themselves and protect others, the first test wasn’t framed in the same way. Actually, as Smithsonian Museum of American History explains in its documentation of the ELISA test, the test kit had a label: “It is inappropriate to use this test as a screen for AIDS or as a screen for members of groups at increased risk for AIDS in the general population. The presence of HTLV III antibody is NOT a diagnosis of AIDS.”
In addition to these medical questions and the high false-positive rate, early HIV tests were surrounded by the very real threat of stigma anddiscrimination—not just from a positive result, but even from being tested at all, which could be interpreted as a sign of belonging to a high-risk group (which included homosexual men, intravenous drug users and prostitutes, among others). Others worried that, if donating blood were the only way for someone to be tested for the virus, that would encourage individuals worried about their exposure to donate blood, furthering potential introduction of HIV to the blood supply.
So it should be no surprise that the focus of testing soon moved from the blood supply to individual people. Though the same type of test continued to be used, new protocols were added—like retesting positive results—to make the process more appropriate for concerned patients. By March of the following year the government had issued a recommendation that people in all “high-risk groups” undergo periodic testing to determine if they were infected with the virus. And when TIME reported on the change, that recommendation wasn’t the only AIDS news covered: the article mentioned a promising study that indicated that an experimental drug, AZT, improved patients’ immune systems.
The FDA approved AZT in 1987, and it was the first drug to combat AIDS.