Since the dawn of time, there has been a need for healthcare, and thus a need to protect ourselves from unforeseen costs due to illness and injury. The early human who discovered fire 400,000 years ago probably could have used some bandaging up at one point or another. And the guy who invented the wheel? We’re pretty sure he misjudged a few swings of his copper mallet, resulting in a broken digit and some colorful language. We’ve all been there.
Yet despite the need for insurance since cavemen roamed the earth, the product didn’t actually take hold until about 100 years ago. And even then, it existed in a state that would be almost unrecognizable to today’s consumers. What started as a simple, straightforward system—need care, get care, pay or barter for the cost of care—has evolved into a complex supply chain, with many variables and stakeholders.
The late 19th century
Average price of a quart of milk: 3 cents
Fun fact: The first U.S. Golf Open was held in Newport, RI on October 4, 1895
The earliest record of formal healthcare and pre-health insurance in America is tied to the Industrial Revolution, which brought steel mill jobs to the country. The physically taxing nature of the work resulted in frequent workplace injuries. As dangerous manufacturing jobs grew in number, unions formed and began to offer some protection from illness and injury. It wasn’t quite health insurance, but it was a step in the right direction.
1900 – 1919
Average price of a pound of steak: 26 cents
Fun fact: Kodak introduced the “Brownie,” America’s first camera for amateurs, in February of 1900
Near the dawn of the century, Teddy Roosevelt began campaigning for mandatory health insurance. Despite opposition from the American Medical Association, the idea gained support from progressive groups such as the American Association of Labor Legislation. The AALL proposed a bill that would provide sick pay and maternity benefits to qualified recipients, with the cost split between employers, employees, and the state. Ultimately, the bill didn’t receive enough support to move forward before World War I took center stage in 1917.
At the same time, American hospitals were crossing over from provincial facilities to modern scientific institutions, and surgeries like tonsillectomies, appendectomies, and tumor removal started to become standard. Perhaps most significantly, physicians were no longer expected or encouraged to offer free procedures to all patients.
1920 – 1993
Average price of a gallon of gas: 24 cents
Fun fact: The Jazz Singer, the first motion picture with sound, debuted in 1927
After WWI ended, physicians and hospitals began to charge more, and the cost of healthcare became a major problem for the average citizen. In 1929, a group of teachers collaborated with Baylor University Hospital to form an agreement wherein they could pay a set monthly amount to receive the medical care they needed. That arrangement was the precursor to Blue Cross. Over the next decade, other Blue Cross plans sprung up across the country, and by 1939, doctors decided to follow suit and formed their own insurance group, called Blue Shield. Eventually, the two groups decided to combine, and the resulting organization was called Blue Cross Blue Shield
1940 – 1959
Median value of a single-family home: $2,938
Fun fact: After 14 years of work, Mount Rushmore’s carving was completed in October 1941
Once again, a major war caused attention to drift away from the health insurance conversation. As the U.S. entered World War II, the labor market grew tight due to a decreasing supply of workers and an amplified demand for goods, which led to a major advancement towards the practice of health insurance as we now know it. The wage controls that were placed on American employers prompted them to offer health benefits in order to compete for talent, giving rise to the first employer-sponsored system. Between 1940 and 1960, the total number of Americans enrolled in insurance plans grew from 20.5 million to 142.3 million, and by the end of the 1950s, three-quarters of Americans had some form of health coverage.
1960 – 1979: An answer for the underserved
Average price of a new car: $2,752
Fun fact: The Woodstock Music and Art Fair drew more than 450,000 people to Bethel, NY in the summer of 1969
Although employer-sponsored plans were a huge benefit to workers, the elderly, the unemployed, and the poor were left out because private insurance remained unaffordable. Prior to 1965, only half of seniors had health care coverage, but in July of that year, President Lyndon B. Johnson laid the groundwork for what we now know as Medicare and Medicaid by signing the Social Security Act. As a result, public health insurance was available to the poor and the elderly; Medicare later expanded to cover people with disabilities and several diseases.
In 1971, Senator Ted Kennedy proposed a tax-funded, single-payer plan, but then-president Richard Nixon felt it went too far, so he proposed an alternate that would require employers to offer health insurance to employees. He felt that the combination of non-negotiable employer insurance and Medicare would ensure that most Americans had access to insurance coverage throughout their lives. The two politicians tried to work together but failed to reach agreement. Then Watergate hit, essentially erasing all support Nixon had accumulated to date. His successor, Gerald Ford, tried to distance himself from Nixon as much as possible, and the bill disappeared.
1980 – 1999
Price of a McDonald’s cheeseburger: 34 cents
Fun fact: Apple launched the Macintosh computer in 1984
By 1980, national healthcare expenditures (NHE) had risen to 8.9% of the gross domestic product (GDP), representing the largest increase yet. Under President Ronald Reagan, the privatization of healthcare grew in favor. Reagan implemented what we know as COBRA in 1986, allowing employees to stay on their former employer’s plan. The catch? They had to pay the entire premium, but for many people with pre-existing conditions who would have had trouble qualifying for private insurance, it was worth it.
While the 90s were the best time for music (Nirvana, anyone?), they were an extremely trying time for the insurance industry. Health expenditures reached a whopping 12.1% of the GDP, healthcare costs rose at double the rate of inflation, and yet again, federal health care reform legislation failed to pass. Shortly after being sworn in, Clinton had proposed a Health Security Act that echoed many of FDR and Nixon’s efforts and would have resulted in a combination of universal coverage plus private insurance, prevented companies from denying coverage based on a pre-existing condition, and required employers to offer insurance to full-time employees. The bill died, in large part due to a growing national deficit and pushback from big business, but also because of its extreme complexity.
There were, however, two bright spots during the Clinton administration when it came to healthcare. In 1996, the President signed the Health Insurance Portability and Accountability Act (HIPAA), which established industry-wide standards for healthcare information and placed restrictions on how pre-existing conditions could be handled in group health plans. The Clinton Administration also expanded Medicaid assistance by offering it to “uninsured children up to age 19 in families with incomes too high to qualify them for Medicaid.”
Despite these advances, in 1999, 44 million Americans still had no health insurance coverage.
Modern Day
The bad: In our current state, healthcare costs are continuing to rise, Medicare is in need of an overhaul, and many people lack confidence in the long-term viability of employer-based insurance due to changing demographics in the workplace.
The good: In a word, consumerization. Up until recently, physicians, facilities, and insurance companies held most of the power in healthcare. In recent years, however, patients have started demanding more. More convenience. More transparency. More choice. Gone are the days when a person would simply take a doctor’s word for a test or treatment. People are starting to ask “why” and “how.” Why is that test necessary? And how much will it cost?
At the forefront of this trend are healthcare consumer accounts, or HSAs, which were created in 2003 in order to give individuals covered by high-deductible health plans tax-preferred treatment of the money they save for their medical expenses. For the first decade or so of their existence, most people treated HSAs as a mysterious, yet useful tool, using them to pay for office visits and prescriptions, but doing little research on their other merits.
Little by little, word started spreading about much more valuable benefits, such as the triple tax deduction HSAs offer. Beyond the fact that contributions to an HSA are tax-deductible, they also combine the advantages of traditional and Roth retirement accounts. In other words, funds grow tax-free and distributions remain untaxed, as long as they’re used for qualified medical expenses.
Now, people are starting to use their HSAs as an addition to traditional retirement savings accounts. During retirement, the money in an HSA can be withdrawn for non-healthcare expenses and taxed just as it would if it was invested in in a traditional IRA, with the added perk of no mandatory disbursements.
It’s taken a decade or two, but it’s dawning on people that the future of health insurance lies in accountability and ownership. The funds in their health savings account aren’t sheets of Monopoly money. The money is real. The money is theirs forever. And the money can grow or dwindle based on their behavior, choices, and demand for control over every stage of their healthcare journey.