I've been thinking about turning 26 for several years now.
To any non-Americans reading this, that might sound a bit strange. After all, 26 might be an age that many people start to experience their first aches and pains, but it's also an age that people still laugh at you when you try to complain about said aches and pains.
And yet, what should be nothing more than another formal reminder that I'm aging is something else entirely.
It's the age I lose my parent's health insurance.
I've already had a few tastes of this reality (a recent trip to the dentist required me to pay out of pocket, although my dental coverage is so bad this wasn't much more than I expected), but 26 is the age when things become official.
As I'm a freelancer, I'm now faced with a dilemma. Do I pay an absurdly high monthly payment for a plan that covers next to nothing? Or do I eat my vegetables, take a multivitamin, and hope that I don't get sick?
To understand why being uninsured and underinsured is a reality for myself and tens of millions of other Americans, I decided to crack into the topic a bit.
A Brief Look at the History of Healthcare in America
The idea that institutions like governments or businesses should provide health insurance to people was a novel idea in the 20th century. So novel, in fact, that by 1920, only 16 European countries had such a system.
Around that time, the demand for health insurance was low. Medical technology was rudimentary at best, and because of that, the public didn't trust it. Most sick people received treatment at home instead of at local hospitals.
However, as medical technology advanced, the idea of receiving professional medical care became more accepted. Whereas a hospital visit might have been a death sentence in the past, people started to associate doctors with saving lives instead of taking them.
At the same time, urban houses were becoming smaller and smaller. Households that had the space to cater to sick family members before now struggled to do so. Receiving medical treatment outside of the house became the only option for many.
In 1929, Blue Cross launched the first insurance plan in the United States. This covered 21 days' worth of hospitalization at a fixed rate of $6 per year, providing people with a safety net in case of emergency.
While health insurance enrollment started out slow, insurance plans gained popularity over the next couple of decades. By the 40s and 50s, enrollment began to skyrocket.
Businesses began to understand that offering health insurance was a great way to entice people to work with them. Over the next few decades, the relationship between employment and health insurance grew stronger.
It wasn't until 2010 that the U.S. finally expanded healthcare coverage under the Affordable Care Act. While the ACA helped tens of millions of Americans (including myself), it's faced steep opposition ever since its conception.
The Supreme Court shot down the most recent attempt to overturn the ACA in June of this year.
Get a Job!
On paper, working for healthcare doesn't sound like too bad of a deal. You can argue that healthcare is a human right and that it's cruel to deny it from people, but provided you can find a full-time job, you should have no problem getting the coverage you need.
Of course, the reality is quite different. Many Americans (myself included) can't find jobs that offer healthcare. Often, this isn't because the jobs we end up taking are for unskilled workers or high schoolers.
It's because businesses know that providing health care and other benefits to employees will affect their bottom line.
Companies like Uber and Amazon have made headlines in recent months for their crackdown on workers attempting to unionize. Besides demanding a livable wage and enough time to use the bathroom, many workers are also fighting for healthcare.
Businesses hire people as "gig-workers" or freelancers because they know that doing so gets them out of having to support them. As of 2017, gig workers made up 43% of the U.S. workforce. Today, around half of all American workers are freelancers.
On top of that, many companies also classify the workers they hire as part-time. Even if an employee ends up working the equivalent of a full-time workweek, keeping them around as a part-time employee prevents the company from having to offer healthcare and other benefits.
A private healthcare system doesn't work when you have businesses doing everything in their power to avoid having to provide insurance.
The Big Pharma Problem
While it's bad enough that people have to live without health insurance, Big Pharma makes the situation even more difficult.
As its insidious-sounding name connotes, Big Pharma refers to the global pharmaceutical industry and, in recent years, how a group of companies holds the health and wellbeing of the world in their hands.
Through patents and other legalities, companies are able to charge high prices for drugs that people need to survive. This includes everything from Humira, which helps treat arthritis, to Lantus, which treats diabetes.
As is the case with many parts of our healthcare system, this is another distinctly American problem. One quick look at drug prices can tell you that.
Pharmaceutical companies are self-aware enough to recognize that if the average American knew the extent of price gauging they commit, they'd be ruined. Because of that, they've perpetuated the idea that drug prices reflect medical breakthroughs.
Congresswoman Katie Porter has gone viral several times for using a whiteboard to confront CEOs and other business executives. In one particularly enjoyable exchange, Porter confronted the CEO of the biopharmaceutical company Abbvie.
As Porter explains, only a fraction of the company's multi-billion dollar budget goes towards R&D. The vast majority of it goes to a group of people looking to enrich themselves and who are willing to deny others the medications they need to do so.
Ever-increasing drug prices are a direct consequence of the private health care system in the United States. Through a lack of regulation, pharmaceutical companies have the power to take advantage of all Americans—both those with and without health insurance.
Where We Go From Here
As I've discussed in past articles, politicians and the media often make certain issues seem like divisive topics. Every time Medicare for All gets brought up on the news, we hear arguments from "both sides".
Someone on the Left claims that healthcare is a human right, and someone on the Right responds that universal healthcare is too expensive and that Americans don't want to spend their money paying for other people's healthcare. The panelists agree to disagree (or the debate spirals into something messy), and the host thanks them both.
While these sorts of verbal boxing matches can be fun, they create the illusion that many people are still unsure how they feel towards certain issues, Medicare for All included.
The reality is that most Americans support Medicare for All and that this support continues to grow. I'd be willing to bet that if people knew more about universal healthcare and vile mouthpieces stopped using buzzwords like "socialism", even more people would be open to it.
Either way, having people against healthcare accessibility on the news is akin to inviting flat Earthers on air. The vast majority of people recognize that the planet is round, so why should we waste time giving a ludicrous minority a platform?
Nearly 70% of Americans support Medicare for All in some way, shape, or form, and as mentioned, many of those who don't just lack an understanding of the topic. So why do we keep hearing that people who supposedly represent the will of the people are so vehemently against universal healthcare?
The Arguments Against Medicare for All
Media pundits and politicians aside, it's important to recognize that many of those nervous about Medicare for All do come from places of sincerity. The reality is, however, that their fears are usually exaggerated, inaccurate, and come from people who do know better.
Let's take a look at a few of them.
Medicare for All is Expensive
This is a common concern that everyone from ordinary people to so-called "liberal progressive socialist Marxist" presidents seem to share. People worry that a universal healthcare system like Medicare for All would cause the average American to pay more in taxes.
The unfortunate reality is that our current healthcare system is the most expensive in the developed world. Between private insurance premiums, co-pays, and other fees, the average U.S. citizen pays around twice the amount of what people in other countries pay.
But with higher costs comes better results, right? Not exactly.
Besides having the highest costs in the developed world, the U.S. also ranks near the bottom of healthcare efficacy. Essentially, the average American citizen pays (much) more for less.
On account of its fractured healthcare system, the U.S. has a declining life expectancy, comparatively high levels of infant and maternal mortality rates, and high levels of medical error.
Medicare for All could help combat all of those issues while still lowering costs for the average person.
Lengthy Wait Times
I've heard many people claim that universal healthcare causes wait times to increase. After all, if everyone can visit the doctor or the dentist without having to fork over thousands of dollars, won't it just make healthcare less accessible?
The irony of this fear is that wait times for American doctors are already long—especially when it comes to seeing certain medical professionals. If you haven't ever been to a dermatologist, try scheduling an appointment with one after finishing this article. Short of demanding that they check out that suspicious mole on your chest, there's a good chance you'll have to wait several weeks or even months to get in.
According to data collected by the OECD, the U.S. ranks on the higher side for healthcare wait times. While some countries with universal healthcare like Canada do too, having to wait for healthcare is already a reality for many Americans.
It's therefore not a valid argument against expanding coverage.
Universal Healthcare Is Socialism
Decades of McCarthyism have caused socialism to become a buzzword used by people who, most of the time, don't know what it means. While I don't expect everyone to go and read socialist literature, I do hope that people can refrain from using Facebook memes as their source of information.
Socialism is an economic system where the means of production are in the hands of the people. Leftists have been arguing about the exact definition for centuries, but most would agree that creating an equitable society is the long-term goal.
They'd also agree that the Soviet Union, Venezuela, and other commonly-cited examples of "socialist" countries are not worth emulating.
It's worth noting that most European politicians, Bernie Sanders, and yes—even Alexandria Ocasio-Cortez, are not socialists. They're social democrats who support a watered-down, less-buzzwordy version of socialism that meshes with a free market system.
While the United States remains a shining beacon of capitalism for the world, the reality is that our society has many socialist policies we take for granted.
Here are just a few of them:
You'll notice that none of these institutions, laws, or policies are profit-motivated. In fact, they're often at odds with what greedy corporations and individuals would like, yet we keep them regardless since we know they improve people's lives.
Medicare for All/universal healthcare/whatever you want to call it would do the same thing, so don't fall for the "it's socialism" line. Don't let people use their faulty understanding of an economic system as an excuse to end talks of giving people healthcare.
It's Time for a Change
No matter what metric you look at, it's clear that our healthcare system needs a revamp. It's also clear that many of the arguments used to defend it in its current state don't hold up.
I don't want to live in a country where I and millions of others have to put our health on the line to avoid having to spend hundreds or thousands of dollars. I don't want to live in a country where private insurance companies and Big Pharma get to enrich themselves while real people suffer and die as a result.
If the wealthiest country on Earth can't provide even the most basic of services to its citizens, it needs a reality check.
If you enjoyed reading this article, take a moment to check out some of the other Study With Chandler posts. You can also support my writing which encourages me to do more!
No taxation without representation!
Today, American students across the country learn those words at a young age. They refer, of course, to the chief grievance that the Thirteen Colonies had towards Great Britain: taxes like the Stamp Act and the Tea Act were unconstitutional since colonists had no say in the British parliament.
This idea of illegitimate taxation helped lead to the American Revolution and, ultimately, the founding of the United States.
It's interesting that while that history lesson has become sensationalized, few people realize that today, several places in the country do indeed have taxation without representation. One of them is the nation's capital, Washington, D.C.
In fact, in 2000, D.C. license plates even read "Taxation Without Representation." Since 2000, they've read "End Taxation Without Representation."
But why is D.C. not a state? Should it become one? Let's find out.
This Land Is *My* Land
As we should do anytime we look at U.S. history, let's remember that the English, French, and other colonial powers were not the first people to live in the Americas. Indigenous peoples occupied the continent for thousands of years before the first European caravel arrived.
Some modern experts estimate that as many as 100 million people lived across North and South America before Christopher Columbus arrived. (For a great deep dive into the pre-Columbian Americas, I recommend 1491: New Revelations of the Americas Before Columbus by Charles C. Mann.)
In the area that we today know as Washington, D.C., the Nacotchtank or Anacostan people lived and called it home. The region, abundant in natural resources, allowed them to create a thriving community that traded with places as far away as New York.
The Nacotchtank people's first contact with Europeans was in 1608 with Captain John Smith from Jamestown. Although the first encounter was friendly, later interactions were not.
Less than 40 years later, only a quarter of the original indigenous people remained in the area. Europeans killed or drove off the majority, while many of the remaining people died from diseases brought by the foreigners.
A New Capital for a New Country
After displacing the remaining natives, the colonies of Maryland and Virginia absorbed the area of D.C., and it remained a part of them until 1790.
In that year, the new American Congress passed the Residence Act. This allowed the United States to create a capital on the banks of the Potomac River. President George Washington himself chose the exact location and signed it into law.
Virginia and Maryland donated land to help create the new federal district, one that measured in total no more than 100 square miles. The name of the new city honored the first president, while Columbia was a feminine and poetic version of Columbus common at the time.
The Framers hoped that such a setup would prevent single states from becoming too powerful. An isolated district that housed the federal government could help keep the various parts of the country in check.
One Step Forward, Many Steps Back
Years later in 1865, the American Civil War ended and the country entered the Reconstruction era. For the first time in American history, the Constitution now (at least on paper) protected the rights of African Americans.
Unsurprisingly, the transition from chattel slave to American citizen was anything but an easy, simple, or fair process. Freed slaves continued to face discrimination all across the country, especially in the South.
President Andrew Johnson (who had been a former slaveholder himself) gave the former Confederate states the freedom to decide the rights of African Americans. As slavery was the principal cause of the Civil War, the southern states quickly made racial discrimination a priority.
Given the city's location, African American residents of D.C. (who made up around one-third of the population) suffered heavily during this era. Black residents that managed to obtain local political positions were removed from power less than a decade later.
Many white politicians weren't shy about why they were passing these restrictions. John Tyler Morgan, a U.S. Senator, said that Congress needed to "burn down the barn to get rid of the rats...the rats being the negro population and the barn being the government of the District of Columbia."
These sorts of "Jim Crow Laws" became codified into the U.S. legal system, which they remained a part of until the latter half of the 20th century.
A String of Hard-Fought Victories (Plus More Setbacks)
For the next century, Washington D.C. voting restrictions remained in effect as the city's demographics evolved. By the late 1950s, D.C. was the first predominately Black city in the country.
Around that time, years of activism and nonviolent protests led to the passing of the Civil Rights Act in 1964. For the first time in history, D.C. residents could vote in presidential elections. That year, they voted overwhelmingly to support Lyndon B. Johnson, the sitting president who had pushed for the Civil Rights Act.
The city also gained three electors to cast votes in the Electoral College. On top of that, in 1973, the Home Rule Act gave D.C. residents the right to elect their city council and mayor.
Yet while these victories were big, they were not without limitations.
The Civil Rights Act set the number of electors to a fixed number: three. If the city one day grows into a megacity with millions of people, its electoral power will remain the same—at least in the current legal framework.
The federal government also heavily intervened in local elections and policy early on. Many members of Congress doubted whether a predominantly Black city could govern itself.
The Situation Today
Over the past few decades, D.C. residents have continued to demand the rights that other U.S. citizens enjoy. Now, with the Democrats in control of both the White House and Congress, residents are hopeful that they might finally get the political representation they deserve.
They believe that the best way to do this is by granting the District statehood.
In April of this year, a bill granting Washington D.C. statehood passed through the House. Although the bill would benefit more than 700,000 people, it's unlikely that it will survive the filibuster and pass through the Senate.
Proponents of the bill argue that while D.C. has seen its rights expanded, it's still not enough. The District needs congressional representation—namely in the form of senators and representatives (ones who hold actual legislative power).
Those against D.C. statehood cite two main arguments. The first is that a city of 68 square miles shouldn't become a state.
Yet while that argument might sound valid when looking at a geographic map, it doesn't hold up when looking at one that shows population.
A population of 700,000 might not sound huge, but it's larger than the populations of both Vermont and Wyoming.
But regardless —whether those people are spread out across a massive state or reside in a tiny urban area, don't they deserve the same rights as the rest of the country?
The second argument is that the Residence Act is clear: D.C. needs to be a separate federal district, independent from the states it ties together.
This type of originalist thinking might seem strong, but again, it falls apart with even a quick examination. It's true that the Constitution demands a federal district no larger than 100 square miles. But what about the land outside of that area?
The area that houses government buildings and monuments could easily remain under federal control. At the same time, the other parts of the District could become a state—one that has the same rights as the other 50 states in the Union.
51: An Odd Number, But the Right Number
While tossing out our old flags to buy updated ones might sound strange, it's a sacrifice we should be willing to make. Having a say in the democratic process is the epitome of what it means to be an American. No U.S. citizens should be denied that fundamental right—no matter where they live.
For the United States to function as a democracy, it needs to treat all its citizens fairly. Granting Washington D.C. statehood is one step that can help the country get closer to achieving that goal.
Make sure to leave me your thoughts, feelings, and hate messages in the comments below. For more deep dives into the American past, check out the other content that Study With Chandler offers.
by Chandler Webster