Predictions from the last 10 years in health tech, and how they turned out

Healthtech Retrospective 

Prior to the Covid-19 pandemic, it was thought that the United States was one of the countries best prepared for epidemics or biological crises. With a death toll surpassing half a million, we see that this may not have been an accurate prediction. However, that is not to discount the changes for the better that the pandemic has forced on the industry. 

Join us on a journey down memory lane as we look at the healthcare industry’s predictions for today from the early 2010s, and see how we measure up.

Big Data

The last decade has seen a genuine data renaissance. 

10 years ago, McKinsey predicted that using big data could be a huge driver of efficiency and quality. The unprecedented amount of growth in computing horsepower, and the fact that people leave a trail of data behind them like fairy dust, suggest that the industry was approaching an inflection point where big data could start to be leveraged usefully. 

In number, McKinsey estimated about $300 billion in value from the data sector in healthcare every year, along with a reduction of national healthcare expenditures of around 8%. More recently, the global big data analytics in healthcare market size was valued at $16.87 billion in 2017, and is growing quickly, with a compound annual growth rate of 19%

Local vs Federal Power

10 years ago, it was predicted that state and local governments would have increasing amounts of power over healthcare, instead of the federal government, and that there would be an increase in focus on public health. 

Prior to the Covid-19 pandemic, it could be said that those goals were on track. Due to the McCarran-Ferguson Act of 1945, the Federal government does cede primary responsibility to the states. While the federal government does still have a lot of power in setting and enforcing response and regulation, much of its actions are executed and filtered through state governments that most often have the final word on what they’re doing, from mask-wearing, to reproductive rights. 

Digital care

In the early 2010s, “e-healthcare” or telemedicine and digital healthcare were seen as useful, but very difficult and costly to implement, requiring specialized training for each user to input, understand, and share. 

However, the last decade’s rise in digitization has forced many companies to store and share information digitally, and has also forced more or less everybody into a certain amount of digital literacy. The pandemic has accelerated that process by a good few years, forcing much of the healthcare industry remote or online, and increasing reliance on video conferencing, medical devices, and digital sharing that would have taken years to implement otherwise. 

The ever-growing power and security of technology makes it a perfect partner to the healthcare industry, as evidenced by their fast-growing partnership. 

Restructuring the healthcare system

A decade ago, industry specialists predicted that the healthcare system would be restructured to be more responsive, resistant, and flexible to change. Despite its reputation as stuck in the past, there have been three notable changes: the 2010 Affordable Care Act, the move to digitization, and increasingly common programs to lower employee healthcare spending. 

The 2010 ACA brought the US as close as it had even gotten to universal healthcare with more affordable insurance available to more people. It also expanded the Medicaid program and supported innovative medical care designed to lower costs. Since the ACA, big companies have had an increasingly large role in their responsibility for their employees’ healthcare, as evidenced by the cost-optimizing programs that are beginning to populate the healthcare landscape. 

As discussed previously, the move to digitization was revolutionary for speed, cost-efficiency, and information availability and will continue to improve as data management, sharing, and access improves. 

 

Author: Vicert
Like this article? Share it!