If you prefer the reading experience ofMedium you can follow us there as well.
Developers and coders are a hot commodity and the boom in start-ups around digital health combined with increased health IT spending is making qualified developers harder to find. It’s a classic example of supply not keeping pace with demand. According to the Bureau of Labor Statistics, approximately 1.4 million computer science jobs are expected by 2020, but only 400,000 new computer science graduates to meet the demand.
With the increase in mobile device applications and further implementation of electronic health records, privacy and security is a growing concern. Naturally, patients expect healthcare organizations to take the necessary steps to safeguard their personal information.
While privacy is certainly assumed, it is not always promised. Many of the high profile data breaches involve thousands of patients and some of the premier payers or providers across the country. For instance, according to the Office of Civil Rights under the Health and Human Services, the top 10 data breaches accounted for just over 111 million records being lost, stolen or inappropriately disclosed with fully 90% reported as a “hacking/IT incident.”
One of the more high-profile breaches of the last year occurred when hackers managed to break into a database containing personal information in the form of 80 million records from Anthem Insurance.
Digital health applications that tap into or integrate with electronic health records (EHRs) may pose the biggest data risk going forward. Since EHRs typically include specific personal information, they are worth more to hackers—averaging $10 to $50 per record. Network access at an average hospital could potentially cost between $10 and $12 million dollars.
“Secure development is definitely a challenge,” said Chad Holmes, principal and cybersecurity leader, EY. “And finding the talent is always a challenge.”
According to industry experts, failing to encrypt mobile devices is the number one exposure to health-related data breaches. Faulty coding poses fundamental security risks and exposure to lawsuits for violating patient’s health information. Programming shortcuts can also expose coding to breaches on several fronts. Because of high demand, companies are settling for average or, in some cases, less than average developers or coders. For coders who are quick to embrace open-source, it can be quite a culture shock to enter an environment built around security such as healthcare.
“Code writers should understand the ‘business of healthcare’ and resist the urge to write specifications that are general business focused,” said Harry Rhodes, director of national standards at American Health Information Management Association. “Collaboration with the healthcare end-user is critical.”
Vetting engineers for healthcare related companies is difficult since coders are typically the most reliable source for other coders. Chief technology officers might also lack the appropriate time to vet incoming programmers. Time and resource pressures sometimes lead to coding errors or faulty security measures. Strict development policy and procedure guidelines are a requirement in healthcare since about 50 aspects of Protected Health Information (PHI) must be covered. At a minimum, developers must be familiar with HIPAA, Fast Healthcare Interoperability Resources (FHIR) and Health Level Seven (HL7) standards.
Solid development is a critical component of all digital health products and services, but coding is the foundation that must be beyond reproach. Testing is now a critical part of the development process of ensuring digital health products are secure and can often identify as much as 80% of security defects. Adopting an agile software (incremental or rapid cycle) approach could offer additional benefits given the on-going improvement nature built into the development process. Meanwhile, application developers have slowly begun to take greater care in authoring secure coding to design safer applications.
While many data breaches and their resulting fines have been small in scope, the largest fine levied by the Office of Civil Rights to date was the combined $4.8 million paid by New York-Presbyterian Hospital and Columbia University when the PHI of 6,800 people was inadvertently made available through search engines.