HIPAA Compliance for Medical Practices
67.9K views | +0 today
HIPAA Compliance for Medical Practices
HIPAA Compliance and HIPAA Risk management Articles, Tips and Updates for Medical Practices and Physicians
Your new post is loading...
Your new post is loading...

OCR clarifies omnibus HIPAA questions | Government Health IT

Last week I had the opportunity to attend and present at this year’s American Health Lawyers Association (AHLA) annual conference in New York. It turned out to be an excellent opportunity for exchanging ideas as well as offering in-depth discussions of many of the compliance challenges in healthcare. Additionally, it was a chance to hear first-hand from the Office of Civil Rights (OCR) regarding various aspects of the rules and their interpretations. What OCR is thinking or how they interpret the rules has always been a major topic of interest.

That was indeed the case as the discussion turned to the subject of Business Associates (BAs). The rule is clear: BAs must comply with the technical, administrative and physical safeguard requirements under the security rule and those use or disclosure limitations expressed in the contract and the privacy rule. 

Right? Well, maybe that’s not so clear.

There was a fair amount of discussion around the fact that not all BA relationships are equal and that there may be cases where not all security provisions apply. In those instances where this can be determined at the point of contracting, it was opined that the contract and/or the Business Associate Agreement (BAA) can include the phrase “as applicable” to recognize that not all security rule provisions may apply. It is important though when going this route to clearly identify what does and does not apply so that expectations are set and both sides know how to perform. 

[See also: Top 10 Government Health IT stories of 2014, thus far.]

Another hot topic that was discussed (and seemingly put to bed by OCR) involved the question of whether encrypting data relieved a vendor of its BA responsibilities.  You might recall that OCR let it be known shortly after the Omnibus Rule was released that they would consider the idea that encryption might be relevant as a factor when determining BA responsibilities or status. The question posed was if the vendor simply hosted the data or the system containing the data, and the data was encrypted by the Covered Entity (CE) who then retained the keys, to such that the vendor could not gain access to the information should this not obviate BA status. The short answer was no, it does not relieve those responsibilities. The rationale was simple. Organizations that host a CE’s electronic protected health information (ePHI) or systems containing ePHI, have security responsibilities that go beyond simple access management. They have responsibilities for areas such as contingency planning, physical security, etc. that have little to do with access management and are absolutely required for anyone maintaining critical systems or ePHI. So, final answer: encryption does not relieve vendors from BA responsibilities.

There were several other topics regarding BAs discussed such as how liability flows from CE to BA to each successive layer of subcontractor and how the Federal Common Law of Agency applies, which is discussed in the Preamble of the Rule, meaning that the CE is responsible for the actions of those it elects to designate as its agents. As always, it is important to not only read the body of the rule, but the Preamble language as well because it typically explains and expands on the rule for interpretation purposes.

The last topic I’ll address is that of the Conduit Exemption. OCR’s representative provided a good explanation of how to apply this exemption and the background for the provision. When the exemption was first conceived electronic transmission of data was not the issue. The issue was the transportation of hard copy PHI through mail services. The conduit provision is very limited. It focuses on transportation or transmittal of PHI. There is no retention of the data contemplated.

[From sister site Medical Practice Insider3 crazy HIPAA breaches.]

Storage if it occurs is incidental to the transmission or transportation process and occurs only for the minimal amount of time necessary to support that process. When the question was asked ‘how does this exemption apply to electronic transmissions?’ the criteria did not change. If a vendor is simply providing transport of ePHI then storage, if necessary, should only occur for that brief period necessary for the information to pass through the vendor’s environment. If the vendor hosts the information, holds it for any reason, beyond what is required to move it through their environment they are a BA and the Conduit Exemption does not apply. 

I leave you with one interesting “conundrum” as the participants in the discussion described it to ponder. It applies to the use of personal email at work, meaning allowing workforce members to use their personal email (Google, Yahoo, etc.) while on the job. The question posed was whether or not the CEs permitted use of personal services created a BA relationship by default, and should workforce members use their personal service to transmit ePHI. Think not only email, but texting, images, etc. Several were of the opinion that this did create a BA relationship.

While you are wrapping your head around the potential implications of that one let me say, this was a great conference. AHLA organized a first class agenda and an excellent faculty that generated a lot of very interesting, thought provoking and relevant discussions. I heartily recommend it for any internal counsel or law team working with healthcare.

No comment yet.

Meaningful Use Audits, RAC Audits, and HIPAA Audits | EMR and HIPAA

Meaningful Use Audits, RAC Audits, and HIPAA Audits | EMR and HIPAA | HIPAA Compliance for Medical Practices | Scoop.it

Healthcare has always been a deeply regulated industry, so in many ways healthcare organizations are already used to dealing with government scrutiny. However, we’ve recently seen a number of new audit programs hit the healthcare world that didn’t exist even a few years ago. Here’s a look at a few of them you should be prepared for.

Meaningful Use Audits
This is one of the newest audit programs to hit healthcare. Depending on your attestation history, it could have a tremendous impact on your organization’s financial health. These EHR incentive audits have been happening across every size organization and are conducted by the CMS hired auditing firm, Figliozzi and Company of Garden City, N.Y. If you get a letter or email from Figliozzi you’ll know what it is right away. An EHR incentive audit is a big deal since the meaningful use program is all or nothing. If they find even one thing wrong with your meaningful use attestation, you could lose ALL of your EHR incentive money.

CMS recently released an informative guidance document outlining the supporting documentation needed for an EHR incentive audit. Pages 4 and 5 of the document go through the self-attestation objectives and others detailing the audit validation and suggested documentation needed for each. If you’ve attested to meaningful use, then you’ll want to take some time to go through the document to make sure you can provide the necessary documentation if needed. In many cases this simply includes dated screenshots to prove measure completion. While many EHR vendors can be helpful in the meaningful use audit process, you should not totally rely on them.

In a recent blog post, Jim Tate makes a compelling case for why you might want to consider doing a mock EHR incentive audit and how to make sure that the audit is effective. Although smaller organizations won’t likely be able to afford an outside audit, having it done by someone in your organization that wasn’t involved in the attestation is beneficial. The CMS guidance document could be used as a guide. A mock audit could help discover any potential issues and help you put mitigation strategies in place before you have a real audit and your hands are tied.

Recovery Audit Contractor (RAC) Audits
RAC audits are currently on hold as CMS works to improve the program and deal with the enormous audit backlog. We still haven’t heard from CMS about when the RAC audits will resume, but we should hear something later this summer. While no RAC audits are occurring right now, that doesn’t mean that once the RAC audits resume, the claims you’re filing today can’t and won’t be audited.

The best thing you can do to be prepared for RAC audits is to make sure that your documentation and billing ducks are in a row. A great place to start is to look at your most common denials and look at how you can improve your clinical documentation, coding and billing for each of these denials. Also, make sure that your process for responding to audits is standardized and effective. The RAC audit is just one example of an audit performed by payers. Don’t be surprised if you’re subjected to audits from other agencies or commercial payers.

RAC audits recovered billions of dollars in overpayments in recent years. You can be sure that they will continue and that other similar initiatives are coming our way. There’s just too much incentive for the government not to do it.

HIPAA Audits
The US Department of Health and Human Services’ Office for Civil Rights (HHS OCR) first started doing HIPAA audits as part of a 2011 pilot program. It’s fair to say that HHS OCR’s audit program was one of discovery as much as it was of compliance. However, the HITECH Act and Omnibus Rule have started to up the ante when it comes to enforcement of HIPAA. HHS OCR announced that they’d be surveying 800 covered entities and 400 business associations to select the next round of audit subjects. An OCR Spokesperson said, “We hope to audit 350 covered entities and 50 BAs in this first go around.”

Unlike previous audits that were done by KPMG, these HIPAA audits will be done by OCR staff. One area that these audits will likely focus on is the HIPAA Security Risk Assessment. The importance of doing this cannot be understated and is illustrated by the fact that it’s a requirement for meaningful use. I will be surprised if these audits don’t also focus on the new HIPAA Omnibus Rule requirements. I’m sure many of the HIPAA audits will catch organizations that never updated their HIPAA policies to comply with HIPAA Omnibus.

No one enjoys an audit of any sort. However, being well prepared for an audit will provide some level of comfort to yourself and your organization. Now is your opportunity to make sure you’re well prepared for these audits that could be coming your way. These audit programs likely aren’t going anywhere, so take the time to make sure you’re prepared.

No comment yet.

Stanford physician's startup makes it a breeze to build HIPAA-compliant mobile health apps

Stanford physician's startup makes it a breeze to build HIPAA-compliant mobile health apps | HIPAA Compliance for Medical Practices | Scoop.it

A plethora of health-related apps and devices should be hitting the market in the next year or two. And the data that these apps and devices collect could help your doctor provide a more holistic picture of your health.

But, as I wrote a few weeks ago, when that health data crosses the line from consumer health cloud into the healthcare delivery system, HIPAA privacy rules will come into play.

One company, started by a Stanford physician, has foreseen this challenge to device and app developers, and is offering a way to easily comply with HIPAA’s often stringent rules. These “medical grade” apps can then safely share data with clinical systems.

“With Medable, mobile apps can make it easy for users to communicate with their doctors, nurses, and caregivers, and also to provide them with any kind of data originating from their mobile devices,” company co-founder Dr. Michelle Longmire tells VentureBeat. “That lets everyone receive the data, visualize it, and then communicate about it in a very natural way.”


Health app developers can use the platform to build new applications or to integrate Medable features into existing applications, Longmire says. Medable also offers numerous application features like patient and provider profiles, two-factor authentication, and “push” messaging. These features are delivered through a software development kit (SDK) and an application programming interface (API).

“If push messages are sent to care providers, they contain only the metadata, not any identifiable information,” Longmire explains. “So a physician might receive a message saying ‘an image is available for you,’ but the doctor would need to log in to get the image.”

Longmire says Medable uses the HL7 clinical data format, so it can integrate with, and exchange data with, any electronic health record system that uses HL7 format, and the majority of them do.

The main concern of HIPAA rules is guarding “protected health information” or “PHI” from the eyes of those who don’t need to see it for clinical purposes.

Longmire says the Medable platform encrypts all PHI in several ways — on the device, in transit and then on the Medable platform.

The Medable platform can also anonymize large amounts of clinical data so that researchers can study it. Additionally, Medable provides all of the capability needed for HIPAA auditing and clinical data reporting.

The bottom line is that Longmire’s platform gets app developers out of the privacy and compliance business, at least where it concerns sharing data with hospitals or medical groups.

“Medable allows developers to focus on the content of their apps, instead of on data security, which is not their specialty,” Longmire says.

The global health market was at $6 billion in 2013, but it’s projected to be a $26 billion market by 2017.

No comment yet.

Health Data Startup Addresses HIPAA Issues Apple Hasn’t

Health Data Startup Addresses HIPAA Issues Apple Hasn’t | HIPAA Compliance for Medical Practices | Scoop.it

Health and fitness-tracking apps and devices are set to take off. The growth of the area, further propelled by platforms developed by Apple (NASDAQ:AAPL) and Samsung (SSNLF.PK), will propel the adoption of these apps and services both by consumers and by healthcare systems and providers. As VentureBeat’s Mark Sullivan reports, these health apps and services will come under the jurisdiction of the Health Insurance Portability and Accountability Act (HIPAA) regulations over the privacy of personal health data.

Those regulations were widened last year to safeguard users’ “protected health information” not only at clinics, hospitals, and insurance companies, but also in computer systems that manage health data, and as apps blur the distinction between services created for consumers and services created for the healthcare industry, developers will need to consider how to make their apps’ handling of data HIPAA-compliant.

Medable, a Palo Alto startup co-founded by Stanford physician Michelle Longmire, is prepared for that challenge. It offers a platform that enables the easy development of apps that comply with HIPAA’s security and privacy regulations. Apps built on Medable’s platform will be able to safely and legally share users’ data with healthcare providers, making it possible for developers to build apps where users will be able to communicate with doctors, nurses, and caregivers, plus track, visualize, and share the health-related data that they collect with their smartphone and any connected devices. Medable offers developers a variety of options, including its platform as a service, cloud services, an assortment of APIs, and an integrator partner program. Though Medable is a mobile-first service, the platform enables developers to build desktop, tablet, and web apps as well.

In a post on Medable’s blog, Trevor Goss writes that the company’s mission is “to make health data universally accessible and connected.” Goss refers to Medable as “the world’s first medical-grade platform-as-a-service,” which will enable developers, doctors, hospitals, medical device manufacturers, and others to quickly and “easily build HIPAA-compliant applications and services.” Medable says “medical-grade” refers to the platform’s ability to support both clinical applications — with features such as communication between healthcare providers and patients, collaboration among patients and multiple providers or providers and multiple patients or other providers, and patient-controlled data sharing — plus personal health information — compliant with HIPAA and compatible with wearables, implantables, and in-home devices.

Sullivan reports that developers will be able to use Medable to build new apps, or integrate its features into existing apps. Longmire told VentureBeat that Medable offers features like patient and provider profiles, two-factor authentication, and security-conscious push messaging. They’re all available in Medable’s software development kit and API. Medable uses the HL7 clinical data format, also used by the majority of health record systems, so that it can integrate with or exchange data with any record system that uses the format. The platform can also anonymize large amounts of data for clinical study, and enables both HIPAA auditing and clinical data reporting. 

As Sullivan puts it, the Medable platform “gets app developers out of the privacy and compliance business, at least where it concerns sharing data with hospitals or medical groups.” Longmire tells him that, “Medable allows developers to focus on the content of their apps, instead of on data security.” 

Sullivan noted in June that what determines whether HIPAA requirements apply to a given app is “who is handling the data.” In the past, consumer apps have been clearly separated from apps intended for doctors and other healthcare providers. But with the growing prevalence of cloud services that enable the uploading and sharing of data, those lines blur. Apps that enable consumers to transmit their data to the cloud, where healthcare providers access it and can provide feedback, will likely need to be HIPAA-compliant because the widening regulations can be interpreted to include app developers whose apps “manage and transmit” protected health information.

Both Apple and Samsung will have HIPAA regulations to contend with as they develop HealthKit and SAMI, as both platforms are clearly intended to collect and send patient health data. That’s especially clear given that both companies are reportedly working with Epic, an electronic health record software provider. But neither company has yet unveiled detailed plans for data security in those platforms.

Since Apple’s HealthKit allows for apps to share data with each other, HIPAA compliance should become especially important for apps integrated with the platform. But just as app developers are unlikely to want to get into the privacy compliance business, Apple and Samsung aren’t likely to want to actively enforce HIPAA compliance as a requirement for apps to be accepted into their app stores. That’s partially because HIPAA was written well before the development of the iPhone, and even with last year’s amendments, the terminology it uses leaves some room for interpretation.

Following Apple’s announcement of HealthKit and the corresponding Health app at the Worldwide Developers Conference, several websites have posted guides for developers who are researching the daunting task of complying with HIPAA’s regulations without much guidance from Apple. Most developers simply don’t know much about the regulations, and how they relate to the apps and services that new technologies and platforms make possible. But HIPAA places responsibility on the shoulders of developers, who will need to make sure that apps that deal with protected health information account for privacy and security in communications, notifications, data sharing, and data storage.

Developers have a few options, like Medable, to remove the burden of compliance from their shoulders. TrueVault provides a secure and HIPAA-compliant API for the storage of health data, and Accountable offers HIPAA compliance management as a service. Medable’s Longmire told Stephanie Baum of MedCity News last year that she wanted Medable to be “one of the key utilities for clinical health.” At the time, Longmire estimated that HIPAA compliance represented as much as 80 percent of app development costs, and said that the process could delay an app’s release for up to a year. Longmire told Baum that she envisions Medable as  solution to save developers both time and money:

She sees plenty of scope for Medable’s platform to be used by small developers to health systems and companies across the health ecosystem. Longmire likens the company’s business model to Dropbox –- there’s a freemium, but it scales with data utilization.”

Though the development of the health app sphere is largely still in its infancy, it seems inevitable that many, if not most, of the health-related apps that consumers will use in the future will share data with doctors, clinics, or hospitals. That makes HIPAA regulation of apps and services not only inevitable but truly necessary, and Medable seems to have hit on an idea that could turn out to be a smart and far-reaching solution for developers building for Apple, Samsung, and a variety of other platforms.

The introduction of platforms like HealthKit and SAMI should represent a turning point in discussions about privacy and security compliance, so that it will be more clear what apps and services need to do to be compliant and secure while delivering innovation to consumers and healthcare professionals. But for individual app developers, a service like Medable may be all they need.

If several — or even one — health apps are built on Medable’s platform, that could set a precedent and get more developers on board, both with HIPAA and with Medable. HIPAA-compliant apps that collect patient data, enable better communication between doctor and patient, and are compatible with the records systems that most healthcare providers already use are a benefit for patients, healthcare providers, and regulators. These apps and services, intended for both consumers and providers, will very likely represent the future of health apps.

No comment yet.

Why Are Telemedicine Systems So Expensive? | EMR and HIPAA

Why Are Telemedicine Systems So Expensive? | EMR and HIPAA | HIPAA Compliance for Medical Practices | Scoop.it

Like many other enabling-technologies in healthcare, telemedicine has vast unrealized potential.

If we make location completely irrelevant and can deliver care virtually, we can address the supply and demand imbalance plaguing healthcare. The benefits to patients would be enormous: lower costs and improved access in ways that are unimaginable in the analog era.

However, one of the many roadblocks to adoption is the cost of the legacy technology powering clinical telemedicine use. In this post, I’ll outline why the telemedicine systems are so expensive, even in the era of Skype and other free video-conferencing systems.

The Telemedicine Industry Is Old…School

Telemedicine as an industry has existed for about 15 years, although uses of telemedicine certainly predate that by another 10-20 years. A decade and a half ago, the foundational technologies that enable video-conferencing simply weren’t broadly available. Specifically, early telemedicine companies had to:

1) Develop and maintain proprietary codecs
2) Design and assemble hardware (e.g. proprietary cameras) and device drivers
3) Deploy hardware at each client site and train end users on its management
4) Build an expensive outside sales force to carry these systems door-to-door to sell them
5) Endure long, grant funding-driven sales cycles

Though some of these challenges have been commoditized over the years, many of the legacy players still manage and maintain the above functions in-house. This drives up costs, which in turn must be passed onto customers. Since many customers initially paid for telemedicine systems with grant money (that telemedicine technology companies helped them write and receive), the market has historically lacked forces to drive down prices. Funny how that seems to be a recurring theme in healthcare!

But, there’s a better way

Today, many startups are building robust telemedicine platforms with dramatically lower cost overhead by taking advantage of a number of technologies and trends:

1) Technologies such as WebRTC commoditize the codec layer
2) The smartphones, tablets, and laptops already owned by hospitals (and individual providers) have high quality cameras built into them
3) Cloud providers like Amazon Web Services make it incredibly easy for young companies to build cloud-based technologies
4) Digital and inbound marketing enable smaller (and inside) sales forces to succeed at scale.
5) To reduce the cost of care, providers are increasingly seeking telemedicine systems now, without wading (and waiting) through the grant process of yesteryear.

In short, telemedicine companies today can build dramatically more cost-effective solutions because they don’t have to incur the costs that the legacy players do.

Why don’t the old players adapt?

The simple answer: switching business models is exceedingly difficult. Consider the following:

1) Laying off hardware and codec development teams is not easy, especially given how tightly integrated they are to the rest of the technology stack that has evolved over the past decade

2) Letting go of an outsides sales force to drive crafty, cost-effective inside sales is an enormous operational risk

3) Lobbying the government to provide telemedicine grants provides an effectively unlimited well to drink from

Changing business models is exceedingly difficult. Few companies can do it successfully. But telemedicine is no different than all other businesses that thought they were un-disruptable. Like all other technologies, telemedicine must adapt from legacy, desktop-centric, on-premise solutions to modern, cloud based, mobile and wearable-first solutions.

No comment yet.

Jocelyn Samuels to head HHS Office for Civil Rights

Jocelyn Samuels has been named the next director of the Office for Civil Rights, the unit within the U.S. Department of Health and Human Services that enforces HIPAA compliance, an OCR spokeswoman confirmed Tuesday in an email to FierceHealthIT.

Samuels (pictured) replaces Leon Rodriguez, who was confirmed as director of U.S. Citizenship and Immigration Services, a unit of the Department of Homeland Security. He had held the top OCR post since 2011.

The OCR is expected to ramp up HIPAA audits this fall, though with a narrower focus, and an OCR attorney has warned that the whopping fines of the past year will "pale in comparison" to those coming in the next 12 months.

Samuels comes from the Department of Justice, where she is the acting assistant attorney general for the Civil Rights Division.

Former OCR senior privacy and security adviser David Holtzman, considered a prime candidate to replace Rodriguez, left in November. Another potential candidate, Susan McAndrew, OCR's deputy director for health information privacy and security, who had worked on the HIPAA Privacy Rule for HHS since May 2000, has retired.

Massive change is coming to HHS. In January, Karen DeSalvo took over as National Coordinator for Health IT. Sylvia Mathews Burwell was named HHS secretary in early June.

A rash of HHS executives are leaving, including Mike Hash, director of Office of Health Reform; Gary Cohen, director of the Center for Consumer Information and Insurance Oversight; CMS principal deputy administrator Jonathan Blum; Lygeia Ricciardi, director of the Office of Consumer eHealth at the Office of the National Coordinator for Health IT; and Joy Pritts, ONC's first chief privacy officer.

In late May, the agency revealed plans to reorganize, cutting the number of offices within the agency from 17 to 10.

Technical Dr. Inc.'s curator insight, July 12, 2014 3:07 AM

Contact Details :
inquiry@technicaldr.com or 877-910-0004
- The Technical Doctor Team


Health system caught up in an $800,000 breach | HIPAA Update

Health system caught up in an $800,000 breach | HIPAA Update | HIPAA Compliance for Medical Practices | Scoop.it

The hits just keep on coming. HHS announced June 23 that OCR entered into resolution agreement and $800,000 settlement with Parkview Health System, Inc., in Fort Wayne, Indiana, for alleged HIPAA Privacy Rule violations.

Parkview obtained the medical records of 5,000–8,000 patients while helping Dr. Christine Hamilton transition her patients to new providers upon her retirement. It was believed that the health system was interested in purchasing a portion of Dr. Hamilton’s practice. Parkview failed to safeguard the PHI of these patients when its employees left 71 cardboard boxes of these medical records outside the physician’s home while she was not there. The home is within 20 feet of a public road and is near a shopping center, according to the press release.

The resolution agreement provides that Dr. Hamilton filed the complaint against Parkview. The investigation revealed that when Parkview employees left the medical records at Dr. Hamilton’s home, they were aware that she was not there and had previously refused the delivery of the records.

Parkview’s corrective action plan states that it will do the following:

  • Develop, maintain, and revise written HIPAA Privacy Rule policies and procedures for its workforce with HHS approval
  • Distribute HHS-approved policies and procedures to members of its workforce
  • Ensure that new, approved policies and procedures provide for administrative, technical, and physician safeguards to protect PHI
  • Notify HHS in writing within 30 days of a violation of the new, approved policies and procedures
  • Provide general safeguards training for its workforce members who have access to PHI
Technical Dr. Inc.'s insight:

Contact Details :
inquiry@technicaldr.com or 877-910-0004
- The Technical Doctor Team

No comment yet.

Hospitals 'very sloppy' about security efforts

Hospitals 'very sloppy' about security efforts | HIPAA Compliance for Medical Practices | Scoop.it

Healthcare facilities are constantly in danger of being hacked and having data stolen, but two researchers have found that many hospitals themselves leak valuable information online.

The data leaks result from network administrators enabling Server Message Block, or SMB, which, when configured a certain way, broadcasts the data externally, researchers Scott Erven, head of information security for Essentia Health, and Shawn Merdinger, an independent healthcare security researcher and consultant, shared in a recent Wired article.

SMB is a protocol used by administrators to quickly identify, locate and communicate with computers and equipment connected to an internal network, according to the article. Erven and Merdinger found that hospitals misconfigure the SMB service, which allows outsiders to see it. 

Security issues at healthcare facilities are nothing new, and the SMB protocol vulnerability is just another problem to add to a growing list of ways information can be compromised.

"It goes to show that healthcare [organizations are] very sloppy in configuring their external edge networks and are not really taking security seriously," Erven told Wired.

He added that the problems can occur because of too much focus on HIPAA compliance--which causes providers to pay too little attention to testing and securing their systems.

With a spike in HIPAA fines possible, healthcare facilities may be even more focused on compliance with those standards then working to properly secure their networks.

To that end, even a recent White House report pointed out that HIPAA compliance might not be enough to ensure privacy in the electronic age.

Technical Dr. Inc.'s insight:

Contact Details :
inquiry@technicaldr.com or 877-910-0004
- The Technical Doctor Team

No comment yet.

Tracking confidential data a major worry in healthcare security

Tracking confidential data a major worry in healthcare security | HIPAA Compliance for Medical Practices | Scoop.it

Uncertainty about where sensitive and confidential data is located causes more worry for security pros than hackers or malicious employees, according to a new survey from the Ponemon Institute.

The report, based on a poll of 1,587 IT security practitioners in 16 countries, focuses on the state of data-centric security, which it describes as a security policy that follows data wherever it is replicated, copied or integrated.

Technical Dr. Inc.'s insight:

Contact Details :
inquiry@technicaldr.com or 877-910-0004
- The Technical Doctor Team

No comment yet.

OCR attorney predicts spike in HIPAA fines

OCR attorney predicts spike in HIPAA fines | HIPAA Compliance for Medical Practices | Scoop.it

The Office for Civil Rights' crackdown on HIPAA violations over the past year will "pale in comparison" to the next 12 months, a U.S. Department of Health and Human Services attorney recently told an American Bar Association conference.

Jerome B. Meites, OCR chief regional counsel for the Chicago area, said that the office wants to send a strong message through high-impact cases, according to Data Privacy Monitor.

The Office for Civil Rights has been levying fines to make healthcare entities take notice: nine settlements since June 1, 2013, have totaled more than $10 million. That includes a record $4.8 million fine announced in May against New York-Presbyterian Hospital and Columbia University.

Choosing an efficient patient portal solution can be a daunting task and can cost you big without proper research. By asking the right questions and connecting the right stakeholders, you can ensure that you implement a true community solution that will improve the continuum of care for your clinicians and patients. Click here to learn more about 7 key questions that can help you choose the best patient portal solution for your facility. To learn more, download today.

"Knowing what's in the pipeline, I suspect that that number will be low compared to what's coming up," Meites said in the article.

The OCR has said that when it resumes HIPAA audits this fall, the investigations will have a narrow focus and there will be fewer onsite visits. Meites told the American Bar Association that the OCR still has to decide which organizations it will select for an audit from a list of 1,200 candidates--800 healthcare providers, health plans or clearinghouses--and 400 of their business associates.

A report last December from the Office of Inspector General criticized the OCR's enforcement of the HIPAA provisions, including inadequate focus on system and data security.

Meanwhile, the number of breaches on the U.S. Department of Health and Human Services' "wall of shame" topped 1,000 this month, with at least 34 breaches so far in June. The records of nearly 31.7 million people have been exposed since federal reporting was mandated in September 2009.

No comment yet.

iOS changes will address HIPAA risk | Healthcare IT News

iOS changes will address HIPAA risk | Healthcare IT News | HIPAA Compliance for Medical Practices | Scoop.it

Imagine if almost everyone walking into your hospital – patients, doctors, visitors, salespeople – was carrying an active homing beacon, which broadcast, unencrypted, their presence and repeatedly updated exact location to anyone who chose to listen.

[See also: Where will HIT security be in 3 years?]

That's where things stand today, courtesy of the mobile MAC address signal (it stands for media access control), a unique ID coming from every smartphone, tablet and wearable device.

But not for long, given upcoming changes to how Apple products will handle MAC address broadcasts –  a move almost certain to be copied by Google's Android.

[See also: 'Troubling disconnect' between mobile security threats and protections in place]

Apple's iOS 8 change, focusing initially on how MAC addressing interacts with Wi-Fi scans, will shift to using "randomly, locally administered" MAC addresses. The result, according to Apple: "The MAC address used for Wi-Fi scans may not always be the device's real – universal – address." (That description is on page 18 of an Apple PDF, available here.)

As a practical matter, using this kind of a randomized bogus address approach will make tracking people via mobile devices impossible or, at best, impractical, depending on the level of randomization used and how often – if ever – the true MAC address is broadcast.

It will still be months before Apple releases this new version of its mobile OS publicly (it's now solely with developers), weeks and maybe months before most consumers will upgrade and longer still before others – especially Google's Android – mimic the move.

That means that, for now, this security privacy risk is still a very real threat.

The risk is twofold. First, there is the potential for a renegade member of the hospital's staff to track people. Second, there exists the possibility that hospital visitors could wirelessly track other hospital visitors.

With the first scenario, this is not as much of a concern for tracking doctors and other hospital staff, as they could just as easily be tracked the instant they log into the hospital's local area network, so the MAC address broadcast is not necessary. With visiting cyberthieves or stalkers, anyone with a mobile device is a potentially tracked victim.

The security risk is that a specific MAC address would be tracked over time, showing all travel activity within the hospital. Retail offers a great example of the risk: Retailers work with vendors who have contracts with lots of other retailers. This allows those companies to create – and to then sell – detailed reports of every store and mall and parking lot that a MAC address visits. By overlaying it with purchase records, that address can be associated with specific purchases. If those purchases used a payment card or loyalty card, that MAC address can then be associated with a specific person.

No comment yet.

Big Data, My Data - iHealthBeat

Big Data, My Data - iHealthBeat | HIPAA Compliance for Medical Practices | Scoop.it

"The routine operation of modern health care systems produces an abundance of electronically stored data on an ongoing basis," Sebastian Schneeweis writes in a recent New England Journal of Medicine Perspective.

Is this abundance of data a treasure trove for improving patient care and growing knowledge about effective treatments? Is that data trove a Pandora's black box that can be mined by obscure third parties to benefit for-profit companies without rewarding those whose data are said to be the new currency of the economy? That is, patients themselves?

In this emerging world of data analytics in health care, there's Big Data and there's My Data ("small data"). Who most benefits from the use of My Data may not actually be the consumer.

Big focus on Big Data. Several reports published in the first half of 2014 talk about the promise and perils of Big Data in health care. The Federal Trade Commission's study, titled "Data Brokers: A Call for Transparency and Accountability," analyzed the business practices of nine "data brokers," companies that buy and sell consumers' personal information from a broad array of sources. Data brokers sell consumers' information to buyers looking to use those data for marketing, managing financial risk or identifying people. There are health implications in all of these activities, and the use of such data generally is not covered by HIPAA. The report discusses the example of a data segment called "Smoker in Household," which a company selling a new air filter for the home could use to target-market to an individual who might seek such a product. On the downside, without the consumers' knowledge, the information could be used by a financial services company to identify the consumer as a bad health insurance risk.

"Big Data and Privacy: A Technological Perspective," a report from the President's Office of Science and Technology Policy, considers the growth of Big Data's role in helping inform new ways to treat diseases and presents two scenarios of the "near future" of health care. The first, on personalized medicine, recognizes that not all patients are alike or respond identically to treatments. Data collected from a large number of similar patients (such as digital images, genomic information and granular responses to clinical trials) can be mined to develop a treatment with an optimal outcome for the patients. In this case, patients may have provided their data based on the promise of anonymity but would like to be informed if a useful treatment has been found. In the second scenario, detecting symptoms via mobile devices, people wishing to detect early signs of Alzheimer's Disease in themselves use a mobile device connecting to a personal couch in the Internet cloud that supports and records activities of daily living: say, gait when walking, notes on conversations and physical navigation instructions. For both of these scenarios, the authors ask, "Can the information about individuals' health be sold, without additional consent, to third parties? What if this is a stated condition of use of the app? Should information go to the individual's personal physicians with their initial consent but not a subsequent confirmation?"

The World Privacy Foundation's report, titled "The Scoring of America: How Secret Consumer Scores Threaten Your Privacy and Your Future," describes the growing market for developing indices on consumer behavior, identifying over a dozen health-related scores. Health scores include the Affordable Care Act Individual Health Risk Score, the FICO Medication Adherence Score, various frailty scores, personal health scores (from WebMD and OneHealth, whose default sharing setting is based on the user's sharing setting with the RunKeeper mobile health app), Medicaid Resource Utilization Group Scores, the SF-36 survey on physical and mental health and complexity scores (such as the Aristotle score for congenital heart surgery). WPF presents a history of consumer scoring beginning with the FICO score for personal creditworthiness and recommends regulatory scrutiny on the new consumer scores for fairness, transparency and accessibility to consumers.

At the same time these three reports went to press, scores of news stories emerged discussing the Big Opportunities Big Data present. The June issue of CFO Magazine published a piece called "Big Data: Where the Money Is." InformationWeek published "Health Care Dives Into Big Data," Motley Fool wrote about "Big Data's Big Future in Health Care" and WIRED called "Cloud Computing, Big Data and Health Care" the "trifecta."

Well-timed on June 5, the Office of the National Coordinator for Health IT's Roadmap for Interoperability was detailed in a white paper, titled "Connecting Health and Care for the Nation: A 10-Year Vision to Achieve an Interoperable Health IT Infrastructure." The document envisions the long view for the U.S. health IT ecosystem enabling people to share and access health information, ensuring quality and safety in care delivery, managing population health, and leveraging Big Data and analytics. Notably, "Building Block #3" in this vision is ensuring privacy and security protections for health information. ONC will "support developers creating health tools for consumers to encourage responsible privacy and security practices and greater transparency about how they use personal health information." Looking forward, ONC notes the need for "scaling trust across communities."

Consumer trust: going, going, gone? In the stakeholder community of U.S. consumers, there is declining trust between people and the companies and government agencies with whom people deal. Only 47% of U.S. adults trust companies with whom they regularly do business to keep their personal information secure, according to a June 6 Gallup poll. Furthermore, 37% of people say this trust has decreased in the past year. Who's most trusted to keep information secure? Banks and credit card companies come in first place, trusted by 39% of people, and health insurance companies come in second, trusted by 26% of people. 

Trust is a basic requirement for health engagement. Health researchers need patients to share personal data to drive insights, knowledge and treatments back to the people who need them. PatientsLikeMe, the online social network, launched the Data for Good project to inspire people to share personal health information imploring people to "Donate your data for You. For Others. For Good." For 10 years, patients have been sharing personal health information on the PatientsLikeMe site, which has developed trusted relationships with more than 250,000 community members.

On the bright side, there is tremendous potential for My Data to join other peoples' data to drive better health for "Me" and for public health. On the darker side, there is also tremendous financial gain to be made by third-party data brokers to sell people's information in an opaque marketplace of which consumers have no knowledge. Individuals have the most to gain from the successful use of Big Data in health. But people also have a great deal to lose if that personal information is used against them unwittingly.

Deven McGraw, a law partner in the health care practice of Manatt, Phelps & Phillips, recently told a bipartisan policy forum on Big Data in health care, "If institutions don't have a way to connect and trust one another with respect to the data that they each have stewardship over, we won't have the environment that we need to improve health and health care." This is also true for individual consumers when it comes to privacy rights over personal health data.

No comment yet.

Will Healthcare Ever Take IT Security Seriously?

Will Healthcare Ever Take IT Security Seriously? | HIPAA Compliance for Medical Practices | Scoop.it

CIO - In the years since the HITECH Act, the number of reported healthcare data breaches has been on the rise - partly because organizations have been required to disclose breaches that, in the past, would have gone unreported and partly because healthcare IT security remains a challenge.

Recent research from Experian suggests that 2014 may be the worst year yet for healthcare data breaches, due in part to the vulnerability of the poorly assembled Healthcare.gov.

"The volume of IPs detected in this sample can be extrapolated to assume that there are millions of compromised healthcare organizations, applications, devices and systems sending malicious packets from around the globe."

Hacks and other acts of thievery get the attention, but the root cause of most healthcare data breaches is carelessness: Lost or stolen hardware that no one bothered to encrypt, protected health information emailed or otherwise exposed on the Internet, paper records left on the subway and so on.

What will it take for healthcare to take data security seriously?

Healthcare IT So Insecure It's 'Alarming'

Part of the problem is that healthcare information security gets no respect; at most healthcare organizations, security programs are immature at best, thanks to scarce funding and expertise. As a result, the majority of reported data breaches are, in fact, avoidable events.

[Related: Healthcare IT Security Is Difficult, But Not Impossible]

The recent SANS Health Care Cyber Threat Report underscores this point all too well. Threat intelligence vendor Norse, through its global network or honeypots and sensors, discovered almost 50,000 unique malicious events between September 2012 and October 2013, according to the SANS Institute, which analyzed Norse's data and released its report on Feb. 19. The vast majority of affected institutions were healthcare providers (72 percent), followed by healthcare business associates (10 percent) and payers (6 percent).

SANS uses the words "alarming" and "troubling" often in its analysis. "The sheer volume of IPs detected in this targeted sample can be extrapolated to assume that there are, in fact, millions of compromised health care organizations, applications, devices and systems sending malicious packets from around the globe," writes Senior SANS Analyst and Healthcare Specialist Barbara Filkins.

[ Tips: How to Prevent (and Respond to) a Healthcare Data Breach ]

More than half of that malevolent traffic came from network-edge devices such as VPNs (a whopping 33 percent), firewalls (16 percent) and routers (7 percent), suggesting "that the security devices and applications themselves were either compromised ... or that these 'protection' systems are not detecting malicious traffic coming from the network endpoints inside the protected perimeter," Filkins writes, noting that many vulnerabilities went unnoticed for months. Connected endpoints such as radiology imaging software and digital video systems also accounted for 17 percent of malicious traffic.

Norse executives say this stems from a disconnect between compliance and regulation. Simply put, says Norse CEO Sam Glines, "There is no best practice applied." Many firewall devices with a public-facing interface, for example, still use the factory username and password. The same is true of many surveillance cameras and network-attached devices such as printers - the default passwords for which can be obtained not through hacking but through a simple Internet search. "It's just not good enough in today's market."

The United States would do well to heed the European Union's data breach laws, Glines says, as they take a "categorically different" approach and include specific language about what is and isn't compliant. This could include, for example, specific policies for managing anything connected to an IP address or basic password and access control management measures, he says.

Mobile Health Security Especially Suspect

In the absence of such regulation, though, patient privacy is a myth. Data is shared freely in a hospital setting, for starters, and clinical systems favor functionality over privacy, so much so that privacy and security are often an afterthought in the development lifecycle. This is especially true in the growing mobile health marketplace, which largely places innovation before security.

[Related: Healthcare.gov Still has Major Security Problems, Experts Say]

Harold Smith discovered this all too quickly in December. After Happtique, a mobile health application certification group, released its first list of approved applications, Smith, the CEO of software firm Monkton Health, decided to check out a couple apps.

It wasn't pretty. He installed one on a jailbroken iPhone and, in less than a minute, pulled electronic protected health information (ePHI) from a plain text, unencrypted HTML5 file. He also found that this data - specifically, blood glucose levels - was being sent over HTTP, not HTTPS. "That was the first hint that something was wrong," he says. "That's a pretty big 'Security 101' thing to miss."

A second app, which Smith tested a few days later, also stored ePHI in unencrypted, plain text files. Though this app uses HTTPS, Smith notes in his blog that it sends usernames and passwords in plain text. "That was another big problem," he says.

[Slideshow: 12 Tips to Prevent a Healthcare Data Breach]

Happtique suspended its application certification program in light of Smith's findings, but the application developers themselves (as well as healthcare IT news sites and blogs) glossed over the issue. This irked Smith: "As someone who develops mobile health software, if someone tells me they've found a vulnerability, I get to them right away."

At the very least, Smith says, mobile health applications need a pin screen and data encryption. The bigger issue, though, is the tendency for developers treat mobile apps like desktop apps. That doesn't work in the "whole new Wild West" of mobile development, Smith says, where databases aren't encrypted and passwords need to be hashed. Five years after the release of the Apple SDK, he adds, people are still trying to figure it all out.

Is Red Tape to Blame for Poor Healthcare IT Security?

The same is true of healthcare's privacy and security regulations - which, to put it mildly are conflicting. Sharon Klein, head of the privacy, security and data protection practice at the law firm Pepper Hamilton, notes that, in the United States, there are 47 different sets of (inconsistent) data breach regulations and multiple regulatory frameworks.

[ Study: Healthcare Industry CIOs, CSOs Must Improve Security ]

If there are overarching standards, they come from the National Institute of Standards and Technology, Klein says, noting the Office for Civil Rights and Department of Health and Human Services have "consistently" used NIST standards. At the same time, other agencies are getting involved:

  • The Federal Trade Commission emphasizes privacy by design in the collection, transmission, use, retention and destruction of data;
  • The Food and Drug Administration's guidance on cybersecurity in medical devices and hospital networks pinpoints data confidentiality, integrity and availability, and
  • The Federal Communications Commission, in the wake of weak 802.11 wireless security, has issued disclaimers regarding text messaging and geolocation with implications for clinical communications.

[ Related: Solving Healthcare's Big Data Analytics Security Conundrum ]

Given the regulatory inconsistencies, Klein says it's best to document everything you're doing and conduct vigorous training and awareness programs for all staff. "Minimum necessary" policies, which limit who get to see which data and, critically, which change as an individual employee's role evolves, can eliminate unnecessary security holes, as does the appropriate de-identification of data.

Software developers have additional priorities. If anything is regulated, isolate it, Klein says, and make sure you disclose to consumers what data you are obtaining, what you intend to do with it, what third-parties will have access to it and whom to contact if there is an issue. Startups want to be first to market, she admits, but in the process - as Smith found - they can put security on the back burner, only to scramble to fill the gaps once vulnerabilities are discovered.

Balancing Healthcare IT Security and Accessibility

Experts largely agree that a cogent approach to health data security must balance security and accessibility, whether it's patients, physicians or third parties who want the data. This is especially important as the healthcare industry emphasizes more widespread health information exchange as part of a larger goal to provide more coordinated care.

"Security has for a long time been an afterthought. Now it has to be part of the build," says Glines - adding that, if it isn't, an app simply shouldn't be released.

Smith suggests that developers and security professionals hack iOS apps, as he did, and see for themselves how easy it is. Then, he says, they should ask, "If it's not that difficult, and [I'm] storing all that data on the phone, what can I do beyond what the OS offers?"

As it turns out, there are a "whole litany of things" that application developers can do, even in an ever-changing field. Specifically, Smith points to viaForensics' secure mobile development best practices, which apply to iOS and Android.

Given the findings of the Norse and SANS Institute study, Glines says it's worth having two conversations. One is with network administrators about the "basic blocking and tackling" work, such as actually changing default device passwords, which can bring about "simple, powerful change." The other is with executive staff about the implications of lax security - a conversation unfortunately made easier in the wake of the Target breach, which it turns out stemmed from systemic failures and not a single point of attack.

Regulators won't cut you any slack if a breach occurs, Klein says, especially if you knew vulnerabilities existed and didn't fix them. Under the new HIPAA Omnibus Rule, which went into effect in September 2013, firms face fines of up to $1.5 million in the event of the "willful neglect" of security issues."

Glines says boardrooms are beginning to shift their security mentality, but this will take time to trickle down. "In the next eight to 12 months, we will continue to see more front-page news" about data breaches, he says.

No comment yet.

Secure vs. HIPAA Compliant: What’s the Difference for Text Messaging?

Secure vs. HIPAA Compliant: What’s the Difference for Text Messaging? | HIPAA Compliance for Medical Practices | Scoop.it

The need for physicians and other healthcare team members to be in constant communication with each other has never been higher. Secure texting applications seek to provide healthcare professionals a quick and convenient way to connect while complying with the Health Insurance Portability and Accountability Act (HIPAA) and other privacy regulations.

Text messages are, in principle, an excellent way to transfer information on the go. They are useful in communication between doctors, nurses, office staff and even patients. Text messaging is a viable replacement for older, less efficient technologies such as the pager. Texting is real-time communication which email doesn’t equal. Physicians have shown an affinity for the method. In a study published in 2014, well over half of physicians at pediatric hospitals reported sending and receiving work-related text messages, and 12 percent said they sent more than 10 messages per shift [1].

Unfortunately, despite being used frequently in healthcare, standard text messages and most “secure” applications lack the encryption and other features needed to avoid potentially costly and embarrassing HIPAA infractions. Such violations, if due to “willful neglect,” can lead to fines of $50,000 per violation, to a maximum “of $1.5 million a year [2]. The right physician messaging solution keeps PHI private while making healthcare professionals’ lives easier and improving quality of care.  Choosing an app that will truly keep your patients’ data safe, however, can be a challenge because “secure” does not always mean “HIPAA-compliant.” HIPAA-compliant is much more stringent and unfortunately most applications just aren’t.

No comment yet.


HIPAA Blog | HIPAA Compliance for Medical Practices | Scoop.it

This article has popped up several places in my morning reading.  They are probably right; in fact, some big health data hacks have probably already occurred, but we just don't know about them yet because we don't yet know how the data is being used and aren't able to see it.  There are probably millions of individual instances of medical identity theft occurring every day, from the voluntary "sharing" of insurance by cooperative parties (your brother has insurance through his job but you don't so you go to a doctor and pretend to be him so that his insurance will pay for your care) to identity theft facilitated by insiders (a nurse or receptionist issues multiple Oxycontin prescriptions to a legitimate pain patient, but sends the extras to a friend who fills them and resells the pills) to pure identity theft (a hacker gains medical identities and sells them to people who use the unwitting victim's insurance to pay for their care). 

Medical identity theft can be much more lucrative that stealing credit card info, since the medical information is more persistent and the credit card info is more transitory (you can get a new credit card number, not a new medical history).  That said, you need a purchaser who needs healthcare to complete a medical identity theft, whereas credit card info can always be used immediately.

No comment yet.

What HIPAA doesn't cover | Healthcare IT News

What HIPAA doesn't cover | Healthcare IT News | HIPAA Compliance for Medical Practices | Scoop.it

Sure, HIPAA adds a layer of privacy protection for certain health data -- if organizations actually comply with it -- but there remains myriad avenues of mining health data and selling to the highest bidder that do not fall under the purview of HIPAA's privacy and security rules. And they may surprise you.    Anything from what health data one Googles, to what medical products you purchase through online retailers are fair game for data brokers. What's more, these companies are not liable under HIPAA and are able, without an individual's consent, to track and collect health data for various purposes, says a new July report from the California Healthcare Foundation.    [See also: FTC calls out data brokers on privacy.]   Often unknown by consumers, data elements including Googling for health data; using medical-related social networks; purchasing health products through online retailers; entering retail store preferences and locations into smartphones; or even buying any item related to health like fast food and cigarettes, can all be tracked.    "Even consumer footprints that are not expressly about health can be used to help determine a person's physical or mental health. How we shop, the magazines we subscribe to, where we hang out on the week -- this information is relatively easy to purchase by third parties," wrote Jane Sarasohn-Kahn, health economist and author of the report.    Sarasohn-Kahn pointed to a 2014 report from 60 Minutes covered by Tim Sparapani, former director of public policy for Facebook, in which he said, "You can buy from any number of data brokers, by malady, the list of individuals in America who are afflicted with a particular disease or condition."   Sure, oftentimes these data elements are collected and tracked not for malevolent purposes but rather for improving clinical outcomes and reducing costs. The report cites data mining as integral in bettering clinical trials and managing chronic disease for instance. One particular instance included designing a recruitment strategy for a Hepatitis C vaccine trial, where they located patient influencers on Twitter, contacted them and asked them to publicize the vaccine trial.    However, even with these seemingly positive end goals, many individuals and stakeholders have expressed concern over privacy rights and the current lack of transparency.     Even the Federal Trade Commission has expressed concernover the unfettered access these data brokers have to consumer health information, without the consumer's consent.    In a May report, FTC underscored the practices of nine data brokers and revealed that most consumers are unaware these brokers are collecting data. Just one of the data brokers in the report, Acxiom, had more than 3,000 data segments for nearly every U.S. consumer.    "To close these gaps, I urge Congress to consider legislation provisions – in addition to the provisions recommended by the Commission – that would create greater accountability for data supplies, data brokers and data broker clients," wrote FTC Commissioner Julie Brill in a May 27 statement to Congress.   Sarasohn-Kahn underlined several recommendations put forth by stakeholders on how to properly balance data sharing with consumers' privacy rights: 

  • Help people gain control. For some stakeholders, this means getting consent from consumers. And for others, consent fails to offer "meaningful protections."
  • Simplify the fragmented regulatory environment.
  • Consider personal health data locker and clouds. 

No comment yet.

Big Data in Health Care: Using Analytics to Identify and Manage High-Risk and High-Cost Patients - CHCF.org

Big Data in Health Care: Using Analytics to Identify and Manage High-Risk and High-Cost Patients - CHCF.org | HIPAA Compliance for Medical Practices | Scoop.it

As a result of greater adoption of electronic health records, health care organizations have increased opportunities to analyze and interpret large quantities of patient information, known as big data, to better manage high-risk and high-cost patients.

The July 2014 issue of the journal Health Affairs explores the promise of big data to improve health care. In one article, supported by CHCF, the authors examine six examples in which mining big data can improve care and reduce expenses in hospital settings:

  1. Identifying high-cost patients can in turn determine which patients are most likely to benefit from interventions and which care plans can best improve care.
  2. Using predictive algorithms to foresee potential readmissions can enable more precise interventions and care coordination after discharge.
  3. Integrating triage algorithms into the clinical workflow can help manage staffing, patient transfers, and beds.
  4. Some ICUs are using analytics to evaluate multiple data streams from patient monitors to predict whether a patient's condition is likely to worsen.
  5. By uncovering unique data patterns, such as prescription drug use and vital sign changes,  other systems can help prevent renal failure, infections, and adverse drug events.
  6. Data from multisite disease registries and clinical networks will help manage patients with chronic conditions that span more than one organ system.

While big data and analytics are powerful tools, the authors say more systematic evaluation is needed to move from potential to realization in many areas. And questions remain on how to regulate analytics and provide adequate patient privacy.

No comment yet.

Medable promises an easy way to make health apps comply with health data laws

Medable promises an easy way to make health apps comply with health data laws | HIPAA Compliance for Medical Practices | Scoop.it

Many health-related apps and devices will be hitting the market in the next year or two. And the data that these apps and devices collect could help your doctor provide a more holistic picture of your health.

But, as I wrote a few weeks ago, when that health data crosses the line from consumer health cloud into the healthcare delivery system, HIPAA privacy rules will come into play.

One company, started by a Stanford physician, has foreseen this challenge to device and app developers, and is offering a way to easily comply with HIPAA’s often stringent rules. These “medical grade” apps can then safely share data with clinical systems.

“With Medable, mobile apps can make it easy for users to communicate with their doctors, nurses, and caregivers, and also to provide them with any kind of data originating from their mobile devices,” company co-founder Dr. Michelle Longmire tells VentureBeat. “That lets everyone receive the data, visualize it, and then communicate about it in a very natural way.”

Health app developers can use the platform to build new applications or to integrate Medable features into existing applications, Longmire says. Medable also offers numerous application features like patient and provider profiles, two-factor authentication, and “push” messaging. These features are delivered through a software development kit (SDK) and an application programming interface (API).

“If push messages are sent to care providers, they contain only the metadata, not any identifiable information,” Longmire explains. “So a physician might receive a message saying ‘an image is available for you,’ but the doctor would need to log in to get the image.”

Longmire says Medable uses the HL7 clinical data format, so it can integrate with, and exchange data with, any electronic health record system that uses HL7 format, and the majority of them do.

The main concern of HIPAA rules is guarding “protected health information” or “PHI” from the eyes of those who don’t need to see it for clinical purposes.

Longmire says the Medable platform encrypts all PHI in several ways — on the device, in transit and then on the Medable platform.

The Medable platform can also anonymize large amounts of clinical data so that researchers can study it. Additionally, Medable provides all of the capability needed for HIPAA auditing and clinical data reporting.

The bottom line is that Longmire’s platform gets app developers out of the privacy and compliance business, at least where it concerns sharing data with hospitals or medical groups.

“Medable allows developers to focus on the content of their apps, instead of on data security, which is not their specialty,” Longmire says.

The global health market was at $6 billion in 2013, but it’s projected to be a $26 billion market by 2017.

No comment yet.

Chinese Hackers Pursue Key Data on U.S. Workers - NYTimes.com

Chinese Hackers Pursue Key Data on U.S. Workers - NYTimes.com | HIPAA Compliance for Medical Practices | Scoop.it

WASHINGTON — Chinese hackers in March broke into the computer networks of the United States government agency that houses the personal information of all federal employees, according to senior American officials. They appeared to be targeting the files on tens of thousands of employees who have applied for top-secret security clearances.

The hackers gained access to some of the databases of the Office of Personnel Management before the federal authorities detected the threat and blocked them from the network, according to the officials. It is not yet clear how far the hackers penetrated the agency’s systems, in which applicants for security clearances list their foreign contacts, previous jobs and personal information like past drug use.

In response to questions about the matter, a senior Department of Homeland Security official confirmed that the attack had occurred but said that “at this time,” neither the personnel agency nor Homeland Security had “identified any loss of personally identifiable information.” The official said an emergency response team was assigned “to assess and mitigate any risks identified.”

One senior American official said that the attack was traced to China, though it was not clear if the hackers were part of the government. Its disclosure comes as a delegation of senior American officials, led by Secretary of State John Kerry, are in Beijing for the annual Strategic and Economic Dialogue, the leading forum for discussion between the United States and China on their commercial relationships and their wary efforts to work together on economic and defense issues.

Computer intrusions have been a major source of discussion and disagreement between the two countries, and the Chinese can point to evidence, revealed by Edward J. Snowden, that the National Security Agency went deep into the computer systems of Huawei, a major maker of computer network equipment, and ran many programs to intercept the conversations of Chinese leaders and the military.

American officials say the attack on the Office of Personnel Management was notable because while hackers try to breach United States government servers nearly every day, they rarely succeed. One of the last attacks the government acknowledged occurred last year at the Department of Energy. In that case, hackers successfully made off with employee and contractors’ personal data. The agency was forced to reveal the attack because state disclosure laws force entities to report breaches in cases where personally identifiable information is compromised. Government agencies do not have to disclose breaches in which sensitive government secrets, but no personally identifiable information, has been stolen.

Just a month ago, the Justice Department indicted a group of Chinese hackers who work for the People’s Liberation Army Unit 61398, and charged them with stealing corporate secrets. The same unit, and others linked to the P.L.A., have been accused in the past of intrusions into United States government computer systems, including in the office of the secretary of defense.

But private security researchers say the indictments have hardly deterred the People’s Liberation Army from hacking foreign targets, and American officials are increasingly concerned that they have failed in their effort to deter computer attacks from China or elsewhere. “There’s no price to pay for the Chinese,” one senior intelligence official said recently, “and nothing will change until that changes.”

The indictments have been criticized as long on symbolism and short on real punishment: There is very little chance that the Chinese military members would ever see the inside of an American courtroom, even if the F.B.I. has put their pictures on wanted posters.

“I think that it was speaking loudly and carrying a small stick,” said Dennis Blair, the former director of national intelligence during President Obama’s first term, who was a co-author of a report last year urging that the United States create a series of financial disincentives for computer theft and attacks, including halting some forms of imports and blocking access to American financial markets.

Not long after several members of Unit 61398 were indicted, security researchers were able to pin hundreds more cyberattacks at American and European space and satellite technology companies and research groups on a second Shanghai-based Chinese military unit, known as Unit 61486. Researchers say that even after Americans indicted their counterparts in Unit 61398, members of Unit 61486 have shown no signs of scaling back.

The same proved true for the dozen other Chinese military and naval units that American officials have been tracking as they break into an ever more concerning list of corporate targets including drone, missile and nuclear propulsion technology makers.

The intrusion at the Office of Personnel Management was particularly disturbing because it oversees a system called e-QIP, in which federal employees applying for security clearances enter their most personal information, including financial data. Federal employees who have had security clearances for some time are often required to update their personal information through the website.

The agencies and the contractors use the information from e-QIP to investigate the employees and ultimately determine whether they should be granted security clearances, or have them updated.

A representative of the Office of Personnel Management said that monitoring systems at the Department of Homeland Security and the agency office allowed them to be “alerted to a potential intrusion of our network in mid-March.”

In the past, the Obama administration has urged American companies to share intrusion information with the government and reveal breaches to consumers in cases where their personal information was compromised and could be used without authorization.

But in this case there was no announcement about the attack. “The administration has never advocated that all intrusions be made public,” said Caitlin Hayden, a spokeswoman for the Obama administration. “We have advocated that businesses that have suffered an intrusion notify customers if the intruder had access to consumers’ personal information. We have also advocated that companies and agencies voluntarily share information about intrusions.”

Ms. Hayden noted that the agency had intrusion-detection systems in place and notified other federal agencies, state and local governments about the attack, then shared relevant threat information with some in the security industry. Four months after the attack, Ms. Hayden said the Obama administration had no reason to believe personally identifiable information for employees was compromised.

“None of this differs from our normal response to similar threats,” Ms. Hayden said.

Technical Dr. Inc.'s insight:

Contact Details :
inquiry@technicaldr.com or 877-910-0004
- The Technical Doctor Team

No comment yet.

HIT vendors rely on security standards that don't meet HIPAA requirements

HIT vendors rely on security standards that don't meet HIPAA requirements | HIPAA Compliance for Medical Practices | Scoop.it

Health IT vendors don't often protect electronic patient information in accordance with HIPAA, even when they and their provider clients think that they're in compliance with the law, according to a new article by Dan Schroeder, an attorney with Habif, Arogeti & Wynne in Atlanta.

Writing for the Health Law eSource, the monthly e-zine of the American Bar Association's Health Law Section, Schroeder points out that while the potential security risks of health IT companies are "very high," many of them are falling short on HIPAA compliance. For example, they're not conducting a risk analysis of potential threats and vulnerabilities regarding the data, a fundamental HIPAA requirement.

Health IT vendors and the providers who use them are expected to come under increased scrutiny, particularly over the next year, according to one attorney with the U.S. Department of Health and Human Services Office for Civil Rights. Both the Office of Inspector General and OCR have announced their intention to targeting cloud vendors and other business associates to ensure that patient data is adequately protected pursuant to HIPAA requirements.

Some vendors erroneously rely on alternative security standards as evidence that they adequately protect patient information. For instance, many health IT companies believe that obtaining a Service Organization Control (SOC) 1 Report--also known as an SSAE 16--is sufficient to comply with HIPAA. SOC 1 Reports, which are prepared by a certified public accountant in accordance with guidelines from the American Institute of Certified Public Accountants (AICPA), attest to a company's internal controls. However, they apply only to financial reporting, such as debits and credits.

"A basic Internet search uncovers numerous HIT companies that offer up SOC 1 reports as evidence that they have fulfilled their HIPAA responsibilities, even though AICPA standards explicitly restrict the report from being used to address operational and compliance risks [e.g., security, privacy, integrity and availability risks]," he warns.

Technical Dr. Inc.'s insight:

Contact Details :
inquiry@technicaldr.com or 877-910-0004
- The Technical Doctor Team

No comment yet.

Thousands of hospitals making simple cyber security error, exposing devices

Thousands of hospitals making simple cyber security error, exposing devices | HIPAA Compliance for Medical Practices | Scoop.it

Drug infusion pumps that can be manipulated from afar, defibrillators than can be programmed to deliver random shocks, refrigerators whose temperature settings can be reset, these are some of the cybersecurity problems uncovered by Scott Erven, the head of information security for healthcare facility operator Essentia Health.

It took Erven's team only half an hour to find another healthcare organization that was exposing information about 68,000 systems, including at least 488 cardiology systems, 332 radiology systems and 32 pacemakers, according to Wired Magazine.

"Now we know all the targeted info and we know that systems that are publicly connected to the internet are vulnerable to the exploit," Erven told Wired. "We can exploit them with no user interaction… [then] pivot directly at the medical devices that you want to attack."

The problem stems from poorly configured settings on the Server Message Block protocol that allows information like computer IDs to be shared publicly instead of just with select staff. And Erven said thousands of other healthcare organizations around the globe are making the same mistake.

Computer viruses exploiting the information can then be sent to hospitals via spam emails. Worst of all, if the computer ID contains a doctor's name, as it sometimes does, that information can be used to target individual patients, the article says. 

While shocking, news of poor cybersecurity in the med tech and healthcare industries shouldn't be "news" anymore. On June 23, Medtronic ($MDT) said that it, along with two other large medical device manufacturers, discovered an "unauthorized intrusion" to its systems last year that could be traced back to hackers in Asia. The company also disclosed that it lost an unnamed number of patient records from its diabetes unit in a separate incident, but does not know what type of information was included in the records.

The FDA has taken notice and experts say it will soon start rejecting devices that aren't secure. In addition, growing concerns from patients could jolt companies and hospitals into action. A fictional cyber attack on the TV show Homeland and increased media attention have brought the issue to life.

Technical Dr. Inc.'s insight:

Contact Details :
inquiry@technicaldr.com or 877-910-0004
- The Technical Doctor Team

No comment yet.

Security tips from the health IT pros | Healthcare IT News

Security tips from the health IT pros | Healthcare IT News | HIPAA Compliance for Medical Practices | Scoop.it

As anyone who's ever worked for IT security can attest, the job is no walk in the park. New threats, compliance mandates, vulnerabilities and updates are constant. But with strong leadership, and a culture of compliance and responsibility to match, many healthcare organizations have shown it can be done right -- and well.   Beth Israel Deaconess Medical Center's Chief Information Officer John Halamka, MD, said for this kind of career, it's a matter of first understanding that, "a CIO has limited authority but infinite accountability." You have to ask, "How do you reduce risk to the point where government regulators and, more importantly, patients will say, 'What you have done is reasonable?'" he said.   [See also: Hacker calls health security 'Wild West'.]   This involves thinking about how to encrypt every device and how to protect the data center from both internal and external attacks.

"Much of what I have to do is meet with my business owners and ask, 'What are the risks? Reputational risks? Patient privacy breach risks? Data integrity risks? We're never going to be perfect," he added. "But we can put in place, what I call a 'multilayer defense.'"   Another fundamental piece to doing privacy and security right? No surprise here: Get your risk analysis done – and done properly.

"This is the single most important document as part of the OCR investigation," said Lynn Sessions, partner at BakerHostetler, who focuses on healthcare privacy. "(OCR is) asking for the current one; they are asking for two, three, five years back. They want to see the evolution of what was going on from a risk analysis standpoint at your institution to see if you were appreciating the risk."   This includes showing the safeguards your organization has put in place from technical, physical and administrative standpoints, explained Sessions. Things such as staff training and education, penetration tests, cable locks or trackers for unencrypted devices all matter.    Time to encrypt   "Encrypt; encrypt; encrypt," said Sessions. It's a safe harbor for the HIPAA breach notification requirements, but that still fails to motivate some.    [See also: Hacker calls health security 'Wild West'.]   "(Physical theft and loss) is the biggest hands down problem in healthcare that we are seeing," said Suzanne Widup, senior analyst on the Verizon RISK team, discussing the 2014 annual Verizon breach report released in April. "It really surprises me that this is still such a big problem ... other industries seem to have gotten this fairly clearly."   According to OCR data, theft and loss of unencrypted laptops and devices account for the lion's share of HIPAA privacy and security breaches, nearing 60 percent. (Hacking accounts for some 7 percent, and unauthorized disclosure accounts for 16 percent).   "Pay attention to encryption, for any devices that can leave the office," said former OCR deputy director for health information privacy Susan McAndrew at HIMSS14 this past February.   Of course, the healthcare breach numbers are going to be slightly higher because the federal government has mandated specific HIPAA privacy and security breach notification requirements for organizations, but that has no bearing on the reality that these organizations still fail to implement basic encryption practices, Widup pointed out.    Sessions conceded that it is a pricing concern. "At a time where reimbursements are going down and technology costs are going up with the advent of the electronic health record, there are competing priorities within a healthcare organization of where they can spend their money."   A 2011 Ponemon Institute report estimated full disk encryption costs to be around $232 per user, per year, on average, a number representing the total cost of ownership. And that number could go as high as $399 per users, per year, the data suggest.    Kaiser Permanente Chief Security Officer and Technology Risk Officer Jim Doggett, however, said encryption presents a challenge not only because of costs but also because of the data itself. "The quantity of data is huge," he told Healthcare IT News.    The 38-hospital health system encrypts data on endpoint devices in addition to sensitive data in transit, said Doggett, who currently leads a 300-person technology risk management team, in charge of 273,000 desktop computers, 65,000 laptops, 21,700 smartphones and 21,000 servers. And don't forget the health data of some 9 million Kaiser members Doggett and his team are responsible for.

"This kind of scale presents unique challenges, and calls for the rigor and vigilance of not only the technology teams but of every staff member across Kaiser Permanente," he added. 

No comment yet.

The HHS/OCR Hit List for HIPAA Audits

The HHS/OCR Hit List for HIPAA Audits | HIPAA Compliance for Medical Practices | Scoop.it

As the HHS Office for Civil Rights analyzes breach reports for vulnerabilities, it has learned lessons on areas where covered entities should pay particular attention to their HIPAA compliance efforts. With OCR hoping soon to launch a permanent random HIPAA Audit program, the agency has reiterated six core ways to avoid common types of breaches, which will be among the targeted focus areas of audits.

No comment yet.

Privacy and security experts: mHealth requires a new approach | mHealthNews

Privacy and security experts: mHealth requires a new approach | mHealthNews | HIPAA Compliance for Medical Practices | Scoop.it

The proliferation of mobile devices in healthcare, from smartphones and tablets to the clinical devices themselves, is forcing healthcare executives to take a new approach to privacy and security.

Gone is the "security cop" approach, in which staff and employees are simply told what they can and can't use and do. Instead, we're seeing a "business enablement" approach, in which privacy and security concerns are woven into the workflow.

The reasoning behind this, says Jim Doggett, Kaiser Permanente's senior vice president, chief security officer and chief technology risk officer, is that cybercrime is an industry now, and the old method of "do it my way or else" won't work any more. With new ways of delivering healthcare must come new ways of protecting it.

"We're a bit out of alignment," Doggett said during a recent presentation at the HIMSS Media Privacy and Security Forum. "We're still solving yesterday's problems when we need to be solving today's and tomorrow's problems."

To wit: Doggett said he wanted to determine how to best implement a new policy on privacy and security. He tailed a physician during a normal workday, and watched the man log on and off and back onto various systems "maybe 50 times." Doggett said he realized the doctor wasn't going to adopt any new privacy and security rule that added to his workload, and would in fact welcome something that improved it.

The answer: Don't just establish a policy and enforce it; work with doctors, nurses and other staff members to see how it can best be implemented.

That was the thinking prevalent during the first day of the two-day forum, being held in San Diego. Healthcare is changing so much as it is, so privacy and security methods have to be woven into those changes. If mHealth and telemedicine are going to improve healthcare delivery over the coming years, develop privacy and security platforms that enhance those methods, rather than pushing people away or hindering adoption.

The takeaway for mHealth enthusiasts during the first day of the conference is that privacy and security has to become more fluid – rigid rules just won't work any more – and mindful of the fact that sensitive data is moving in and out of the enterprise in more ways and on more devices.

Mobile devices and social media "are really big areas of compliance concern," said Iliana L. Peters, senior advisor for HIPAA compliance and enforcement with the U.S. Health and Human Services Department's Office for Civil Rights. She said too many healthcare providers aren't taking this seriously. "They neglect to acknowledge where their data is or the risk to that data."

Encryption of data has to become the norm, rather than a suggested policy.

"If your entity is not encrypting, it should be," she said.

And doctors and nurses have to be made to understand that protection of sensitive data is "a part of efficient healthcare." Michael Allred, Intermountain Healthcare's information security consultant and identity and access team manager, said clinicians are the toughest to educate and may be frustrated with privacy and security efforts, but one breach could cost them and their institution much in terms of reputation and money.

No comment yet.

Key privacy rule could fall to accountable care push | Vital Signs | The healthcare business blog from Modern Healthcare

Key privacy rule could fall to accountable care push | Vital Signs | The healthcare business blog from Modern Healthcare | HIPAA Compliance for Medical Practices | Scoop.it

History may look back on last week as an inflection point for privacy and technology in the healthcare industry.

That's because what happened makes it possible that a bulwark federal privacy rule will become a casualty of the push to accountable care, patient-centered medical homes and other population-health oriented care plans.

If the considered rule change happens, proponents of these care plans could have broader access to the medical records of patients of drug and alcohol abuse programs without those patients' consent. That will help healthcare providers afford those patients better coordinated, higher quality and more cost efficient care, these proponents say.

Opponents of the rule change warn, however, without the law's current stringent consent requirements, drug and alcohol abuse patients will avoid seeking treatment, out of concern their stigmatizing and or illegal activity will be exposed, a situation the rule, created in the 1970s, sought to avoid.

“I think what will happen is you'll see some people who will be in substance abuse treatment either won't get it or will stop confiding as much as they do in their therapist,” said Jim Pyles, a Washington privacy lawyer who testified last week on behalf of the American Psychoanalytic Association and in favor of maintaining a stringent federal privacy rule covering these patient records.

Here are three actions last week around which federal technology privacy policy may turn and why:

A federal regulatory advisory panel last Tuesday accepted recommendations from its privacy workgroup that would put off until 2017 the introduction of some narrow and largely voluntary privacy protection criteria under the electronic health-record incentive payment program of the American Recovery and Reinvestment Act of 2009. The policy recommendations are for technology to protect the privacy of behavioral health patient information. The federal privacy workgroup, the Privacy and Security Tiger Team, has been looking at certain privacy protection technology since 2010. But it has not recommended that the feds put a regulatory stake in the ground, telling developers of electronic health-record systems and information exchange systems that they should add this technology to their own systems, or encouraging healthcare providers to incorporate the technology in their workflows.

On Wednesday, the Substance Abuse and Mental Health Services Administration, in a day-long listening session, heard conflicting testimony on whether it should consider modifying the federal privacy rule, 42 CFR Part 2, covering the transmission and sharing of medical records of many drug and alcohol abuse patients.

Many “general” healthcare providers aren't using substance-abuse treatment data because they don't have the technology to help them handle it efficiently and in compliance with the law.

But if SAMHSA continues its unflagging support for the special rule for handling substance abuse information, it may force technology developers to incorporate privacy-protecting technology into their systems, and induce providers to use it, affording better protection for all healthcare data, privacy advocates say. If the rule is weakened, however, that technology may never be rolled out.

Finally, on Thursday came the news that privacy advocate Joy Pritts, the first chief privacy officer at the Office of the National Coordinator for Health Information Technology at HHS, would be stepping down after 4 ½ years on the job.

Pritts praised an early implementation of data segmentation technology for behavioral health demonstrated by EHR developer Cerner Corp. at this year's Health Information and Management Systems Society, adding her hope that “other vendors follow that lead.” The concern, expressed by several privacy advocates bemoaning her July departure, is that her successor— unknown at this pivotal moment—might not be as stalwart an advocate for patients' rights and data segmentation technology as Pritts has been.

Development and adoption of the technical capabilities to affix so-called “meta-data tags” to patient records—which also would aid in interoperability and research as well as privacy protection—was urged by the President's Council of Advisors on Science and Technology in 2010 and by the JASON, a group of top scientists working for the Agency for Healthcare Research and Quality, this April.

SAMHSA, itself, sponsored one of six ONC “Data Segmentation for Privacy” pilot projects to test the technology. But the agency is under considerable pressure to ease regulatory restrictions on the flow of this data, rather than hold firm and press the industry to adopt technology that will help providers comply with the consent provisions of 42 CFR Part 2.

Either way, SAMHSA's decision will likely have a wider impact on privacy protections than the scope of that rule now.

No comment yet.