HIPAA Compliance for Medical Practices
65.0K views | +0 today
Follow
HIPAA Compliance for Medical Practices
HIPAA Compliance and HIPAA Risk management Articles, Tips and Updates for Medical Practices and Physicians
Your new post is loading...
Your new post is loading...
Scoop.it!

HIPAA Compliance and EHR Access

HIPAA Compliance and EHR Access | HIPAA Compliance for Medical Practices | Scoop.it

In light of the recent massive security breaches at UCLA Medical Center and Anthem Blue Cross, keeping your EHR secure has become all the more important. However, as organizations work to prevent data breaches, it can be difficult to find a balance between improving security and maintaining accessibility. To that end, HIPAA Chat host Steve Spearman addresses digital access controls, common authentication problems, and how authentication meets HIPAA compliance and helps ensure the integrity of your EHR, even after multiple revisions.


Q: What are access controls?


A: Access controls are mechanisms that appropriately limit access to resources. This includes both physical controls in a building, such as security guards, and digital controls in information systems, such as firewalls. Having and maintaining access controls are a critical and required aspect of HIPAA compliance, and is the first technical HIPAA Security Standard.


Q: What’s the most common form of digital access control we see in healthcare?


A: The username and password is the most common form of access control by far. The Access Control Standard requires covered entities to give each user a distinct and unique user ID and password in order to access protected information. These unique credentials for each employee enable covered entities to confirm (“authenticate”) the identity of users and to track and audit information access.


Q: What are the most common problems with access controls and use of passwords in healthcare?


A: The most common problem is that covered entities often use multiple systems which each may require its own set of usernames and passwords along with varying requirements for these credentials, such as minimum character length or use of capital letters. Memorizing multiple sets of passwords and usernames for multiple systems is difficult for most people. In addition, there is a conundrum between password complexity and memorization. Complex passwords (longer with multiple required character types) are better for security but much harder to memorize. This is the conundrum.


Q: Are stricter password policies always more secure?


A: No, if passwords requirement are too strict, users then use coping mechanisms such as writing them down or re-using the same password over and over and across multiple systems. This compromises security rather than enhancing it. For example, a policy that required 14 digit passwords and required, lower-case, upper-case, numbers and symbols and expired every 30 days would create huge problems for most organizations. With these policies, staff would simply write down their passwords. But this compromises security. If a bad person gets a hold of a written list of passwords they have the “keys to the kingdom”, the ability to access the accounts on that written list. So passwords should not be written down.

In addition, overly strict password policies tend to overwhelm technical support staff with password reset requests.

So passwords should be sufficiently complex to make them hard to crack which also makes them hard to memorize.


Q: This sounds like a big problem. Do you have any suggestions to make things better?


A: At a minimum, organizations need to provide training to staff on straightforward techniques to create memorable but complex passwords. I have an exquisitely terrible memory. But I have great passwords using one particular technique. Just google “create good memorable passwords” and you can find dozens of videos demonstrating how to do it. But, of course, our favorite is the video featuring our very own, Gypsy, the InfoSec Wonderdog.


Enterprises should seriously consider additional technical solutions such as two factor authentication with single sign on (2FA/SSO).


Q: What is a good, reasonable password policy?


A: I recommend a policy that:


  • Requires a minimum of 8 characters
  • Requires two or three of the options of lower-case, upper-case, numbers and symbols
  • Expire every 3 to 6 months
  • And limit limit use of historical passwords so that the previous two cannot be used.


Q: You mentioned authentication before. What is that? What is two-factor or multi-factor authentication?


A: Authentication is the process of confirming the identity of a person before granting access to a resource. Computer geeks refer to the three factors of authentication:


  • What a user has (an ID badge or phone).
  • What a user knows (a PIN number)
  • Who a user is (biometrics)


For example, ATMs use two-factor authentication:

  1. What the user has: an ATM card and
  2. What they know: a PIN.


One of my favorite tools for two factor authentication is Google Authenticator which runs as an app on my mobile phone. Another common form of two factor authentication is text codes. With this method, the website or app, after entering a correct username and password, sends a text with a numeric code that expires after a few minutes to your phone that is entered into another field in the website before access is granted.


Everyone should enable two factor authentication on their most essential systems such as to online banking and to email accounts such as gmail.


In healthcare, there is a growing trend toward biometric authentication, the use of fingerprint readers or palm readers, etc. to authenticate into systems. Biometric authentication is generally very secure and is also very easy to use since there is nothing to memorize.


Q: What is SSO?


A: Single sign-on (SSO) lets users access multiple applications through one authentication event. In other words, one password allows access to multiple systems. It enhances security because users only have to remember one password. And because it is just one, it is commonly a good complex password. Once entered, it will allow access to all the core systems (if enabled) without having to re-authenticate.


Single sign-on combined with two factor authentication or biometrics work great together in tandem and are often sold together by vendors. The leading SSO/2FA vendor in healthcare is Imprivata, but there are other vendors making great in-roads into healthcare such as Duo Security2FA.com and Secureauth.com.


Q: What do you mean by “integrity” and what does it have to do with access control and authentication?


A: Integrity in System Standards is the practices used to track and verify all changes made to a health record. It is a condition that allows us to prevent editing or deleting of records without proper authorization.


Authentication and access controls are the primary means we use to preserve integrity of a record. If the information system is programmed to track its users’ activity, then it’s possible to track who made changes to a record and how they changed it.


This is why users should never share usernames and passwords with other users. Integrity becomes impossible if a username does not signify the same user every time it appears.


Q: Any final thoughts?


A: Finding that balance between HIPAA compliance, security and accessibility can be tricky. We recommend reducing digital access controls to a single multi-factor authentication or biometrics event. This single, secure method of authentication could be the balance between security and efficiency needed to keep your EHR secure and yet accessible. In addition to improving accessibility to your system, an MFA or biometrics sign-in method could help improve your organization’s EHR integrity.

more...
No comment yet.
Scoop.it!

Cybersecurity: Things Are Getting Worse, But Need to Get Better

Cybersecurity: Things Are Getting Worse, But Need to Get Better | HIPAA Compliance for Medical Practices | Scoop.it

In his opening keynote address at the CHIME Lead Forum at iHT2-Denver, sponsored by the Ann Arbor, Mich.-based College of Healthcare Information Management Executives (CHIME) and by the Institute for Health Technology Transformation (iHT2—a sister organization of Healthcare Informatics through our parent company, the Vendome Group LLC), being held at the Sheraton Downtown Denver, Mac McMillan laid out in the clearest possible terms for his audience of IT executives the growing cybersecurity dangers threatening patient care organizations these days.


Under the heading, “What Is Cyber Security and Why Is It Crucial to Your Organization?” McMillan, the CEO of the Austin, Tex.-based CynergisTek consulting firm, used his opening keynote address to challenge his audience to think strategically and proactively about the growing cyber-threats hitting patient care organizations across the U.S.

McMillan elaborated on what he sees as 11 key areas of concern going forward right now for healthcare IT leaders: “increased reliance”; “insider abuse”; “questionable supply chains”; “device-facilitated threats”; “malware”; “mobility”: “identity theft and fraud”; “theft and losses”; “hacking and cyber-criminality”; “challenges emerging out of intensified compliance demands”; and a shortage of chief information security officers, or CISOs.


In fact, McMillan said, cybersecurity threats are accelerating and intensifying, and are coming through such a broad range of threat vehicles—hacking by criminal organizations and foreign governments, penetration of information networks via the deliberate infiltration via medical devices, and a crazed proliferation of all types of malware across the cyber universe, that the leaders of patient care organizations must take action, and take it now, he urged.


As for “increased reliance,” the reality, McMillan noted, is that “We live in a world today that is hyper-connected. When I left the government and came back into healthcare in 2000,” he noted, “probably the total number of people who looked at any patient record, was about 50, and all were hospital employees. Today, that average is more like 150, and half of those individuals are not hospital employees. And our systems are interconnected. Digitizing the patient record, under meaningful use, coincided with the rise in breaches. Not that any of that is bad,” he emphasized. “But it did become easier for bad people to do bad things; it also increased the number of mistakes that could be made. If I wanted to carry out paper medical records” in the paper-based world, he noted, “I was limited to the number I could put into a basket. Now, I can download thousands at a time onto a flash drive.”


With regard to “insider abuse,” McMillan made a big pitch for the use of behavior pattern recognition strategies and tools. “We have to actively monitor what’ going on,” he urged. “It doesn’t mean running random audits. You have to actively monitor activity, and you can’t do that manually, and we have to recognize that. Also, a lot of activity, particularly identity theft, is not captured by monitoring compliance rules, but rather, by capturing activity patterns. The fact that someone looks at information four times the frequency that their neighbor does—the fact that an individual is looking at four times as many records, is absolutely a flag. They’re either working four times as hard/fast, or are snooping, or are engaged in nefarious activities. But fewer than 10 percent of hospitals are actively monitoring behavior patterns.”


McMillan was totally blunt when it came to discussing “questionable supply chains.” “I’ll just come out and say it: vendors are a threat,” he told his audience. “We’ve had cases where vendors have been hacked or have had incidents, and the vendor didn’t have a good procedure for restoration or what have you. We need to do a better job of vetting our vendors, of holding them to a higher standard for performance. And this industry needs to create a better baseline—basic requirements—if you connect my network, this is how you have to connect, this is the basic level of encryption required, that kind of thing. This is about creating and adhering to minimal requirements, not creating a new framework,” he said. “We’re already got a million frameworks out there.”


What about medical devices? The threats there are absolutely exploding, McMillan said. He noted that successful hacks have now been documented via such devices as insulin pumps and blood pumps, all of which are relatively recent, as most medical devices weren’t networkable until at least 2006.


Meanwhile, the malware explosion dwarfs just about all other issues, at least in terms of volume. At the beginning of last year, McMillan reported, there were 100 million instances of malware floating around; by the end of the year, there were 370 million. Importantly, he noted, “Malware is no longer produced by smart people in dark rooms writing code. It’s now being produced by bots morphing old malware. And this is putting more pressure on people in terms of the integrity of the environment.” He warned his audience that “The anti-virus products we have today are antiquated products. Less than half of the malware out there is recognized by anti-virus anymore; if you’re relying on antivirus, you’ve already lost the battle. In the next decade,” he predicted, “we’ll move from a speed of computing of 10 to the 8th power, to one of 10 to the 26th power—that’s how fast we’ll be computing. That’s phenomenal. So decisions will be made by computers so fast that any technology relying on signatures to be looked up, will be blown by. It will never keep up. So our security vendors have got to get ahead of this curve, have got to recognize that this whole paradigm we’re dealing with is changing, and we’ve got to change the way we act around this.”


With regard to the rest of the 11 key areas he cited, McMillan made a number of important comments. Among them, with regard to mobility and data, he said, “We’ve got to quit chasing the device. I’ve said this for the better part of five years now. If we chase the device, we’ll never catch up. We’ve got to focus on how the devices connect the environment and how we register and protect those devices.” Meanwhile, he emphasized that while hacking and cyber-criminality represented only 10 percent of data breaches only two years ago, breaches created by hacking and cyber-criminality are now surging.


A lot of these challenges really require a level of IT security management and governance that remains lacking in U.S. healthcare, McMillan said. “I absolutely believe that we need more CISOs in healthcare. I think we need to improve the education of our CISOs and need to help professionalize them. We need to find ways for CIOs to collaborate. That’s the way we help everyone benefit and get ahead.”

more...
No comment yet.
Scoop.it!

Hospital with repeat security failures hit with $218K HIPAA fine

Hospital with repeat security failures hit with $218K HIPAA fine | HIPAA Compliance for Medical Practices | Scoop.it

Does your hospital permit employees to use a file-sharing app to store patients' protected health information? Better think again. A Massachusetts hospital is paying up and reevaluating its privacy and security policies after a file-sharing complaint and following a HIPAA breach. 


St. Elizabeth's Medical Center in Brighton, Mass. – a member hospital of Steward Health Care system – will pay $218,400 to the Office for Civil Rights for alleged HIPAA violations. The settlement resulted from a 2012 complaint filed by hospital employees, stating that the medical center was using a Web-based document-sharing application to store data containing protected health information. Without adequately analyzing the security risks of this application, it put the PHI of nearly 500 patients at risk.


"Organizations must pay particular attention to HIPAA's requirements when using Internet-based document sharing applications," said Jocelyn Samuels, OCR director, in a July 10 statement announcing the settlement. "In order to reduce potential risks and vulnerabilities, all workforce members must follow all policies and procedures, and entities must ensure that incidents are reported and mitigated in a timely manner."


It wasn't just the complaint that got St. Elizabeth's in hot water, however. A HIPAA breach reported by the medical center in 2014 also called attention to the lack of adequate security policies. The hospital notified OCR in August of last year of a breach involving unsecured PHI stored on the personal laptop and USB drive of a former hospital employee. The breach ultimately impacted 595 patients, according to a July 10 OCR bulletin.


As part of the settlement, St. Elizabeth's will also be required to "cure the gaps in the organization's HIPAA compliance program," OCR officials wrote in the bulletin. More specifically, this includes conducting a self-assessment of its employees' awareness and compliance with hospital privacy and security policies. Part of this assessment will involve "unannounced visits" to various hospital departments to assess policy implementations. Officials will also interview a total of 15 "randomly selected" employees with access to PHI. Additionally, at least three portable devices across each department with access to PHI will be inspected.


Then there's the policies and training piece part of the settlement. With this, St. Elizabeth's based on the assessment, will submit revised policies and training to HHS for approval.


In addition to the filed complaint and the 2014 breach, the medical center also reported an earlier HIPAA breach in 2012when paper records containing billing data, credit card numbers and security codes of nearly 7,000 patients were not properly shredded by the hospital. Some of the files containing the data were reportedly found blowing in a field nearby.


To date, OCR has levied nearly $26.4 million from covered entities and business associates found to have violated HIPAA privacy, security and breach notification rules.


The largest settlement to date was the whopping $4.8 million fine paid by New York Presbyterian Hospital and Columbia University Medical Center after a single physician accidentally deactivated an entire computer server, resulting in ePHI being posted on Internet search engines. 

more...
Gerard Dab's curator insight, July 16, 2015 8:05 PM

Security! Security! Security!

#medicoolhc #medicoollifeprotector

Scoop.it!

Florida Hospital faces two data breach lawsuits

Florida Hospital faces two data breach lawsuits | HIPAA Compliance for Medical Practices | Scoop.it

Florida Hospital is facing two possible class action lawsuits regarding two separate data breaches of patient information over the past four years.


The hospital is battling both suits, and has recently submitted motions to toss them both out.


The first data breach, revealed in August 2011, involved Florida Hospital employees Dale Munroe and Katrina Munroe combing through thousands of patient records and selling data to lawyers and chiropractors. Both employees were fired and charged criminally.



The second breach, discovered in May 2014, involved two employees printing portions of medical records for at least 9,000 patients for over two years. Those employees were also fired but not named in the lawsuits. That breach was allegedly discovered by state investigators of a criminal case.



The first lawsuit is handled by a Chicago-based law firm Edelson, and local attorney Edmund Normand. The named plaintiffs in that case are Richard Faircloth, who was a patient at Florida Hospital's Apopka campus, and Consuelo Armesto, a former patient at Florida Hospital's Altamonte Campus. A new hearing is coming up soon regarding Florida Hospital’s motion to dismiss the Faircloth case.


Attorney John Yanchunis of Orlando law firm Morgan & Morgan is handling a case tied to the May 2014 breach. The named plaintiffs in that case are Heather and Sebastian Peralta of Altamonte, and their daughter Janson Peralta.


The Peralta case, filed more recently, cites the previous case as evidence that the hospital has known about data breaches for a while now.


“Hospital are good about delivering medical services. Other kinds of things, like this, they are not so good at, because it’s not their business,” Yanchunis said. “But that must change now, and there’s a movement now to install systems to better detect access to information.”


Florida Hospital and its attorneys did not immediately respond to phone calls and emails about the lawsuits, which are both pending in Orange County Circuit Court.


But the hospital has argued that the lawsuits are missing an important fact: the plaintiffs haven’t suffered any identity theft, at least not yet.

Both lawsuits rely on allegations that the patients involved had “expected and paid for" data security at the hospital.


But Florida Hospital’s attorneys argue that no Florida court has recognized a fiduciary duty between a hospital and a patient. The hospital also argues that the plaintiffs can’t enforce federal HIPAA laws through private civil action, that they can’t sue based on “increased risk of identity theft.”


The hospital also argues that their employees were willfully violating the policies regarding HIPAA compliance and patient data security.

Data stolen from medical records is a common method used by identity thieves, especially for filing fake tax returns seeking bogus tax refunds.


There’s an additional wrinkle in the Peralta case. Yanchunis noted that the Peralta’s daughter isn’t even eligible for credit protection services yet, but that her data could be used in an identity theft years from now.

According to the court record, the Munroes were paid $10,000 by local chiropractor Sergei Kusyakov to pull out information on victims of motor vehicle accidents – some of whom then received calls from Kusyakov’s office with offers of chiropractic care. The Munroes and Kusyakov all pleaded guilty to the crimes.

more...
No comment yet.
Scoop.it!

Orlando Health reports data breach for 3,200 patients

Orlando Health reports data breach for 3,200 patients | HIPAA Compliance for Medical Practices | Scoop.it

Orlando Health said Thursday about 3,200 patients’ records were accessed illegally by one of its employees, who was fired during an investigation.



The hospital system said it discovered the data breach on May 27. A news release on Thursday, July 2, said it began notifying patients “today”, which would be more than 30 days after the breach.



According to the release, there was no evidence that the data was copied or used illegally, but Orlando Health reported the incident in accordance with its data breach policies.


Under Florida law, notice to victims of a data breach is required within 30 days, unless the custodian of records has determined that nobody suffered identity theft or any other financial harm.


The records included certain patients at Winnie Palmer Hospital for Women & Babies, Dr. P. Phillips Hospital and a limited number of patients treated at Orlando Regional Medical Center from January 2014 to May 2015.


Theft of patient information at health-related companies is one of the primary ways that tax refund fraud has been occurring in Florida, according to federal authorities. Thieves can use the information to submit a fake tax return in your name, claiming refunds that could prevent or delay a legitimate refund.


In the Orlando Health incident, stolen data may have included names, dates of birth, addresses, medications, medical tests and results, the last four digits of social security numbers, and other clinical information. The former employee may have also accessed insurance information in approximately 100 of those patient records.


Steve Stallard, corporate director for compliance and information security said in a statement that Orlando Health “deeply regrets any concern or inconvenience this may cause our patients or their family members.”


The organization is providing affected patients with call center and other support, the news release said.


Orlando Health has reported other data breaches, such as a March 2014 incident where over 500 child patient records were misplaced.

more...
No comment yet.
Scoop.it!

Unencrypted Device Breaches Persist

Unencrypted Device Breaches Persist | HIPAA Compliance for Medical Practices | Scoop.it

Although hacker attacks have dominated headlines in recent months, a snapshot of the federal tally of major health databreaches shows that stolen unencrypted devices continue to be a common breach cause, although these incidents usually affect far fewer patients.


As of June 23, the Department of Health and Human Services' Office for Civil Rights' "wall of shame" website of health data breaches affecting 500 or more individuals showed 1,251 incidents affecting nearly 134.9 million individuals.


Those totals have grown from 1,213 breaches affecting 133.2 million individuals in an April 29 snapshot prepared by Information Security Media Group.


The federal tally lists all major breaches involving protected health information since September 2009, when the HIPAA Breach Notification rule went into effect. As of June 23, about 52 percent of breaches on the tally listed "theft" as the cause.


Among the breaches added to the tally in recent weeks are about a dozen involving stolen unencrypted computers. Lately, those type of incidents have been overshadowed by massive hacking attacks, such as those that hit Anthem Inc.and Premera Blue Cross.


"Although we've seen some large hacking attacks, they are aimed at higher-profile organizations than the more typical provider organization," says privacy and security expert Kate Borten, founder of the consulting firm, The Marblehead Group. "Attackers know that these organizations have a very high volume of valuable data. But I continue to believe that unencrypted PHI on devices and media that are lost or stolen is 'the' most common breach scenario affecting organizations of any size."


Borten predicts that many incidents involving unencrypted devices will continue to be added to the wall of shame. "Getting those devices encrypted is an ongoing challenge when we expand the requirement to tablets and smartphones, particularly when owned by the users, not the organization," she says. "We also shouldn't overlook encryption of media, including tapes, disks and USB storage drives."

Unencrypted Device Breaches

The largest breach involving unencrypted devices that was recently added to the tally was an incident reported to HHS on June 1 by Oregon Health Co-Op., an insurer.


That incident, which impacted 14,000 individuals, involved a laptop stolen on April 3. In a statement, the insurer says the device contained member and dependent names, addresses, health plan and identification numbers, dates of birth and Social Security numbers. "There is no indication this personal information has been accessed or inappropriately used by unauthorized individuals," the statement says.

Also recently added to the federal tally was a breach affecting 12,000 individuals reported on June 10 by Nevada healthcare provider Implants, Dentures & Dental, which is listed on the federal tally as "doing business as Half Dental." The incident is listed as a theft involving electronic medical records, a laptop, a network server and other portable electronic devices.


In addition to the recent incidents involving stolen or lost unencrypted devices, several breaches added to the wall of shame involve loss or stolen paper records or film.


"Breaches of non-electronic film and paper will never end, but at least these breaches are typically limited to one or a small number of affected individuals," Borten says. Because many of the breaches involving paper or film are often due to human error, "effective, repeated training is essential" to help prevention of such incidents, she says.

Hacking Incidents Added

The largest breach added to the tally in recent weeks, however, is the hacker attack on CareFirst BlueCross BlueShield, which was reported on May 20 to HHS and affected 1.1 million individuals. Baltimore-based CareFirst has said that an "unauthorized intrusion" into a database dating back to June 2014 was discovered in April by Mandiant, a cyberforensics unit of security vendor FireEye, discovered the attack on CareFirst in April. Mandiant was asked by CareFirst to conduct a proactive examination of CareFirst's environment, following the hacker attacks on Anthem and Premera.


Another hacker incident added to the tally affected South Bend, Ind.-based Beacon Health System. That incident, reported to HHS on May 20, is listed as affecting about 307,000 individuals. The organization has said patients' protected health information, including patient name, doctor's name, internal patient ID number, and in some cases, Social Security numbers and treatment information, was exposed as a result of phishing attacks on some employees that started in November 2013. The attacks led to hackers accessing "email boxes" that contained patient information.

Addressing Multiple Threats

Healthcare organizations need to continue their efforts to protect data from the threats posed by cyber-attackers, insiders or street thieves, says Borten, the consultant.


"There's no simple answer, but security is complex, and so the solutions, or mitigating controls, must be numerous and varied."

more...
No comment yet.
Scoop.it!

Four Common HIPAA Misconceptions

Four Common HIPAA Misconceptions | HIPAA Compliance for Medical Practices | Scoop.it

While practices must work hard to comply with HIPAA, some are taking HIPAA compliance efforts a bit too far. That's according to risk management experts, who say there are some common compliance misconceptions that are costing practices unnecessary time and resources.

Here's what they say many practices are avoiding that they don't necessarily need to avoid, and some extra steps they say practices are taking that they don't necessarily need to take.


1. Avoiding leaving phone messages

While it's true that a phone message from your practice to a patient could be overheard by the wrong party, phone messages that contain protected health information (PHI) don't need to be strictly off limits at your practice, says Jim Hook, director of consulting services at healthcare consulting firm The Fox Group, LLC."Many offices adopt a blanket policy of, well, 'We can't leave you any phone messages because HIPAA says we can't,' and, that's really not true," he says. "You can always get consent from a patient on how they want to be communicated with."


Hook recommends asking all of your patients to sign a form indicating in what manner you are permitted to communicate with them, such as by mail, e-mail, text, and phone message. "If the patient says, 'Yes, you can call and leave me phone messages at this phone number I'm giving you,' then it's not a HIPAA violation to use that method of communication," he says.


2. Avoiding discussing PHI

It's important to safeguard PHI as much as possible, but some practices are taking unnecessary precautions, says Michelle Caswell, senior director, legal and compliance, at healthcare risk-management consulting firm Clearwater Compliance, LLC.


"I think there's still a fear among small providers ... that they can't discuss protected health information anywhere in the [practice]," she says. "They feel that they have to almost build soundproof walls and put up bulletproof glass or soundproof glass to prevent any sort of disclosure of protected health information, and that's not what HIPAA requires at all. HIPAA allows for incidental disclosures, [which] are disclosures that happen [incidentally] around your job. So if you've got a nurse and a doctor talking, maybe at the nurses' station, and someone overhears that Mr. Smith has blood work today, that probably wouldn't be a violation because it's incidental to the job. Where else are the doctors and nurses going to talk?"


As long as you are applying "reasonable and appropriate" safeguards, Caswell says you should be in the clear.


3. Requiring unnecessary business associate agreements

HIPAA requires practices to have written agreements, often referred to as business associate agreements (BAAs), with other entities that receive or work with their PHI. Essentially, the agreements state that the business associates will appropriately safeguard the PHI they receive or create on behalf of the practice.


Still, some practices take unnecessary precautions when it comes to BAAs, says Robert Tennant, senior policy adviser of government affairs for the Medical Group Management Association. "A lot of practices are very concerned about people like janitorial services [and] plant maintenance folks, and they have them sign business associate agreements, but those folks are not business associates for the most part," says Tennant. "You may want to have them sign confidentiality agreements basically saying, 'If you do come across any information of a medical nature, protected health information, you are not permitted to look at it, copy it, keep it ...,' But, you do not need to sign a business associate agreement with anybody other than those folks that you actually give PHI to for a specific reason, like if you've got a law office or accounting office or a shredding company that is coming in to pick up PHI to destroy it."


4. Requiring unnecessary patient authorizations

While it's critical to comply with HIPAA's requirement that only those who have a valid reason to access a patient's medical record, such as treatment purposes, payment purposes, or healthcare operations, have access to it — some practices are misconstruing that rule, says Tennant. "They demand patient authorization before they transfer data to another provider for treatment purposes," he says. "I understand why they do it, but it's one of those things that … can cause delays and confusion, and even some acrimony between the patient and the provider. If it's for treatment purposes specifically, you do not need a patient authorization."

more...
No comment yet.
Scoop.it!

Think Your Practice is HIPAA Compliant? Think Again.

Think Your Practice is HIPAA Compliant? Think Again. | HIPAA Compliance for Medical Practices | Scoop.it

You may think you know HIPAA inside and out, but experts say many practices and physicians are making mistakes regarding protected health information (PHI) that could get them into big trouble with the law. Here are nine of the most common compliance missteps they say practices and physicians are making.

1. TEXTING UNENCRYPTED PHI

For most physicians, texting is an easy, convenient, and efficient way to communicate with patients and colleagues. But if a text contains unencrypted PHI, that could raise serious HIPAA problems.


"One of the big things people are doing these days is texting PHI, and people seem to be ignoring the fact that text messages can be read by anyone, they can be forwarded to anyone, [and] they're not encrypted in any fashion when they reside on a telecommunications provider's server," says Jim Hook, director of consulting services at healthcare consulting firm The Fox Group, LLC. "People really need to understand that [short message service (SMS)] text messaging is inherently nonsecure, and it's noncompliant with HIPAA."


That's not to say that texting PHI is never appropriate, it just means that physicians must find a way to do so securely. While the privacy and security rules don't provide explicit text messaging guidelines, they do state that covered entities must have "reasonable and appropriate safeguards to protect the confidentiality, availability, and integrity of protected health information," says Michelle Caswell, senior director, legal and compliance, at healthcare risk-management consulting firm Clearwater Compliance, LLC. As a result, Caswell, who formerly worked for HHS' Office for Civil Rights, says physicians must consider, "What would I put on my [smart] phone to reasonably and appropriately safeguard that information?" Most likely, the answer will be a secure messaging service with encryption, she says, adding that many inexpensive solutions are available to providers.


2. E-MAILING UNENCRYPTED PHI

Similar to text messaging, many physicians are e-mailing unencrypted PHI to patients and colleagues. As Robert Tennant, senior policy adviser of government affairs for the Medical Group Management Association says, e-mailing is becoming ubiquitous in our society, and healthcare is no exception.


If your providers are e-mailing PHI, consider implementing a secure e-mail application; for instance, one that recognizes when content included in the e-mail contains sensitive information and therefore automatically encrypts the e-mail. Your practice could use the application to specify certain circumstances in which e-mails should be encrypted; such as the inclusion of social security numbers or credit card numbers. The application would then filter e-mails for that specified content, and when it finds that content, encrypt those e-mails automatically, says Caswell.


Another option is to use a secure e-mail application to set up filters to automatically encrypt e-mails sent with attachments, or encrypt e-mails when senders include a word like "sensitive" or "encrypt" in the subject line, she says. An added benefit of encrypting e-mail is if a potential breach occurs, like the theft of a laptop containing e-mails with PHI, that is not considered a reportable breach if the e-mails stored on the laptop are encrypted, says Tennant. "You don't need to go through all of the rigmarole in terms of reporting the breach to the affected individual, and ultimately, to the government," he says. "So it's sort of a get out of jail free card in that sense."


If your practice would rather prohibit the use of e-mail altogether, a great alternative might be a patient portal that enables secure messaging.


Finally, if patients insist on having PHI e-mailed to them despite the risks, get their permission in writing for you to send and receive their e-mails, says Tennant.


3. FAILING TO CONDUCT A RISK ANALYSIS

If your practice has not conducted a security risk analysis — and about 31 percent of you have not, according to our 2014 Technology Survey, Sponsored by Kareo — it is violating HIPAA. The security rule requires any covered entity creating or storing PHI electronically to perform one. Essentially, this means practices must go through a series of steps to assess potential risks and vulnerabilities to the confidentiality, integrity, and availability of their electronic protected health information (ePHI).


Though the security risk analysis requirement has been in place since the security rule was formally adopted in 2003, it's been pretty widely ignored by practices, says Hook. Part of the reason, he says, is lack of enforcement of the requirement until recently. Since conducting a security risk analysis is now an attestation requirement in the EHR incentive program, auditors are increasingly noting whether practices are in compliance.


4. FAILING TO UPDATE THE NPP

If your practice has not updated its Notice of Privacy Practices (NPP) recently, it could be violating HIPAA. The HIPAA Omnibus Rule requires practices to update these policies and take additional steps to ensure patients are aware of them, says Tennant.

Some of the required updates to the NPP include:


• Information regarding uses and disclosures that require authorization;

• Information about an individual's right to restrict certain disclosures of PHI to a health plan; and

• Information regarding an affected individual's right to be notified following a privacy or security breach.


In addition to updating the NPP, a practice must post it prominently in its facility and on the website, and have new patients sign it and offer a copy to them, says Tennant. "I'd say of every 10 practices, hospitals, dental offices I go into, nine of them don't have their privacy notice in the waiting room," he says.


5. IGNORING RECORD AMMENDMENT REQUESTS

Don't hesitate to take action when patients request an amendment to information in their medical records, cautions Cindy Winn, deputy director of consulting services at The Fox Group, LLC. Under the HIPAA Privacy Rule, patients have the right to request a change to their records, and providers must act on those requests within 60 days, she says.


If you disagree with a patient's requested change, you must explain, in writing, why you are not making the requested change, says Hook. Then, share that reasoning with the patient and store a copy of it in the patient's medical record, as well as a copy of the patient's written request for the amendment.


6. NOT PROVIDING ENOUGH TRAINING

The privacy and security rules require formal HIPAA education and training of staff. Though the rules don't provide detailed guidance regarding what training is required, Hook recommends training all the members of your workforce on policies and procedures that address privacy and security at the time of hire, and at least annually thereafter.


The HIPAA Security Rule also requires practices to provide "periodic security reminders" to staff, says Caswell, adding that many practices are unaware of this. Actions that might satisfy this requirement include sending e-mails to staff when privacy and security issues come up in the news, such as information about a recent malware outbreak; or inserting a regular "security awareness" column in staff e-newsletters.

Finally, be sure to document any HIPAA training provided to staff.


7. OVERCHARING FOR RECORD COPIES

With few exceptions, the privacy rule requires practices to provide patients with copies of their medical records when requested. It also requires you to provide access to the record in the form requested by the individual, if it is readily producible in that manner.


While practices can charge for copies of records, some practices may be getting into trouble due to the fee they are charging, says Tennant. "HIPAA is pretty clear that you can only charge a cost-based fee and most of those are set by the state, so most states have [limits such as] 50 cents a page up to maybe $1 a page ... but you can't charge a $50 handling fee or processing fee; at least it's highly discouraged," says Tennant.


To ensure you are following the appropriate guidelines when dealing with record copy requests, review your state's regulations and consult an attorney. Also, keep in mind that though the privacy rule requires practices to provide copies within 30 days of the request, some states require even shorter timeframes.


8. BEING TOO OPEN WITH ACCESS

If your practice does not have security controls in place regarding who can access what medical records and in what situations, it's setting itself up for a HIPAA violation. The privacy rule requires that only those who have a valid reason to access a patient's record — treatment purposes, payment purposes, or healthcare operations — should do so, says Caswell. "If none of those things exist, then a person shouldn't [access] an individual's chart."


Caswell says practices need to take steps to ensure that staff members do not participate in "record snooping" — inappropriately accessing a neighbor's record, a family member's record, or even their own record.


She recommends practices take the following precautions:


• Train staff on appropriate record access;

• Implement policies related to appropriate record access; and

• Run EHR audits regularly to determine whether inappropriate access is occurring.


9. RELEASING TOO MUCH INFORMATION

Similar to providing too much access to staff, some practices provide too much access to outside entities, says Caswell. For instance, they release too much PHI when responding to requests such as subpoenas for medical records, requests for immunization information from schools, or requests for information from a payer.


"If there's, say, for instance, litigation going on and an attorney says, 'I need the record from December 2012 to February 2014,' it is your responsibility to only send that amount of information and not send anything else, so sort of applying what's called the minimum necessary standard," says Caswell. "When you receive outside requests for PHI, pay close attention to the dates for which information is requested, as well as the specific information requested."

more...
No comment yet.
Scoop.it!

Is your cloud provider HIPAA compliant? An 11 point checklist

Is your cloud provider HIPAA compliant? An 11 point checklist | HIPAA Compliance for Medical Practices | Scoop.it

Healthcare organisations frequently turn to managed service providers (MSPs) to deploy and manage private, hybrid or public cloud solutions. MSPs play a crucial role in ensuring that healthcare organisations maintain secure and HIPAA compliant infrastructure.


Although most MSPs offer the same basic services – cloud design, migration, and maintenance – the MSP’s security expertise and their ability to build compliant solutions on both private and public clouds can vary widely.


Hospitals, healthcare ISVs and SaaS providers need an MSP that meets and exceeds the administrative, technical, and physical safeguards established in HIPAA Security Rule. The following criteria either must or should be met by an MSP:


1. Must offer business associate agreements


An MSP must offer a Business Associate Agreement (BAA) if it hopes to attract healthcare business. When a Business Associate is under a BAA, they are subject to audits by the Office for Civil Rights (OCR) and could be accountable for a data breach and fined for noncompliance.

According to HHS, covered entities are not required to monitor or oversee how their Business Associates carry out privacy safeguards, or in what ways MSPs abide by the privacy requirements of the contract. Furthermore, HHS has stated that a healthcare organisation is not liable for the actions of an MSP under BAA unless otherwise specified.


An MSP should be able to provide a detailed responsibility matrix that outlines which aspects of compliance are the responsibility of whom. Overall, while an MSP allows healthcare organisations to outsource a significant amount of both the technical effort and the risk of HIPAA compliance, organisations should still play an active role in monitoring MSPs. After all, an OCR fine is often the least of an organisation’s worries in the event of a security breach; negative publicity is potentially even more damaging.


2. Should maintain credentials


There is no “seal of approval” for HIPAA compliance that an MSP can earn. The OCR grants no such qualifications. However, any hosting provider offering HIPAA compliant hosting should have had their offering audited by a reputable auditor against the HIPAA requirements as defined by HHS.


In addition, the presence of other certifications can assist healthcare organisations in choosing an MSP that takes security and compliance concerns very seriously. A well-qualified MSP will maintain the following certifications:

  •      SSAE-16
  •      SAS70 Type II
  •      SOX Compliance
  •      PCI DSS Compliance


While these certifications are by no means required for HIPAA compliance, the ability to earn such qualifications indicates a high level of security and compliance expertise. They require extensive (and expensive) investigations by 3rd party auditors of physical infrastructure and team practices.


3. Should offer guaranteed response times


Providers should indicate guaranteed response times within their Service Level Agreement. While 24/7/365 NOC support is crucial, the mere existence of a NOC team is not sufficient for mission-critical applications; healthcare organisations need a guarantee that the MSP’s NOC and security teams will respond to routine changes and to security threats in a timely manner.  Every enterprise should have guaranteed response times for non-critical additions and changes, as well.


How such changes and threats are prioritized and what response is appropriate for each should be the subject of intense scrutiny by healthcare organisations, who also have HIPAA-regulated obligations in notifying authorities of security breaches.


4. Must meet data encryption standards


The right MSP will create infrastructure that is highly secure by default, meaning that the highest security measures should be applied to any component where such measures do not interfere with the function of the application. In the case of data encryption, while HIPAA’s Security Rule only requires encryption for data in transit, data should reasonable be encrypted everywhere by default, including at rest and in transit.


When MSPs and healthcare organisations encrypt PHI, they are within the “encryption safe harbor.” Unauthorised disclosure will not be considered a breach and will not necessitate a breach notification if the disclosed PHI is encrypted.


Strong encryption policies are particularly important in public cloud deployments. The MSP should be familiar with best practices for encrypting data both within the AWS environment and in transit between AWS and on-site back-ups or co-location facilities. We discuss data encryption best practices for HIPAA compliant hosting on AWS here.


It is important to note that not all encryption is created equal; look for an MSP that guarantees at least AES-256 Encryption, the level enforced by federal agencies. It is useful to note that AWS’ check-box encryption of EBS volumes meets this standard.


5. Should have “traditional IT” and cloud expertise


Major healthcare organisations have begun to explore public cloud solutions. However, maintaining security in public clouds and in hybrid environments across on-premises and cloud infrastructure is a specialty few MSPs have learned. “Born in the Cloud” providers, whose businesses started recently and are made up exclusively of cloud experts, are quite simply lacking the necessary experience in complex, traditional database and networking that would enable them to migrate legacy healthcare applications and aging EHR systems onto the public cloud without either a) over-provisioning or b) exposing not-fully-understood components to security threats.


No matter the marketing hype around “Born in the Cloud” providers, it certainly is possible to have best-in-classDevOps and cloud security expertise and a strong background in traditional database and networking. In fact, this is what any enterprise with legacy applications should expect.


Hiring an MSP that provides private cloud, bare metal hosting, database migrations, legacy application hosting, and also has a dedicated senior cloud team is optimal. This ensures that the team is aware of the unique features of the custom hardware that currently supports the infrastructure, and will not expose the application to security risks by running the application using their “standard” instance configuration.


6. Must provide ongoing auditing and reporting


HIPAA Security Rule requires that the covered entity “regularly” audit their own environment for security threats. It does not, however, define “regularly,” so healthcare organisations should request the following from their MSPs:


  • Monthly or quarterly engineering reviews, both for security concerns and cost effectiveness
  • Annual 3rd party audits
  • Regular IAM reports. A credential report can be generated every four hours; it lists all of the organisations users and access keys.
  • Monthly re-certification of staff’s IAM roles
  • Weekly or daily reports from 3rd party security providers, like Alert Logic or New Relic


7. Must maintain compliant staffers and staffing procedures


HIPAA requires organisations to provide training for new workforce members as well as periodic reminder training. As a business associate, the MSP has certain obligations for training their own technical and non-technical staff in HIPAA compliance. There are also certain staff controls and procedures that must be in place and others that are strongly advisable. A covered entity should ask the MSP the following questions:


  • What formal sanctions exist against employees who fail to comply with security procedures?
  • What supervision exists of employees who deal with PHI?
  • What is the approval process for internal collaboration software or cloud technologies?
  • How do employees gain access to your office? Is a FOB required?
  • What is your email encryption policy?
  • How will your staff inform our internal IT staff of newly deployed instances/servers? How will keys be communicated, if necessary?
  • Is there a central authorisation hub such as Active Directory for the rapid decommissioning of employees?
  • Can you provide us with your staff’s HIPAA training documents?
  • Do you provide security threat updates to staff?
  • What are internal policies for password rotation?
  • (For Public Cloud) How are root account keys stored?
  • (For Public Cloud) How many staff members have Administrative access to our account?
  • (For Public Cloud) What logging is in place for employee access to the account? Is it distinct by employee, and if federated access is employed, where is this information logged?


While the answers to certain of these questions do not confirm or deny an MSP’s degree of HIPAA compliance, they may help distinguish a new company that just wants to attract lucrative healthcare business versus a company already well versed in such procedures.


8. Must secure physical access to servers


In the case of a public cloud MSP, the MSP should be able to communicate why their cloud platform of choice maintains physical data centres that meet HIPAA standards. To review AWS’s physical data centre security measures, see their white paper on the subject. If a hybrid or private cloud is also maintained with the MSP, they should provide a list of global security standards for their data centres, including ISO 27001, SOC, FIPS 140-2, FISMA, and DoD CSM Levels 1-5, among others. The specific best practices for physical data centre security that healthcare organisations should look out for is well covered in ISO 27001 documentation.


9. Should conduct risk analysis in accordance with NIST guidelines


The National Institute of Standards and Technology, or NIST, is a non-regulatory federal agency under the Department of Commerce. NIST develops information security standards that set the minimum requirements for any information technology system used by the federal government.


NIST produces Standard Reference Materials (SRMs) that outline the security practices, and their most recent Guide for Conducting Risk Assessments provides guidance on how to prepare for, conduct, communicate, and maintain a risk assessment as well as how to identify and monitor specific risk factors. NIST-800 has become a foundational document for service providers and organisations in the information systems industry.


An MSP should be able to provide a report that communicates the results of the most recent risk assessment, as well as the procedure by which the assessment was accomplished and the frequency of risk assessments.


Organisations can also obtain NIST 800-53 Certification from NIST as a further qualification of security procedures. While again this is not required of HIPAA Business Associates, it indicates a sophisticated risk management procedure — and is a much more powerful piece of evidence than standard marketing material around disaster recovery and security auditing.


10. Must develop a disaster recovery plan and business continuity plan


The HIPAA Contingency Plan standard requires the implementation of a disaster recovery plan. This plan must anticipate how natural disasters, security attacks, and other events could impact systems that contain PHI and develops policies and procedures for responding to such situations.

An MSP must be able to provide their disaster recovery plan to a healthcare organisation, which should include answers to questions like these:

  • Where is backup data hosted? What procedure maintains retrievable copies of ePHI?
  • What procedures identify suspected security incidents?
  • Who must be notified in the event of a security incident? How are such incidents documented?
  • What procedure documents and restores the loss of ePHI?
  • What is the business continuity plan for maintaining operations during a security incident?
  • How often is the disaster recovery plan tested?


11. Should already provide service to large, complex healthcare clients


Although the qualifications listed above are more valuable evidence of HIPAA compliance, a roster of clients with large, complex, HIPAA-compliant deployments should provide extra assurance. This pedigree will be particularly useful in vendor decision discussions with non-technical business executives. The MSPs ability to maintain healthcare clients in the long-term (2-3+ years) is important to consider.

more...
No comment yet.
Scoop.it!

Two Sentenced in HIPAA Criminal Case

Two Sentenced in HIPAA Criminal Case | HIPAA Compliance for Medical Practices | Scoop.it

Two individuals - a former hospital worker and a convicted drug trafficker - have been sentenced to serve time in federal prison for HIPAA privacy violations.


But the May 18 sentencing for HIPAA violations of drug "kingpin" Stuart Seugasala is the least of his problems. He'll be serving his 10-year HIPAA-related prison sentence concurrently with three life sentences for his January convictions on drug trafficking conspiracy and two kidnapping charges. In addition to that, he'll serve a consecutive seven-year sentence on firearm violations.


In a statement, the U.S. Department of Justice notes that because there is no parole in the federal penal system, Seugasala will spend the rest of his life in custody.


Meanwhile, as part of the same criminal case, Stacy Laulu, a former financial worker at Providence Alaska Medical Center in Anchorage, was sentenced on May 29 to two years in federal prison for each of her two counts of unauthorized disclosure of health information, for which she was convicted in January. She will serve the two year sentences concurrently. Federal prosecutors say Seugasala in March 2013 contacted Laulu, a friend, to find out if two victims of his crimes, who were both admitted to Providence Alaska Medical Center due to injuries inflicted by Seugasala and two other accomplices, had reported him to police.


"Laulu accessed the private electronic medical files of the victims and reported back to Seugasala," according to the Justice Department statement. Laulu went to trial with Seugasala in January and was convicted of violating the privacy rights of the victims.


Unlike Laulu, Seugasala received the maximum 10-year sentence on his HIPAA conviction. The HIPAA case is the first in the history of Alaska "and one of few such cases prosecuted in the country," federal prosecutors note. Judge Ralph Beistline, who presided over the Seugasala and Lulua cases, said that in committing these HIPAA violations, Seugasala "disrespected the victims again."


While imposing the life sentences on Seugasala, Beistline told him, "You enjoyed being a drug kingpin, you seemed to enjoy the misery that you created, and you enjoyed your criminal posse," according to the Justice Department statement.


Three other individuals involved with the criminal case were also convicted and sentenced on a variety of charges that included drug conspiracy, drug trafficking and kidnapping - but not HIPAA violations.

Relatively Rare Cases

While HIPAA criminal convictions are themselves unusual, it's even rarer in cases involving individuals who are not employed by a covered entity or business associate, says privacy attorney Adam Greene of the law firm Davis Wright Tremaine. In those rare cases, the individuals have usually been convicted of HIPAA violations related to other crimes, he notes.


"Before the HITECH Act in 2009, the criminal conviction could be based on an aiding and abetting or conspiracy charge, where the non-employee causes the HIPAA covered entity to violate HIPAA. We have seen this in identity theft cases," he says. "The HITECH Act amended the criminal provision to more explicitly permit prosecutors to go after anyone who improperly obtains or discloses health information, even if not part of a covered entity."


Criminal prosecutions tied to HIPAA violations often are tied to cases "involving criminal conduct, such as identity theft, other fraud, or in this case drug trafficking crimes," Greene notes. "So far, prosecutors have seemed to be more interested in using HIPAA as a secondary charge in other criminal matters rather than seeking to prosecute matters that only involve inappropriate access or use of health information."

Other Cases

There have been only a handful of other federal criminal HIPAA cases elsewhere in the U.S. The 10-year sentence for Seugasala's HIPAA crimes is apparently the most substantial so far.

Among other recent cases was the sentencing in February of Texas hospital worker Joshua Hippler to 18 months in federal prison for criminal HIPAA violations.


Hippler, 30, formerly of Longview, Texas, was sentenced after pleading guilty on Aug. 28, 2014, to wrongful disclosure of individually identifiable health information, according to federal prosecutors.

Prosecutors say that from December 2012 through January 2013, Hippler was an employee of an unidentified East Texas hospital. During this time, he obtained protected health information with the intent to use it for personal gain, they say.


In another HIPAA prosecution, Denetria Barnes, a former nursing assistant at a Florida assisted living facility, was sentenced in October 2013 to 37 months in prison after pleading guilty to several federal offenses, including conspiracy to defraud the U.S. government and wrongful disclosure of HIPAA protected information.


And in April 2013, Helene Michel, the former owner of a Long Island, N.Y., medical supply company, was sentenced to 12 years in prison in a case that involved $10.7 million in Medicarefraud as well as criminal HIPAA violations

.

Aside from those cases, most other defendants sentenced for criminal HIPAA violations have generally received much lighter sentences.

For example, last November, Christopher R. Lykes Jr., a former South Carolina state employee, was sentenced to three years of probation, plus community service, after he sent personal information about more than 228,000 Medicaid recipients to his personal e-mail account. Lykes pleaded guilty to four counts of willful examination of private records by a public employee and one count of criminal conspiracy.


And in 2010, former UCLA Healthcare System surgeon Huping Zhou was sentenced to four months in prison after admitting he illegally read private electronic medical records of celebrities and others. Zhou was the first defendant in the nation to receive a prison sentence for a HIPAAprivacy violation, according to the U.S. attorney's office for the central district of California.

more...
No comment yet.
Scoop.it!

Beacon Health Is Latest Hacker Victim

Beacon Health Is Latest Hacker Victim | HIPAA Compliance for Medical Practices | Scoop.it

Yet another large hacker attack has been revealed in the healthcare sector. But unlike three recent cyber-attacks, which targeted health insurers, this latest breach, which affected nearly a quarter-million individuals, involved a healthcare provider organization.


South Bend, Ind.-based Beacon Health System recently began notifying 220,000 patients that their protected health information was exposed as a result of phishing attacks on some employees that started in November 2013, leading to hackers accessing "email boxes" that contained patient data.


The Beacon Health incident is a reminder that healthcare organizations should step up staff training about phishing threats as well as consider adopting multi-factor authentication, shifting to encrypted email and avoiding the use of email to share PHI.

"Email - or at least any confidential email - going outside the organization's local network should be encrypted. And increasingly, healthcare organizations are doing just that," says security and privacy expert Kate Borten.


Unfortunately, in cases where phishing attacks fool employees into giving up their email logon credentials, encryption is moot, she says. "Although encryption is an essential protection when PHI is sent over public networks, and stored somewhere other than within IT control, it is only one of many, many security controls. There's no silver bullet."

At the University of Vermont Medical Center, which has seen an uptick in phishing scams in recent months, the organization has taken a number of steps to bolster security, including implementing two-factor authentication "for anything facing the Web, because that can pretty much render phishing attacks that are designed to steal credentials useless," says CISO Heather Roszkowski.

The Latest Hacker Attack

On March 26, Beacon Health's forensic team discovered the unauthorized access to the employees' email accounts while investigating a cyber-attack. On May 1, the team determined that the affected email accounts contained PHI. The last unauthorized access to any employee email account was on Jan. 26, the health system says.


"While there is no evidence that any sensitive information was actually viewed or removed from the email boxes, Beacon confirmed that patient information was located within certain email boxes," Beacon Health says in a statement posted on its website. "The majority of accessible information related only to patient name, doctor's name, internal patient ID number, and patient status (either active or inactive). The accessible information, which was different for different individuals, included: Social Security number, date of birth, driver's license number, diagnosis, date of service, and treatment and other medical record information."


The provider organization says it has reported the incident to the U.S. Department of Health and Human Services, various state regulators, and the FBI.

Hospital Patients Affected

A Beacon Health spokeswoman tells Information Security Media Group that the majority of those affected by the breach were patients of Memorial Hospital of South Bend or Elkhart General Hospital, which combined have more than 1,000 beds. The two facilities merged in 2012 to form the health system. Individuals who became patients of Beacon Health after Jan. 26 were not affected by the breach, she says.


The breach investigation is being conducted by the organization's own forensics team, the spokeswoman says.

Affected individuals are being offered one year of identity and credit monitoring.


The news about similar hacker attacks earlier this year that targeted health insurers Anthem Inc. and Premera Blue Cross prompted Beacon's forensics investigation team to "closely review" the organization's systems after discovering it was the target of a cyber-attack, the Beacon spokeswoman says.


In the wake of the incident, the organization has been bolstering its security, including making employees better aware of "the sophisticated tactics that are used by attackers," she says. That includes instructing employees to change passwords and warning staff to be careful about the websites and email attachments they click on.

The Phishing Threat

Security experts say other healthcare entities are also vulnerable to phishing.


"The important takeaway is that criminals are using fake email messages - phishing - to trick recipients into clicking links taking them to fake websites where they are prompted to provide their computer account information," says Keith Fricke, principle consultant at consulting firm tw-Security. "Consequently, the fake website captures those credentials for intended unauthorized use. Or they are tricked into opening attachments of these fake emails and the attachment infects their computer with a virus that steals their login credentials."

As for having PHI in email, that's something that, while common, is not recommended, Fricke notes. "Generally speaking, most employees of healthcare organizations do not have PHI in email. In fact, many healthcare organizations do not provide an email account to all of their clinical staff; usually managers and directors of clinical departments have email," he says. "However, for those workers that have a company-issued email account, some may choose to send and receive PHI depending on business process and business need."

Recent Hacker Attacks

As of May 28, the Beacon Health incident was not yet posted on the HHS' Office for Civil Rights'"wall of shame" of health data breaches affecting 500 or more individuals.


OCR did not immediately respond to an ISMG request to comment on the recent string of hacker attacks in the healthcare sector.

Other recent hacker attacks, which targeted health insurers, include:


  • An attack on Anthem Inc. , which affected 78.8 million individuals, and is the largest breach listed on OCR's tally.
  • A cyber-assault on Premera Blue Cross announced on March 17, that resulted in a breach affecting 11 million individuals.
  • An "unauthorized intrusion" on a CareFirst BlueCross BlueShield database disclosed on May 20. The Baltimore-based insurer says the attack dated back to June 2014, but wasn't discovered until April 2015. The incident resulted in a breach affecting 1.1 million individuals.


But the recent attack on Beacon Health is yet another important reminder to healthcare provider organizations that it's not just insurers that are targets. Last year, a hacking assault on healthcare provider Community Health System affected 4.5 million individuals.

Smaller hacker attacks have also been disclosed recently by other healthcare providers, includingPartners HealthCare. And a number of other healthcare organizations in recent months have also reported breaches involving phishing attacks. That includes a breach affecting nearly 760 patients at St. Vincent Medical Group.


"Healthcare provider organizations are also big targets - [they have] more complex environments, and so have more vulnerabilities that the hackers can exploit," says security and privacy expert Rebecca Herold, CEO of The Privacy Professor. "Another contributing factor is insufficient funding for security within most healthcare organizations, resulting in insufficient safeguards for PHI in all locations where it can be stored and accessed."

Delayed Detection

A delay in detecting hacker attacks seems to be a common theme in the healthcare sector. Security experts say several factors contribute to the delayed detection.


"Attacks that compromise an organization's network and systems are harder to detect these days for a few reasons," says Fricke, the consultant. "Criminals wait longer periods of time before taking action once they successfully penetrate an organization's security defenses. In addition, the attack trend is to compromise the accounts of legitimate users rather than gaining unauthorized access to a system via a brute force attack."


When criminals access a system with an authorized account, it's more difficult to detect the intrusion, Fricke notes. "Network security devices and computer systems generate huge volumes of audit log events daily. Proactively searching for indicators of compromise in that volume of log information challenges all organizations today."

As organizations step up their security efforts in the wake of other healthcare breaches, it's likely more incidents will be discovered and revealed, says privacy attorney Adam Greene of the law firm Davis Wright Tremaine.


"The challenge that many healthcare entities face is that oftentimes, the better they do at information security, the more likely it is they find potential problems. Implementing new information security tools sometimes can detect problems that may be years old," he says. "But the alternative - keeping your head in the sand - can lead to far worst results for patients and the organization."


However, as more of these delayed-detection incidents are discovered, "regulators and plaintiffs may question why any particular security issue was not identified and corrected earlier," he warns.

Accordingly, organizations should consider if there were reasonable issues that led to any delays in identifying or correcting any security lapses and maintain any related documentation supporting the cause of any delays, he suggests.


"Hindsight is 20-20, and it is always easy for regulators to question why more wasn't done sooner, and it could be challenging for the organization if it is asked to justify why it spent resources on other projects," Greene says.

more...
No comment yet.
Scoop.it!

Medical Staff Resistance to HIPAA Compliance

Medical Staff Resistance to HIPAA Compliance | HIPAA Compliance for Medical Practices | Scoop.it

Recently, while reading a 2013 article in Information Week, "Doctors Push Back Against Health ITs Workflow Demands," I thought about various scenarios individuals have brought to my attention. It is indisputable that both the healthcare industry and physicians have been dealing with a dramatic shift in the landscape and, in turn, having to adapt to and implement a variety of new processes. In the article, the authors say, "There's a powerful force working against the spread of health IT: physician anger, as doctors resist adopting workflows that can feel to them more like manufacturing than traditional treatment." There are several reasons for this: uncertainty in reimbursement, the transition to ICD-10, and compliance requirements related to HIPAA and the Affordable Care Act.

Some of the situations that have been brought to my attention include: entities refusing to sign a Business Associate Agreement (BAA), refusing to choose a vendor because a password is required to be utilized and periodically changed in order to text message, and giving a username/password to other members of the care team to change or augment the electronic health record. Needless to say, all of these scenarios are problematic for several reasons. First and foremost, they violate the legal standards set forth in HIPAA, the HITECH Act, and the 2013 Final Omnibus Rule. Second, engaging in these practices makes the person more vulnerable. Lastly, refusing to utilize a password in order to optimize both IT security and compliance is foolish.


At its core, a Business Associate Agreement is required between parties who create, receive, maintain, or transmit protected health information (PHI) on behalf of or for a covered entity. The phrase "on behalf of or for" is crucial because it extends beyond the relationship between the covered entity and a single business associate. This is the requirement of federal HIPAA. States may, and in fact do, have more stringent requirements.


One of the greatest areas of vulnerability is texting sensitive data using smartphones. Hence, it is crucial to make sure that the iPhone App is encrypted and requires a password (ideally, this would be a two-factor identification method). Yet, I have heard stories where physicians belligerently refuse to adopt a technology because of the requirement.

Lastly, providing a nurse or PA with access to a medical record utilizing the physician's user name and password is absurd. Think of the Ebola case in Dallas, Texas, where the nurses left notes in one section that the physicians did not read. What if both individuals had used the same user ID and password? How easy would it be to look at the audit log and determine who made the entry? The level of legal liability associated with this practice is exponential.


Given that these scenarios really do happen, what steps can be taken by physicians and other entities? Here are a few suggestions:


• Adopt a "no tolerance" policy and sanctions for non-compliance from the medical staff in relation to HIPAA compliance. Many organizations have these in place.

• Get your Business Associate Agreements in order and keep a log of all the vendors, business associates, and other entities that need to have one — along with the date they were executed.

• Never give your user id/password to anyone; the system administrator has it.


more...
No comment yet.
Scoop.it!

Data breach at White Plains Hospital involving emergency room patients

Data breach at White Plains Hospital involving emergency room patients | HIPAA Compliance for Medical Practices | Scoop.it

A security breach has been disclosed at a hospital in Westchester County.

Personal information about hundreds of emergency room patients over a two year period was leaked to someone or some entity that shouldn't have it.

So what if you're one of those patients? And who gave away the information?

White Plains Hospital is the latest target of a data breach.

An employee working for a billing company called Medical Management LLC. allegedly copied personal information including names, dates of birth, and social security numbers then gave it away to a third party.

MML handles the billing and coding for White Plains Hospital's emergency room.

"It should be held securely. Its information you should not give to certain people. I don't like giving my information out at all to anybody," said Jeffry Jones, a former patient.

The employee was fired and other hospitals in the state are affected.

Now patients at White Plains Hospital are waiting to find out if they're personal information was compromised.

"We're going to have to catch the company that's doing it. Wipe them out. The hospital is great. They're making them look bad. It's not right for them to mess up our lives," said Diana Bennett, a patient.

The breach was from February 2013 to March 2015.

Now the hospital is offering identity theft protection services for anyone who may have been impacted.

Credit protection expert Adam Levine has this advice for the 1,100 people affected.

"...." Levine said.

Anyone who may have fallen victim will be notified by mail.

Those affected by the breach are also being offered identity threat protection services at no cost.

There was no indication that any medical history or treatment information was disclosed.

Victims are being advised to place a fraud alert or a security freeze on their accounts through a national credit bureau and to review all bills and account statements.


more...
No comment yet.
Scoop.it!

Hospital Slammed With $218,000 HIPAA Fine

Hospital Slammed With $218,000 HIPAA Fine | HIPAA Compliance for Medical Practices | Scoop.it

Federal regulators have slapped a Boston area hospital with a $218,000 HIPAA penalty after an investigation following two security incidents. One involved staff members using an Internet site to share documents containing patient data without first assessing risks. The other involved the theft of a worker's personally owned unencrypted laptop and storage device.


The Department of Health and Human Services' Office for Civil Rights says it has entered a resolution agreement with St. Elizabeth's Medical Center that also includes a "robust" corrective action plan to correct deficiencies in the hospital's HIPAA compliance program.

The Brighton, Mass.-based medical center is part of Steward Health Care System.


Privacy and security experts say the OCR settlement offers a number of valuable lessons, including the importance of the workforce knowing how to report security issues internally, as well as the need to have strong policies and procedures for safeguarding PHI in the cloud.

Complaint Filed

On Nov. 16, 2012, OCR received a complaint alleging noncompliance with the HIPAA by medical center workforce members. "Specifically, the complaint alleged that workforce members used an Internet-based document sharing application to store documents containing electronic protected health information of at least 498 individuals without having analyzed the risks associated with such a practice," the OCR statement says.


OCR's subsequent investigation determined that the medical center "failed to timely identify and respond to the known security incident, mitigate the harmful effects of the security incident and document the security incident and its outcome."


"Organizations must pay particular attention to HIPAA's requirements when using internet-based document sharing applications," says Jocelyn Samuels, OCR director in the statement. "In order to reduce potential risks and vulnerabilities, all workforce members must follow all policies and procedures, and entities must ensure that incidents are reported and mitigated in a timely manner."


Separately, on Aug. 25, 2014, St. Elizabeth's Medical Center submitted notification to OCR regarding abreach involving unencrypted ePHI stored on a former hospital workforce member's personal laptop and USB flash drive, affecting 595 individuals. The OCR "wall of shame" website of health data breaches impacting 500 or more individuals says the incident involved a theft.

Corrective Action Plan

In addition to the financial penalty - which OCR says takes into consideration the circumstances of the complaint and breach, the size of the entity, and the type of PHI disclosed - the agreement includes a corrective action plan "to cure gaps in the organization's HIPAA compliance program raised by both the complaint and the breach."

The plan calls for the medical center to:


  • Conduct a "self-assessment" of workforce members' familiarity and compliance with the hospital's policies and procedures that address issues including transmission and storage of ePHI;
  • Review and revise policies and procedures related to ePHI; and
  • Revise workforce training related to HIPAA and protection of PHI.
Lessons Learned

Other healthcare organizations and their business associates need to heed some lessons from OCR's latest HIPAA enforcement action, two compliance experts say.


Privacy attorney Adam Greene of the law firm Davis Wright Tremaine notes: "The settlement indicates that OCR first learned of alleged noncompliance through complaints by the covered entity's workforce members. Entities should consider whether their employees know how to report HIPAA issues internally to the privacy and security officers and ensure that any concerns are adequately addressed. Otherwise, the employees' next stop may be complaining to the government."

The settlement also highlights the importance of having a cloud computing strategy, Greene points out. That strategy, he says, should include "policies, training and potential technical safeguards to keep PHI off of unauthorized online file-sharing services."


The enforcement action spotlights the continuing challenge of preventing unencrypted PHI from ending up on personal devices, where it may become the subject of a breach, he notes.

The case also sheds light on how OCR evaluates compliance issues, he says. "The settlement highlights that OCR will look at multiple HIPAA incidents together, as it is not clear that OCR would have entered into a settlement agreement if there had only been the incident involving online file sharing software, but took action after an unrelated second incident involving PHI ending up on personal devices."


Privacy attorney David Holtzman, vice president of compliance at security consulting firm CynergisTek, says the settlement "serves as an important reminder that a covered entity or a business associate must make sure that the organization's risk assessment takes into account any relationship where PHI has been disclosed to a contractor or vendor so as to ensure that appropriate safeguards to protect the data are in place."


The alleged violations involving the document sharing vendor, he says, "involve failure to have a BA agreement in place prior to disclosing PHI to the vendor, as well as failing to have appropriate security management processes in place to evaluate when a BA agreement is needed when bringing on a new contractor that will handle PHI."

St. Elizabeth's Medical Center did not immediately respond to an Information Security Media Group request for comment.

Previous Settlements

The settlement with the Boston-area medical center is the second HIPAA resolution agreement signed by OCR so far this year. In April, the agency OK'd an agreement with Cornell Prescription Pharmacyfor an incident related to unsecure disposal of paper records containing PHI. In that agreement, Cornell was fined $125,000 and also adopted a corrective action plan to correct deficiencies in its HIPAA compliance program.


The settlement with St. Elizabeth is OCR's 25th HIPAA enforcement action involving a financial penalty and/or resolution agreement that OCR has taken since 2008.


But privacy advocate Deborah Peel, M.D., founder of Patient Privacy Rights, says OCR isn't doing enough to crack down on organizations involved in HIPAA privacy breaches.


"Assessing penalties that low - St. Elizabeth will pay $218,400 - guarantees that virtually no organizations will fix their destructive practices," she says. "Industry views low fines as simply a cost of doing business. They'll take their chances and see if they're caught."


The largest HIPAA financial penalty to date issued by OCR was a $4.8 million settlement with New York-Presbyterian Hospital and Columbia University for incidents tied to the same 2010 breach that affected about 6,800 patients. The incidents involved unsecured patient data on a network.

more...
No comment yet.
Scoop.it!

Hospital to pay $218,400 for HIPAA violations

Hospital to pay $218,400 for HIPAA violations | HIPAA Compliance for Medical Practices | Scoop.it

St. Elizabeth's Medical Center must pay $218,400 for HIPAA violations through an agreement with the Department of Health and Human Services' Office for Civil Rights.


In 2012, the OCR received a complaint alleging that the Brighton, Massachusetts-based health center did not analyze the risks of an Internet-based document sharing app, which stored protected health information for almost 500 individuals, according to anannouncement from OCR.


During its investigation, OCR found that the health center "failed to timely identify and respond to the known security incident, mitigate the harmful effects of the security incident, and document the security incident and its outcome." In addition, St. Elizabeth's in 2014 submitted notification to OCR that a laptop and USB drive had been breached, putting unsecured protected health information for 595 consumers at risk.

OCR also is requiring that St. Elizabeth's adopt a corrective action plan to correct deficiencies in its HIPAA compliance program.


"Organizations must pay particular attention to HIPAA's requirements when using Internet-based document sharing applications," OCR Director Jocelyn Samuels said in an announcement. "In order to reduce potential risks and vulnerabilities, all workforce members must follow all policies and procedures, and entities must ensure that incidents are reported and mitigated in a timely manner."


A recent report from application security vendor Veracode found that the healthcare industry fares poorly compared to other industries in reducing application security risk.


Healthcare also is near the bottom of the pack when it comes to addressing remediation, with only 43 percent of known vulnerabilities being remediated.


While Phase II of the federal HIPAA audit program remains "under development,"Samuels reiterated in March that OCR is "committed to implementing a robust audit program," FierceHealthIT previously reported.

more...
No comment yet.
Scoop.it!

Hospital Slammed With $218,000 HIPAA Fine

Hospital Slammed With $218,000 HIPAA Fine | HIPAA Compliance for Medical Practices | Scoop.it

Federal regulators have slapped a Boston area hospital with a $218,000 HIPAA penalty after an investigation following two security incidents. One involved staff members using an Internet site to share documents containing patient data without first assessing risks. The other involved the theft of a worker's personally owned unencrypted laptop and storage device.


The Department of Health and Human Services' Office for Civil Rights says it has entered a resolution agreement with St. Elizabeth's Medical Center that also includes a "robust" corrective action plan to correct deficiencies in the hospital's HIPAA compliance program.

The Brighton, Mass.-based medical center is part of Steward Health Care System.


Privacy and security experts say the OCR settlement offers a number of valuable lessons, including the importance of the workforce knowing how to report security issues internally, as well as the need to have strong policies and procedures for safeguarding PHI in the cloud.

Complaint Filed

On Nov. 16, 2012, OCR received a complaint alleging noncompliance with the HIPAA by medical center workforce members. "Specifically, the complaint alleged that workforce members used an Internet-based document sharing application to store documents containing electronic protected health information of at least 498 individuals without having analyzed the risks associated with such a practice," the OCR statement says.


OCR's subsequent investigation determined that the medical center "failed to timely identify and respond to the known security incident, mitigate the harmful effects of the security incident and document the security incident and its outcome."


"Organizations must pay particular attention to HIPAA's requirements when using internet-based document sharing applications," says Jocelyn Samuels, OCR director in the statement. "In order to reduce potential risks and vulnerabilities, all workforce members must follow all policies and procedures, and entities must ensure that incidents are reported and mitigated in a timely manner."


Separately, on Aug. 25, 2014, St. Elizabeth's Medical Center submitted notification to OCR regarding a breach involving unencrypted ePHI stored on a former hospital workforce member's personal laptop and USB flash drive, affecting 595 individuals. The OCR "wall of shame" website of health data breaches impacting 500 or more individuals says the incident involved a theft.

Corrective Action Plan

In addition to the financial penalty - which OCR says takes into consideration the circumstances of the complaint and breach, the size of the entity, and the type of PHI disclosed - the agreement includes a corrective action plan "to cure gaps in the organization's HIPAA compliance program raised by both the complaint and the breach."

The plan calls for the medical center to:


  • Conduct a "self-assessment" of workforce members' familiarity and compliance with the hospital's policies and procedures that address issues including transmission and storage of ePHI;
  • Review and revise policies and procedures related to ePHI; and
  • Revise workforce training related to HIPAA and protection of PHI.


Lessons Learned

Other healthcare organizations and their business associates need to heed some lessons from OCR's latest HIPAA enforcement action, two compliance experts say.


Privacy attorney Adam Greene of the law firm Davis Wright Tremaine notes: "The settlement indicates that OCR first learned of alleged noncompliance through complaints by the covered entity's workforce members. Entities should consider whether their employees know how to report HIPAA issues internally to the privacy and security officers and ensure that any concerns are adequately addressed. Otherwise, the employees' next stop may be complaining to the government."

The settlement also highlights the importance of having a cloud computing strategy, Greene points out. That strategy, he says, should include "policies, training and potential technical safeguards to keep PHI off of unauthorized online file-sharing services."


The enforcement action spotlights the continuing challenge of preventing unencrypted PHI from ending up on personal devices, where it may become the subject of a breach, he notes.


The case also sheds light on how OCR evaluates compliance issues, he says. "The settlement highlights that OCR will look at multiple HIPAA incidents together, as it is not clear that OCR would have entered into a settlement agreement if there had only been the incident involving online file sharing software, but took action after an unrelated second incident involving PHI ending up on personal devices."


Privacy attorney David Holtzman, vice president of compliance at security consulting firm CynergisTek, says the settlement "serves as an important reminder that a covered entity or a business associate must make sure that the organization's risk assessment takes into account any relationship where PHI has been disclosed to a contractor or vendor so as to ensure that appropriate safeguards to protect the data are in place."


The alleged violations involving the document sharing vendor, he says, "involve failure to have a BA agreement in place prior to disclosing PHI to the vendor, as well as failing to have appropriate security management processes in place to evaluate when a BA agreement is needed when bringing on a new contractor that will handle PHI."

St. Elizabeth's Medical Center did not immediately respond to an Information Security Media Group request for comment.

Previous Settlements

The settlement with the Boston-area medical center is the second HIPAA resolution agreement signed by OCR so far this year. In April, the agency OK'd an agreement with Cornell Prescription Pharmacyfor an incident related to unsecure disposal of paper records containing PHI. In that agreement, Cornell was fined $125,000 and also adopted a corrective action plan to correct deficiencies in its HIPAA compliance program.


The settlement with St. Elizabeth is OCR's 25th HIPAA enforcement action involving a financial penalty and/or resolution agreement that OCR has taken since 2008.


But privacy advocate Deborah Peel, M.D., founder of Patient Privacy Rights, says OCR isn't doing enough to crack down on organizations involved in HIPAA privacy breaches.


"Assessing penalties that low - St. Elizabeth will pay $218,400 - guarantees that virtually no organizations will fix their destructive practices," she says. "Industry views low fines as simply a cost of doing business. They'll take their chances and see if they're caught."

The largest HIPAA financial penalty to date issued by OCR was a $4.8 million settlement with New York-Presbyterian Hospital and Columbia University for incidents tied to the same 2010 breach that affected about 6,800 patients. The incidents involved unsecured patient data on a network.

more...
No comment yet.
Scoop.it!

Premera Blue Cross Data Breach Results in Several Lawsuits, Class Actions

Premera Blue Cross Data Breach Results in Several Lawsuits, Class Actions | HIPAA Compliance for Medical Practices | Scoop.it

Premera is the third largest health insurer in Washington State, and was hit with a cyber attack initiated on May 5 of last year. The Premera attack exposed the personal information of as many as 11 million current and former clients of Premera across the US. While Premera noted on January 29 of this year - the day the data breach was discovered - that according to best information none of the personal data had been used surreptitiously, the fact remains that the data mined by cyber attackers is exactly the kind of information useful for perpetrating identity theft.

To that end, it has been reported that the cyber attackers targeted sensitive personal information such as names, dates of birth, Social Security numbers, mailing addresses, e-mail addresses, phone numbers, member identification numbers, bank account information, and claims and clinical information.

As for why the attack was not discovered for some eight months, Premera has said little. However, the breadth of the attack - affecting some 11 million people - and the delay in discovering the breach (initiated May 5, 2014 and revealed January 29, 2015) will likely provide much fodder for Premera cyber attack lawsuits.

According to the Puget Sound Business Journal, the New York Times had suggested the Premera cyber attack may have been perpetrated by the same China-based hackers who are suspected of breaching the federal Office of Personal Management (OPM) last month. However, the VP for communications at Premera, Eric Earling, notes there is no certainty the attack originated in China.

“We don’t have definitive evidence on the source of the attack and have not commented on that,” he said. “It continues to be under investigation by the FBI [Federal Bureau of Investigation] and we would leave the speculation to others.”

That said, it has been reported that the US government has traced all of these attacks to China.

Recent data breach attacks, including the Vivacity data breach and Connexion data breach, are reflective of a shift in targets, according to cyber attack experts. The attacks to the data systems of the federal OPM notwithstanding, it seems apparent that hackers are increasingly shifting their targets to health insurers in part due to the breadth of information available from the health records of clients.

The goal of cyber attackers in recent months, according to claims appearing in the New York Times, is to amass a huge trove of data on Americans.

Given such a headline as “Premera Blue Cross Reports Data Breach of 11 Million Accounts,” it appears they have a good start. While it might be a “win” for the hackers involved acquiring such data surreptitiously and illegally, it remains a huge loss in both privacy and peace of mind for millions of Americans who entrust their personal information to insurance providers, who, in turn, require such information in order to provide service. Consumers and clients also have historically assumed that such providers have taken steps to ensure their personal information is secure.

When it isn’t - and it takes eight months for a cyber attack to be identified - consumers have little recourse than to launch a Premera cyber attack lawsuit in order to achieve compensation for the breach, and as a hedge for the possibility of ample frustration down the road were the breach to evolve in a full-blown identity theft.

To that end, five class-action data breach lawsuits have been filed in US District Court for the District of Seattle. According to reports, two of the five lawsuits allege that Premera was warned in an April 2014 draft audit by the OPM that its IT systems “were vulnerable to attack because of inadequate severity precautions,” according to the text of the lawsuits.

Tennielle Cossey et al. vs. Premera asserts that the audit in question, “identified… vulnerabilities related to Premera’s failure to implement critical security patches and software updates, and warned that ‘failure to promptly install important updates increases the risk that vulnerabilities will not be.’

“If the [OPM] audit were not enough, the events of 2014 alone should have placed Premera on notice of the need to improve its cyber security systems.”

Moving forward, Premera Blue Cross data breach lawsuits are being consolidated into multidistrict litigation, given the number of Americans affected and their various locations across the country. An initial case management conference has been scheduled for August 7.

more...
No comment yet.
Scoop.it!

Website Error Leads to Data Breach

Website Error Leads to Data Breach | HIPAA Compliance for Medical Practices | Scoop.it

An error in a coding upgrade for a Blue Shield of California website resulted in a breach affecting 843 individuals. The incident is a reminder to all organizations about the importance of sound systems development life cycle practices.


In a notification letter being mailed by Blue Shield of California to affected members, the insurer says the breach involved a secure website that group health benefit plan adminstrators and brokers use to manage information about their own plans' members. "As the unintended result of a computer code update Blue Shield made to the website on May 9," the letter states, three users who logged into their own website accounts simultaneously were able to view member information associated with the other users' accounts. The problem was reported to Blue Shield's privacy office on May 18.


Blue Shield of California tells Information Security Media Group that the site affected was the company's Blue Shield Employer Portal. "This issue did not impact Blue Shield's public/member website," the company says. When the issue was discovered, the website was promptly taken offline to identify and fix the problem, according to the insurer.


"The website was returned to service on May 19, 2015," according to the notification letter. The insurer is offering all impacted individuals free credit monitoring and identity theft resolution services for one year.


Exposed information included names, Social Security numbers, Blue Shield identification numbers, dates of birth and home addresses. "None of your financial information was made available as a result of this incident," the notification letter says. "The users who had unauthorized access to PHI as a result of this incident have confirmed that they did not retain copies, they did not use or further disclose your PHI, and that they have deleted, returned to Blue Shield, and/or securely destroyed all records of the PHI they accessed without authorization."


The Blue Shield of California notification letter also notes that the company's investigation revealed that the breach "was the result of human error on the part of Blue Shield staff members, and the matter was not reported to law enforcement authorities for further investigation."

Similar Incidents

The coding error at Blue Shield of California that led to the users being able to view other individuals' information isn't a first in terms of programming mistakes on a healthcare-sector website leading to privacy concerns.


For example, in the early weeks of the launch of HealthCare.gov in the fall of 2013, a software glitch allowed a North Carolina consumer to access personal information of a South Carolina man. The Department of Health and Human Services' Centers for Medicare and Medicaid Services said at the time that the mistake was "immediately" fixed once the problem was reported. Still, the incident raised more concerns about the overall security of the Affordable Care Act health information exchange site.


Software design and coding mistakes that leave PHI viewable on websites led to at least one healthcare entity paying a financial penalty to HHS' Office for Civil Rights.


An OCR investigation of Phoenix Cardiac Surgery P.C., with offices in Phoenix and Prescott, began in February 2009, following a report that the practice was posting clinical and surgical appointments for its patients on an Internet-based calendar that was publicly accessible.

The investigation determined the practice had implemented few policies and procedures to comply with the HIPAA privacy and security rules and had limited safeguards in place to protect patients' information, according to an HHS statement. The investigation led to the healthcare practice signing an OCR resolution agreement, which included a corrective action plan and a $100,000 financial penalty.


The corrective action plan required the physicians practice, among other measures, to conduct arisk assessment and implement appropriate policies and procedures.

Measures to Take

Security and privacy expert Andrew Hicks, director and healthcare practice lead at the risk management consulting firm Coalfire, says that to avoid website-related mistakes that can lead toprivacy breaches, it's important that entities implement appropriate controls as well as follow the right systems development steps.


"Organizations should have a sound systems development life cycle - SDLC - in place to assess all systems in a production environment, especially those that are externally facing," he says. "Components of a mature SDLC would include code reviews, user acceptance testing, change management, systems analysis, penetration testing, and application validation testing."


Healthcare entities and business associates need to strive for more than just HIPAA compliance to avoid similar mishaps, he notes.

"Organizations that are solely seeking HIPAA compliance - rather than a comprehensive information security program - will never have the assurance that website vulnerabilities have been mitigated through the implementation of appropriate controls," he says. "In other words, HIPAA does not explicitly require penetration testing, secure code reviews, change management, and patch management, to name a few. These concepts are fundamental to IT security, but absent from any OCR regulation, including HIPAA."

Earlier Blue Shield Breach

About a year ago, Blue Shield of California reported a data breach involving several spreadsheet reports that inadvertently contained the Social Security numbers of 18,000 physicians and other healthcare providers.


The spreadsheets submitted by the plan were released 10 times by the state's Department of Managed Health Care. In California, health plans electronically submit monthly to the state agency a roster of all physicians and other medical providers who have contracts with the insurers. Those rosters are supposed to contain the healthcare providers' names, business addresses, business phones, medical groups and practice areas - but not Social Security numbers. DMHC makes those rosters available to the public, upon request.

more...
No comment yet.
Scoop.it!

Hospital ID Theft Leads to Fraud

Hospital ID Theft Leads to Fraud | HIPAA Compliance for Medical Practices | Scoop.it

Eight alleged members of an identity theft ring, including a former assistant clerk at Montefiore Medical Center in New York, have been indicted on a variety of charges stemming from using stolen information on nearly 13,000 patients to make purchases at retailers.


Ann Patterson, senior vice president and program director of the Medical Identity Fraud Alliance, says that the incident points to the need for ongoing vigilance by healthcare organizations to prevent and detect ID theft and other related crimes.


Manhattan District Attorney Cyrus Vance Jr. alleges in a statement that members of the ID theft ring made up to $50,000 in purchases at retailers in Manhattan by opening up store credit card accounts using patient information stolen by former hospital worker, Monique Walker, 32.


Walker was an assistant clerk at Montefiore Medical Center, where her position gave her access to patients' names, dates of birth, Social Security numbers, and other personal information, Vance says.

Between 2012 and 2013, Walker allegedly printed thousands of patients' records on a near daily basis and supplied them to a co-defendant, Fernando Salazar, 28, according to Vance's statement.

Salazar is accused of acting as the ringleader of the operation. He allegedly purchased at least 250 items personal identifying information from Walker for as little as $3 per record, Vance says.


The stolen information was then allegedly provided to other defendants to open credit card accounts that were used for purchasing gift cards and merchandize at retailers, including Barneys New York, Macy's, Victoria's Secret, Zales, Bergdorf Goodman and Lord & Taylor.

Walker is charged with one count of felony grand larceny and one count of felony unlawful possession of personal identification information. The other defendants are charged with varying counts of grand larceny, identity theft and criminal possession of a forged instrument, among other charges.


All of the defendants have been arrested and arraigned in criminal court, and have various dates pending for their next court appearances.


"Case after case, we've seen how theft by a single company insider, who is often working with identity thieves on the outside, can rapidly victimize a business and thousands of its customers," Vance says. "Motivated by greed, profit and a complete disregard for their victims, identity thieves often feed stolen information to larger criminal operations, which then go on to defraud additional businesses and victims. In this case, a hospital employee privy to confidential patient records allegedly sold financial information for as little $3 per record."

Hospital Fires Worker

A Montefiore spokeswoman tells Information Security Media Group that the medical center was informed by law enforcement on May 15 of Walker's alleged crimes dating back to 2012 and 2013. As a result, Walker, who worked for the hospital for about three years, was fired, the spokeswoman says. "Montefiore is fully cooperating with law enforcement, including the Manhattan's District Attorney's office," a hospital statement says.


Law enforcement discovered the connection to Montefiore patient information while investigators were working on the ID theft case, the Montefiore spokeswoman says.


Of the 12,000-plus patient records that were compromised, it's uncertain how many individuals are victims of ID theft crimes, she says. But as a precaution, Montefiore is offering all impacted patients free identity recovery services, 12 months of free credit monitoring and a $1 million insurance policy to protect against identity theft-related costs.


Montefiore has reported the breach to the Department of Health and Human Services Office for Civil Rights, the spokeswoman says. While that incident as of June 22 was not yet listed on HHS'"wall of shame" tally of health data breaches affecting 500 or more individuals, three other breaches at Montefiore Medical Center appear on the federal website.


Those incidents, all reported in 2010, involved the theft of unencrypted computers. That includes the theft of a laptop in March 2010 which resulted in a breach impacting 625; and two July 2010 thefts of desktop computers that impacted 16,820 and 23,753 individuals.

Breach Prevention Steps

In a statement, Montefiore says that following the alleged crimes committed by Walker that were discovered in May, the hospital has expanded both its technology monitoring capabilities and employee training on safeguarding an accessing patient records to further bolster its privacy safeguards.


"The employee involved in this case received significant privacy and security training and despite that training, chose to violate our policies," the statement notes. "In response to this incident, Montefiore is also adding additional technical safeguards to protect patient information from theft or similar criminal activity in the future."


A hospital spokeswoman says the hospital has rolled out "sophisticated technology" to monitor for improper access by employees to the hospital's electronic patient records


The hospital also says it performs criminal background checks on all employees and "has comprehensive policies and procedures, as well as a code of conduct, which prohibits employees from looking at patient records when there is not a work-related reason to do so."

Steps to Take

Dan Berger, CEO of security consulting firm RedSpin, says it's not surprising the breach went undetected for so long because insider attacks are difficult to uncover. It's unclear if the Montefiore hospital clerk had "good reason to access so many records" as part of her job, he notes.


Patterson of the Medical Identity Fraud Alliance notes: "In addition to proper vetting of employees, the continued evaluation of employee education and awareness training programs and of your internal fraud detection programs is necessary. It's not something you do once and are done. Employees who are properly vetted upon initial hire may have changing circumstances that change their work integrity later on in their employ."


Additionally, security measures often need tweaking as circumstances within an organization change, she says.


"Fraud detection processes that worked when a specific type of workflow procedure was in place may need to be adjusted as that workflow process changes. An emphasis on continued evaluation of all components - people, process, technologies - for fraud detection is good practice."


Workforce training is important not only for preventing breaches, including those involving ID crimes, but also to help detect those incidents, she says. "Each employee must understand their role in protecting PHI. Equally important is regular and continued evaluation of the training programs to make sure that employees are adhering to the policies put in place, and that the 'red flags' detection systems are keeping pace with changing technologies and workplace practices."

more...
No comment yet.
Scoop.it!

Complying with the HIPAA Nondisclosure Rule

Complying with the HIPAA Nondisclosure Rule | HIPAA Compliance for Medical Practices | Scoop.it

Under the HIPAA Omnibus Rule, patients can request a restriction on a disclosure of PHI to a health plan if they pay out of pocket, in full for the service. Practices must agree to such a request unless they are required by law to bill that health plan (as is the case with some Medicaid plans).

During a session at the Medical Group Management Association 2014 Annual Conference, Loretta Duncan, senior medical practice consultant with malpractice insurer the State Volunteer Mutual Insurance Company in Brentwood, Tenn., shared some of her compliance tips:


• If the service the patient does not want disclosed is bundled with something else, explain that the patient may need to pay more out-of-pocket costs than expected.

• Make sure that communication is tight between all staff and departments regarding nondisclosure.

• Document your new nondisclosure policies and procedures.

• Be careful when e-prescribing, as pharmacies may bill to the insurance plan before the patient has a chance to let the pharmacy know that the information should not be disclosed.

more...
No comment yet.
Scoop.it!

What Happens in HIPAA Audits: Breaking Down HIPAA Rules

What Happens in HIPAA Audits: Breaking Down HIPAA Rules | HIPAA Compliance for Medical Practices | Scoop.it

HIPAA audits are something that covered entities of all sizes must be prepared to potentially go through. As technology continues to evolve, facilities need to ensure that they are maintaining PHI security and understand how best to keep sensitive information secure.


The Department of Health & Human Services (HHS) Office for Civil Rights (OCR) had originally scheduled its second round of HIPAA audits for the fall of 2014, yet as of this publication, round two is still waiting to be scheduled. Regardless, HIPAA audits are an essential aspect to the HIPAA Privacy and Security Rules.


We’ll break down the finer points of the audit process and why it is important, while also highlighting tips for facilities in case they are selected for an OCR HIPAA audit.


What are the HIPAA audits?


The OCR HIPAA audit program was designed to analyze the processes, controls, and policies that selected covered entities have in place in relation to the HITECH Act audit mandate, according to the HHS website.

OCR established a comprehensive audit protocol that contains the requirements to be assessed through these performance audits. The entire audit protocol is organized around modules, representing separate elements of privacy, security, and breach notification. The combination of these multiple requirements may vary based on the type of covered entity selected for review.

The HIPAA audits also are designed to cover HIPAA Privacy Rule requirements in seven areas:

  • Notice of privacy practices for PHI
  • Rights to request privacy protection for PHI
  • Access of individuals to PHI
  • Administrative requirements
  • Uses and disclosures of PHI
  • Amendment of PHI
  • Accounting of disclosures.


Why are the HIPAA audits important?


HIPAA audits are not just a way for OCR to ensure that covered entities are keeping themselves HIPAA compliant. Having periodic reviews of audit logs can help healthcare facilities not only detect unauthorized access to patient information, but also provide forensic evidence during security investigations. Auditing also helps organizations track PHI disclosures, learn about new threats and intrusion attempts, and even help to determine the organization’s overall effectiveness of policies and user education.


In FY 2014 alone, the OCR resolved more than 15,000 complaints of alleged HIPAA violations, according to the national FY 2016 budget request proposal report.


“OCR conducted a pilot program to ensure that its audit functions could be performed in the most efficient and effective way, and in FY 2015 will continue designing, testing, and implementing its audit function to measure compliance with privacy, security, and breach notification requirements,” the report authors explained. “Audits are a proactive approach to evaluating and ensuring HIPAA privacy and security compliance.”


The HIPAA audits are important because they help incentivize covered entities to remain HIPAA compliant, but they are also an opportunity to strengthen up organization’s security measures and find any weak spots in their approach to security.


What if I am selected for the HIPAA audit program?


As previously mentioned, there is not yet an exact date for when the next round of HIPAA audits will take place, there have been several reports that preliminary surveys have been sent to covered entities that may be selected for audits.


According to a report in The National Law Review, OCR will audit approximately 150 of the 350 selected covered entities and 50 of the selected business associates for compliance with the Security Standards. Furthermore, OCR will audit 100 covered entities for compliance with the Privacy Standards and 100 covered entities for Breach Notification Standards compliance.


Whether your organization received one of those surveys or not, it’s important for entities to have at least a basic plan in place for potential audits. Healthcare organizations should not rely on a false sense of security, and they need to ensure that when their data systems and safeguards are being reviewed, that facilities try and keep in mind what the OCR would be looking for so no areas are missed.


Current physical safeguards, administrative safeguards, and technical safeguards are not only required by the Security Rule, but they work together to protect health information. In addition to those areas, here are a few key things for covered entities to maintain, as they may play a role in the HIPAA audit process:


  • Perform comprehensive and periodic risk analyses
  • Keep thorough inventories of business associates and their contracts or BAAs.
  • Maintain thorough accounts of where ePHI is stored, this includes but is not necessarily limited to internal databases, mobile devices and paper documents.
  • Thorough records of all security training that has taken place.
  • Documented evidence of the facility’s encryption capabilities.


If covered entities have performed a proper risk assessment, preparing for the HIPAA audits will not be as daunting. For further discussion on the legal implications of risk assessments and analyses.


Maintain compliance and stay prepared


Perhaps one of the best ways to prepare for a potential OCR HIPAA audit is to keep all three safeguards current, ensuring to adjust them as necessary as technology evolves.


It is also essential for covered entities to know their BAs, and have all appropriate contracts and business associate agreements in place and up to date.


Conducting periodic risk analysis will also be beneficial, and covered entities should be sure to be able to provide evidence of compliance. This can include documentation of policies and procedures being in place. For example, instances where a facility has sanctioned people and whether it was consistent with its sanctions policy will be beneficial if an audit takes place that looks at the sanction process.


Without a risk analysis, it is much more difficult for healthcare organizations to know where they are in terms of security. This can be detrimental not only for HIPAA audits, but also in maintaining comprehensive data security. Periodic reviews will help facilities continue to work toward maintaining HIPAA compliance and keeping sensitive data as secure as possible.

more...
No comment yet.
Scoop.it!

Unencrypted Laptop Leads To US HealthWorks Data Breach

Unencrypted Laptop Leads To US HealthWorks Data Breach | HIPAA Compliance for Medical Practices | Scoop.it

U.S. HealthWorks, a California-based health care service provider specializing in urgent care and occupational medicine, recently alerted employees to a data breach after a password protected (but unencrypted) laptop was stolen in April.


According to its website, the company operates over 200 locations in 20 states and has 3,600 employees, but it was unclear in the notification of the breach exactly how many people may be affected.


The letter explains how an internal investigation began shortly after the company was notified on April 22, 2015, that a laptop issued to an employee was stolen from their vehicle overnight.


“On May 5, 2015, we determined that the employee’s laptop was password protected, but it was not encrypted. After conducting a thorough review, we determined that the laptop may have contained files that included your name, address, date of birth, job title, and Social Security number. Although we continue to work with law enforcement, at this time, the computer has not been located,” U.S. HealthWorks said in its notice letter to employees.


The company did not confirm whether any personal information has been accessed or used inappropriately, but it said it will offer employees free enrollment in identity protection services for one year as a precautionary measure. U.S. HealthWorks reported efforts to ensure compliance to its laptop encryption policy going forward, including an enhancement to deployment procedures for laptops and full disk encryption.


With the number of security breaches on the rise, the importance of organizations controlling and protecting data is critical.

“If you have laptops in your enterprise environment, and let’s face it who doesn’t, you need to address this issue. In this day and age there really isn’t a good reason to not encrypt the hard drives on your laptops,” wrote Forbes contributor Dave Lewis in a post Monday (June 1).


While the scope and effects of this particular breach are unclear, U.S. HealthWorks does not need to look far to see that data breaches can wreak havoc. Anthem Inc.TargetHome Depot and many others have learned the hard way about the ongoing financial impacts associated with data breaches. A recent study by Ponemon Institute found that the average cost of a data breaches is now more than $3.8 million on average, a 23 percent increase from the levels seen two years ago.

more...
No comment yet.
Scoop.it!

Here's how healthcare can guard against data breaches in the "year of the hack"

Here's how healthcare can guard against data breaches in the "year of the hack" | HIPAA Compliance for Medical Practices | Scoop.it

Protected Health Information, or PHI, is increasingly attractive to cybercriminals. According to PhishLabs, health records can fetch as much as 10 times the value of credit card data on the black market.

Stolen healthcare records can be used for fraudulent billing which, unlike financial fraud, can go undetected for long periods of time. The rising price of healthcare records on the market is attracting more cybercriminals, who are exploiting any vulnerability they can find, be it an unpatched system or an insecure endpoint device.


We’ve all heard about several devastating data breaches in the healthcare industry this year – Anthem’s breach of more than 78 million records and the Premera Blue Cross breach of 11 million records. In the first quarter of 2015 alone, there have been 87 reported data breaches affecting 500 or more individuals, according to data from US Department of Health and Human Services Office for Civil Rights. These breaches affected a combined total of 92.3 million individuals, up 3,709 percent from Q1 2014.


Given the mega breaches experienced by Anthem and Premera, one could consider them as outliers. In terms of comparison, excluding the aforementioned breaches would still leave us with a 4.9 percent increase in individuals affected in the first quarter of 2015 versus the same quarter in 2014. Although the first three months of 2014 saw three more data breaches than what has occurred in 2015, it is clear that the number of individuals affected per breach is on the rise.

2015 is the year of the “hack”, but people are still the root cause.

In the first quarter of the year, 33 percent of data breaches were attributed to hacking or an “IT incident,” but the methods by which cybercriminals have successfully penetrated corporate networks are quite telling. These breaches have originated from unencrypted data, unpatched systems, or compromised passwords. In 2015, several hacking incidents have been tracked back to the compromise of a single set of credentials.


The Verizon 2015 Data Breach Investigations Report analyzed nearly 80,000 security incidents including 2,122 confirmed data breaches. Its findings reveal that despite the rise in cyberattacks, 90 percent of security incidents are tied back to people and their mistakes including phishing, bad behavior, or lost devices. The report notes that, even with a detailed technical report of a security incident, the “actual root cause typically boils down to process and human decision-making.” This is frightening but also good news, as there are measures that can be taken to reduce these risks by improving upon process and education, complemented by the right data security solutions.


It’s not all about the network


Healthcare organizations reacting to data breach headlines may focus efforts on protecting the network, leaving data vulnerable to other attack vectors and overlooking the people and process risks that ultimately result in most data breaches.


Cyberattacks come from many different vector points. It only takes one missing device, one use of unsecured WiFi, one compromised password, one click of a phishing email to compromise the entire corporate network. Many of these risks, which originate on the endpoint, put corporate network at risk. Current data security strategies in healthcare cannot be network versus endpoint, nor can they ignore the “people” risk that is only amplified by such trends as BYOD, mobile work, the cloud, and the Internet of Things.


A holistic approach to healthcare security


If we don’t adopt a different approach – one that addresses the multitude of options available to cybercriminals – breaches will continue to occur. Healthcare organizations that want to get ahead of cybercriminals need to create a holistic approach to data security that incorporates threat prevention, incident detection, and efficient response.


Reduce “the attack surface”


Every point of interaction with PHI puts that data at risk. Reducing the sum total of these points of interaction – the attack surface – can reduce the risk to the data. I suggest a layered approach to data security which decreases the attack surface across endpoints as well as the network, including:

  • A foundation of tight controls and processes;
  • Encryption is a must, but on its own is often circumvented;
  • Supplement encryption with a persistent technology that will provide a connection with a device, regardless of user or location while defeating attempts to remove the technology;
  • Network segmentation is key — granular access controls and tools for continuous monitoring offer real-time intelligence about the devices on the network and the security status of these systems;
  • Automate security remediation activities such as setting new firewall rules or locking down a suspicious device in the case of suspicious activities.


Minimize the “people” risk


You can have the best firewalls, encryption and network access controls, but your employees are still your weakest link. Using a combination of process (education and interactive ongoing training) and technology (such as mobile device management), employees should be aware of their part in protecting corporate data on endpoints.


Know how to detect anomalies


Conduct regular security audits on the network and endpoints. Know where your sensitive data resides and how it’s being used (or misused, in the case of employees) with the aid of a data loss prevention (DLP) tool. Most DLP and endpoint security tools can create automated alerts for suspicious activity.


Develop and maintain an incident response plan


With clear procedures in place to pursue anomalies and to escalate breach situations, potential risks can be addressed promptly and effectively. With many false positives, skilled IT personnel need to connect the dots (such as a user name change, unauthorized physical changes to the device or the device location, software vulnerabilities, registry changes or unusual system processes) and spot a true security incident quickly. Ensure your endpoint security supports remote actions such as data delete and device freeze.


With data regulations tightening, and healthcare data breaches escalating, don’t give cybercriminals an easy “in” to your organization. Trim the sails and batten the hatches to weather the oncoming storm of cyberattacks with a holistic approach to data security.


The more layers of protection you have in place, the better chance you have of avoiding a breach. Just as sailors can make or break a ship’s success in a storm, your employees are your first line of defense in preventing and detecting a data breach incident. If an incident is discovered, an efficient response plan can help your organization stay afloat in the muddy and complex waters of compliance.

more...
No comment yet.
Scoop.it!

Your Cyber-Risk Policy: What it Covers and What it Doesn't

Your Cyber-Risk Policy: What it Covers and What it Doesn't | HIPAA Compliance for Medical Practices | Scoop.it

In healthcare, we deal with highly sensitive and very private electronic information, so of course our ears perk up every time we see headlines about the latest cyber threat or breach. The natural question is whether this could happen to us. This is constructive if it leads to cyber risk-prevention. But all too often, folks are responding with, "it could not happen to me," or "my insurance policy covers this so I'm prepared." These folks are ignoring the growing cyber threat around all of us. They are whistling past the "cyber" graveyard.

We live in a digital age where almost everything is accessible — even more now with the evolution of EHRs — so we have to run our businesses as though we are all at risk. To be prepared, we must first understand the common sources of cyber risk. Second, we must understand the basics of cyber insurance policies we may or may not have in place.


There are several ways breaches at small healthcare organizations may occur:


1. Disgruntled employees are one of the leading reasons for cyber attacks. They know your systems — likely better than you do — so keep a close watch on them and what type of data they have access to. Really pay close attention to new staff and those that may be on their way out. Also make sure they know they are monitored.

2. Cyber criminals are looking for remote Internet access services with weak passwords. Require and enforce more complex passwords and require employees to change their passwords regularly.


A smart form of cyber protection is a cyber-risk insurance policy. These provide bundled services designed to help you quickly respond to a data breach. However, there are many cyber insurance product options to consider. These range from standalone policies with high limits and comprehensive services to policy add-on coverages typically offering less coverage.


Rather than stumbling through a maze of complicated cyber-related insurance rhetoric, do yourself a favor and review your options with an experienced broker:


• Carefully scrutinize "free" cyber coverage or riders added onto your base coverage. While not totally worthless, the majority come nowhere near covering the exposure of a potential cyber breach (which explains why they are typically thrown in at no additional cost). In reviewing your insurance coverages with your broker, it's easy to brush by this one and mentally check off the fact that you have cyber coverage. Drill into the details of what's covered, as outlined below.

• Find out how much you are covered for and what out-of-pocket expenses you could expect. A data breach at a small physician practice could run into the hundreds of thousands of dollars or even higher. This type of uncovered damage could put a small practice out of business. Some expenses physicians can expect to incur when a breach occurs include legal fees, IT forensic costs, notification costs, credit monitoring costs, and public relations and advertising expenses to reclaim patient goodwill as well as making the public aware of the steps taken to address the breach.


Cyber risk is not just a technology issue. It affects all elements of the healthcare business and needs to be well-planned and mitigated through ongoing education and risk-management programs.

more...
No comment yet.
Scoop.it!

Doctors Going the Distance (In Education)

Doctors Going the Distance (In Education) | HIPAA Compliance for Medical Practices | Scoop.it

We need more doctors.


Between older care providers retiring, and the general population shift that is the aging of the Baby Boomers, we are running into a massive demographic of more, older patients, living longer and managing more chronic conditions. This puts incredible pressure not just on the remaining doctors and nurses to make up the gap, but strains the capacity of schools to recruit, train, and produce competent medical professionals.


So how can schools do more to reach students and empower them to enter the healthcare field?


The increasing popularity of online programs (particularly at the Masters level, among working professionals looking for a boost to their career advancement) has called forth a litany of studies and commentaries questioning everything from their technology to their academics,compared to traditional, on-campus programs. More productive would be questioning the structure and measuring the outcomes of degree programs in general, rather than judging the value of a new delivery mechanism against an alternative more rooted in tradition than science.


In terms of sheer practicality, though, a distance education—yes, even for doctors and surgeons—makes a certain amount of sense. One of the hottest topics in the medical community right now is Electronic Health Records (EHRs) and the ongoing struggle to fully implement and realize the utility of such technology.


Rolling out in October of 2015, comes the sidecar for the EHR vehicle: ICD-10, the international medical coding language that the U.S. has long postponed adopting. While the digital nature of modern records platforms at least makes ICD-10 viable, it still represents a sharp learning curve for current care providers.


Then there is the intriguing promise of pharmacogenetics, whereby medication is developed, tested, and prescribed, all on the basis of a patient’s individual genetic profile. Combined with an EHR and a personal genetic profile, a patient could be observed, screened, diagnosed, referred to a pharmacist, and able to order and receive a prescription, all without leaving home. Taking into consideration the growing need for medication therapy management—driven by the Baby Boomers living longer with more conditions under care—the value of such a high-tech system is clear.


This draws on what is perhaps the most lucrative (in terms of health outcomes and large-scale care delivery) set of possibilities enabled by the shift to digital: telemedicine. From consultations to check-ups, telehealth in the digital age no longer necessitates sacrificing face-to-face interaction; streaming video chat means patients and doctors can still look one another in the eye, albeit through the aid of cameras.


Proponents of the technology take it further, declaiming that world-class surgeons will no longer be anchored to a single facility—human-guided robotic surgery (telesurgery) will bring expertise to even the most remote locations.


If industry leaders anticipate so much being done remotely, why then are others squeamish about delivering an education online? It would seem that the medical skillset of the future requires greater comfort and competence in dealing with virtual settings, online interaction, and digital record-keeping.


The problem many have is not with online med school in particular so much as online degree programs in general. How can a virtual setting possibly hope to compete with the unique, collaborative, community-oriented environment of the college campus—whatever the area of study?


Forward-thinking professors like Sharon Stoerger at Rutgers have pioneered at least one possible answer to this question. Adopting the online immersive social platform known as Second Life, Stoerger and her like-minded peers have constructed virtual classrooms with accompanying courses, and successfully guided several cohorts (of students as well as instructors) through the experience.


For the aspects of learning that simply require hands-on practice, of course, there are limits to the promise of such virtual environments. Then again, synthetic patient models, known as Human Patient Simulators (HPS), are already proving their merits as an efficient, effective way to let students gain practical experience in a controlled environment. While Ohio Universityinstructors have pioneered the use of HPS in the school’s nursing programs, advancing technology continues to push the functional limits of such systems.


In order to realize the potential of modern delivery of patient care, we first need to realize the potential of modern instructional delivery. The technology is already showing that the real limits of online learning are not practical considerations; they are attitudes and assumptions about what learning ought to look like.


more...
No comment yet.