PHI Tokenization

PHI Tokenization

PHI Tokenization

Protected Health Information

What is PHI?

Protected Health Information (PHI) is a term broadly used to describe various types of information related to healthcare diagnosis, patient records, and payments. What is unique to this specific data type is that the information is either created or collected by “Covered Entities” – a legal term referring to organizations such as health plans, health care clearinghouses, and health care providers as defined by HIPPA regulations. PHI can also apply to organizations that support Covered Entities. These organizations are contractually obligated through a Business Associate Agreement (BAA). If you are not a Covered Entity or have not signed a BAA, chances are you don’t accept, store, or transmit PHI. However, you may still have Personally Identifiable Information (PII). While similar, they have different regulatory implications.

What Are the Compliance Obligations?

The Final Rule on Security Standards under the Health Insurance Portability and Accountability Act (HIPPA) defines fifty-six administrative, physical, and technical standards for protecting Electronic PHI (ePHI). These standards are scoped to protect all ePHI within your environment. By tokenizing ePHI, you are effectively reducing the amount of ePHI in your environment, and reducing your overall burden of compliance. It is easier to ensure security and audit control effectiveness in an environment where ePHI is limited in scope.

When ePHI is tokenized, it is effectively de-identified–rendering it anonymized—preventing the loss of personal information if systems are breached and the data exposed. This has the added benefit of making it easy to adhere to the Safe Harbor provision of the HIPAA Security Rule to avoid costly fines and having to send out breach notifications if only tokenized data is exposed. Tokenized and de-identified datasets can also be used for big data analytics without risking the exposure of personal sensitive data during processing.

HIPPA also requires that Covered Entities and Business Associates ensure that the data within their systems has not been changed or erased in an unauthorized manner. While there are many ways an organization can demonstrate adherence to this requirement, by using TokenEx to securely tokenize, de-identify, and vault ePHI, an audit trail is automatically created for every ePHI dataset to ensure any changes or deletions are authorized and tracked.

How TokenEx Secures PHI

As with other data sets, controls concerning ePHI can be addressed strategically by removing the sensitive components of that data element. For HIPAA, there is a specific term for this process called de-identification, which removes any personally identifying characteristics of the data, replacing it with anonymized datasets, and thereby lifting protective requirements of ePHI.

Organizations can tokenize ePHI in accordance with the Safe Harbor provision of the HIPAA Security Rule to avoid costly fines and having to send out breach notifications if the data is compromised. The Safe Harbor provision details 18 unique identifiers that must be de-identified to qualify. These identifiers include names, addresses, dates of birth, telephone numbers, social security numbers, email addresses, etc. If data thieves gain unauthorized access to ePHI that has been de-identified using tokenization, you save your reputation and the monetary costs associated with declaring a breach. More detail on the de-identification standard can be found here. TokenEx can help you meet the Safe Harbor criteria by tokenizing and vaulting sensitive data elements to remove them from your business environment, yet retaining their usefulness for business processing and analytics.

With TokenEx’s tokenization service, an organization can demonstrate de-identification of ePHI and that it is properly protected, thus reducing risk associated with storing the information and minimizing an organization’s regulatory responsibilities for protecting the data.

Examples of ePHI Tokenization

TokenEx can tokenize, and thus de-identify, individually identifiable health information–a subset of protected health information—that includes demographic information collected from individuals, such as social security number, patient number, and medical images, as well as non-health data such as IP addresses and postal codes. In some cases, tokenization can be used to tokenize and de-identify the entire data set. Unlike payment card and financial information, ePHI is very complex and includes many deviations across disparate systems. TokenEx helps solve this dilemma by being able to tokenize any data set using consistent schemas to protect data integrity.

In today’s diverse healthcare environments, it is common to encounter systems that cannot be adequately protected with preventative security controls. It may be legacy medical equipment that requires Windows XP to function or a vendor-managed system that for technical reasons is required to be configured in an insecure manner. Lacking the ability to apply basic security hardening configurations to these systems, healthcare organizations are relying more on system isolation controls to limit the damage of a breach of these highly vulnerable systems. Tokenization of data stored on these highly vulnerable systems can be a Covered Entity’s saving grace, protecting the organization of risk that these systems pose if breached. A compromise of these systems will only expose the tokenized data to an attacker and not the underlying ePHI.

Tokenization can also be used to limit the risk posed by rogue system administrators. System administrators are the most highly privileged users in IT organizations, yet they usually do not have the need to view protected health information. A rogue system administrator—whether their intent is malicious in nature or simply driven by curiosity—can abuse their privilege and view sensitive PHI about a targeted individual. Tokenization of the data will prevent unauthorized access of ePHI by rogue system administrators. A database administrator, for example, will only see tokenized data instead of the identity patient’s personal health record.

TokenEx has successfully de-identified ePHI data for organizations using tokenization. These organizations securely send identifiable data to the TokenEx Data Security Platform where the identifiable elements of the sensitive data are tokenized and returned to the organization for research, analysis, storage, and other uses.

What Are the Compliance Obligations?

The Final Rule on Security Standards under the Health Insurance Portability and Accountability Act (HIPPA) defines fifty-six administrative, physical, and technical standards for protecting Electronic PHI (ePHI). These standards are scoped to protect all ePHI within your environment. By tokenizing ePHI, you are effectively reducing the amount of ePHI in your environment, and reducing your overall burden of compliance. It is easier to ensure security and audit control effectiveness in an environment where ePHI is limited in scope.

When ePHI is tokenized, it is effectively de-identified–rendering it anonymized—preventing the loss of personal information if systems are breached and the data exposed. This has the added benefit of making it easy to adhere to the Safe Harbor provision of the HIPAA Security Rule to avoid costly fines and having to send out breach notifications if only tokenized data is exposed. Tokenized and de-identified datasets can also be used for big data analytics without risking the exposure of personal sensitive data during processing.

HIPPA also requires that Covered Entities and Business Associates ensure that the data within their systems has not been changed or erased in an unauthorized manner. While there are many ways an organization can demonstrate adherence to this requirement, by using TokenEx to securely tokenize, de-identify, and vault ePHI, an audit trail is automatically created for every ePHI dataset to ensure any changes or deletions are authorized and tracked.

How TokenEx Secures PHI

As with other data sets, controls concerning ePHI can be addressed strategically by removing the sensitive components of that data element. For HIPAA, there is a specific term for this process called de-identification, which removes any personally identifying characteristics of the data, replacing it with anonymized datasets, and thereby lifting protective requirements of ePHI.

Organizations can tokenize ePHI in accordance with the Safe Harbor provision of the HIPAA Security Rule to avoid costly fines and having to send out breach notifications if the data is compromised. The Safe Harbor provision details 18 unique identifiers that must be de-identified to qualify. These identifiers include names, addresses, dates of birth, telephone numbers, social security numbers, email addresses, etc. If data thieves gain unauthorized access to ePHI that has been de-identified using tokenization, you save your reputation and the monetary costs associated with declaring a breach. More detail on the de-identification standard can be found here. TokenEx can help you meet the Safe Harbor criteria by tokenizing and vaulting sensitive data elements to remove them from your business environment, yet retaining their usefulness for business processing and analytics.

With TokenEx’s tokenization service, an organization can demonstrate de-identification of ePHI and that it is properly protected, thus reducing risk associated with storing the information and minimizing an organization’s regulatory responsibilities for protecting the data.

Examples of ePHI Tokenization

TokenEx can tokenize, and thus de-identify, individually identifiable health information–a subset of protected health information—that includes demographic information collected from individuals, such as social security number, patient number, and medical images, as well as non-health data such as IP addresses and postal codes. In some cases, tokenization can be used to tokenize and de-identify the entire data set. Unlike payment card and financial information, ePHI is very complex and includes many deviations across disparate systems. TokenEx helps solve this dilemma by being able to tokenize any data set using consistent schemas to protect data integrity.

In today’s diverse healthcare environments, it is common to encounter systems that cannot be adequately protected with preventative security controls. It may be legacy medical equipment that requires Windows XP to function or a vendor-managed system that for technical reasons is required to be configured in an insecure manner. Lacking the ability to apply basic security hardening configurations to these systems, healthcare organizations are relying more on system isolation controls to limit the damage of a breach of these highly vulnerable systems. Tokenization of data stored on these highly vulnerable systems can be a Covered Entity’s saving grace, protecting the organization of risk that these systems pose if breached. A compromise of these systems will only expose the tokenized data to an attacker and not the underlying ePHI.

Tokenization can also be used to limit the risk posed by rogue system administrators. System administrators are the most highly privileged users in IT organizations, yet they usually do not have the need to view protected health information. A rogue system administrator—whether their intent is malicious in nature or simply driven by curiosity—can abuse their privilege and view sensitive PHI about a targeted individual. Tokenization of the data will prevent unauthorized access of ePHI by rogue system administrators. A database administrator, for example, will only see tokenized data instead of the identity patient’s personal health record.

TokenEx has successfully de-identified ePHI data for organizations using tokenization. These organizations securely send identifiable data to the TokenEx Data Security Platform where the identifiable elements of the sensitive data are tokenized and returned to the organization for research, analysis, storage, and other uses.

Secure Your PHI Today!

Contact us to learn how TokenEx can help tokenize your sensitive business data.