12 Apr

Transparently Tokenizing Data While Maintaining Its Functionality

Although the primary goals of tokenization are to secure data, desensitize it, and remove it from your environment, it’s just as important to preserve that data’s business utility. If an organization’s tokenized data can’t still be used for analytics or other business intelligence purposes, then it is just as worthless

25 Feb

Segment and Secure Sensitive Data with the Data Security Island

The enterprise today is a complex technological environment that has adapted over time to meet ever-changing business requirements. This is evidenced by the confluence of new technologies like Guidewire PolicyCenter and legacy integrations with backend mainframe applications. With more cloud-based applications being added to these evolving technological environments, business data

15 Jan

TokenEx partners with leading governance, risk and compliance platform, SureCloud

Data security companies look forward to “complementary” combination of security solutions TokenEx and SureCloud announced today that they are partnering to further streamline compliance for both entities’ customers. “I am very excited about our partnership with SureCloud,” said TokenEx Head of Global Privacy and Compliance Solutions John Noltensmeyer. “Our clients

26 Dec

De-scoping PCI Compliance


Utilizing Tokenization to Reduce Scope and Achieve PCI Compliance Descoping a data environment to achieve PCI compliance is often the primary goal of entities subject to the PCI DSS. Several methods exist for achieving this, but one of the easiest and most common is to outsource the management of sensitive

25 Oct

5 Takeaways from PCI Europe Community Meeting

5 Takeaways from PCI Europe Community Meeting 1. QSAs underestimate the PCI DSS scope reduction provided by cloud-based tokenization. While familiar with tokenization as an on-premise solution or as a service provided by payment processors, many QSAs we spoke with during the PCI Europe Community Meeting were often unaware of