Vormetric Vaultless Tokenization with Dynamic Data Masking dramatically reduces the cost and effort required to comply with security policies and regulatory mandates like PCI DSS. The solution delivers capabilities for database tokenization and dynamic display security. Now you can efficiently address your objectives for securing and anonymizing sensitive assets—whether they reside in data center, big data, container or cloud environments.
Efficiently Reduce PCI DSS Compliance Scope. Remove card holder data from PCI DSS scope with minimal cost and effort and save big on complying with the industry standard.
Foster Innovation Without Introducing Risk. Tokenize data and maintain control and compliance when moving to the cloud, big data, and outsourced environments.
Scale Globally. Deploy the solution globally without concerns about token synchronization, performance or uncontrolled costs. The vaultless tokenization approach and pricing model enables easy to manage and affordable scale.
Dynamic Data Masking. Administrators can establish policies to return an entire field tokenized or dynamically mask parts of a field. For example, a security team could establish policies so that a user with customer service representative credentials would only receive a credit card number with the last four digits visible, while a customer service supervisor could access the full credit card number in the clear.
Non-Disruptive Implementation. With the solution’s format-preserving tokenization capabilities, you can restrict access to sensitive assets without changing the existing database schema. The solution’s REST API implementation makes it fast, simple, and efficient for application developers to institute sophisticated tokenization capabilities.
Optional Batch Data Transformation. Thales e-Security Tokenization customers can also order the Batch Data Transformation utility from Thales e-Security. With this utility, you can tokenize high volumes of sensitive records without lengthy maintenance windows and downtime. You can also tokenize or mask sensitive columns in production databases and in copies of databases before they are shared with third-party developers and big data environments.
Tokenization capabilities: Format preserving tokenization, crypto-tokens (alpha/numeric), random tokens (numeric), single and multi-use tokens, date support
Dynamic data masking capabilities: Alpha/numeric, customize mask character
Validation support: Luhn check
Virtual appliance: Open Virtualization Format (.ovf), International Organization for Standardization (.iso), Microsoft Azure Marketplace, Amazon Machine Image (.ami)
Application integration: REST APIs
Authentication integration: Lightweight Directory Access Protocol (LDAP); Active Directory (AD)
High-Performance: Over 1 million tokenization transactions per token server
No cost for redundant, geographically dispersed, or scale-up token servers