More and more organizations are banning the internal use of publicly available GenAI tools like GenAI due to security risks. These include leading financial services organizations like JP Morgan Chase, Bank of America, Deutsche Bank, Goldman Sachs and Citigroup, as well as technology companies like Apple, Amazon and Samsung.
But controlling that policy is not a simple task. According to the Q2 2024 AI Adoption and Risk Report [source], 74% of chat GPT use at work, and 94% of Gemini and Bard use at work, are from non-corporate accounts. By using personal accounts to conduct work, knowledge workers are gaining the productivity benefits of GenAI while potentially skirting corporate policies.
This type of shadow AI creates real risks for organizations. And the risk is increasing as the amount of sensitive data put into these unguarded AI tools is growing. In March 2024, 27.4% of data employees put into AI tools is sensitive, up from 10.7% a year ago.
Shadow AI presents an uncontrolled risk for businesses and underscores the need for solid security and privacy practices when it comes to working with your AI.
At the heart of the challenge is data. Data is the fuel that feeds AI. Data is also a valuable corporate asset, that if leaked, stolen, or corrupted can cause real harm to your business. This harm can be in the form of regulatory fines from leaking PII data, or the costs associated with leaking proprietary information like source code.
The only way to mitigate these risks is to secure your data at-rest, in-transit, and in-use and have control over what data gets fed into LLM models.
Data Security is at the Heart of Securing GenAI and the Fortanix DNA
To mitigate the risk, organizations need to take a multi-faceted approach to limit the use of shadow AI and to prevent leaking of sensitive data into GenAI models. Data security is as fundamental as ever, and here are some basic recommendations to preserve data security and privacy:
- Data Encryption: Ensure encryption throughout data’s full lifecycle—at-rest, in-transit, and in-use.
- Data Obfuscation: Anonymize sensitive or PII data from any dataset data can possibly make it into the AI pipeline. Data tokenization is by way of using Format Preserving Encryptions is an effective technique that will enable you to monetize your data without compromising security and privacy.
- Data Access: only authorized users should be able to see or use data in plain text, so apply granular roled based access controls.
- Data Governance: Stay current on data privacy regulations, ensure data privacy is embedded in operations, and commit to ethical business practices.
- Manage Encryption Keys: Manage full lifecycle, store them securely and separately from data, and limit access.
About Fortanix
Fortanix is a leader in data-first security and a pioneer of Confidential Computing. We secure data across hybrid multicloud environments, enabling customers to maintain greater control, security, and compliance over sensitive data.
We also prioritize data exposure management-- our unified data security platform makes it simple to discover, assess, and remediate data exposure risks, whether it’s to enable a Zero Trust enterprise or to prepare for the post-quantum computing era.