Rumors of the death of US privacy enforcement were premature. ⬇️ Recap of the 5 myths busted on privacy in the US for non-US companies, from my presentation at the Naschitz, Brandes, Amir & Co. - AYR - Amar Reiter Jeanne Shochatovitch & Co. Privacy Day event: Myth 1: I'm under the radar so I'm safe from enforcement -> No, you're not. 🟣Regulators are enforcing against non-US entities with zero boots on the ground. (FTC in Avast & non-public FTC investigations). 🟣The US State privacy laws are also extra-territorial in application. 🟣Your competitors are telling on you to regulators. 🟣Your clients are less inclined to engage you because they are (more) worried about their own compliance. 🟣Private plaintiffs and classes are filing thousands of lawsuits. Myth 2: I can deal with it later -> Later may be too late. 🟣Buyers (in M&A deals) are increasingly looking into privacy earlier and in more depth. 🟣Customers prefer someone that makes it easy for them to comply with their own higher privacy risk (If they need a DPIA (risk assessment) or a third party bias audit (for AI use) and you don't have one, they will look for someone who does). 🟣Sometimes too late is too late and without the right setup in place the defect is not curable. (DoorDash case). Myth 3: US privacy is "inadequate" and the Trump Admin doesn't care -> No US enforcement is very complex, and active. 🟣The FTC is enforcing seriously with a focus in sensitive data, children's data, auto-renewal, pricing, fake reviews, "AI washing" and more. 🟣Sectoral laws like HIPAA (for health data) are being enforced, including for cookies. 🟣50 states have consumer protection laws that apply to privacy violations 🟣20 states have privacy laws, issuing $3 billion in fines in 2025. 🟣California has new CCPA regulations with requirements on cyber audits, DPIA, and transparency. Texas, Connecticut, Oregon and others are enforcing. 🟣Lots of class action lawsuits on all types of violations including: websites, chatbots, health data and biometrics. Myth 4: US AI regulation is "inadequate" & Trump Admin doesn't care -> No There are A LOT of difference state laws on AI and Federal may be coming. 🟣There are a number of dedicated AI laws like the Colorado AI Act & Texas TRAIGA. 🟣There AI disclosure, deepfake, AI Companion & transparency in training data laws. 🟣The Executive Order on AI envisions a preemptive, Federal AI law and encourages Congress to get there. Myth 5: "I did GDPR so I'm fine" -> No. US laws are different, and GDPR doesn't cover you. 🟣Different privacy notice, DPA and privacy rights. 🟣You need more DPIAs than under GDPR; they need to be more robust, and, in California, your C-Suite needs to submit confirmation, under penalty of perjury, that your DPIAs are in order. 🟣You need to deal with biometrics laws and children's data laws. 🟣US AI enforcement is strict and is already in play in the US. Thank you Dalit and Eyal for inviting me!
Common Data Privacy Myths Among Professionals
Explore top LinkedIn content from expert professionals.
Summary
Many professionals misunderstand data privacy laws, believing myths such as privacy policies alone ensure compliance, only analyzing data counts as processing, or that new technologies like AI are exempt from regulations. Data privacy is the practice of safeguarding individuals’ personal information throughout its lifecycle, from collection to deletion, by following laws and best practices.
- Go beyond paperwork: Make sure your organization backs up privacy policies with real-world enforcement, employee training, and ongoing monitoring of data practices.
- Recognize all processing: Understand that merely collecting personal data—even without using or analyzing it—is legally considered processing, and triggers privacy obligations.
- Stay transparent with AI: If your company uses personal data for AI development, ensure transparency and respect individuals’ rights regardless of your intentions or technological innovations.
-
-
Stop saying “trust is the cornerstone” of a privacy program. This is the first in a short series I am writing about the myths we keep repeating in privacy and what I’ve seen actually works inside organizations. To talk about privacy and customer trust may sound impressive in a keynote. It may look good on a slide deck. But inside an organization, “trust” is not a control. You cannot assign it to an owner. You cannot improve it in a sprint. What executives actually listen to are metrics tied to business risk and velocity: → How much rework are we avoiding by embedding privacy into product cycles? → How much vendor exposure (and liability) are we reducing through contracts? → How much cleaner and more useful is our data thanks to sane tag/data-flow control? These are operational signals. They move decisions in the boardroom because they tie directly to risk reduction, cost avoidance, and speed to market. Now, here is where it gets more nuanced. People absolutely engage more with brands they trust. But in most cases, that trust comes from product quality, service, and price, not privacy. Privacy plays a role, but it is usually a supporting role rather than the lead actor. I have seen companies receive a flood of deletion requests. At first glance, it looked like customers had lost faith in the company’s privacy practices. In reality, it had nothing to do with privacy. A merger had disrupted the customer experience, frustration was high, and people left. The deletion requests were a symptom of lost confidence in the business, not the privacy program. Are there rare exceptions where privacy truly differentiates? There are some companies with products architected around privacy. But that is not the reality for most organizations. So let’s stop overselling. Privacy should not be pitched as “building trust” or as the next competitive differentiator. That framing sets professionals up to disappoint. Privacy earns its place at the table when it demonstrates something much more practical: → Lower liability → Faster launches → Cleaner operations → Defensible evidence when regulators or plaintiffs come calling That has been my experience across decades of building and advising programs. But I know others may have seen it differently. If you have examples where privacy truly functioned as a differentiator, or where “trust” was more than a slogan, I would welcome the evidence. This is also the conversation I take on in my upcoming book: So You Got the Privacy Officer Title—Now What? It is time we move past slogans and focus on the operational mastery that actually makes a difference.
-
"I DON’T PROCESS PERSONAL DATA, I ONLY COLLECT IT" This is a common misconception we still hear, even in professional environments. Many believe that processing only takes place when data is being analyzed, shared, or sold. In reality, the scope of “processing” under data protection laws, including Indonesia’s Personal Data Protection (PDP) Law or GDPR, is much broader and begins the moment personal data is collected. Processing includes every action involving personal data, whether automated or manual. It covers collection, recording, organization, storage, alteration, updating, retrieval, use, transfer, disclosure, and even deletion or destruction. So even if you only collect personal data and keep it on file without using it further, you are still processing that data and therefore subject to data protection obligations. This broad interpretation serves an important purpose. It ensures that individuals’ personal data is protected throughout its entire lifecycle. Every stage must comply with core principles such as fairness, transparency, purpose limitation, accuracy, and security. Failing to recognize your role in processing can lead to unintended non-compliance and regulatory risk. That is why it is essential not to misinterpret what constitutes data processing. Many organizations unknowingly collect and store documents such as ID cards or family cards, assuming it’s harmless, especially when only one or two Personal Data are actually needed. What they often overlook is that collecting the full document itself is already considered processing, and it increases exposure to risks associated with excessive data collection.
-
Information Commissioner's Office Busts Data Protection Myths On AI Use 👩💻 Building on yesterday's post about the AI Safety Report and in the aftermath of last week's Data Protection Week, I wanted to share a summary of a fascinating article from the ICO, that came across my feed over the weekend. The piece debunks common misconceptions about AI and data protection, offering clear insights into the actual facts. The article discusses the transformative potential of AI across various sectors in the UK, and emphasises the importance of responsible AI development through strong data protection practices. Here's the myths, and the facts-summarised! ❓ People have no control over their personal data being used to train AI ✅ People’s rights over their personal data remain unchanged despite evolving technology, meaning individuals can object to its use, especially in training or deploying AI, if they are not comfortable with how it’s processed. The ICO can intervene when organisations appear to misuse data. ❓ AI developers can use people’s data without being transparent about it ✅Organisations using personal data to train AI models must be transparent about their activities from the start and provide clear information to ensure people can easily understand and object if they wish. The ICO consultation around this revealed a widespread lack of transparency, emphasising the need for Tech firms to respect individuals' rights. ❓ Data protection doesn’t apply if AI developers did not intend to process people’s personal data ✅ Some AI developers claim that unintended or incidental processing of personal data excuses them from legal obligations, but an organisation’s intention does not affect its duty to protect personal data. In practice, if a company wants to train an AI model using personal data, it must do so lawfully, regardless of its goals. ❓ AI models themselves do not come with data protection risks ✅ Some developers argue that AI models do not “store” personal data and therefore fall outside data protection rules, but the ICO 2020 guidance makes it clear that some models can retain identifiable personal data. ❓ AI development should be exempt from the law ✅Data protection law applies to AI without exception, ensuring that firms must address people’s rights and freedoms before using AI, and this legal obligation supports the responsible development of innovative products and services. ❓ Existing regulation is not fit for cutting-edge tech like AI ✅ The ICO are one of the few UK regulators that can oversee AI from the design stage, ensuring that systems are developed as well as used responsibly, while consistently applying data protection rules regardless of the technology used. Photo by Airam Dato on Plexels #DataProtection #ResponsibleAI #AIMyths #ICOGuidance #DigitalRights #TechTransparency #PrivacyMatters #UKRegulation #AIInnovation #DataPrivacy
-
MYTH: “A Privacy Policy Means We Are Covered” TRUTH: A privacy policy is just words on paper unless it’s backed by actual enforcement. Many organizations publish legally compliant privacy policies but fail to train employees or monitor real-world application. A tech company states in its policy that it anonymizes user data, but developers store identifiable user logs for debugging, violating their own policy. Privacy compliance isn’t just about having policies—it’s about governance, training, and operational accountability. https://lnkd.in/eaBraFCD #GRC #PrivacyAwareness
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Negotiation
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development