Most business owners building chatbots in Saudi Arabia have no idea what PDPL is.
Some have heard of it. Very few have checked whether their bot actually complies with it.
If your WhatsApp automation collects names, phone numbers, or any customer data — and it does — you need to know this.
PDPL stands for the Personal Data Protection Law. Saudi Arabia's data privacy regulation. It came into full enforcement in 2024, and the fines for non-compliance go up to 5 million SAR. It's not theoretical. It's active.
I'm Mohammed, founder of Buraq AI. We've deployed compliant WhatsApp automation for 500+ Gulf businesses. Here's what every business owner needs to know — in plain language, no legal jargon.
What PDPL Actually Says (The 4 Rules That Matter for Chatbots)
Rule 1: You need explicit consent before collecting data.
When a customer messages your WhatsApp bot and it asks for their name, email, or phone number — that collection requires their informed consent. They need to know what you're collecting, why, and how you'll use it.
A simple fix: your bot's opening message includes one line — "By chatting with us, you agree to our privacy policy [link]." That's not perfect legal compliance on its own, but it's the minimum visible signal.
Rule 2: Collect only what you need.
PDPL calls this "data minimization." If your chatbot is booking a clinic appointment, it needs a name, phone number, and preferred time. It doesn't need their ID number, their employer, or their income bracket. The moment you collect data you don't need for the stated purpose, you're in grey territory.
Rule 3: The customer has the right to delete their data.
If a customer asks "delete all my information," your system needs to be able to do that. This means your chatbot data can't just sit in a spreadsheet somewhere nobody can edit. It needs to be in a system with actual data management capabilities.
Rule 4: No sharing without permission.
If your chatbot collects a lead and you pass that lead to a third-party marketing agency — without the customer's explicit permission — that's a PDPL violation. Full stop.
The Meta Policy Change That Made Compliance Even More Important
As of January 2026, Meta officially restricted general-purpose AI chatbots on WhatsApp Business API. Only task-specific bots are permitted — bots built for a defined purpose like booking appointments, tracking orders, or answering FAQs.
This isn't just a Meta rule. It's actually aligned with how PDPL works. Task-specific bots collect less data. They're more transparent about their purpose. They're easier to audit.
What this means practically: if your bot is trying to be everything — a sales agent, a customer service rep, a recommendation engine, a conversation partner — you're probably violating both Meta's policies and PDPL's principles at the same time.
Build it to do one thing. Do it well.
5 Things to Check Right Now
Go through this list for your current WhatsApp bot setup:
1. Do you have a visible privacy notice?
Your bot should reference your privacy policy at the start of every new conversation. If it doesn't, add one line. Today.
2. Does your bot collect only what it needs?
List every piece of data your bot asks for. Now cross off everything that isn't directly needed for the service the customer is asking for. Anything left on the crossed-off list is a risk.
3. Can you delete a customer's data if they ask?
Test this right now. Find a test contact in your system. Can you delete all their data completely? If the answer involves a developer and three weeks, fix the system.
4. Do you have an audit log?
PDPL requires that you can explain what happened with data. Most basic chatbot setups have zero logs. Your platform should keep conversation logs and data access records that you can access if SDAIA (the enforcing body) ever asks.
5. Are you using a Meta-approved WhatsApp Business API provider?
If your WhatsApp automation is running through an unofficial channel or a third-party app that isn't an official Meta partner — your account is at risk of being banned, and your data handling is unaudited. Buraq AI runs on the official API. That matters.
What "Compliant" Actually Looks Like on Buraq AI
When we set up a client's WhatsApp bot, these are the non-negotiable defaults:
- Opening message includes a privacy acknowledgment - Conversation logs are stored and accessible - Data collection is mapped to a specific service purpose - Human handoff is available (customers can always reach a person) - Bot is task-specific — not an open-ended AI conversation
These aren't bureaucratic checkboxes. They're also just good business practice. A customer who knows their data is handled properly trusts you more. Trust closes deals.
The Practical Bottom Line
You don't need to become a data privacy lawyer. You need to ask three questions about your chatbot:
1. Do customers know what data you're collecting and why? 2. Can they ask you to delete it and know it'll actually happen? 3. Are you only collecting data that's necessary for what you're doing?
If you can answer yes to all three, you're in good shape.
If not — the fix is simpler than you think. It starts with picking a platform that handles this correctly by default.
Buraq AI is PDPL-aware by design. Plans start at 399 SAR/month. buraq.ai
Ignorance of the law is not a defense. But compliance doesn't have to be complicated.
Mohammed — Founder, Buraq AI | buraq.ai