AI might be smart, charming (sometimes), and available 24/7, but that doesn’t mean you should tell it everything. Think of it less like a therapist or bestie, and more like a very sharp intern with an excellent memory and very firm boundaries. Whether you’re just getting cozy with ChatGPT or using it for serious business, here are 8 things you should never say to an AI, unless you want a polite “Sorry, I can’t help with that” or, worse, a privacy mishap.

1. Your deepest, darkest secrets (aka sensitive personal info)

Sure, ChatGPT will never ghost you, but that doesn’t make it the right person (or bot) to spill your life story to. Even if you just want someone to talk to, there’s no need to drop your IC number, full address, or your entire resume into a prompt just to get solid advice. Keep sensitive information or personal drama to yourself or at least within a secure platform.

2. Your bank account details

As much as AI can help you budget, it doesn’t need to know your credit card information or bank details. Ever. Sharing financial information puts you at risk, even if the AI promises to behave. (Spoiler: even if it’s not storing your info, it’s still a no-go!) Think of ChatGPT like a super-smart spreadsheet, not your personal accountant. While it can help you build a budget, simulate saving plans, or explain how compound interest works, it’s not built to securely store or process actual financial data. That’s a job for encrypted and regulated platforms, not a chat window.

3. Anything that would make your lawyer nervous (Illegal stuff)

If you’re asking AI how to “quietly” bypass a paywall or something worse – don’t. Not only is this unethical, but AI is designed to shut down suspicious activity instantly. It’s not your partner in crime, and it definitely won’t help you out of a sticky situation. In fact, trying to get AI to participate in shady activity can leave a digital footprint you don’t want traced back to you. Even joking about illegal activity in a prompt can trigger red flags and result in warnings or restricted access. Remember, everything you type is part of a conversation that could be reviewed for safety or compliance.

4. Hate speech, threats, or anything that sounds like an online meltdown

AI has a zero-tolerance policy for harmful, violent, or discriminatory language. That includes anything that resembles bullying, hate speech, threats, or anything that incites violence. ChatGPT is designed to shut these kinds of prompts down instantly. Not only does it not engage, but repeated attempts to input harmful content can flag your account or get you booted from the platform entirely. But beyond the technical consequences, there’s also a bigger picture to consider. Language matters. What we type into AI models contributes to shaping how these systems learn and respond over time. If we want AI to be a tool that supports creativity, inclusivity, and respect, we have to model that in the way we interact with it. So while ChatGPT won’t humor a meltdown or engage in toxicity, it’s also on us to avoid treating it like a digital punching bag. So the next time you’re feeling heated, maybe take a breath, go for a walk, and come back when you’re ready to type like a decent human being! 

5. “Hey, does this look infected?” (Medical information)

ChatGPT can give you general health tips, but it’s not a doctor and definitely doesn’t play one online. Never upload test results or ask for a diagnosis. Instead, see a real healthcare professional who’s licensed, trained, and has a stethoscope. In fact, we even tried to test it by asking for a diagnosis for a simple cold and ChatGPT shut it down immediately, clearly stating it’s not a doctor from the get-go.

6. Confidential company secrets or copyrighted info

Tempted to paste your clients’ entire brand strategy for AI to rewrite? Think again. AI is a great collaborator, but confidential or copyrighted data should stay off the chat. It’s not about AI spilling secrets, it’s about staying on the right side of your NDA. The best way to go about this is to keep confidential information out of your prompts and replace it with generic terms instead. For example, swap “Coca-Cola launch strategy” with “launch plan for a soft drink”. This lets AI help you brainstorm without putting your company at risk.

7. Your passwords (or anyone else’s!)

This one should be obvious, but it’s worth repeating louder for the people in the back: don’t share your passwords with anyone – including ChatGPT! While some AI tools do not store your information or remember your conversations permanently (if you have memory disabled), it’s still a terrible habit to type out login details in any space that’s not a secure password manager. Think of it like shouting your ATM PIN across a crowded room: it’s not illegal, but definitely not the wisest thing to do. And with phishing scams and data breaches already a constant threat, giving out sensitive login info in a tool that may seem “private” is just adding fuel to the fire.

8. Small talk like “Hi” or “Thanks”

Okay, this one’s not dangerous; it’s just unnecessary. While it’s nice to be polite, here’s the thing: even a simple “hello” or “thank you” uses up computing power. AI runs on serious energy behind the scenes, and every response – no matter how small – has a carbon cost. So while ChatGPT won’t mind the extra chatter, the planet might. Keep it concise, save the niceties for your next human interaction, and let’s be efficient together.

Final Thoughts

AI is a powerful, borderline magical tool – capable of spinning up ideas, reworking your emails, and even pretending to know what your dreams mean. But it’s not your diary, your doctor, or your best friend. Every interaction, no matter how casual, is part of a much bigger system that runs on data, energy, and policy. The more intentional we are with our inputs, the better and safer the outputs become – not just for us, but for everyone using AI.

So the next time you start a conversation with ChatGPT, think of it less like a chatroom and more like firing up a jet engine to fetch you a sandwich. Cool? Yes. Efficient? Only if you use it wisely!

Disclaimer: This piece is just our take on what to avoid when chatting with AI tools like ChatGPT. Ultimately, use it how you see fit – but hopefully a little wiser now!

Leave a Reply

Your email address will not be published. Required fields are marked *