The Real Risks of Free AI Advice in HR
“Just ask ChatGPT.”
It’s become the default advice for everything—from writing a recipe to managing staff issues.
And while it might work for dinner ideas, when it comes to HR—especially in Australia—it can cost you thousands, damage your reputation, and land you in legal hot water.
Why? Because AI isn’t trained on your policies. Or our Fair Work system. Or your obligations under the SCHADS, Retail, Clerks, or Manufacturing Awards. It’s trained on patterns. Not on legal precedent. Not on nuance.
But more and more businesses are using free AI tools to:
- Get advice on pay rates or Award coverage
- Write termination letters
- Draft employment contracts
- Handle conflict and underperformance
And they’re making expensive mistakes as a result.
I recently had a potential client plugging the information I was providing them into AI—just to “check” if I knew what I was talking about.
Then came the awkward (and somewhat heated) conversation when the AI tool gave them a different answer than I did.
The heated part wasn’t on my end—it was the client who chose to trust good old Charlie Chat over my 20+ years of real-world HR and compliance experience.
Let’s just say, Charlie Chat won’t be showing up beside them at a Fair Work hearing.
I know tools like ChatGPT are incredibly convincing but that’s because they’re designed to be. They mimic confidence. They use professional language. But they don’t understand your legal risk, your workplace culture, or the impact of a poorly handled conversation. They don’t know how to read a tense pause in a meeting or recognise that an employee is struggling with burnout masked as underperformance.
Real Mistakes, Real Consequences
🔻 Incorrect Pay Rates: A small business owner relied on ChatGPT to calculate casual pay for a retail worker. The AI didn’t factor in the minimum engagement period or weekend penalty rates. The business ended up owing nine months of backpay and faced a Fair Work audit after a complaint.
🔻 Unenforceable Contracts: A client copied and pasted a contract clause from an AI draft, thinking it sounded good. But the non-compete clause had no geographical scope and wasn’t relevant to NSW employment law. The employee left and immediately started a competing business - legally.
🔻 Flawed Termination Advice: An SME used AI to guide a dismissal. It missed key steps like formal warnings, offering a support person, and procedural fairness. The business owner was shocked to receive a general protections claim and had no proper records to support their version of events.
Why This Happens
Free AI tools like Claude, ChatGPT and others sound confident. That’s their design. But they don’t:
❌ Know current Award rates
❌ Verify Australian employment laws
❌ Understand Fair Work procedures
❌ Read emotional dynamics or workplace context
And more importantly, they aren’t accountable if things go wrong. You are.
In small businesses, it’s easy to think AI is the smarter, faster, cheaper option. You’re time-poor. You’re trying to get it right. And maybe you didn’t realise HR is actually a specific skill set you need help with - because isn’t it just about chatting to your people every now and then?
Unfortunately, it’s not. HR is layered with nuance. It’s legal, yes, but it’s also emotional, relational, and strategic. And AI can’t lead with empathy, adapt to cultural dynamics, assess risk, or understand what truly keeps people engaged, safe, and supported.
That doesn’t mean AI has no place in HR. Used with care, it can help you draft a job ad, map out onboarding steps, or summarise a policy.
But when it comes to decision-making—especially involving real people, risk, or conflict - human judgment is non-negotiable.
HR isn’t just about policies, it’s about people. And when AI gets it wrong, it’s not just a legal risk, it’s a leadership one.
Trust gets shaken. Communication suffers. And people start to feel like they’re working for a machine, not a business that values them.
5 Questions to Ask Before You Use Free AI Advice:
- Would I trust this advice in front of FairWork representative, SafeWork Inspector or another third party? If not, don’t act on it.
- Have I checked this advice against current Australian legislation or Awards? If not, you’re flying blind.
- Does this situation require judgment, empathy, or leadership? AI can’t offer any of those.
- Am I clear on my legal obligations—not just “common sense”? AI doesn’t know your compliance context.
- If this goes badly, who’s responsible? You are. Not Charlie Chat. AI is a powerful tool, but it’s not your legal team, your HR manager, or your conscience.
Final Thoughts
AI will give you fast answers. But HR isn’t about speed - it’s about risk, relationships, and results. Use AI for admin and structure.
Use a human for anything that carries weight, risk, or impact.
You don’t need to fear AI, you just need to know where it belongs. Use it to make your systems smarter, not to replace your leadership. Because at the end of the day, AI doesn’t lead people. You do.





READY TO GET THINGS DONE?
Revolution Consulting Group is your Dedicated HR Partner