Finance
Why ChatGPT and Other AI Tools Aren’t a Shortcut for the Clueless
By: Albert Biga This ancient wisdom has never felt more relevant than it does in today’s rapidly changing technological landscape particularly with the explosive growth of artificial intelligence. The rise of AI tools like ChatGPT, DeepSeek, Claude, and others have sparked global excitement. Wit...
The High Street Journal
published: Jul 10, 2025

By: Albert Biga (Technology evangelist & Entrepreneur)
This ancient wisdom has never felt more relevant than it does in today’s rapidly changing technological landscape particularly with the explosive growth of artificial intelligence. The rise of AI tools like ChatGPT, DeepSeek, Claude, and others have sparked global excitement. With just a few typed instructions, anyone can now produce detailed reports,generate business plans, write code, brainstorm creative content, and summarize complex texts.
These tools are powerful and widely accessible and being hailed as democratizing technology and access to information and rightly so. But beneath the surface lies a less comfortable truth: AI is not a shortcut to intelligence or insight. In fact, it may be quietly deepening the divide between those who know and those who don’t.
To an outsider, usingChatGPT seems as easy as typing a question. But the difference between a mediocre response and a powerful, tailored output lies in how well the user understands the 3 prompt commandments namely Who you are, what information you want and who the output is intended for. All these properly delivered in a good prompt guides AI to get you to the proverbial “promised land”.
This is where the gap begins. People with domain expertise and communication skills can extract tremendous value from AI. They know how to frame a prompt, iterate intelligently, challenge an answer, and guide the tool like a co-pilot. But those without this foundation often get stuck at the surface asking vague questions and getting generic or even misleading responses in return. Put differently, prompting is not just about asking but knowing what to ask, and how to use the answer. This means knowledge is required to use AI properly.
There’s a popular belief that AI can replace thinking. That you can ask it to “explain this strategy” or “write a business plan” and walk away with something ready to use. But AI tools don’t have real-world experience. They rely on patterns in data and not judgment. Without a user who understands the topic, the results can sound confident but be entirely wrong.

For instance, a lawyer using AI can draft a clause faster, but only because they know what makes it valid in law, a data analyst can structure a meaningful report with ChatGPT only because they understand what the data should show and finally a business leader can ideate faster but only because they can sense what will work and what’s off.
To sum it all up if you don’t know what “good” looks like, AI won’t help you get there. In fact, it may mislead you faster. Another challenge is that generative AI tools are trained entirely on human-created data through machine learning, which means they reflect not only our knowledge, but also our errors, biases, and blind spots.
AI doesn’t “know” the truth; it mimics what it has seen. For users who lack subject-matter understanding, this poses a serious risk: they may accept flawed, outdated, or inaccurate information without realizing it. The ability to detect these missteps is yet another advantage of an informed user and another trap for the clueless.
This means for professionals, creators, and thinkers who use AI and bring judgment, structure, and experience, AI becomes a multiplier. Whereas for users who treat AI like Google or Grammarly. They may get surface-level help an edited email or blog summary, but the real depth is missing.
It’s not a new phenomenon. When computers first entered the workplace, they held the promise of revolutionizing productivity. But in many organizations, people simply used them as glorified typewriters using them to type memos and letters, printing documents, maybe checking email without ever tapping into the true power of spreadsheets, databases, or analytics.
The tool was there, but the thinking hadn’t changed. We often talk about digital inequality in terms of access within the context of who has internet, devices, or apps. But a deeper divide is emerging: one of AI aptitude. It’s not about having the tools but about knowing how to leverage them.

This is particularly concerning in under-resourced communities, developing economies, and smaller institutions where formal training is lacking. In such settings, the availability of AI won’t lead to transformation unless users are taught how to think with it and not just use it. It’s like giving everyone a piano: only those who understand music will make something beautiful.
The rest may just bang the keys. To ensure AI becomes a true equalizer and not a divider, we must invest in capacity and competency and not just access to the tools. This means teaching AI literacy in schools, workplaces, and public institutions, promoting critical thinking and domain knowledge as core competencies in the AI age and finally encouraging users to treat AI as a collaborator, not as a crutch or oracle.
We don’t all need to be programmers. But we do need to become more intentional, reflective, and structured in how we engage with these tools. To conclude, ChatGPT and its peers aren’t magic bullets, they’re mirrors reflecting back what the user brings to the table. So NO, AI isn’t a shortcut for the clueless, It’s the fulfilment of the parable revealing who will sow wisely, and who will bury their gifts in the ground.
Read More