https://newsletter.en.creamermedia.com
Business|Consulting|Gold|Paper|Systems|Solutions
Business|Consulting|Gold|Paper|Systems|Solutions
business|consulting-company|gold|paper|systems|solutions

Hallucination traps and knowledge gaps in policy

10th October 2025

By: Saliem Fakir

     

Font size: - +

Policy-wonking can at times be a crude endeavour, and reality is not reached directly but through untested internal beliefs. Without direct personal experience, one worldview tries to grasp the lived experience and worldview of others. In between, hallucinations can creep into the mix.

Then something is written and presented as some sort of knowing without really knowing. There is always epistemic treachery when you have not done the hard yards of working at something and gained tacit knowledge – prior experience is gold. Even if we recover some modicum of authenticity, policy elites can take leaps of faith: a view exercised is viewed as universally shared because someone with purported influence has said so.

We should not lose sight of a further dimension of the word ‘evidence’, suggesting an objective exercise only to mask political affiliations and belief systems that have percolated, knowingly and unknowingly, through the process of evidence gathering. Belief systems should not be underestimated in their sway over evidence, lending a hand to what are preset and locked-in, immovable political choices.

And let us not fall into the trap of the ‘halo effect’ – someone profound, celebrity-like in the policy world, due to exceptional accomplishments in one domain of knowledge, does not always make them the best equipped to provide solutions in other domains.

Here, listening is the key art, and then wisdom, when placed well, can be exercised.

Policy advocacy can live in its own cave, watching shadows, mistaking them for real knowledge.

What lessons can we draw from the problem of hallucinations from another domain of knowledge?

Large language models (LLMs) are known to have the problem of hallucinations – being able to create the illusion of giving you a reasonably sounding string of words but entirely false and inaccurate content.

Depending on the weighting and ranking that are part of algorithms of AI machines like ChatGPT, Grok and others, some papers recently written show that hallucinations cannot be removed – in what one paper described as ‘perplexingness’ – defined as “the degree to which new knowledge conflicts with an LLM’s learned conceptual hierarchies and categorical relationships”.

AI machines can have programmed flaws or develop bias, as highlighted by recent news reports, which later prove hard to correct through editing once the algorithm has been trained on data through reinforcement and learning mechanisms.

The point of AI hallucinations and perplexingness is the problem of bias inherent in the categorisation and hierarchy with which data is evaluated by an algorithm.

In any case, it is still early days with AI and sceptics remain: Emily Bender (FT interview, 20 June 2025) referred to LLMs as large plagiarising machines or “stochastic parrots”. Not to dismiss this or confine this to AI machines alone, Bender’s is a serious point – AI machines are changing the human condition and what we may call authentic intellectualism into synthetic laziness.

In order to address the problem of algorithmic bias in shaping public discourse and what information shapes political choices, it is curious that some AI companies have come up with a solution: the “Habermas Machine”, inspired by the philosopher Jurgen Habermas’s theory of communicative action, in which a free society is one of reason, tolerance and intellectual maturity where the process of deliberation leads to understanding and consensus.

DeepMind, inventor of this Habermas Machine, is giving its LLM a go at mediating political debates and disputes to help to achieve the Habermasian ideal of communicative action. As far as culture wars and partisan debates go, it does not seem machines are anywhere near solving the deep fractures and polarisation hammering away at our civilisation.

These concerns should not be reserved for young pupils or university graduates but also for the very business of policy research and knowledge. How much real research, versus consulting the “stochastic machines”, for policy answers is now necessary, is an open question.

While LLMs speed up the process of policy knowledge and answers, they also turn policy-wonking into something inauthentic and in danger of automated plagiarism if no ethical guardrails are applied.

In the allegory of the cave that Plato introduces in the Republic, there are people who have lived all their lives in a cave and have become accustomed to seeing only one thing – the shadows of objects that are reflected on the walls of the cave. Until one person takes the brave step of venturing out of the cave and discovers that the shadows are a result of actual objects that exist outside of the cave.

The story goes further: if the said brave person who ventured beyond the cave came back to tell the other cave dwellers that everything they saw was an illusion, they would not believe him.

There is a lot in LLM hallucinations, AI bias and the story of Plato’s cave that has relevance for policy-wonking. We all suffer from one or the other form of hallucination if we do not take corrective measures – hopefully we do not find ourselves in the position some AI programmers discovered: that once a hierarchical bias sets in, no amount of editing can change the weighting of the bias.





Edited by Martin Zhuwakinyu
Creamer Media Senior Deputy Editor

Comments

Showroom

ATI Systems
ATI Systems

ATI systems comprises five divisions: electrical assemblies, drives and controls, feedback sensors, enclosures, and strip guiding.

VISIT SHOWROOM 
Rentech
Rentech

Rentech provides renewable energy products and services to the local and selected African markets. Supplying inverters, lithium and lead-acid...

VISIT SHOWROOM 

Latest Multimedia

sponsored by

Industrial policy in focus again
Industrial policy in focus again
10th October 2025 By: Creamer Media Reporter
Magazine round up | 10 October 2025
Magazine round up | 10 October 2025
10th October 2025

Option 1 (equivalent of R125 a month):

Receive a weekly copy of Creamer Media's Engineering News & Mining Weekly magazine
(print copy for those in South Africa and e-magazine for those outside of South Africa)
Receive daily email newsletters
Access to full search results
Access archive of magazine back copies
Access to Projects in Progress
Access to ONE Research Report of your choice in PDF format

Option 2 (equivalent of R375 a month):

All benefits from Option 1
PLUS
Access to Creamer Media's Research Channel Africa for ALL Research Reports, in PDF format, on various industrial and mining sectors including Electricity; Water; Energy Transition; Hydrogen; Roads, Rail and Ports; Coal; Gold; Platinum; Battery Metals; etc.

Already a subscriber?

Forgotten your password?

MAGAZINE & ONLINE

SUBSCRIBE

RESEARCH CHANNEL AFRICA

SUBSCRIBE

CORPORATE PACKAGES

CLICK FOR A QUOTATION







301

sq:0.068 0.158s - 167pq - 2rq
Subscribe Now