The AI revolution and the talent recruitment question
By Mike Mackenzie, talent acquisition, BIS Services

The AI Revolution is here. In fact, even writing these words already feels archaic, with the use of this amazing technology now embedded irrevocably into every industry across the globe.
From marketing to healthcare, from the stock exchange to social media algorithms, the outsourcing of menial tasks and the speeding up uninspiring administration responsibilities has revolutionised workflows, education and data analysis at a speed many of us have found breathtaking. And it will only get faster as our reliance continues deeper.
Reviewing this article in a year’s time may already be obsolete. In fact you’d be forgiven if you suspect that this too was AI generated, such is its prevalence now into every aspect of our lives. (It wasn’t!).
We could all argue back and forth as to the pros and cons of this technological wonder, and where this is taking us as a civilisation, however what must be debated and considered in our industry, is the role of AI in neurorehabilitation, and its impact on critical thinking, analysis and capacity of rehabilitation professionals to best serve their clients in the real world.
As the Talent Acquisition Manager at BIS Services, my recruitment role entirely focusses on identifying competent Rehabilitation Assistants to support sensitive clients living with acquired brain injuries, who act as the eyes and ears on the ground for the wider MDT in their ongoing rehabilitation and assessments.
The role therefore requires not just a background education in psychology or neuroscience, our minimum standard, but also candidates who possesses inherent compassionate personality traits and problem-solving ability, combined with skills in critical analysis, observation and a thirst for supervision and development.
As a fast-growing service, we vet thousands of CVs each year, and as part of that recruitment process we include a detailed application form that not only assesses their theoretical knowledge of cognitive deficits and presentations following a TBI, but several real-world support scenarios designed to gain insights into candidates’ ability to problem solve and apply their knowledge into practice.
It is here we have recently seen a marked shift in university applicants’ use of AI generators in their answers, and an increasing number try and game the system to secure an interview. And who can blame them? This is a notoriously difficult industry to break into, such are the numbers of graduates coming into the system each year, with multiple pathways into it.
Our processes have evolved therefore to adapt, where we too have now employed AI detectors when we suspect and answer is inauthentic or too precise, or where many have sadly copied and pasted answers without thought for editing.
Of course, as AI use develops further, we have seen social media advertising aimed directly at students for AI filters to humanise AI generated text to avoid detection. AI, detecting AI, using AI to avoid AI. You can see the issue.
To take things one step further, I have personally conducted virtual interviews with candidates who, to my astonishment, set up their ChatGPT platform next to their laptop, and recite verbatim the generated answers to my posed questions. Needless to say, these candidates were not successful, but you get the point.
To lecturers of course this is nothing new, and this trend has long seen the horses have bolted when it comes to generated content in dissertations, research and assessments.
As we have close relationships with multiple faculties across the UK, the problem is one of exasperation and many universities now adopt a policy of AI rules and guidelines published on their own application pages, thus tacitly endorsing its use, and who have appeared to be resolved to try and stay with the times, rather than issue an outright ban.
In February 2025, The Higher Education Policy Institute (HEPI) published a Savanta survey that reported the proportion of students using generative AI tools such as ChatGPT for assessments has jumped from 53 per cent last year to 88 per cent this year. The most common uses are for generative AI to explain concepts, summarise articles and suggest research ideas. The trend is clearly here to stay.
The issue therefore for the industry to grapple with, is the obvious loss in that uniquely human in-person problem solving ability and talent for building trust and empathy, which is what really helps to improve outcomes.
An AI, as yet, cannot engender compassion, or read a room and its inherent non-verbal cues. Trust is irrevocably broken between client and rehabilitation assistant if a computer or smart phone is dictating the course of a session, with each party going through the de humanised motions.
This might be an extreme example, yes. But what hope will a candidate have if their entire training has been developed for them, with now being thrown into a real case scenario that requires independent thought?
What talents can be developed by rehabilitation assistant, where 90 per cent of applicants we see aspire to progress to clinical roles and doctorates, if they have a lost that human ability to truly connect with their subject, and guide clients with their insights appropriately?
At the risk of being too critical of the current cohort of graduates, this of course looks like the rant of the little Dutch boy with his finger in the dam.
This is not my world any longer, but the future here already, and technology has closely shaped the lives of this new generation of aspiring psychologists at a speed beyond anything this writer has experienced in his lifetime so far. I might think it’s like turkey’s voting for Christmas, using another analogy, but the next generation simply do not see it that way.
Education and employment trends have further exacerbated the situation, and we are all still adapting to a world post COVID, where society was asked to retreat to the home setting and conduct their lives virtually, sacrificing human interaction. It is this generation graduating now whose core development years were impacted in the thick of that crisis, and so coupled with a remote working, social media world and swipe-left mentality, it is entirely understandable as to why we are where we are today.
So what is to be done? How do we adapt to make sure we are preparing talent and serving our clients to the best of our abilities?
Indeed, there have already been opinions and articles written about the benefits of AI as it becomes increasingly sophisticated; TBI solicitors and barristers are already leveraging these technologies to build stronger cases, analyse complex medical data, neuroimaging advancements and ultimately secure better outcomes for their clients, as reported in another article in the NR Times.
This undoubtedly will revolutionise the industry, but the flip side to this is the fact the data and observational reports in the first place will be scrutinised by AI, and that the technology is being used by professionals with years of care and litigation experience under their belts and harnessed correctly.
A fact not afforded to inexperienced assistants trying to make a name for themselves, and which leaves them in a potentially precarious position.
It is true we’re going through a dramatic sea change in society, where technology will all but wipe out entire job industries to automation. We’re in a world where billionaires are openly setting their goals on space, Neuralink interfaces and automatons in the home. The impact on our industry therefore is clear.
At BIS, we acknowledge these changes and work with our rehabilitation assistants to help them prepare for the new world to come, and help them carve out their USP, their point of difference, a talent.
We monitor closely session MACH forms and provide guidance and development through senior line managers.
Shadowing, multi-phase recruitment, even in person role playing and regular scenario case studies have all become essential to the goal of developing talent, and finding the best match for our clients, right down to personality matches.
We do this so that both sides, clients foremost, but secondary the assistants themselves, are looked after during this delicate process of recovery, so that it’s the outcomes on both sides that prosper.
The BIS Services motto is “Raising the Standard in Cognitive Rehabilitation and Brain Injury Support”. That means training, development and a client centred tailored approach – on both sides.
If AI is to be the future, which it is sure to be, then this includes adaptation in our approach and safe integration of its uses and benefits, not to the detriment of that essential human touch and losing the essential gateway for psychology graduates to progress as clinicians.
Find out more about BIS Services at thebiss.co.uk
