From shadow tech to concealed AI use and why leaders must catch up

Organisations that fail to monitor shadow AI usage today will face difficulties when regulatory bodies start enforcing new rules in the future.

Photo credit: Shutterstock

The rapid pace of technological development exceeds organisations’ ability to establish effective governance systems.

The workplace experienced a similar phenomenon in the last decade when staff members brought Dropbox, Google Docs and Slack into their work environments before organisational approval. Workers adopted these tools because they needed solutions that official systems failed to provide. The official tools were too slow, clunky, or nonexistent, so workers found their own.

The current situation with shadow AI mirrors the previous case of shadow IT. Shadow AI is the unsanctioned use of AI tools or applications by employees without approval or oversight of the employer.

There are several reasons why employees turn to shadow AI.

The underlying factors are similar to previous situations, which are activated when employees encounter performance deficiencies, including productivity pressure, where a marketing associate uses AI to generate campaign ideas within a short time frame and complexity gaps, where a financial analyst uses AI to verify formulas instead of waiting for their peers to review them.

These examples demonstrate that staff members use AI tools to address genuine operational challenges rather than seeking new technology for its own sake.

However, the challenge arises when employees are using AI tools without proper oversight. There are several hidden risks, and shadow AI creates three distinct risk categories that organisations must address.

One, data exposure represents the first risk factor because sensitive information and client data become vulnerable to unauthorised disclosure when fed into unprotected AI systems.

The implementation of AI systems leads to two major problems – biased results and non-compliance with regulations. AI systems generate biased or inaccurate results, which can lead to legal exposure when organisations use them for hiring or decision-making processes.

Leaders who are at the centre of organisations need to first validate employee needs by understanding that shadow AI demonstrates their desire to enhance their work efficiency and create specific rules which define authorised tools and data handling procedures and prohibited usage practices.

The organisation needs to deliver training sessions about proper AI usage, which should include lessons about bias detection and privacy protection and system security and should also purchase enterprise-grade AI solutions which provide secure platforms for employees to work with, instead of forcing them to hide their tools.

Executives who view shadow AI as a threat alone will overlook the substantial business potential it presents. Organisations that recognise shadow AI as a strategic indicator will convert potential risks into business advantages.

Organisations face a straightforward decision between letting shadow AI control their operations or using it to establish purposeful leadership.

Further, the organisation needs to develop a system for periodic assessments which will monitor AI usage for safety and compliance with business objectives.

Additionally, when staff members start to conceal productivity tools from their superiors, it leads to a breakdown in employee trust which damages organizational culture.

Government entities must maintain close observation of these developments. The AI Act has partially taken effect throughout Europe, it demands organisations to maintain records about their AI system utilization and implement proper governance systems.

Organisations that fail to monitor shadow AI usage today will face difficulties when regulatory bodies start enforcing new rules in the future.

History shows the right path. Organisations progressed from banning cloud services to creating structured systems for cloud adoption after their employees started using shadow IT. The same approach needs to be applied to AI systems.

Leaders who are at the centre of organisations need to first validate employee needs by understanding that shadow AI demonstrates their desire to enhance their work efficiency and create specific rules which define authorised tools and data handling procedures and prohibited usage practices.

The organisation needs to deliver training sessions about proper AI usage which should include lessons about bias detection and privacy protection and system security and should also purchase enterprise-grade AI solutions which provide secure platforms for employees to work with instead of forcing them to hide their tools.

Further, the organisation needs to develop a system for periodic assessments which will monitor AI usage for safety and compliance with business objectives.

Shadow AI functions as an indicator rather than an act of defiance against authority. The current situation demonstrates that employees want to adopt new work approaches although their leaders have not adopted these changes.

Executives who view shadow AI as a threat alone will overlook the substantial business potential it presents.

Organisations that recognise shadow AI as a strategic indicator will convert potential risks into business advantages through the development of organisations that excel at AI operations.

Organisations face a straightforward decision between letting shadow AI control their operations or using it to establish purposeful leadership.

The writer is an experienced Tech leader, accelerating Africa’s digital future with AI, Strategic Innovation and Resilient Growth

PAYE Tax Calculator

Note: The results are not exact but very close to the actual.