Powered by MOMENTUMMEDIA
Advertisement

The realities AI is forcing finance leaders to confront about influence, trust

Technology
27 February 2026

AI is not removing the need for judgement in finance. It is making judgement more visible, writes Hakan Ozyon.

I returned from the 1 Billion Summit in Dubai this January with a familiar mix of optimism and unease. The technology on display was impressive. The ambition was undeniable. I gained valuable insights, particularly on how AI is reshaping finance in ways that extend far beyond efficiency, automation, or cost reduction. Beneath the excitement sat a quieter truth: artificial intelligence is transforming the foundations of financial services itself.

Long before a customer speaks to an adviser, opens an account, or signs a document, AI systems are already at work shaping what they see, what they trust, and what feels credible. Influence now sits upstream of decision-making. Personalised feeds, recommendation engines, sentiment analysis, and automated messaging are no longer peripheral marketing tools. They form the environment in which financial choices are made.

At the summit, much of the focus was on scale. Scale of reach, scale of engagement, scale of monetisation. Yet scale without governance is not progress. It is exposure.

 
 

The most visible edge of this shift is the rise of financial creators. In Australia, there are platforms that demonstrate how financial influence now operates at scale across podcasts, newsletters, and social channels long before consumers engage directly with an institution. But creators are not the core issue. They are a signal of something deeper: AI-driven persuasion has become systemic.

The question for finance leaders is no longer whether AI-driven influence exists. It does. The real question is whether it is being governed with the same discipline we apply to capital, compliance, and risk.

For decades, influence sat outside the balance sheet. Marketing was treated as discretionary, and trust was assumed to be slow-moving and resilient. AI has dismantled that assumption. Distribution now behaves like infrastructure. Algorithms determine reach. Personalisation determines relevance. Velocity determines impact. Attention can be mobilised faster than most institutions can respond.

Creators illustrate this shift clearly. They do not simply promote products. They control direct relationships, repeatable demand, and community trust. AI amplifies this by optimising content, scaling reach, and accelerating monetisation. What appears to be marketing increasingly functions as economic leverage. Finance can no longer afford to treat influence as an afterthought when it behaves like infrastructure.

One of the most underestimated risks in this environment is symmetry. AI does not distinguish between good intent and poor outcomes. It optimises for engagement, not suitability. The same systems that scale credibility also scale misalignment. When trust is earned slowly but deployed quickly, small distortions compound into large failures.

This is where regulation often lags reality. Influence can scale long before guardrails appear. Disclosure becomes blurred. Incentives are obscured. Audiences discover monetisation only after behaviour has shifted. Trust is eroded not by malice, but by opacity. AI does not introduce new ethical dilemmas in finance; it accelerates existing ones and removes the margin for ambiguity.

Personalisation is often framed as customer centricity. In practice, it can blur the line between guidance and persuasion. AI systems increasingly tailor messages based on behavioural signals, emotional cues, and inferred preferences. Used responsibly, this can improve understanding. Used carelessly, it becomes silent persuasion, influencing outcomes without customers fully realising how or why certain messages were prioritised.

Finance operates under a higher obligation. Fiduciary duty does not stop at the advice itself. It extends to the environment in which decisions are shaped. When influence is personalised, customers must be able to understand what they are seeing, why they are seeing it and whose interests are being served. Any system capable of influencing a financial decision must be governable, explainable, and interruptible by a human. Transparency is not optional. It is foundational to trust.

A common misconception I heard repeatedly in Dubai is that partnering with creators or third-party platforms transfers risk. In reality, it often concentrates on it. Creators move fast. Institutions move carefully. When these speeds collide without structure, problems follow. Improvised messaging becomes compliance exposure. Personal controversy becomes brand damage. Missing approvals become audit gaps.

AI intensifies this risk. Content can be generated, adapted, and distributed at scale, sometimes without direct human oversight. When something goes wrong, regulators and customers do not ask which algorithm or partner was responsible. They look to the institution. If an organisation cannot clearly explain how its AI systems influence customer perception, then it does not control those systems – regardless of who built them.

Governance cannot be bolted on after the fact. Vetting, contracts, disclosures, approvals, training, and measurement are not bureaucratic obstacles. They are the cost of operating in an AI-influenced environment.

The most important insight from the summit was not about technology at all. It was about discipline. Institutions that succeed will not be those that chase every new channel or tool. They will be those who build trust infrastructure that scales alongside influence. That means treating creators as partners within a regulated system, not as promotional shortcuts. It means designing governance into workflows, not retrofitting it after incidents. It means aligning influence strategies with values, not short-term engagement metrics.

In an AI-shaped influence economy, trust is no longer brand equity. It is an operational infrastructure. Heading into 2026, the gap will widen between institutions that invest in trust as a system and those that continue to buy impressions and hope for the best. The former will build portfolios of influence in that compound. The latter will experience volatility, scrutiny, and surprise.

AI is not removing the need for judgment in finance. It is making judgment more visible. How institutions choose to govern today will determine whether trust becomes an appreciating asset or a hidden liability tomorrow.

Hakan Ozyon is the founder and chief executive of Hejaz.