Table of Contents
Banks are no longer asking whether artificial intelligence belongs in financial services. The more pressing question is how to move AI out of isolated experiments and into day-to-day banking operations—without weakening governance, security, or regulatory compliance. That challenge is at the heart of a new offering from digital banking platform Plumery AI, reflecting a broader industry shift captured in Banks move toward operational adoption as Plumery AI rolls out standardised integration.
Turning AI experiments into everyday banking tools
Plumery AI has introduced what it calls an “AI Fabric,” designed to act as a standardised framework that connects generative AI models with core banking data and services. Rather than relying on one-off, custom-built integrations for every new use case, the company positions the framework as a reusable, event-driven, API-first architecture that can scale alongside a bank’s digital ecosystem.
The aim is to tackle a familiar problem in financial services. While banks have invested heavily in AI proofs of concept over the past decade, many initiatives stall before reaching production. According to industry research, including analysis from McKinsey, the issue is rarely a lack of ambition. Instead, fragmented data environments and legacy operating models make it difficult to deploy AI consistently across the enterprise. Shared infrastructure, clear governance, and reusable data products are increasingly seen as prerequisites for meaningful AI adoption.
Governance and control remain non-negotiable
Speaking alongside the product launch, Plumery founder and chief executive Ben Goldin emphasised that banks’ expectations around AI are well defined.
Financial institutions, he noted, want AI systems that deliver tangible improvements in customer experience and operational efficiency, but not at the expense of oversight or control. Plumery’s event-driven data mesh approach is intended to reshape how banking data is produced and consumed, rather than layering AI on top of already fragmented systems.
This focus reflects a wider industry reality: in regulated environments like banking, innovation only succeeds when it aligns with strict governance requirements.
Data fragmentation: the persistent bottleneck
Data fragmentation continues to be one of the biggest barriers to operational AI in banking. Many institutions still depend on legacy core systems that sit beneath newer digital channels. The result is a patchwork of data silos across products and customer journeys.
Every new AI initiative often triggers another round of bespoke integrations, security assessments, and compliance reviews. This not only raises costs but also slows down delivery, making it harder for banks to scale AI use cases beyond pilot stages.
Academic and industry studies reinforce this concern. Research into explainable AI highlights how fragmented data pipelines complicate auditability and increase regulatory risk—especially in sensitive areas such as credit decisions and anti-money-laundering. Regulators have been clear that banks must be able to explain and trace AI-driven outcomes, regardless of where or how the models are built.
Plumery argues that its AI Fabric addresses these issues by exposing domain-specific banking data as governed, reusable streams. By clearly separating systems of record from systems of engagement and intelligence, the company says banks can innovate more safely and with greater confidence.
AI is already delivering value—selectively
Despite the challenges, AI is far from theoretical in financial services. Many banks already use machine learning and natural language processing in production, particularly in customer service, risk management, and compliance.
Large institutions have deployed AI-driven chatbots to handle routine customer queries, easing pressure on call centres and improving response times. Predictive analytics models are widely used to monitor loan portfolios and anticipate defaults, while banks such as Santander have spoken publicly about using machine learning to strengthen credit risk assessment.
Fraud detection is one of the most mature applications. AI systems now analyse transaction patterns in real time, identifying anomalies more effectively than traditional rule-based approaches. However, analysts note that these models depend heavily on high-quality, well-integrated data—something that remains harder for smaller or less modernised institutions to achieve.
More advanced uses, including conversational AI for transactional or advisory support, are emerging cautiously. Academic research suggests such applications may be viable under strict governance, but they remain closely scrutinised due to their regulatory implications.
Platforms, partnerships, and composable banking
Plumery operates in a crowded market of digital banking platforms that position themselves as orchestration layers rather than replacements for core systems. Its strategy includes partnerships designed to fit into wider fintech ecosystems. An example is its integration with open banking infrastructure provider Ozone API, which is pitched as a way for banks to deliver standards-compliant services faster and with less custom development.

This approach mirrors a broader industry trend toward composable architectures. Vendors increasingly promote API-centric platforms that allow banks to plug in AI, analytics, and third-party services on top of existing cores. Analysts generally agree that such models are better suited to incremental innovation than large-scale system overhauls.
Uneven readiness across the sector
Even so, readiness for large-scale AI adoption remains inconsistent. Research from Boston Consulting Group suggests that fewer than one in four banks believe they are truly prepared. The main gaps lie in data foundations, governance frameworks, and operational discipline.
Regulators have responded by encouraging controlled experimentation. In markets like the UK, regulatory sandboxes allow banks to test new technologies, including AI, in supervised environments. These initiatives are designed to balance innovation with accountability and risk management.
The road ahead for operational AI
For vendors such as Plumery, the opportunity lies in bridging the gap between technological ambition and regulatory reality. AI Fabric enters a market where demand for production-ready AI is clear, but where success depends on proving that new tools can be transparent, secure, and compliant.
Whether Plumery’s framework becomes a widely adopted standard remains to be seen. What is clear is that as banks move decisively from experimentation to execution, the focus is shifting toward the underlying architectures that make AI sustainable. Platforms that combine flexibility with strong governance are likely to play a central role in shaping the next phase of digital banking.