SG hospitals struggle to scale AI amidst data and legacy system gaps
Data sharing between public and private providers remains limited, an expert said.
Healthcare providers in Singapore are stepping up efforts to deploy artificial intelligence (AI), but gaps in data and systems—seen across parts of Southeast Asia— are limiting their ability to scale.
A panel of experts at the Healthcare Asia Summit 2026 in Singapore on 25 March said that many providers still lack the foundational infrastructure needed to support AI at scale, despite adoption advancements in areas such as diagnostics and administrative functions.
In some markets, up to 30% to 40% of prescriptions and invoices remain handwritten, constraining the use of data-driven tools.
“Number one is having digital processes and data in the first place,” said Uli Braun, group chief technology officer at Fullerton Health. “You can’t do any AI if you don’t have that.”
Even where digital systems exist, fragmentation persists, as hospitals, clinics and insurers often operate in silos, with minimal data exchange and no standardised interfaces.
Specifically in Singapore, data sharing between public and private providers remains limited, and a universal patient record is not in place, according to Braun.
Asst Prof. Aung Myint Oo, assistant chief clinical informatics officer at Tan Tock Seng Hospital, said data quality remains a persistent challenge as busy clinicians often struggle with the time-consuming nature of manual entry into electronic medical record systems.
He said organisations are looking at ambient and generative AI to automate clinical documentation and reduce administrative workload.
He also highlighted the need to translate patient-generated data, such as from wearables, into clinical platforms.
“It becomes ‘garbage in, garbage out’,” he said, emphasising that a robust data architecture is essential to ensure AI-driven analytics are fed with clean, reliable information.
"Legacy systems are one part of the challenge, because the data is often not ready for AI to consume. But just as important is choosing the right platform and vendor," according to Aslyn Koh, chief information officer at Thomson Medical Group.
She added that success of AI pilots in the healthcare sector depends on adopting solutions that can be governed effectively and work in practice.
Clinicians see value in AI tools that reduce administrative workload but remain cautious about their use in clinical decision-making, citing concerns over reliability and accountability.
“When you introduce AI into hospitals, it can mean one of two things: helping reduce workload and burdens or helping with diagnoses,” said Dr Melvyn Chin, associate director of product & solutions management of ECRI. “The second part is where clinicians have a lot of worry.”
Hospitals are also grappling with legacy systems that are difficult to replace, with some providers shifting towards modular approaches that allow new AI tools to integrate with existing infrastructure.
“We have decided to make everything we develop composable,” Braun said, describing efforts to build modular systems that can plug into different processes across markets.
Despite growing interest, most AI initiatives have yet to move beyond the pilot stage. A survey cited during the discussion found that 95% of AI pilots in healthcare do not scale up, reflecting challenges in data readiness, governance and user adoption.
Hospitals are responding by introducing policies, monitoring tools and training programmes to guide AI use. Organisations are also placing greater emphasis on involving clinicians and patients early in the development process to improve adoption.
Asst Prof. Oo said clinicians and patients need to be involved early in the development of AI systems, adding that change management is required as many clinicians are used to existing workflows.
He said organisations should also account for future users, noting that whilst current practitioners may resist new tools, the next generation will adopt them. “We need to build the ecosystem now so that it is acceptable when launched,” he added.
Accountability remains a key issue, with providers retaining responsibility for outcomes even when AI tools are used.
“You cannot outsource accountability; it always sits with you,” Braun said, adding that Fullerton Health applies a “human in the loop” approach to its AI systems.
Panellists said governance frameworks must keep pace with rapid advances in AI whilst ensuring systems remain safe and reliable for clinical use.