Skip to content.
Two people overlook a large, modern factory floor with rows of machinery; a digital interface shows a holographic illustration of robotic arms, suggesting advanced automation and Industry 4.0 technology integration.

A turning point for healthcare compliance

At the historic Union League of Philadelphia, senior compliance and risk leaders gathered for the NAVEX + Granite GRC Executive Forum, “Shaping the Future: AI, Risk, and Ethical Leadership in Healthcare Compliance.” 

Unlike traditional conferences, this peer-driven forum was built around facilitated dialogue, exploring how innovation is outpacing oversight and how ethics can anchor organizations during AI-driven transformation. 

The day’s central question: What does effective compliance look like when the rules are still being written?

When regulation lags, leadership leads

In the opening session, facilitators invited attendees to share moments where leadership filled the gap between unclear rules and ethical responsibility. 

Themes emerged quickly: healthcare organizations are facing “tides of change” – from shifting CMS models and AI adoption to workforce reduction and financial pressure. Each sector, from pharmacy to HIM, is being reshaped by rapid innovation and ambiguous oversight. 

Participants discussed reframing conversations around compliance from “are we too conservative or too loose?” to “how much risk are we willing to take – and why?” 

Polls confirmed this tension: 

  • 50% of attendees said their greatest challenge was changing laws and industry expectations 
  • 33% cited AI adoption and technology pace as the next biggest disruptor 

The consensus is that ethical consistency, not regulatory certainty, defines credibility.

From buzzword to oversight imperative

The second session underscored the reality that AI is already embedded across healthcare operations – from billing and diagnostics to HR, vendor management, and training analytics. 

While most organizations are still building policy frameworks, nearly every participant reported that AI is influencing compliance decisions today. The question isn’t whether AI is present – it’s whether its use is explainable, transparent and governed. 

In one breakout session, leaders discussed how they introduced “acceptable use” policies that define which AI tools can be used, who owns the output, and how that information is documented. Another leader shared how their organization identified every AI system in use and required staff to label any AI-assisted input in patient files – a proactive transparency step others planned to replicate. 

According to live polling: 

  • 73% said patient data protection was their top AI-related risk 
  • 9% pointed to vendor transparency and workforce training respectively

Turning awareness into action: The Risk Assessment Workshop

By mid-morning, participants rolled up their sleeves for a hands-on exercise on risk assessment – transforming theory into strategy. Facilitators Melissa Hamlett and Crystal Stalter guided attendees through mapping a realistic use case (such as a new AI monitoring tool or a vendor platform handling PHI) to identify blind spots and draft immediate, medium-term, and long-term fixes. 

Table discussions revealed a shared challenge: many teams conduct risk assessments, but fail to operationalize findings. As one facilitator put it, “You can’t just check the box – you have to act on what the assessment tells you.” 

Participants noted practical steps: 

  • Making AI training part of onboarding 
  • Ensuring acceptable use policies are easy to find and acknowledged 
  • Equating noncompliance risk to tangible financial impact to secure leadership buy-in

The WISeR Model and Accelerating Regulation

The session on CMS’s WISeR (Wasteful and Inappropriate Service Reduction) Model offered a preview of how regulatory cadence is shifting. Facilitators Cheryl Fahrenholz and Brett Smith walked through how AI and machine learning are now part of official CMS innovation models, focusing on documentation integrity and shared cost savings. 

Attendees discussed how staying compliant increasingly means staying proactive – monitoring proposed regulations, not waiting for final rules. As one participant observed, “We can’t afford to start our response after enforcement begins.” 

Regulators, the group agreed, are signaling a new era: governance over mere compliance.

Vendor and third-party risk emerged as one of the day’s most urgent themes. Nearly 40% of participants identified vendor oversight and third-party risk as their most vulnerable area, with another 40% flagging data security and privacy as their top vendor concern. 

Discussion groups shared practical oversight steps: 

  • Embedding compliance expectations directly into contracts and onboarding processes 
  • Treating vendors as extensions of culture, not just service providers 
  • Performing site visits to verify vendor integrity – one participant described finding a background-check company operating from a repurposed grocery store, illustrating the need for firsthand verification

Cybersecurity: From IT problem to leadership imperative

The final afternoon session framed cybersecurity as a shared responsibility. Facilitators Scott Ashton and Melissa Hamlett emphasized that regulators now view cyber incidents as both compliance and patient safety risks. 

Breakout tables compared gaps: incident response plans rarely tested, vendor questionnaires unchecked, staff awareness declining after onboarding. Participants agreed that cyber governance must evolve from static checklists to a continuous improvement approach. 

The top three readiness priorities included: 

  • Assigning clear leadership ownership of cyber risk 
  • Conducting annual cross-functional tabletop exercises 
  • Monitoring vendor and employee readiness year-round

Looking ahead: Compliance leadership in a time of acceleration

By day’s end, facilitators asked participants to share what they saw as the biggest shift coming for 2026. The poll results were decisive: 

  • 78% pointed to technology and AI governance as the defining challenge 
  • Only 11% chose data privacy and security – suggesting that the field now sees AI as the broader governance lens through which all risks are refracted

In closing, Cheryl Fahrenholz reflected that the most resilient compliance cultures will be those that “measure what matters – not just what’s easy to count.” 

The 2025 Executive Forum made one message clear: in healthcare’s AI-driven future, ethical leadership is the ultimate compliance control.