Health Plans Want More AI. But What Do Members Really Need?

Why automation can’t replace empathy in the housing stability journey

Artificial intelligence is everywhere—redefining workflows, reducing administrative burdens, and powering efficiencies in healthcare at a staggering pace. For health plans, the promise of AI is particularly compelling: faster intake, better data management, predictive modeling, automated outreach.

But when the challenge at hand is housing instability, we have to ask a more grounded question: What do members actually need?

Because while AI can sort, triage, and streamline, it can’t replicate what so many members in crisis truly respond to: empathy, trust, and lived experience.

The Push Toward AI and the Reality on the Ground

Across the industry, there’s a race to embed AI into every touchpoint of the member experience. Health plan leaders are being asked to do more with less. AI promises solutions at scale but without the right guardrails, it risks reducing sensitive, human experiences into transactions.

When a member is facing eviction, couch-surfing with two children, or navigating shelter referrals, their needs extend far beyond what any chatbot or algorithm can parse.

Take one real-world example: a woman quietly navigating cultural pressures as a first-generation immigrant, hesitant to share her full story with her care guide. It wasn’t until that care guide, who also came from an immigrant family, shared a personal detail that the floodgates opened. From there, the support could be matched to the full scope of her situation.

AI can’t improvise that. It can’t connect lived experience to lived experience. And in housing-related care, that type of cultural fluency makes the difference between surface-level help and real stability.

When Technology Helps and When It Hurts

Let’s be clear: AI has real potential to support the housing navigation process. Used wisely, it can:

  • Surface real-time bed availability, like a smart parking system alleviating the stress of unanswered calls or outdated shelter lists.
  • Auto-populate applications using previously gathered client information, saving members from repeating their trauma during every intake.
  • Streamline backend documentation, freeing up care guides to be fully present, not buried in screens.

But when AI crosses the line from supporting to replacing the human experience, the results can be counterproductive or even dangerous.

A care guide shared the story of a member struggling with schizophrenia, terrified of surveillance and wary of phones. The presence of AI, or even a smartphone, escalated fear. In that moment, no chatbot could help.

Members in crisis don’t need automation. They need regulation of emotion, human grounding, and cultural resonance. And those only come from people.

Why “Efficiency” Isn’t Always the Right Metric

Members experiencing housing instability often come from backgrounds of systemic disadvantage. Many haven’t had access to technology or even the chance to develop basic digital literacy.

Some don’t know how to use a smartphone. Others have never touched a laptop. And more than a few have thrown their phones across the room when met with a chatbot.

Efficiency is important but context matters more. When someone is in survival mode, facing eviction, untreated mental illness, or past system failures, a machine-led interface can feel cold, alienating, or even threatening.

In these scenarios, the wrong tool, even if well-intentioned, doesn’t just fall short. It actively drives members away.

The Role Health Plans Can Play

The future isn’t AI or humans. It’s about knowing when and where each is most effective.

Health plans are uniquely positioned to design systems that reflect that nuance. The path forward includes:

  • Using AI behind the scenes to reduce wait times, optimize logistics, and surface follow-ups.
  • Keeping human connection at the front when emotional regulation, cultural nuance, or rapport-building is essential.
  • Investing in direct field experience for leadership teams. One day spent shadowing a care guide could reshape how AI is deployed across the entire organization.

The Bottom Line for Health Plans

Before integrating AI into sensitive care journeys, ask:

  • Does this tool enhance or erode emotional safety?
  • Is it making our Care Guides more effective or replacing what only they can offer?
  • Are we designing with real-life member experiences in mind or prioritizing operational ease?

There’s no doubt that AI has a meaningful role to play. But in the fight for housing stability, empathy isn’t a bonus feature it’s the core deliverable. Technology should lift the human work, not replace it.

What Members Deserve and What Plans Can Build

Members deserve a system that meets the full complexity of their lives. Housing instability isn’t solved by faster forms or better dashboards alone. It requires continuity, emotional intelligence, and grounded presence.

Technology can help us get there but only if it’s built with human experience at the center. Because in this space, the wrong automation doesn’t just frustrate. It fractures safety. And the damage that it does is difficult to repair.

Health plans have the resources to design systems that reflect these realities. When they choose to invest in both infrastructure and human capacity, they build trust, drive better outcomes, and meet the real moment members are living in.

Upside Corporate Headquarters

200 E Las Olas Blvd 14th Floor

Fort Lauderdale, FL 33301