Findings and Next Steps for AI Adoption in Utah

Findings and Next Steps for AI Adoption in Utah

April 2026 | AI Across America

Based on the December 2025 AI Across America Workshop, conducted in partnership with the State of Utah.

What the Workshop Revealed

The December 1 workshop brought together 120 participants across 15 tables, mixing domain experts, technical mentors, policy specialists, and facilitators to solve real AI implementation problems. The workshop produced 8 concrete implementation plans, with 4 already built or deployed within weeks of the event. Three tables did not produce implementation plans but provided equally valuable data about where adoption efforts encounter barriers and what conditions are necessary for success.

Beyond the individual outcomes, the workshop surfaced broader findings about Utah's AI ecosystem:

Peer learning outperforms traditional training. Participants reported learning more in six hours of structured problem-solving with mixed-expertise tables than in months of self-study, webinars, or conferences. Real-world examples from practitioners resonated far more than vendor presentations or theoretical benefits.

The ecosystem has significant assets worth replicating nationally. Utah has a permissive policy environment, substantial state government deployment experience, pockets of deep practitioner expertise, and strong university R&D infrastructure. The state's #1 ranking in the 2025 AI Readiness Index reflects real capacity.

What's needed is systematic connection. The institutions represented (county libraries, federal agencies, credit unions, independent pharmacies, healthcare systems, legal practices, universities) don't naturally intersect. The workshop forced productive collision, but there’s a need for further infrastructure to make these connections repeatable rather than serendipitous.

Adoption barriers are primarily human, not technical. Participants consistently emphasized that AI adoption is "80% psychological, 20% technical." Lack of executive clarity, organizational buy-in, and compliance uncertainty outweigh technology limitations.

Building on What Worked

The workshop demonstrated a model that works. The question now is how Utah augments its existing assets to institutionalize what worked and address the gaps that surfaced.

1. Establish ongoing community infrastructure for AI adopters

The workshop created valuable connections that will dissipate without structural support. Sustaining this community requires an ecosystem approach where multiple institutions contribute according to their strengths. The following are the needs identified through the workshop, with recommendations for how existing organizations could fill those roles.


  • Convener and policy connector: The workshop surfaced regulatory uncertainty as a significant barrier to adoption, specifically in healthcare and financial services. Utah businesses need a clear channel to state resources, updated compliance guidance, and confidence that their implementation efforts align with policy direction. The Office of AI Policy (OAIP) is well-positioned to serve as convener and policy connector, setting standards, connecting participants to state resources, tracking outcomes, and ensuring the community's work informs regulatory and policy development. 


  • Operational coordination: The workshop created a network of participants across sectors who now have shared context, relationships, and in many cases, active implementation projects. Maintaining this network requires ongoing work: regular check-ins with participants, communication channels that keep people connected, programming that brings cohorts back together, and coordination across the other ecosystem partners. Without a dedicated operational home, these connections will fade, and new ones will fail to develop.

    The Nucleus Institute could serve as the primary operational coordinator. Nucleus explicitly focuses on connecting universities, industry, entrepreneurs, and policymakers, with a "Solutions" pillar emphasizing policy development and strategic partnerships. As a neutral body with credibility across sectors, Nucleus could host regular cohorts, maintain the alumni network, and manage communications.


  • Anchor expertise: Workshop tables succeeded when they had access to practitioners who had already implemented AI into their businesses and workflows and could advise on feasibility, tools, and pitfalls. This expertise exists in Utah but is fragmented across institutions. Utah's research universities (University of Utah, Utah State, BYU, UVU) should serve as expertise anchors, providing deep AI implementation and technology transfer knowledge, student talent pipelines, and existing convening capacity. 

    University partners could provide or source technical mentors, host sector-specific programming, and connect academic research to practitioner needs. Universities should view their role as a key part of a workforce development feedback loop, adapting curricula and training to meet industry and government workforce needs. This would be well augmented by private sector expertise surfaced by Nucleus, OAIP, or business associations.


  • Regional extension: The workshop drew primarily from Salt Lake City and the surrounding communities, but the need extends throughout the state. Reaching businesses in rural Utah and smaller metro areas requires institutions with existing relationships in those communities. Regional chambers of commerce, cooperative extension services, or other regional support institutions could be outfitted to extend reach into local business communities that other institutions don't naturally access, particularly outside the Salt Lake metro area. 

    Chambers were underrepresented in this workshop despite outreach efforts; a future effort should invest more heavily in chamber partnerships to extend the community's geographic reach. Other regional institutions could also fulfill this role with proper training and support, ranging from social organizations to libraries.

    This model leverages each entity's strengths: OAIP provides policy legitimacy and state connection, Nucleus coordinates operations, universities supply expertise, and chambers extend reach into communities statewide.

2. Track workshop participants as an ongoing cohort

Eight implementation plans emerged from the workshop, with several already progressing. A federal trade agency deployed an automated market intelligence newsletter within days. A county library system has leadership approval and is integrating AI recommendations with its catalog system. An independent pharmacy owner is actively building custom operations tools. A major healthcare system is working through a compliance review for AI-assisted cancer screening outreach.

These participants represent a ready-made peer network. Maintaining contact to track their progress (what's working, what's stalled, what new challenges have emerged) would generate ongoing case study material, surface common barriers worth policy attention, and demonstrate ROI from the workshop model. This tracking function could be lightweight (quarterly check-ins) and could inform OAIP's Learning Lab research agenda and the operations of the business support structures.

3. Bridge the state government's AI experience to the broader business community

The Utah state government has deployed Google Gemini to 22,000+ employees, with approximately half reporting regular use and an estimated 12,000 hours saved weekly. This represents one of the largest public-sector AI deployments in the country and a substantial base of institutional knowledge about what works.

The workshop didn't explicitly connect state employees' implementation experience to Utah business participants, but the opportunity exists. State technical mentors from the Utah Division of Technology Services (DTS) and OAIP brought valuable expertise to their tables. OAIP, in coordination with DTS and leveraging the Chief Information Officer of Utah state agencies, should formalize this by documenting the state government's implementation learnings in formats useful to local businesses, creating "office hours" or mentorship pathways connecting state AI practitioners to business adopters, and using state deployment data to develop ROI frameworks that help businesses quantify potential returns.

This positions the Utah government not just as a regulator but as a practitioner with lessons to share, reinforcing OAIP's collaborative approach to AI governance.

4. Develop sector-specific compliance guidance through the Learning Lab

Workshop participants in regulated industries (healthcare, legal, financial services) expressed significant uncertainty about which AI tools and use cases meet compliance requirements. This uncertainty functions as a de facto adoption barrier.

The Learning Lab has already produced valuable guidance for mental health therapists. The workshop surfaced additional sectors where similar guidance would accelerate adoption. Clinical applications like cancer screening outreach need clarity on HIPAA-compliant AI use. Credit union participants needed to navigate audit trail requirements, data governance, and regulatory approval pathways. 

Each of these could become a Learning Lab study area, with workshop participants serving as practitioner advisors.

5. Replicate the workshop model for other Utah contexts

The workshop format (mixed-expertise tables, trained facilitators, structured problem-solving, real implementation plans) produced outcomes that traditional training doesn't achieve. Utah should deploy this model again in several ways: extending beyond Salt Lake City to rural communities where AI adoption barriers may differ, running sector-specific deep dives where a healthcare-only or education-only workshop could address challenges in greater depth, or applying the peer-learning model to state employee cohorts to accelerate adoption among the 50% not yet regularly using available AI tools.

SeedAI is taking this methodology to other states (Louisiana, Tennessee, Georgia, Florida, and Kentucky are scheduled for 2026), but the model isn't exclusive. Utah should run additional cohorts independently or in continued partnership.

Lessons from the Outliers

Three of the fifteen tables did not produce implementation plans. These outcomes represent valuable data about where adoption efforts encounter barriers, both within the workshop format and beyond it.

Some barriers were workshop-specific: tables with expertise-to-adopter ratios that produced philosophical discussion rather than concrete plans, or problem statements that weren't scoped tightly enough for the time available. These are addressable through better table composition and facilitation design.

Other barriers were structural and would persist regardless of format: problems that current AI technology can't yet solve well, organizational constraints (approval chains, budget authority, data access) that require longer timelines to navigate, and compliance uncertainty in regulated industries. These patterns will surface in any adoption effort, not just workshops.

Documenting both types of barriers and distinguishing between them would strengthen future efforts and set realistic expectations for what can be achieved in workshops versus ongoing programming.

Every effort will have components that don't reach desired outcomes. These examples are critical for understanding how adoption works, and they often surface insights that successful outcomes don't. They are successes in their own right and should be cataloged and analyzed.

Conclusion

The AI adoption workshop revealed a Utah AI ecosystem with significant assets: a permissive policy environment, substantial public-sector deployment experience, strong university infrastructure, and pockets of deep practitioner expertise. This picture emerged through both the lead-up and execution of the workshop. The workshop showed that these assets become significantly more valuable when systematically connected, and that the connections require intentional infrastructure rather than serendipity.

Peer learning in mixed-expertise settings produced implementation plans that traditional training formats don't achieve. The barriers that stalled adoption were overwhelmingly human, not technical: organizational buy-in, compliance uncertainty, and executive clarity. And the tables that didn't produce plans were as instructive as those that did, surfacing the boundary conditions that show structural constraints.

The path forward should build on what already exists, formalizing roles and functionality identified in the workshop. OAIP can act as convener and policy connector, Nucleus as operational coordinator, universities as expertise anchors, and regional institutions extending reach statewide. Tracking the current cohort's 120 participants and 8 implementation plans will generate ongoing case studies, surface common barriers worth policy attention, and demonstrate ROI from the model. Should the State continue to conduct workshops like these, the case studies and process of AI adoption will remain evergreen. Expanding the Learning Lab's sector-specific compliance guidance, particularly for healthcare and financial services, addresses the uncertainty that currently functions as a de facto adoption barrier across regulated industries. 

The workshop model itself is a scalable, replicable output. The underlying insight it validates is that technology diffusion is fundamentally a human coordination problem, and that solving it requires structured spaces where expertise meets real problems. This applies well beyond a single instance and can serve the interests of any state or community seeking to accelerate AI adoption.