
From First Draft to Final Cut: Why Analytics Still Owns the Last Mile
The AI revolution is happening now, but much of its promise remains incomplete. As generative and predictive technologies deliver rapid gains, they also introduce new forms of hidden complexity. For example AI can create cognitive debt: where quick wins create more later for analysts and decision-makers. In this thought-provoking keynote, Donald Farmer explores how analytics professionals and platforms must evolve to meet this challenge. Far from being eclipsed by AI, analytics holds the key to making AI outputs credible, actionable, and aligned with strategic goals. Attendees will learn why analytics vendors are best positioned to “own the last mile” of AI value delivery, bridging the gap between automation and understanding, from rough draft to real impact.
Presented by: Donald Farmer, TreeHive Strategy
Donald Farmer is a seasoned data and analytics strategist, author and speaker with over 30 years of experience designing data and analytics products. He has led teams at Microsoft and Qlik Technologies and is now the Principal of TreeHive Strategy, where he advises software vendors, enterprises, and investors on data and advanced analytics strategy. In addition to his work at TreeHive, He’s a Research Fellow at TDWI and the on-board Futurist for Tranquilla, developing empathetic AI for emotional care.
Donald has a passion for innovation and has worked on some of the leading data technologies in the market, as well as in award-winning startups. He has a diverse background, having worked in fish farming, archaeology, and forestry, but data has always been at the heart of his work.

Empowering Data Stewards: Innovating with Existing Tools for Effective Data Governance
In today’s data-driven world, the role of data stewards is crucial for maintaining data quality and governance. This presentation will explore how to facilitate a successful data stewardship program by innovating with existing tools and resources. Attendees will learn strategies for providing structure and support to data stewards, ensuring they can effectively manage and govern data assets. Key insights will be shared on how to creatively use available tools to solve data governance challenges and drive positive outcomes for the organization, even when working with a small team. Benefits of the Talk:
• Structured Data Stewardship: Discover methods to establish a clear framework and support system for data stewards.
• Innovative Problem-Solving: Gain insights on how to leverage existing tools and technologies creatively.
• Creating Data Governance Standards: Learn techniques to involve various departments in your organization to create effective data governance standards, especially when working with a small team.
• Program Facilitation: Understand best practices for facilitating and managing a data stewardship program.
Presented by Frances Stoor, Jackson Financial
Frances Stoor is a seasoned data governance professional with a rich background in software development. Over her impressive 20-year tenure at Jackson Financial, Frances has gained extensive experience in supporting both legacy systems and cloud-based data architecture and management. Currently, Frances serves as the Manager of Data & AI Governance at Jackson, where she plays a pivotal role in setting and executing the organization's data governance roadmap. Her expertise extends to implementing AI governance compliance across the enterprise, ensuring that Jackson maintains the highest standards of data integrity and ethical AI practices. Frances's leadership and innovative approach have been instrumental in bridging the gap between business leaders and IT development teams, fostering a culture of collaboration and excellence in data management.

Data Fluency and Advanced Analysis Skills: How to Actually Develop Them in Your Teams and Yourself
We’ve grown adept at plotting charts and trusting algorithms—but what if the story those tools tell isn’t true?
In this session, Jim Todd shares a real-world case in which a ‘signal’ emerged from the data, and AI—along with compelling graphics, trusted control charts, and advanced statistical models—helped confirm a narrative that was ultimately false. Only critical thinking and business fluency uncovered the truth behind the numbers. Participants will explore how well-intended actions, misleading assumptions, and poor operational definitions can quietly distort even the most polished dashboards. This cautionary tale reveals how common tools can amplify errors when context is ignored—and why cultivating curiosity, skepticism, and domain awareness is more vital than ever. Attendees will leave with strategies for challenging assumptions, elevating data fluency, and approaching analytics as a practice of thinking, not just calculation.
Presented by Jim Todd, Lake Michigan College
Jim Todd is a dynamic leader with over 20 years of expertise in unlocking the power of data through analytical and critical thinking skills. At Lake Michigan College, he designs innovative training programs and apprenticeships in Data Analysis, Critical Thinking, Lean, and Project Management, empowering professionals to question assumptions and master complex problem-solving. Previously at Whirlpool Corporation, Jim designed and led the Business OpEx program, driving more than $100M over his tenure in direct quantifiable benefits through rigorous analysis and sequential experimentation. Renowned for his engaging, humor-infused approach, he has trained global teams, certifying Black Belts and Master Black Belts. Jim’s mastery of Lean Six Sigma and the application of the Scientific Method to all processes equips organizations to cut through data noise, ensuring decisions are grounded in clarity, skepticism, and strategic insight.

Move from DIY to ready for AI
AI and machine learning efforts need access to centralized, high-quality data. However, managing data pipelines in-house often leads to complexity, inefficiencies, and delays that stall AI innovation. This session explores how automation transforms data movement—streamlining ingestion, normalization, and preparation—to fuel AI/ML applications effectively. Discover how automated pipelines, managed data lake services, and rapid deployment models empower teams to move from experimentation to production faster, unlocking real business value.
Join this session to learn:
- Strategies to centralize and normalize data from diverse sources
- How to get reliable, high-quality data from automated pipelines
- Ways to accelerate AI from experimentation to production with scalable data movement
Presented by: James Render, Fivetran
James Render is a Senior Sales Engineer at Fivetran. He has been in the data space for over 12 years, working in analytics, data management, and data integration. He has worked with enterprise organizations on data infrastructure and integration for 10 years. During his time at Fivetran, James has partnered to onboard various Fortune 100, 500 and G2K organizations, modernizing their data movement infrastructure for advanced analytics and generative AI initiatives.



