


















Roundtable discussions aimed towards growing long-lasting relationships with other influential leaders across the US

Research-driven thought leadership on people experience design and strategy tailored, tailored towards the workforce of 2022

Best practice collaboration to equip your organization with people-improving factors to succeed this year
We'll keep things fairly informal, but will make sure to cover the following:

Introductions and a brief overview on the roundtable format to kick off the conversation.

Group roundtable discussion focused on when to lean on broad LLMs versus smaller specialized models to power high-volume AI use cases that truly impact financial and operational performance while keeping cost and risk under control.

Q&A and best practice collaboration on turning those insights into an enterprise AI approach that favors right-sized, private models close to your data so technology and finance teams can track ROI clearly and refine their investments over time.
Wayne Webb is the GM of North America for Dharma-AI, where he leads US market expansion for the company's custom small language model platform serving Fortune 1000 clients in regulated industries. A global technology executive with more than 25 years of leadership across banking, media, and insurance, Wayne has built and led teams of up to 500 people, managed initiatives exceeding $200M, and delivered platforms operating across 80+ countries at organizations including HSBC and Dow Jones.
He is also the Founder and CEO of AIMguide, an enterprise technology consultancy. An MIT-certified CTO, his focus today is helping organizations move beyond one-size-fits-all LLMs toward purpose-built SLMs that deliver higher accuracy, lower cost, and full data sovereignty.

Gabriel is the CEO and founder of Dharma-AI, where he is focused on building cutting-edge Small Language Model products for real-world business applications. An Industrial Engineer from UFRJ with a postgraduate degree from MIT, he brings over 20 years of experience in AI, advanced analytics, technology, and decision-making. Previously, he founded MAiS in 2015 and later sold it to EloGroup, where he served as Chief Data & AI Officer for nearly six years, helping clients operationalize data and AI at scale. Gabriel combines entrepreneurial vision with deep technical leadership to help organizations design, deploy, and govern AI systems with rigor and impact.

Hunter Menton is the Founder & CEO of Executive Insight, where he partners with Fortune 5000 and high-growth technology companies to shape the strategy, adoption, and revenue impact of next-generation technologies. With more than 25 years of leadership across fintech, payments, and enterprise SaaS, Hunter has guided organizations through major inflection points in digital transformation, building revenue engines for global processors, API-first platforms, and category-defining startups.
Today, his work sits at the forefront of enterprise AI evolution, focusing on how small language models and intelligent automation can redefine decision-making, accelerate adoption cycles, and create durable competitive advantage. Hunter is recognized for translating complex innovation into strategic clarity, helping executive teams move from experimentation to scalable, enterprise-wide impact. He also advises founders and CROs on designing revenue architectures that transform AI initiatives into predictable, future-ready growth platforms.

DHARMA is an AI product company with global ambition, focused on building more ethical, economical, and energy-efficient AI solutions. We train Small Language Models (SLMs) for high-impact verticals, delivering reliable performance comparable to or better than the best AI systems in the world.
We design, build, and deploy specialized small language models (SSLMs) and industry-specific AI agents that are more efficient, accurate, and secure than one-size-fits-all models. Our platform delivers private, tailored AI trained on your data, with a focus on real-world performance, lower costs, and dramatically lower energy usage to bring smarter AI into production faster.
