AI Con USA 2025 - MLOps & AIOps
Sunday, June 8
Fundamentals of AI—ICAgile Certification (ICP-FAI)
Certified Data & Analytics Tester (DAU-CDAT)
Monday, June 9
AI Code Generation Lifecycle
NewThis workshop explores the transformative potential of generative AI across the software development lifecycle, providing practitioners with hands-on insights into leveraging AI technologies for enhanced productivity. Participants will learn strategic approaches to translating requirements into code using large language models, demonstrating techniques for generating contextually accurate implementations with minimal manual intervention. The session will showcase a case study of an AI-assisted application that intelligently generates both backend services and user-facing displays. Further...
Wednesday, June 11
Using GenAI to Advance Azure AIOps
In today’s rapidly evolving digital landscape, maintaining high service quality is crucial. To achieve this, Microsoft is leveraging cutting-edge technologies such as Generative AI and Large Language Models (LLMs) to enhance service reliability and developer productivity. LLMs excel at understanding and reasoning over large volumes of data. They can generalize across diverse tasks and domains, making them invaluable for generating models, insights, and automating intricate tasks. By integrating LLMs and Generative AI into the complex domains of incident response and root cause analysis,...
Securing the Foundations of AI: Addressing the Past to Safeguard the Future
AI’s future hinges on an ecosystem built on decades of technical debt, fragmented tools, and opaque processes; creating vulnerabilities that threaten the reliability and security of modern applications. In this talk, Peter will examine how the legacy of open-source numerical computing and software supply chains is influencing AI’s trajectory. Drawing from over a decade of leadership in the Python and scientific computing communities, Peter will share strategies for tackling these challenges: improving transparency in data and dependencies, building curated software stacks, addressing...
Customer Churn Prediction Using MLFlow and Streamlit
This session will guide you through every stage of the machine learning process, from data preprocessing and feature engineering to model training, pipeline construction, and deployment. The session will begin by preparing and transforming data to ensure it’s ready for model training. Next, dive into building scalable ML pipelines within MLFlow, where you’ll learn how to track experiments, monitor model performance, and manage version control. These features enable a streamlined and reproducible workflow, empowering both beginner and experienced practitioners to understand and implement...
Decoding the Black Box: Unlocking LLM Observability
PreviewLLMs are revolutionizing AI, but their complexity and black-box nature can make it challenging to understand their behavior and ensure optimal performance. This session will demystify LLM observability, providing practical insights and best practices for gaining visibility into LLM interactions, performance, and security. Learn how to: discover the essential metrics to monitor for LLM health and performance, implement effective logging and tracing strategies to track LLM requests and identify bottlenecks, optimize LLM inference performance using profiling techniques, proactively...