Project Metamorphosis: Unveiling the next-gen event streaming platform.Learn More

Confluent Confab
Co-Hosted by Google Detroit


Apache Kafka & AI/ML Connecting the Dots…
Building from inception to production fully managed AI/ML use cases in the Cloud

Apache Kafka® has become the de facto standard for reliable and scalable streaming infrastructures. AI/Machine learning and the Apache Kafka ecosystem are a great combination for training and deploying analytic models at scale. AI/Machine Learning/Deep Learning are showing up more and more in projects, but still feel like buzzwords and hype for science projects. See how to connect the dots! How are both related? How can they be combined to productionize Machine Learning models in mission-critical and scalable real time applications?

Wednesday, November 20th, 2019
11:30 am - 3:30 pm

Google Detroit
52 Henry St
Detroit, MI 48201

11:30 - 12:15 PM – Lunch & Networking, Hosted by Google
12:15 - 1:00 PM – Welcome: Multi/Hybrid Cloud Strategy for AI/MI: Confluent & Google
1:00 - 3:10 PM – Apache Kafka and AI/Machine Learning in the Cloud – Let’s Connect the Dots
3:10 - 3:30 PM – Wrap Up: White Board Ideas, Q&A

Agenda Details

  • See how to converge the best of breed tools used by your Data Science teams, living in silo’s through your enterprise, to a central nervous system running in the Cloud that can be fully managed and automated to deliver your data sources for enabling real-time AI/ML/DL use cases with Apache Kafka.
  • Deep dive on the process & hands on for Citizen Data Roles in your organization to train & deploy AI/ML models using Notebooks, Python, Machine Learning / Deep Learning frameworks such as TensorFlow, Kubeflow, DeepLearning4J, H2O, etc. and the Apache Kafka data pipeline ecosystem for Cloud / Kubernetes.
  • A live demo in GCP that shows how to build a mission-critical Machine Learning environment leveraging different Kafka components:
    • Kafka messaging and Kafka Connect for data movement from and into different sources and sinks
    • Kafka Streams for model deployment, pre-processing and inference in real time
    • KSQL for real time predictions and alerts
  • Showcase of Production Use case examples in Automotive, FinSrv/Insurance, Retail & Healthcare

Hosted by:


Kai Waehner, Sr. Technology Evangelist Confluent Platform, AI/Machine Learning, Confluent

Kai Waehner works as Technology Evangelist at Confluent. Kai’s main area of expertise lies within the fields of Big Data Analytics, Machine Learning / Deep Learning, Cloud / Hybrid Architectures, Messaging, Integration, Microservices, Stream Processing, Internet of Things and Blockchain. He is a regular speaker at international conferences such as Kafka Summit, O’Reilly Software Architecture or ApacheCon, writes articles for professional journals, and shares his experiences with new technologies on his blog ( Contact and references: / @KaiWaehner / / LinkedIn (

Steve Howard, Sr. Systems Engineer Confluent Platform, Confluent

Sign Up Now



By clicking “sign up” above you understand we will process your personal information in accordance with our プライバシーポリシー

上記の「新規登録」をクリックすることにより、お客様は以下に同意するものとします。 サービス利用規約 Confluent からのマーケティングメールの随時受信にも同意するものとします。また、当社がお客様の個人情報を以下に従い処理することを理解されたものとみなします: プライバシーポリシー

単一の Kafka Broker の場合には永遠に無料

商用版の機能を単一の Kafka Broker で無期限で使用できるソフトウェアです。2番目の Broker を追加すると、30日間の商用版試用期間が自動で開始します。この制限を単一の Broker へ戻すことでリセットすることはできません。

Manual Deployment
  • tar
  • zip
  • deb
  • rpm
  • docker
  • kubernetes
  • ansible

By clicking "download free" above you understand we will process your personal information in accordance with our プライバシーポリシー

以下の「ダウンロード」をクリックすることにより、お客様は以下に同意するものとします。 Confluent ライセンス契約 Confluent からのマーケティングメールの随時受信にも同意するものとします。また、お客様の個人データが以下に従い処理することにも同意するものとします: プライバシーポリシー

このウェブサイトでは、ユーザーエクスペリエンスの向上に加え、ウェブサイトのパフォーマンスとトラフィック分析のため、Cookie を使用しています。また、サイトの使用に関する情報をソーシャルメディア、広告、分析のパートナーと共有しています。