Main Conference (September 5th, 2023)
The Future of Application Development
In the ever-evolving landscape of the software industry, the future of application development holds remarkable potential — reshaping the way applications are built and deployed. My talk will take us through the story of the Kappa architecture, stream processing, and how the powerful integration of Language Models (LLMs) allows you to incorporate knowledge into systems which opens up unprecedented possibilities.
The Kappa architecture, built on top of Apache Kafka, simplifies the data pipeline by treating all data as a continuous stream. Applications built on the Kappa architecture can seamlessly process streams of data, analyse, and act upon that data in real-time. We will explore this along with the other key tenants and principles of the Kappa architecture, uncovering how it enables developers to build highly scalable, fault-tolerant, and low-latency applications that excel in handling rapidly changing data.
With a solid understanding of Kappa and stream processing, we'll quickly get up-to-speed on the game changing LLMs. They've revolutionised the way we interact with and extract insights from the world around us. Artificial Intelligence such as ChatGPT, Bard and Bing possess the remarkable ability to understand language, generate contextual responses, and provide meaningful information by leveraging a vast amount of pre-existing knowledge.
The last part of my talk will focus on how to integrate a Kappa architecture with LLMs using real world examples. At the end of the talk, you'll be able to build entirely new systems that leverage knowledge and human language as a first class citizen.
In this talk, we'll cover:
- Explore the core principles of the Kappa architecture built on Apache Kafka
- Learn about the benefits of stream processing, such as stateful data processing and simplified architectures with Flink
- Quickly get up-to-speed on what an LLM is and how developers can leverage prompt engineering to "program" the AI
- Learn how to incorporate private and up-to-date data sources into a LLM
- Discover practical use cases for incorporating LLMs into application development, such as natural language understanding, context-aware content generation, and personalised recommendations.