Event-driven architecture (EDA) is a design principle in which the flow of a system’s operations is driven by the occurrence of events instead of direct communication between services or components. There are many reasons why EDA is a standard architecture for many moderate to large companies. It offers a history of events with the ability to rewind the ability to perform real-time data processing in a scalable and fault-tolerant way. It provides real-time extract-transform-load (ETL) capabilities to have near-instantaneous processing. EDA can be used with microservice architectures as the communication channel or any other architecture.
In this workshop, we will discuss the prevalent principles regarding EDA, and you will gain hands-on experience performing and running standard techniques.
Java’s evolution is remarkable, and the leap from JDK 17 to the current version brings a wealth of powerful features to elevate your projects. Join us for an exciting session to explore select JEPs (Java Enhancement Proposals) introduced up to today, diving into their use cases and practical benefits for your work or open-source initiatives.
What You’ll Learn:
How to enable and utilize advanced Java features introduced in JDK 23.
Real-world demonstrations of cutting-edge updates, including:
super()
: Test invariants without constructing objects.switch
Expressions: We will discuss where we are with pattern matching as well as dealing with primitivesWhy Attend?
Learn how to advocate for and implement your organization's latest Java tools and practices. Gain the knowledge you need to sell the value of next-generation Java and stay at the forefront of software development.
Join us for an indepth exploration of cuttingedge messaging styles in your large domain.
Here, we will discuss the messaging styles you can use in your business.
In this session, we will discuss architectural concerns regarding security. How do microservices communicate with one another securely? What are some of the checklist items that you need?
This session will focus on data governance and making data available within your enterprise. Who owns the data, how do we obtain the data, and what does governance look like?
Apache Iceberg is quickly becoming the foundation of the modern Data Lakehouse, offering ACID guarantees, schema evolution, time travel, and multi-engine compatibility over cheap object storage. We’ll work with Iceberg hands-on and show how to build durable, versioned, trustworthy datasets directly from streaming pipelines.
You’ll see Flink writing to Iceberg, Kafka events flowing into governed tables, and how snapshots let you query “what the data looked like yesterday.” We’ll compact, rewind, evolve schemas, roll back mistakes, and even handle CDC-style updates — all in real time and all powered by open source.
Whether you’re building for Data Mesh, Lakehouse, or stream-batch unification, this talk will show you how to use Iceberg to defend your data and enable self-serve, analytical infrastructure at scale.
Join us for a hands-on workshop, GitOps: From Commit to Deploy, where you’ll explore the entire lifecycle of modern application deployment using GitOps principles.
We’ll begin by committing an application to GitHub and watching as your code is automatically built through Continuous Integration (CI) and undergoes rigorous unit and integration tests. Once your application passes these tests, we’ll build container images that encapsulate your work, making it portable, secure, and deployment-ready. Next, we’ll push these images to a container registry preparing for deployment
Next, you will learn how to sync your application in a staging Kubernetes cluster using ArgoCD (CD), a powerful tool that automates and streamlines the deployment process. Finally, we’ll demonstrate a canary deployment in a production environment with ArgoCD, allowing for safe, gradual rollouts that minimize risk.
By the end of this workshop, you’ll have practical experience with the tools and techniques that perform GitOps deployments, so you can take this information and set up your deployments at work.
Join us for a hands-on workshop, GitOps: From Commit to Deploy, where you’ll explore the entire lifecycle of modern application deployment using GitOps principles.
We’ll begin by committing an application to GitHub and watching as your code is automatically built through Continuous Integration (CI) and undergoes rigorous unit and integration tests. Once your application passes these tests, we’ll build container images that encapsulate your work, making it portable, secure, and deployment-ready. Next, we’ll push these images to a container registry preparing for deployment
Next, you will learn how to sync your application in a staging Kubernetes cluster using ArgoCD (CD), a powerful tool that automates and streamlines the deployment process. Finally, we’ll demonstrate a canary deployment in a production environment with ArgoCD, allowing for safe, gradual rollouts that minimize risk.
By the end of this workshop, you’ll have practical experience with the tools and techniques that perform GitOps deployments, so you can take this information and set up your deployments at work.
This workshop will explore the principles of the Ports and Adapters pattern (also called the Hexagonal Architecture) and demonstrate how to refactor legacy code or design new systems using this approach. You’ll learn how to organize your domain logic and move UI and infrastructure code into appropriate places within the architecture. The session will also cover practical refactoring techniques using IntelliJ and how to apply Domain Driven Design (DDD) principles to ensure your system is scalable, maintainable, and well-structured.
What is Hexagonal Architecture?
Understand the fundamental principles of Hexagonal Architecture, which helps isolate the core business logic (the domain) from external systems like databases, message queues, or user interfaces. This architecture is designed to easily modify the external components without affecting the domain.
What are Ports and Adapters?
Learn the key concepts of Ports and Adapters, the core elements of Hexagonal Architecture. Ports define the interface through which the domain interacts with the outside world, while Adapters implement these interfaces and communicate with external systems.
Moving Domain Code to Its Appropriate Location:
Refactor your domain code to ensure it is correctly placed in the core domain layer. You will learn how to separate domain logic from external dependencies, ensuring that business rules are isolated and unaffected by user interface or infrastructure changes.
Moving UI Code to Its Appropriate Location:
Discover how to refactor UI code by decoupling it from the domain logic and placing it in the appropriate layers. You’ll learn how to use the Ports and Adapters pattern to allow the user interface to communicate with the domain without violating architectural boundaries.
Using Refactoring Tools in IntelliJ:
Learn how to use IntelliJ’s powerful refactoring tools to streamline code movement. Techniques such as Extract Method, Move Method, Extract Delegate, and Extract Interface will be applied to refactor your codebase.
Applying DDD Software Principles:
We’ll cover essential Domain-Driven Design principles, such as Value Objects, Entities, Aggregates, and Domain Events.
Refactoring Techniques:
Learn various refactoring strategies to improve code structure, Extract Method, Move Method, Extract Delegate, Extract Interface, and Sprout Method and Class
Verifying Code with Arch Unit:
Ensure consistency and package rules using Arch Unit, a tool for verifying the architecture of your codebase. You will learn how to write tests confirming your project adheres to the desired architectural guidelines, including separating layers and boundaries.
This workshop is perfect for developers who want to improve their understanding of Ports and Adapters Architecture, apply effective refactoring techniques, and leverage DDD principles for designing scalable and maintainable systems.
This workshop will explore the principles of the Ports and Adapters pattern (also called the Hexagonal Architecture) and demonstrate how to refactor legacy code or design new systems using this approach. You’ll learn how to organize your domain logic and move UI and infrastructure code into appropriate places within the architecture. The session will also cover practical refactoring techniques using IntelliJ and how to apply Domain Driven Design (DDD) principles to ensure your system is scalable, maintainable, and well-structured.
What is Hexagonal Architecture?
Understand the fundamental principles of Hexagonal Architecture, which helps isolate the core business logic (the domain) from external systems like databases, message queues, or user interfaces. This architecture is designed to easily modify the external components without affecting the domain.
What are Ports and Adapters?
Learn the key concepts of Ports and Adapters, the core elements of Hexagonal Architecture. Ports define the interface through which the domain interacts with the outside world, while Adapters implement these interfaces and communicate with external systems.
Moving Domain Code to Its Appropriate Location:
Refactor your domain code to ensure it is correctly placed in the core domain layer. You will learn how to separate domain logic from external dependencies, ensuring that business rules are isolated and unaffected by user interface or infrastructure changes.
Moving UI Code to Its Appropriate Location:
Discover how to refactor UI code by decoupling it from the domain logic and placing it in the appropriate layers. You’ll learn how to use the Ports and Adapters pattern to allow the user interface to communicate with the domain without violating architectural boundaries.
Using Refactoring Tools in IntelliJ:
Learn how to use IntelliJ’s powerful refactoring tools to streamline code movement. Techniques such as Extract Method, Move Method, Extract Delegate, and Extract Interface will be applied to refactor your codebase.
Applying DDD Software Principles:
We’ll cover essential Domain-Driven Design principles, such as Value Objects, Entities, Aggregates, and Domain Events.
Refactoring Techniques:
Learn various refactoring strategies to improve code structure, Extract Method, Move Method, Extract Delegate, Extract Interface, and Sprout Method and Class
Verifying Code with Arch Unit:
Ensure consistency and package rules using Arch Unit, a tool for verifying the architecture of your codebase. You will learn how to write tests confirming your project adheres to the desired architectural guidelines, including separating layers and boundaries.
This workshop is perfect for developers who want to improve their understanding of Ports and Adapters Architecture, apply effective refactoring techniques, and leverage DDD principles for designing scalable and maintainable systems.
We take a look at another facet of architectural design, and that is how we develop and maintain transactions in architecture. Here we will discuss some common patterns for transactions
MLOps is a mix of Machine Learning and Operations. It is the new frontier for those interested in or knowledgeable about both of these disciplines. MLOps supports the operationalization of machine learning models developed by data scientists and delivers the model for processing via streaming or batch operations. Operationalizing Machine Learning Models is nurturing your data from notebook to deployment through pipelines.
In this workshop, we will describe the processes:
Some of the technologies we will discover include:
Our exercises will include running and understanding MLFlow.
MLOps is a mix of Machine Learning and Operations. It is the new frontier for those interested in or knowledgeable about both of these disciplines. MLOps supports the operationalization of machine learning models developed by data scientists and delivers the model for processing via streaming or batch operations. Operationalizing Machine Learning Models is nurturing your data from notebook to deployment through pipelines.
In this workshop, we will describe the processes:
Some of the technologies we will discover include:
Our exercises will include running and understanding MLFlow.
If you build your Scala application through Test-Driven Development, you’ll quickly see the advantages of testing before you write production code. This hands-on book shows you how to create tests with ScalaTest and the Specs2—two of the best testing frameworks available—and how to run your tests in the Simple Build Tool (SBT) designed specifically for Scala projects.
By building a sample digital jukebox application, you’ll discover how to isolate your tests from large subsystems and networks with mocking code, and how to use the ScalaCheck library for automated specification-based testing. If you’re familiar with Scala, Ruby, or Python, this book is for you.