Daniel Hinojosa

Independent Consultant

Daniel is a programmer, consultant, instructor, speaker, and recent author. With over 20 years of experience, he does work for private, educational, and government institutions. He is also currently a speaker for No Fluff Just Stuff tour. Daniel loves JVM languages like Java, Groovy, and Scala; but also dabbles with non JVM languages like Haskell, Ruby, Python, LISP, C, C++. He is an avid Pomodoro Technique Practitioner and makes every attempt to learn a new programming language every year. For downtime, he enjoys reading, swimming, Legos, football, and barbecuing.

Presentations

Event Driven Architecture

9:00 AM MDT

Event-driven architecture (EDA) is a design principle in which the flow of a system’s operations is driven by the occurrence of events instead of direct communication between services or components. There are many reasons why EDA is a standard architecture for many moderate to large companies. It offers a history of events with the ability to rewind the ability to perform real-time data processing in a scalable and fault-tolerant way. It provides real-time extract-transform-load (ETL) capabilities to have near-instantaneous processing. EDA can be used with microservice architectures as the communication channel or any other architecture.

In this workshop, we will discuss the prevalent principles regarding EDA, and you will gain hands-on experience performing and running standard techniques.

  • Key Concepts of Event-Driven Architecture
  • Event Sourcing
  • Event Streaming
  • Multi-tenant Event-Driven Systems
  • Producers, Consumers
  • Microservice Boundaries
  • Stream vs. Table
  • Event Notification
  • Event Carried State Transfer
  • Domain Events
  • Tying EDA to Domain Driven Design
  • Materialized Views
  • Outbox Pattern
  • CQRS (Command Query Responsibility Segregation)
  • Saga Pattern (Choreography and Orchestrator)
  • Avoiding Coupling
  • Monitoring Systems
  • Cloud-Based EDA

Stay Ahead with Java’s Latest Features!

You can't rest one minute. Java is always Advancing!

8:30 AM MDT

Java’s evolution is remarkable, and the leap from JDK 17 to the current version brings a wealth of powerful features to elevate your projects. Join us for an exciting session to explore select JEPs (Java Enhancement Proposals) introduced up to today, diving into their use cases and practical benefits for your work or open-source initiatives.

What You’ll Learn:
How to enable and utilize advanced Java features introduced in JDK 23.
Real-world demonstrations of cutting-edge updates, including:

  • Stream Gatherers: Handle complex data streams with ease.
  • Statements Before super(): Test invariants without constructing objects.
  • Stable Values
  • Unnamed Variables and Parameters: Enhance code readability and maintainability.
  • Launch Multi-File Source-Code Programs: Rapidly prototype with multiple source files.
  • Implicitly Declared Classes & Enhanced Main Methods: Streamline application development.
  • Updates on switch Expressions: We will discuss where we are with pattern matching as well as dealing with primitives
  • I may also sneak in something about my never-ending love for SimpleWebServer.

Why Attend?
Learn how to advocate for and implement your organization's latest Java tools and practices. Gain the knowledge you need to sell the value of next-generation Java and stay at the forefront of software development.

Architectural Patterns Focus: EventDriven Architecture & Messaging

10:30 AM MDT

Join us for an indepth exploration of cuttingedge messaging styles in your large domain.

Here, we will discuss the messaging styles you can use in your business.

  • Event Sourcing
  • EventDriven Architecture
  • Claim Check
  • Event Notification
  • Event Carried State Transfer
  • Domain Events

Architectural Patterns Focus: Security

1:00 PM MDT

In this session, we will discuss architectural concerns regarding security. How do microservices communicate with one another securely? What are some of the checklist items that you need?

  • Valet Key
  • mTLS and Sidecars
  • Public Key Infrastructure (PKI)
  • SASL
  • Hashicorp Vault Keys
  • SBOMs
  • JSON Web Tokens

Architectural Patterns Focus: Data

3:00 PM MDT

This session will focus on data governance and making data available within your enterprise. Who owns the data, how do we obtain the data, and what does governance look like?

  • CQRS
  • Materialized Views
  • Warehousing vs Data Mesh
  • OLAP vs OLTP
  • Pinot, Kafka, and Spark
  • Business Intelligence
  • Making Data Available for ML/AI

Gonna Go Back in Time with Apache Iceberg

Data Flux Capacitor Not Included

5:00 PM MDT

Apache Iceberg is quickly becoming the foundation of the modern Data Lakehouse, offering ACID guarantees, schema evolution, time travel, and multi-engine compatibility over cheap object storage. We’ll work with Iceberg hands-on and show how to build durable, versioned, trustworthy datasets directly from streaming pipelines.

You’ll see Flink writing to Iceberg, Kafka events flowing into governed tables, and how snapshots let you query “what the data looked like yesterday.” We’ll compact, rewind, evolve schemas, roll back mistakes, and even handle CDC-style updates — all in real time and all powered by open source.

Whether you’re building for Data Mesh, Lakehouse, or stream-batch unification, this talk will show you how to use Iceberg to defend your data and enable self-serve, analytical infrastructure at scale.

  • Set up a full Lakehouse pipeline with Kafka, Flink, Iceberg, and MinIO (S3 local clone)
  • Use Time Travel to query historical snapshots of your data
  • Run Compaction to optimize small files into efficient Parquet chunks
  • Perform Schema Evolution safely with zero downtime
  • Ensure Conflict-Free Streaming Writes with exactly-once guarantees
  • Expire and Roll Back Snapshots to recover from mistakes or manage retention
  • Apply CDC-style Merge/Upserts into Iceberg tables from change logs
  • If time remaining, explore other engines other than Flink, like Dremio or Trino

Gitops: From Commit to Deploy

9:00 AM MDT

Join us for a hands-on workshop, GitOps: From Commit to Deploy, where you’ll explore the entire lifecycle of modern application deployment using GitOps principles.

We’ll begin by committing an application to GitHub and watching as your code is automatically built through Continuous Integration (CI) and undergoes rigorous unit and integration tests. Once your application passes these tests, we’ll build container images that encapsulate your work, making it portable, secure, and deployment-ready. Next, we’ll push these images to a container registry preparing for deployment

Next, you will learn how to sync your application in a staging Kubernetes cluster using ArgoCD (CD), a powerful tool that automates and streamlines the deployment process. Finally, we’ll demonstrate a canary deployment in a production environment with ArgoCD, allowing for safe, gradual rollouts that minimize risk.

By the end of this workshop, you’ll have practical experience with the tools and techniques that perform GitOps deployments, so you can take this information and set up your deployments at work.

  • Creating Your Application
  • Running Locally
  • Proper Commits
  • Security Scans
  • Safe Image Creation
  • Image Publishing
  • ArgoCD and Syncing
  • Canary Deployments

Gitops: From Commit to Deploy

11:00 AM MDT

Join us for a hands-on workshop, GitOps: From Commit to Deploy, where you’ll explore the entire lifecycle of modern application deployment using GitOps principles.

We’ll begin by committing an application to GitHub and watching as your code is automatically built through Continuous Integration (CI) and undergoes rigorous unit and integration tests. Once your application passes these tests, we’ll build container images that encapsulate your work, making it portable, secure, and deployment-ready. Next, we’ll push these images to a container registry preparing for deployment

Next, you will learn how to sync your application in a staging Kubernetes cluster using ArgoCD (CD), a powerful tool that automates and streamlines the deployment process. Finally, we’ll demonstrate a canary deployment in a production environment with ArgoCD, allowing for safe, gradual rollouts that minimize risk.

By the end of this workshop, you’ll have practical experience with the tools and techniques that perform GitOps deployments, so you can take this information and set up your deployments at work.

  • Creating Your Application
  • Running Locally
  • Proper Commits
  • Security Scans
  • Safe Image Creation
  • Image Publishing
  • ArgoCD and Syncing
  • Canary Deployments

Hexagonal Architecture

Putting Code in the Correct Place

1:30 PM MDT

This workshop will explore the principles of the Ports and Adapters pattern (also called the Hexagonal Architecture) and demonstrate how to refactor legacy code or design new systems using this approach. You’ll learn how to organize your domain logic and move UI and infrastructure code into appropriate places within the architecture. The session will also cover practical refactoring techniques using IntelliJ and how to apply Domain Driven Design (DDD) principles to ensure your system is scalable, maintainable, and well-structured.

What You’ll Learn:

  1. What is Hexagonal Architecture?
    Understand the fundamental principles of Hexagonal Architecture, which helps isolate the core business logic (the domain) from external systems like databases, message queues, or user interfaces. This architecture is designed to easily modify the external components without affecting the domain.

  2. What are Ports and Adapters?
    Learn the key concepts of Ports and Adapters, the core elements of Hexagonal Architecture. Ports define the interface through which the domain interacts with the outside world, while Adapters implement these interfaces and communicate with external systems.

  3. Moving Domain Code to Its Appropriate Location:
    Refactor your domain code to ensure it is correctly placed in the core domain layer. You will learn how to separate domain logic from external dependencies, ensuring that business rules are isolated and unaffected by user interface or infrastructure changes.

  4. Moving UI Code to Its Appropriate Location:
    Discover how to refactor UI code by decoupling it from the domain logic and placing it in the appropriate layers. You’ll learn how to use the Ports and Adapters pattern to allow the user interface to communicate with the domain without violating architectural boundaries.

  5. Using Refactoring Tools in IntelliJ:
    Learn how to use IntelliJ’s powerful refactoring tools to streamline code movement. Techniques such as Extract Method, Move Method, Extract Delegate, and Extract Interface will be applied to refactor your codebase.

  6. Applying DDD Software Principles:
    We’ll cover essential Domain-Driven Design principles, such as Value Objects, Entities, Aggregates, and Domain Events.

  7. Refactoring Techniques:
    Learn various refactoring strategies to improve code structure, Extract Method, Move Method, Extract Delegate, Extract Interface, and Sprout Method and Class

  8. Verifying Code with Arch Unit:
    Ensure consistency and package rules using Arch Unit, a tool for verifying the architecture of your codebase. You will learn how to write tests confirming your project adheres to the desired architectural guidelines, including separating layers and boundaries.

Who Should Attend:

This workshop is perfect for developers who want to improve their understanding of Ports and Adapters Architecture, apply effective refactoring techniques, and leverage DDD principles for designing scalable and maintainable systems.

Hexagonal Architecture

Putting Code in the Correct Place

3:15 PM MDT

This workshop will explore the principles of the Ports and Adapters pattern (also called the Hexagonal Architecture) and demonstrate how to refactor legacy code or design new systems using this approach. You’ll learn how to organize your domain logic and move UI and infrastructure code into appropriate places within the architecture. The session will also cover practical refactoring techniques using IntelliJ and how to apply Domain Driven Design (DDD) principles to ensure your system is scalable, maintainable, and well-structured.

What You’ll Learn:

  1. What is Hexagonal Architecture?
    Understand the fundamental principles of Hexagonal Architecture, which helps isolate the core business logic (the domain) from external systems like databases, message queues, or user interfaces. This architecture is designed to easily modify the external components without affecting the domain.

  2. What are Ports and Adapters?
    Learn the key concepts of Ports and Adapters, the core elements of Hexagonal Architecture. Ports define the interface through which the domain interacts with the outside world, while Adapters implement these interfaces and communicate with external systems.

  3. Moving Domain Code to Its Appropriate Location:
    Refactor your domain code to ensure it is correctly placed in the core domain layer. You will learn how to separate domain logic from external dependencies, ensuring that business rules are isolated and unaffected by user interface or infrastructure changes.

  4. Moving UI Code to Its Appropriate Location:
    Discover how to refactor UI code by decoupling it from the domain logic and placing it in the appropriate layers. You’ll learn how to use the Ports and Adapters pattern to allow the user interface to communicate with the domain without violating architectural boundaries.

  5. Using Refactoring Tools in IntelliJ:
    Learn how to use IntelliJ’s powerful refactoring tools to streamline code movement. Techniques such as Extract Method, Move Method, Extract Delegate, and Extract Interface will be applied to refactor your codebase.

  6. Applying DDD Software Principles:
    We’ll cover essential Domain-Driven Design principles, such as Value Objects, Entities, Aggregates, and Domain Events.

  7. Refactoring Techniques:
    Learn various refactoring strategies to improve code structure, Extract Method, Move Method, Extract Delegate, Extract Interface, and Sprout Method and Class

  8. Verifying Code with Arch Unit:
    Ensure consistency and package rules using Arch Unit, a tool for verifying the architecture of your codebase. You will learn how to write tests confirming your project adheres to the desired architectural guidelines, including separating layers and boundaries.

Who Should Attend:

This workshop is perfect for developers who want to improve their understanding of Ports and Adapters Architecture, apply effective refactoring techniques, and leverage DDD principles for designing scalable and maintainable systems.

Architectural Patterns Focus: Transactions

5:00 PM MDT

We take a look at another facet of architectural design, and that is how we develop and maintain transactions in architecture. Here we will discuss some common patterns for transactions

  • TwoPhase Commit
  • The Problem with 2PC
  • Using EventDrivenArchitecture to manage transactions
  • Transactional Outbox
  • Compensating Transaction
  • Optimistic vs Pessimistic Locking
  • TCC (TryConfirm/Cancel)
  • Saga Orchestrator
  • Saga Choreography

MLOps Half Day Workshop

Delivering Machine Learning at Scale: A Practical Introduction to MLOps

8:30 AM MDT

MLOps is a mix of Machine Learning and Operations. It is the new frontier for those interested in or knowledgeable about both of these disciplines. MLOps supports the operationalization of machine learning models developed by data scientists and delivers the model for processing via streaming or batch operations. Operationalizing Machine Learning Models is nurturing your data from notebook to deployment through pipelines.

In this workshop, we will describe the processes:

  • Model Development
  • Model Packaging
  • Model Deployment
  • Model Cataloging
  • Model Monitoring
  • Model Maintenance

Some of the technologies we will discover include:

  • Airflow, Kubeflow, MLFlow
  • Prometheus & Grafana
  • TensorFlow, XGBoost,
  • Serving
  • Hyperparameter Tuning

Our exercises will include running and understanding MLFlow.

MLOps Half Day Workshop

Delivering Machine Learning at Scale: A Practical Introduction to MLOps

10:30 AM MDT

MLOps is a mix of Machine Learning and Operations. It is the new frontier for those interested in or knowledgeable about both of these disciplines. MLOps supports the operationalization of machine learning models developed by data scientists and delivers the model for processing via streaming or batch operations. Operationalizing Machine Learning Models is nurturing your data from notebook to deployment through pipelines.

In this workshop, we will describe the processes:

  • Model Development
  • Model Packaging
  • Model Deployment
  • Model Cataloging
  • Model Monitoring
  • Model Maintenance

Some of the technologies we will discover include:

  • Airflow, Kubeflow, MLFlow
  • Prometheus & Grafana
  • TensorFlow, XGBoost,
  • Serving
  • Hyperparameter Tuning

Our exercises will include running and understanding MLFlow.

Books

Testing in Scala

by Daniel Hinojosa

If you build your Scala application through Test-Driven Development, you’ll quickly see the advantages of testing before you write production code. This hands-on book shows you how to create tests with ScalaTest and the Specs2—two of the best testing frameworks available—and how to run your tests in the Simple Build Tool (SBT) designed specifically for Scala projects.

By building a sample digital jukebox application, you’ll discover how to isolate your tests from large subsystems and networks with mocking code, and how to use the ScalaCheck library for automated specification-based testing. If you’re familiar with Scala, Ruby, or Python, this book is for you.

  • Get an overview of Test-Driven Development
  • Start a simple project with SBT and create tests before you write code
  • Dive into SBT’s basic commands, interactive mode, packaging, and history
  • Use ScalaTest both in the command line and with SBT, and learn how to incorporate JUnit and TestNG
  • Work with the Specs2 framework, including Specification styles, matchers DSLs, and Data Tables
  • Understand mocking by using Java frameworks EasyMock and Mockito, and the Scala-only framework ScalaMock
  • Automate testing by using ScalaCheck to generate fake data