As a trusted advisor, leader, and collaborator, Rohit applies problem resolution, analytical, and operational skills to all initiatives and develops strategic requirements and solution analysis through all stages of the project life cycle and product readiness to execution.
Rohit excels in designing scalable cloud microservice architectures using Spring Boot and Netflix OSS technologies using AWS and Google clouds. As a Security Ninja, Rohit looks for ways to resolve application security vulnerabilities using ethical hacking and threat modeling. Rohit is excited about architecting cloud technologies using Dockers, REDIS, NGINX, RightScale, RabbitMQ, Apigee, Azul Zing, Actuate BIRT reporting, Chef, Splunk, Rest-Assured, SoapUI, Dynatrace, and EnterpriseDB. In addition, Rohit has developed lambda architecture solutions using Apache Spark, Cassandra, and Camel for real-time analytics and integration projects.
Rohit has done MBA from Babson College in Corporate Entrepreneurship, Masters in Computer Science from Boston University and Harvard University. Rohit is a regular speaker at No Fluff Just Stuff, UberConf, RichWeb, GIDS, and other international conferences.
Rohit loves to connect on http://www.productivecloudinnovation.com.
http://linkedin.com/in/rohit-bhardwaj-cloud or using Twitter at rbhardwaj1.
Over the next three years, the enterprise technology stack will be reshaped by Agentic AI, AI governance platforms, confidential computing, and post-quantum cryptography (PQC)—while sustainability and cost optimization become architectural imperatives.
This keynote gives architects a concrete operating model to turn emerging technologies into trusted, scalable platforms that CIOs and CISOs will approve.
You’ll learn how to design an AI-native enterprise architecture: agentic workflows orchestrated with MCP/LangGraph, retrieval grounded in GraphRAG, governed under ISO/IEC 42001 and the NIST AI RMF, secured with OWASP LLM guardrails and confidential compute, and optimized for both FinOps and GreenOps.
We’ll explore how to measure cost and carbon per request using Software Carbon Intensity (SCI), and how to prepare for a PQC future using FIPS 203/204/205.
The session closes with a 90-day activation plan and a 3-year roadmap template to modernize your EA practice for the intelligent enterprise era.
Agenda
Key Takeaways:
This is a dynamic session exploring the integration of cutting-edge AI technologies into software architecture. This talk provides senior developers and architects with actionable insights on leveraging large language models like ChatGPT to enhance design processes, manage architectural tradeoffs, and achieve scalable, innovative solutions.
Overview of the session
Importance of large language models (LLMs) in software architecture
Introduction to ChatGPT and its relevance for software architects
Part 1:
The Role of Large Language Models in Software Architecture
Understanding the capabilities of LLMs like ChatGPT
Benefits of integrating LLMs in modern software development
Real-world examples of AI-enhanced software architecture
Part 2: Prompt Engineering for Architectural Tasks
Crafting effective prompts for ChatGPT
Strategies for creating precise and effective prompts
Examples of architectural prompts and their impact
Interactive Exercise: Participants craft and test their own prompts
Feedback and discussion on prompt effectiveness
Part 3: Optimizing Requirement Analysis with ChatGPT
Leveraging ChatGPT for requirement analysis and design
Integration of AI in empathizing with client needs and journey mapping
Cost estimations, compliance, security, and performance
Case Study: Using empathy map and customer journey map tools in conjunction with AI
Hands-On Exercise: Requirement analysis and design
Part 4: Managing Architectural Tradeoffs
Defining and understanding architectural tradeoffs
Exploring real-world tradeoff scenarios
Case Study 1: Scalability vs. Flexibility
Case Study 2: Time-to-Market vs. Maintainability
Leveraging AI insights to analyze tradeoffs
Group Discussion and Q&A
Part 5: Best Practices for Integrating AI in Software Architecture
Techniques for gathering and prioritizing project requirements
Aligning architectural decisions with business objectives
Evaluating risks and potential outcomes of tradeoffs
Assessing tools, technologies, and architectural patterns
AI-powered decision support with ChatGPT
Collaborative decision-making and involving stakeholders
Part 6: Achieving Sustainable Innovation
Leveraging tradeoffs to drive innovation and creativity
Recap of key points and takeaways
Panel Discussion with Industry Experts
AI in architectural innovation: ChatGPT in action
Q&A and Open Discussion with the Audience
Conclusion
Recapitulation of key takeaways
Addressing final questions and facilitating discussions with the audience
Highlighting the future of AI and big data with technologies like ChatGPT
In this dynamic talk, we explore the fusion of AI, particularly ChatGPT, with data-intensive architectures. The discussion covers the enhancement of big data processing and storage, the integration of AI in distributed data systems like Hadoop and Spark, and the impact of AI on data privacy and security. Emphasizing AI's role in optimizing big data pipelines, the talk includes real-world case studies, culminating in a forward-looking Q&A session on the future of AI in big data.
This talk delves into the innovative integration of advanced AI models like ChatGPT into data-intensive architectures. It begins with an introduction to the significance of big data in modern business and the role of AI in scaling data solutions. The talk then discusses the challenges and strategies in architecting big data processing and storage systems, highlighting how AI models can enhance data processing efficiency.
A significant portion of the talk is dedicated to exploring distributed data systems and frameworks, such as Apache Hadoop and Spark, and how ChatGPT can be utilized within these frameworks for improved parallel data processing and analysis. The discussion also covers the critical aspects of data privacy and security in big data architectures, especially considering the implications of integrating AI technologies like ChatGPT.
The talk further delves into best practices for managing and optimizing big data pipelines, emphasizing the role of AI in automating data workflow, managing data lineage, and optimizing data partitioning techniques. Real-world case studies are presented to illustrate the successful implementation of AI-enhanced data-intensive architectures in various industries.
Introduction (10 mins)
Part 1: Architecting for Big Data Processing and Storage (25 mins)
Part 2: Distributed Data Systems and Frameworks (25 mins)
Part 3: Handling Data Privacy and Security in Big Data Architectures (20 mins)
Part 4: Best Practices for Managing and Optimizing Big Data Pipelines (20 mins)
Case Studies and Real-World Applications (10 mins)
Conclusion and Q&A (10 mins)
Overall, this talk aims to provide a comprehensive understanding of how AI, especially ChatGPT, can be integrated into data-intensive architectures to enhance big data processing, analysis, and management, preparing attendees to harness AI's potential in their big data endeavors.
Key Takeaways:
“By 2030, 80 percent of heritage financial services firms will go out of business, become commoditized, or exist only formally but not competing effectively”, predicts Gartner.
This session explores the integration of AI, specifically ChatGPT, into cloud adoption frameworks to modernize legacy systems. Learn how to leverage AWS Cloud Adoption Framework (CAF) 3.0, Microsoft Cloud Adoption Framework for Azure, and Google Cloud Adoption Framework to build cloud-native architectures that maximize scalability, flexibility, and security. Designed for architects, technical leads, and senior IT professionals, this talk provides actionable insights and strategies for successful digital transformation.
Attendees will learn how to:
Integrate AI assistants into cloud readiness, migration, and optimization phases.
Use AI to analyze legacy code, auto-generate documentation, and map dependencies.
Employ the AWS CAF 3.0, Microsoft CAF, and Google CAF to guide large-scale migration while balancing security, compliance, and cost.
Design cloud-native architectures powered by continuous learning, resilience, and automation.
Packed with case studies, modernization blueprints, and AI-assisted workflows, this session equips architects and technical leaders to bridge the gap between heritage systems and future-ready enterprises.
Agenda (60–90 minutes)
1 Introduction: Why Legacy Modernization Now (10 min)
The Gartner 2030 prediction and what it means for enterprises.
The rise of AI-augmented modernization.
2 Understanding Cloud Adoption Frameworks (15 min)
Overview of AWS CAF 3.0, Microsoft CAF for Azure, Google CAF.
Common pillars: strategy, governance, people, platform, security, and operations.
Strengths and trade-offs across frameworks.
3 Strategic Role of AI in Legacy Modernization (15 min)
How LLMs augment discovery, documentation, and refactoring.
ChatGPT as a legacy analysis assistant: reading COBOL, PL/SQL, Java monoliths.
AI-driven dependency mapping, test case generation, and modernization playbooks.
4 Steps for Moving Legacy Systems to the Cloud (20 min)
Assessment → Migration Planning → Modernization Execution → Optimization.
Incremental vs. Full Rewrite: decision matrix and hybrid models.
Ensuring compliance, resilience, and audit readiness throughout migration.
5 Designing AI-Ready Cloud-Native Architectures (15 min)
Embedding RAG, microservices, and event-driven architecture.
Leveraging container orchestration (EKS, AKS, GKE) and serverless compute.
Implementing AI observability, MLOps, and data pipelines on cloud.
6 Case Studies & Real-World Transformations (10 min)
BFSI: Mainframe-to-Microservices using AWS CAF + GenAI refactoring.
Manufacturing: SAP modernization using Azure CAF + AI code summarization.
Retail: Omnichannel API modernization with GCP CAF + Copilot GPTs.
7 Best Practices & Roadmap (5 min)
Align modernization with business capability models.
Embed AI governance into CAF workflows.
Build continuous improvement loops through feedback and metrics.
8 Q&A / Wrap-Up (5 min)
Recap core insights.
The future of AI-enhanced cloud adoption and autonomous modernization.
Join us for an immersive journey into the heart of modern cybersecurity challenges. In this groundbreaking talk, we delve into the intricacies of securing your digital assets with a focus on three critical domains: applications, APIs, and Large Language Models (LLMs).
As developers and architects, you understand the paramount importance of safeguarding your systems against evolving threats. Our session offers an exclusive opportunity to explore the industry-standard OWASP Top 10 vulnerabilities tailored specifically to your domain.
Uncover the vulnerabilities lurking within your applications, APIs, and LLMs, and gain invaluable insights into mitigating risks and fortifying your defenses. Through live demonstrations and real-world examples, you'll witness firsthand the impact of security breaches and learn proactive strategies to combat them.
Whether you're a seasoned architect seeking to fortify your organization's security posture or a developer striving to build resilient systems, this talk equips you with the knowledge and tools essential for navigating the complex landscape of cybersecurity.
Agenda
OWASP Top 10 Overview
OWASP Top 10 for Application Security
OWASP Top 10 for API Security
OWASP Top 10 for LLM Applications (Large Language Models)
Q&A and Discussion
Conclusion