Through my decades in programming and management,I've always tried to make time to learn and develop both technical and leadership skills and share them with others Regardless of the topic or technology, my belief is that there is no substitute for the excitement and sense of potential that come from providing others with the knowledge they need to help them accomplish their goals.
In my spare time, I hang out with my wife Anne-Marie, 4 children and 2 small dogs in Cary, North Carolina where I design and conduct trainings and write books. You can find me on LinkedIn (linkedin.com/in/brentlaster), Twitter (@brentclaster) or through my company's website at www.getskillsnow.com.
GitHub Copilot is a generative AI tool for coding that assists developer in writing code more efficiently and faster. This full-day course will help you gain a comprehensive understanding of the tool's capabilities and how to use it effectively in your day-to-day coding.
In this full-day class, we'll cover the basics of Copilot and provide you with hands-on experience through labs. You'll learn the what, why, and how of Copilot and see how to leverage its generative AI functionality in daily coding tasks across multiple languages. You'll also learn key techniques and best practices for working with Copilot.
IMPORTANT NOTE: In order to do the labs for this course, you must have a GitHub Copilot subscription. If you do not, you can log into GitHub, then go to https://github.com/settings/copilot and sign up (start free trial) before the course.
Security is a fundamental concern and requirement in all aspects of software development today. And GitHub is the industry-leading collaboration platform for software development. So, it’s crucial that anyone working with/in GitHub understands how to use it securely. In this session, author and trainer Brent Laster will provide a brief overview of the key GitHub security features available to you for free and through GitHub Advanced Security.
This session will cover authentication via keys and tokens, guarding your branches and tags with protection rules and rulesets, code scanning, and secret scanning. We’ll also touch on security logging, creating security policies, and security alerts.
In this presentation, trainer and author Brent Laster will discuss the good, the bad, and the ugly of both GitHub Copilot and Codeium as AI coding assistants.
We'll look at the functionality offered, how they integrate in IDEs, the quality of results, cost factors, and other key aspects. The presentation will include demos of both tools in similar contexts.
In this presentation, we'll cover the options, tips, and tricks for using GitHub Copilot to help us identify how to test code, generate tests for existing code, and generate tests before the code.
Join global trainer, speaker, and author of the upcoming book, Learning GitHub Copilot, Brent Laster as he presents material on multiple ways to leverage Copilot for testing your code on any platform and framework.
Have you wondered what options GitHub Copilot can provide for helping to not only write your code, but test your code? In this session, we'll examine some key ways that Copilot can support you in ensuring you have the basic testing needs covered. In particular, we'll cover:
Updated!  LM Studio is an easy to use desktop app for experimenting with local and open-source Large Language Models (LLMs) by hosting them on your own system.
Here are some of the features it provides (quoted from its homepage):
Run LLMs on your laptop, entirely offline
Use models through the in-app Chat UI or an OpenAI compatible local server
Download any compatible model files from HuggingFace repositories
Discover new & noteworthy LLMs in the app's home page
Hugging Face is a community hub focused on creating and sharing AI models. It provides many free and pre-trained models as well as datasets and tools to use with them.
Ollama is a command line tool for downloading, exploring, and using LLMs on your local system.
In this hands-on workshop, we'll cover the basics of getting up and running with LM Studio, Ollama and give you hands-on labs where you can use them and Hugging Face to find and load and run LLMs, interact with it via Chat and Python code and more!
Join author, trainer and speaker Brent Laster to learn about LM Studio, Hugging Face, Ollama, and Streamlit and how to use them to find and use Large Language Models hosted and running in your own environment. Get hands-on experience with the applications and learn how to DIY your own Gen AI!
Agenda:
Section 1: Introduction
In this section, we'll talk about what LLMs are, learn about basic use of LM Studio to find models and also start to look at huggingface.co.
Lab 1 - Lab 1 - Getting familiar with LM Studio and models
Purpose: In this lab, we’ll start to learn about models through working with one in LM Studio.
Section 2: Chatting with LLMs and using their APIs
In this section, we'll learn about how we can chat with an LLM, the different roles involved in chatting, and how to also use API calls from the command line to interact with models.
Lab 2 - Chatting with our model
Purpose: In this lab, we'll see how to load and interact with the model through chat and terminal.
Section 3 - Programming for local models
In this section, we'll look at how to create some Python code to interact with LM Studio with its lms interface and lmstudio.js library for JavaScript and Typescript.
Lab 3 - Coding to LM Studio
Purpose: In this lab, we'll see how to do some simple Python and JavaScript code to interact with the model.
Section 4 - Leverage HuggingFace.co
In this section, we'll look more into the model details and tools for using models that Hugging Face offers, including its transformers library and pipelines.
Lab 4 - Working with models in Hugging Face
Purpose: In this lab, we’ll see how to get more information about, and work directly with, models in Hugging Face.
Section 5 - Using Ollama
In this section, we'll learn about how we can use the standalone tool Ollama to get and run LLMs. We'll also talk about multimodal models.
Lab 5 - Using Ollama to run models locally
Purpose: In this lab, we’ll start getting familiar with Ollama, another way to run models locally.
Section 6 - Creating simple UIs for GenAI with Streamlit
In this section we'll work with a graphical Python library, Streamlit to see how to quickly and easily create interactive interfaces like chatbots to use with our local LLMs.
Lab 6 - Building a chatbot with Streamlit
Purpose: In this lab, we'll see how to use the Streamlit application to create a simple chatbot with Ollama.
Updated!  LM Studio is an easy to use desktop app for experimenting with local and open-source Large Language Models (LLMs) by hosting them on your own system.
Here are some of the features it provides (quoted from its homepage):
Run LLMs on your laptop, entirely offline
Use models through the in-app Chat UI or an OpenAI compatible local server
Download any compatible model files from HuggingFace repositories
Discover new & noteworthy LLMs in the app's home page
Hugging Face is a community hub focused on creating and sharing AI models. It provides many free and pre-trained models as well as datasets and tools to use with them.
Ollama is a command line tool for downloading, exploring, and using LLMs on your local system.
In this hands-on workshop, we'll cover the basics of getting up and running with LM Studio, Ollama and give you hands-on labs where you can use them and Hugging Face to find and load and run LLMs, interact with it via Chat and Python code and more!
Join author, trainer and speaker Brent Laster to learn about LM Studio, Hugging Face, Ollama, and Streamlit and how to use them to find and use Large Language Models hosted and running in your own environment. Get hands-on experience with the applications and learn how to DIY your own Gen AI!
Agenda:
Section 1: Introduction
In this section, we'll talk about what LLMs are, learn about basic use of LM Studio to find models and also start to look at huggingface.co.
Lab 1 - Lab 1 - Getting familiar with LM Studio and models
Purpose: In this lab, we’ll start to learn about models through working with one in LM Studio.
Section 2: Chatting with LLMs and using their APIs
In this section, we'll learn about how we can chat with an LLM, the different roles involved in chatting, and how to also use API calls from the command line to interact with models.
Lab 2 - Chatting with our model
Purpose: In this lab, we'll see how to load and interact with the model through chat and terminal.
Section 3 - Programming for local models
In this section, we'll look at how to create some Python code to interact with LM Studio with its lms interface and lmstudio.js library for JavaScript and Typescript.
Lab 3 - Coding to LM Studio
Purpose: In this lab, we'll see how to do some simple Python and JavaScript code to interact with the model.
Section 4 - Leverage HuggingFace.co
In this section, we'll look more into the model details and tools for using models that Hugging Face offers, including its transformers library and pipelines.
Lab 4 - Working with models in Hugging Face
Purpose: In this lab, we’ll see how to get more information about, and work directly with, models in Hugging Face.
Section 5 - Using Ollama
In this section, we'll learn about how we can use the standalone tool Ollama to get and run LLMs. We'll also talk about multimodal models.
Lab 5 - Using Ollama to run models locally
Purpose: In this lab, we’ll start getting familiar with Ollama, another way to run models locally.
Section 6 - Creating simple UIs for GenAI with Streamlit
In this section we'll work with a graphical Python library, Streamlit to see how to quickly and easily create interactive interfaces like chatbots to use with our local LLMs.
Lab 6 - Building a chatbot with Streamlit
Purpose: In this lab, we'll see how to use the Streamlit application to create a simple chatbot with Ollama.
Professional Git takes a professional approach to learning this massively popular software development tool, and provides an up-to-date guide for new users. More than just a development manual, this book helps you get into the Git mindset—extensive discussion of corollaries to traditional systems as well as considerations unique to Git help you draw upon existing skills while looking out—and planning for—the differences. Connected labs and exercises are interspersed at key points to reinforce important concepts and deepen your understanding, and a focus on the practical goes beyond technical tutorials to help you integrate the Git model into your real-world workflow.
Git greatly simplifies the software development cycle, enabling users to create, use, and switch between versions as easily as you switch between files. This book shows you how to harness that power and flexibility to streamline your development cycle.
Git works with the most popular software development tools and is used by almost all of the major technology companies. More than 40 percent of software developers use it as their primary source control tool, and that number continues to grow; the ability to work effectively with Git is rapidly approaching must-have status, and Professional Git is the comprehensive guide you need to get up to speed quickly.