Your browser is ancient!
Upgrade to a different browser to experience this site.

Skip to main content

Llama for Python Programmers

Description

Llama for Python Programmers is designed for programmers who want to leverage the Llama 2 large language model (LLM) and take advantage of the generative artificial intelligence (AI) revolution. In this course, you’ll learn how open-source LLMs can run on self-hosted hardware, made possible through techniques such as quantization by using the llama.cpp package. You’ll explore how Meta’s Llama 2 fits into the larger AI ecosystem, and how you can use it to develop Python-based LLM applications. Get hands-on skills using methods such as few-shot prompting and grammars to improve and constrain Llama 2 output, allowing you to get more robust data interchanges between Python application code and LLM inference. Lastly, gain insight into the different Llama 2 model variants, how they were trained, and how to interact with these models in Python.

This course does not require a data science or statistics background. It is developed specifically for Python application developers who are interested in integrating generative AI, such as Llama 2, into their work.

Language

English

Duration

1 week

Status

Available

U-M Credit Eligible

No