July 11th, 2024, 11AM ET
[Sponsored Post] This webinar supplies a information to putting in Hugging Face transformers, Meta’s Llama 3 weights, and the mandatory dependencies for working Llama regionally on AMD techniques with ROCm™ 6.0.1
Webinar Subjects
- Stroll by a primary understanding of transformer fashions. Introduce LLM’s, Hugging Face, and a number of the current work finished between AMD and Hugging Face
- Methods to set up torch and transformers framework and its dependencies on AMD Intuition and Radeon GPUs with ROCm 6.1
- Stroll by a set of publicly out there scripts for working llama2-7b and serving on AMD Intuition™ MI210 and AMD Radeon™ W7800 GPUs on a system with ROCm 6.1 put in
- Share the place to search out documentation and weblog posts from AMD
Following the stay coding part, there shall be a quick wrap-up to share ROCm sources and have a Q&A session with ROCm specialists.
Register here.
1 For a full listing of Radeon components supported by ROCm™ software program as of 5/1/2024, go to https://rocm.docs.amd.com/en/latest/reference/gpu-arch-specs.html. GD-241