The Architect's Playbook.pdf
anthropics/claude-cookbooks: A collection of notebooks/recipes showcasing some fun and effective ways of using Claude.
A collection of notebooks/recipes showcasing some fun and effective ways of using Claude. - anthropics/claude-cookbooks
How to Become a Claude Architect in 6 Months (Full course)
Anthropic Courses
Learn to build with Claude AI through Anthropic's comprehensive courses and training programs.
Instructor/lsyftffjjycxlmobw/public//Claude+Certified+Architect+–+Foundations+Certification+Exam+Guide
I want to become a Claude architect (full course).
Databricks Fine-Tuning: MLflow Sweep Comparison & Fast Model Serving Demo (Llama/ Unsloth)
Join Ryan Cicak, Solutions Engineer at Databricks, as he explores the art of fine-tuning models using serverless GPU compute. Discover how to pull models from Hugging Face, fine-tune them with ease, and serve them via API.
What are Large Language Models? | NVIDIA Glossary
Large language models (LLMs) are deep learning algorithms that can recognize, summarize, translate, predict, and generate content using very large datasets. Explore all about LLMs solutions.
Mastering LLM Techniques: Inference Optimization | NVIDIA Technical Blog
Stacking transformer layers to create large models results in better accuracies, few-shot learning capabilities, and even near-human emergent abilities on a wide range of language tasks.
Mastering LLM Techniques: Customization | NVIDIA Technical Blog
Large language models (LLMs) are becoming an integral tool for businesses to improve their operations, customer interactions, and decision-making processes. However, off-the-shelf LLMs often fall…
The Attention Mechanism in Large Language Models
Check out the latest (and most visual) video on this topic! The Celestial Mechanics of Attention Mechanisms: https://www.youtube.com/watch?v=RFdb2rKAqFwAtten...
Introduction - Hugging Face LLM Course
We’re on a journey to advance and democratize artificial intelligence through open source and open science.
How to become a NVIDIA-Certified Associate: Generative AI LLMs (NCA-GENL) | LinkedIn
When Nvidia announced its Generative AI certification tracks at GTC in March 2024 — the LLM-focused NCA-GENL, as well as the multimodal NCA-GENM — it was clear to me that I wanted to give it a shot, for various reasons: My Google Cloud-certified knowledge from 2020 and 2021 felt outdated (see summar
Advanced Generative AI Engineering Pathway (Beta) - Databricks Learning
Create custom model serving endpoints | Databricks Documentation
Learn how to create and configure model serving endpoints that serve custom models.
From Zero to GenAI Hero: Building Your GenAI App with HuggingFace and Databricks | Databricks Blog
A comprehensive guide to building a GenAI app using a HuggingFace model, MLflow, Unity Catalog and Databricks Apps, covering setup, development, and deployment.
AWS Flash - AWS Partner: Generative AI on AWS for Financial Services Industries (Technical) - AWS Skill Builder
Your learning center to build in-demand cloud skills.
AWS Certified AI Practitioner (AIF-C01) – Full Course to PASS the Certification Exam
Prepare for the AWS Certified AI Practitioner Certification and pass! AWS Certified AI Practitioner validates in-demand knowledge of artificial intelligence ...
AWS Certified AI Practitioner
What is Databricks Feature Serving? - Azure Databricks
Feature Serving provides structured data for RAG applications and makes data in the Databricks platform available to applications deployed outside of Databricks.
With Databricks Feature Serving, you can serve structured data for retrieval augmented generation (RAG) applications, as well as features that are required for other applications, such as models served outside of Databricks or any other application that requires features based on data in Unity Catalog.
Databricks Foundation Model APIs - Azure Databricks
This article provides an overview of the Foundation Model APIs in Databricks. It includes requirements for use, supported models, and limitations.
Using the Foundation Model APIs you can:
Query a generalized LLM to verify a project’s validity before investing more resources.
Query a generalized LLM in order to create a quick proof-of-concept for an LLM-based application before investing in training and deploying a custom model.
Use a foundation model, along with a vector database, to build a chatbot using retrieval augmented generation (RAG).
Replace proprietary models with open alternatives to optimize for cost and performance.
Efficiently compare LLMs to see which is the best candidate for your use case, or swap a production model with a better performing one.
Build an LLM application for development or production on top of a scalable, SLA-backed LLM serving solution that can support your production traffic spikes.
Databricks exam guide generative ai engineer associate exam guide
Text classification
We’re on a journey to advance and democratize artificial intelligence through open source and open science.
Batch inference using Foundation Model APIs - Azure Databricks
Learn how to do batch inference using a provisioned throughput endpoint.
Tutorial: Deploy and query a custom model
Learn the overview and basic steps for performing model serving on Databricks.
Fine Tune your LLMs with Mosaic AI Model Training
dbdemos - Databricks Lakehouse demos : Fine Tune your LLMs with Mosaic AI Model Training
LLM Chatbot With Retrieval Augmented Generation (RAG) and DBRX
dbdemos - Databricks Lakehouse demos : LLM Chatbot With Retrieval Augmented Generation (RAG) and DBRX
Tutorials | Databricks
Discover the power of Lakehouse. Install demos in your workspace to quickly access best practices for data ingestion, governance, security, data science and data warehousing.
Databricks Generative AI Cookbook — Databricks Generative AI Cookbook
databricks/genai-cookbook
My Journey towards “Databricks Certified Generative AI Engineer Associate”
My experiences from preparing for and successfully passing the (beta) exam