Certifications

Certifications

30 bookmarks
Newest
Anthropic Courses
Anthropic Courses
Learn to build with Claude AI through Anthropic's comprehensive courses and training programs.
·anthropic.skilljar.com·
Anthropic Courses
What are Large Language Models? | NVIDIA Glossary
What are Large Language Models? | NVIDIA Glossary
Large language models (LLMs) are deep learning algorithms that can recognize, summarize, translate, predict, and generate content using very large datasets. Explore all about LLMs solutions.
·nvidia.com·
What are Large Language Models? | NVIDIA Glossary
Mastering LLM Techniques: Customization | NVIDIA Technical Blog
Mastering LLM Techniques: Customization | NVIDIA Technical Blog
Large language models (LLMs) are becoming an integral tool for businesses to improve their operations, customer interactions, and decision-making processes. However, off-the-shelf LLMs often fall…
·developer.nvidia.com·
Mastering LLM Techniques: Customization | NVIDIA Technical Blog
The Attention Mechanism in Large Language Models
The Attention Mechanism in Large Language Models
Check out the latest (and most visual) video on this topic! The Celestial Mechanics of Attention Mechanisms: https://www.youtube.com/watch?v=RFdb2rKAqFwAtten...
·youtube.com·
The Attention Mechanism in Large Language Models
How to become a NVIDIA-Certified Associate: Generative AI LLMs (NCA-GENL) | LinkedIn
How to become a NVIDIA-Certified Associate: Generative AI LLMs (NCA-GENL) | LinkedIn
When Nvidia announced its Generative AI certification tracks at GTC in March 2024 — the LLM-focused NCA-GENL, as well as the multimodal NCA-GENM — it was clear to me that I wanted to give it a shot, for various reasons: My Google Cloud-certified knowledge from 2020 and 2021 felt outdated (see summar
·linkedin.com·
How to become a NVIDIA-Certified Associate: Generative AI LLMs (NCA-GENL) | LinkedIn
What is Databricks Feature Serving? - Azure Databricks
What is Databricks Feature Serving? - Azure Databricks
Feature Serving provides structured data for RAG applications and makes data in the Databricks platform available to applications deployed outside of Databricks.
With Databricks Feature Serving, you can serve structured data for retrieval augmented generation (RAG) applications, as well as features that are required for other applications, such as models served outside of Databricks or any other application that requires features based on data in Unity Catalog.
·learn.microsoft.com·
What is Databricks Feature Serving? - Azure Databricks
Databricks Foundation Model APIs - Azure Databricks
Databricks Foundation Model APIs - Azure Databricks
This article provides an overview of the Foundation Model APIs in Databricks. It includes requirements for use, supported models, and limitations.
Using the Foundation Model APIs you can: Query a generalized LLM to verify a project’s validity before investing more resources. Query a generalized LLM in order to create a quick proof-of-concept for an LLM-based application before investing in training and deploying a custom model. Use a foundation model, along with a vector database, to build a chatbot using retrieval augmented generation (RAG). Replace proprietary models with open alternatives to optimize for cost and performance. Efficiently compare LLMs to see which is the best candidate for your use case, or swap a production model with a better performing one. Build an LLM application for development or production on top of a scalable, SLA-backed LLM serving solution that can support your production traffic spikes.
·learn.microsoft.com·
Databricks Foundation Model APIs - Azure Databricks