๐Ÿ“– Reference

AI Glossary

100 essential AI & machine learning terms explained clearly โ€” from neural networks to RAG, LLMs, and beyond.

100
Terms
14
Categories
โœ“
Always Updated
A
B
C

Context Window

Inference & Generation

The maximum amount of text (measured in tokens) that an AI model can process in a single interaction.

Learn more โ†’

CLIP

Models & Architecture
Contrastive Language-Image Pretraining

A vision-language model that learns shared representations of images and text so they can be compared in the same embedding space.

Learn more โ†’

Computer Vision

Core Concepts

The field of AI focused on enabling machines to interpret and understand images and video.

Learn more โ†’

Convolutional Neural Network

Models & Architecture
CNNConvNet

A neural network architecture specialized for processing grid-like data such as images using convolutional filters.

Learn more โ†’

Chain of Thought

Techniques & Methods
CoT

A prompting technique that asks an LLM to reason step by step before giving a final answer, improving complex reasoning.

Learn more โ†’

Chunking

RAG & Retrieval

The process of splitting long documents into smaller pieces that fit into a language model's context window.

Learn more โ†’

Cosine Similarity

RAG & Retrieval

A metric that measures similarity between two vectors based on the cosine of the angle between them, commonly used for embeddings.

Learn more โ†’

Cross-Attention

Models & Architecture

An attention mechanism where queries from one sequence attend to keys and values from a different sequence.

Learn more โ†’

Cross-Validation

Evaluation & Metrics

A technique for evaluating model performance by splitting data into multiple folds and testing on each fold in turn.

Learn more โ†’

Constitutional AI

Safety & Alignment
CAI

An alignment technique developed by Anthropic where an AI model critiques and revises its own outputs using a set of principles.

Learn more โ†’
D
E
F
G
H
I
J
K
L
M
N
O
P
Q
R
S
T

Transformer

Models & Architecture

The neural network architecture behind most modern AI โ€” uses attention mechanisms to process sequences in parallel.

Learn more โ†’

Tokenization

Language & Text

The process of splitting text into smaller units (tokens) that a language model can process.

Learn more โ†’

Temperature

Inference & Generation

A parameter that controls the randomness of an AI model's outputs โ€” lower values are more deterministic, higher values are more creative.

Learn more โ†’

Token

Language & Text

The basic unit of text processed by a language model, often representing a word, subword, punctuation mark, or symbol.

Learn more โ†’

Text-to-Image

Applications

AI generation that creates images from natural language prompts.

Learn more โ†’

Text-to-Video

Applications

AI generation that creates video clips from natural language prompts.

Learn more โ†’

Transfer Learning

Training & Learning

The practice of reusing knowledge from a model trained on one task to accelerate learning on a different but related task.

Learn more โ†’

Training Data

Training & Learning
Training Set

The dataset used to teach a machine learning model the patterns it needs to make predictions or generate outputs.

Learn more โ†’

Tree of Thoughts

Techniques & Methods
ToT

An advanced reasoning technique where the model explores multiple reasoning paths in a tree structure before choosing the best.

Learn more โ†’

Top-K Sampling

Inference & Optimization

A text generation strategy that restricts sampling to the K most likely next tokens at each step.

Learn more โ†’

Top-P Sampling

Inference & Optimization
Nucleus Sampling

A text generation strategy that samples from the smallest set of tokens whose cumulative probability exceeds P.

Learn more โ†’
U
V
Z