Home/Glossary/Few-Shot Learning
Prompting

Few-Shot Learning

Providing a small number of input-output examples in the prompt to teach the model a task pattern.

Full Definition

Few-shot learning places two to ten demonstrations of a task directly inside the prompt, allowing the model to infer the pattern and apply it to a new input — all without updating any weights. The quality of examples matters enormously: diverse, representative demonstrations outperform redundant ones. Few-shot prompting is powerful because it sidesteps the need for expensive fine-tuning while still adapting a general model to a niche format or domain. It sits between zero-shot (no examples) and fine-tuning (many examples used to update weights) on the effort-versus-performance spectrum.

Examples

1

Showing three (tweet, sentiment) pairs before asking the model to classify a fourth tweet as positive, negative, or neutral.

2

Providing two example SQL queries with their natural-language descriptions before asking the model to write a third query from a new description.

Apply this in your prompts

PromptITIN automatically uses techniques like Few-Shot Learning to build better prompts for you.

✦ Try it free

Related Terms

Zero-Shot Learning

Asking a model to perform a task it has never seen demonstrated, relying purely

View →

Instruction Prompting

Directly telling the model what to do using clear imperative commands.

View →

Prompt Template

A reusable prompt structure with placeholders that are filled in at runtime.

View →
← Browse all 100 terms