Home/Glossary/Zero-Shot Learning
Prompting

Zero-Shot Learning

Asking a model to perform a task it has never seen demonstrated, relying purely on instructions.

Full Definition

Zero-shot prompting gives the model only a task description — no examples. The model must generalise from its pre-training knowledge and instruction tuning to perform the task correctly. Modern large language models are surprisingly capable zero-shot performers because instruction tuning teaches them to follow natural-language task descriptions across a wide variety of domains. Zero-shot prompting is the lowest-effort approach and works well for common NLP tasks (summarisation, translation, classification) but tends to underperform few-shot approaches on niche formats or unusual output schemas where the model has no prior reference point.

Examples

1

'Classify the sentiment of this tweet as positive, negative, or neutral: I can't believe how slow this app is today.'

2

'Translate the following sentence into Swahili: The meeting starts at 9 AM.'

Apply this in your prompts

PromptITIN automatically uses techniques like Zero-Shot Learning to build better prompts for you.

✦ Try it free

Related Terms

Few-Shot Learning

Providing a small number of input-output examples in the prompt to teach the mod

View →

Instruction Prompting

Directly telling the model what to do using clear imperative commands.

View →

Instruction-Tuned Model

A model fine-tuned on instruction-response pairs to follow natural-language dire

View →
← Browse all 100 terms