Home/Glossary/Self-Consistency
Prompting

Self-Consistency

Sampling multiple reasoning paths and selecting the most common answer to improve reliability.

Full Definition

Self-consistency is a decoding strategy that generates multiple independent chain-of-thought responses to the same question, then marginalises over the reasoning paths by taking a majority vote on the final answers. Because language model sampling is stochastic, different runs can produce different reasoning chains — some correct, some not. The most frequently occurring answer across N samples is statistically more likely to be correct than any single sample. The technique improves accuracy on arithmetic, commonsense, and symbolic reasoning tasks at the cost of N× the inference budget. It is particularly powerful when combined with temperature sampling.

Examples

1

Generating 20 solutions to a maths word problem at temperature 0.7, then returning the answer that appears most often across solutions.

2

Running a medical diagnosis prompt 10 times and surfacing the diagnosis agreed upon by at least 7 of 10 runs.

Apply this in your prompts

PromptITIN automatically uses techniques like Self-Consistency to build better prompts for you.

✦ Try it free

Related Terms

Chain-of-Thought

A prompting technique that asks the model to reason step-by-step before giving a

View →

Temperature

A sampling parameter that controls the randomness and creativity of model output

View →

Tree of Thoughts

A framework that explores multiple reasoning branches in parallel and selects th

View →
← Browse all 100 terms