How to fact-check and eliminate hallucination in your search results (Perplexity, ChatGPT, and Gemini)
Boost your confidence in using AI search
A new fact-check capability
Perplexity has just released a new feature called "Check Sources" that helps address the most common concern about AI search results: accuracy.
This feature allows users to fact-check any word, phrase, or sentence by linking it back to its original sources. Currently in beta for Pro subscribers, the feature aims to ensure that the information you receive is not simply a bot's overgeneralization or hallucination.
Table of Contents
How to use "Check Sources"
Combining Check Sources with Search Operators
Fact-check with ChatGPT 4.5
Fact-check with Gemini 2.5 Pro (experimental)
Summary (+ comparison table & recommendation)
How to use "Check Sources"
The feature is activated by default and can be used right away.
Simply run your usual search, and you'll see a list of sources on the right-hand side.
Highlight any word or phrase in your search results, and a small "Check Sources" pop-up will appear. When you click on this pop-up, you will be taken directly to the information's source (cited word or phrase). You'll know exactly where the content comes from, ensuring it's accurate and not a bot's hallucination.
Watch a short demo video:
Combining Check Sources with Search Operators
Keep reading with a 7-day free trial
Subscribe to Geeky Curiosity to keep reading this post and get 7 days of free access to the full post archives.