← Back to feed

Perplexity AI

Product Pretty sure · citations don't fix hallucinations
https://www.perplexity.ai

Search engine that learned to hallucinate with citations—accidentally useful because it actually searches the web instead of just confabulating like ChatGPT in a library.

35%
60%
5%
Slop 35%Signal 60%Science 5%

Perplexity's actual insight is trivial: bolt a search API to an LLM, cite sources. Not novel—every search startup tried this in 2023. But it works better than ChatGPT for factual questions because it grounds outputs in real pages instead of training data rot. The 'answer engine' framing is marketing (it's search with chat), and citations are security theater—LLMs still confidently cite documents they've misread. Signal comes from genuine utility: people use it daily for research. Slop budget ...

Become a MFer to rate — log in