Skip to content

BigAI is the Book of Sand#

How hallucinations are the real deal

Some of you may have read Borges’ Book of Sand short story.

borges-livredesable.webp

See the page that will be lost forever!#

In this text, Borges tells the story of a man, who is collecting Bibles, comes to buy a strange book named the Book of Sand.

This book has a particularity: it has an infinite number of pages. When you look at one page, says the vendor, look at it carefully because you will never see it ever again. The man closes the book and tries to find back the page but he fails. Intrigued, he buys the book.

About BigAI responses#

In a certain way, BigAI is similar to the Book of Sand. Ask a question and you’ll get a reply. Ask almost the same question and you’ll get another reply. Ask variations of the original question and you’ll get very different replies, maybe even hallucinations that will bring creativity to the answers. The answers will never be the same and may be very surprising if you make the slightest change in your input.

Suppose you want to find back the exact answer to the third attempt without remembering the prompt; suppose this answer was illuminating and creative; unless you have the right model, the right system prompt and the exact prompt you typed, you’ll never find the same answer ever again.

Swimming in the neighborhood#

Borges probably wanted to picture that, trying to find the page you just saw in the Book of Sand, you would end up searching in the (mathematical) neighborhood of this very page, very close to it, but within the infinity of pages between the current page you’re looking at and the one you are searching for.

potential-wells.webp

(Image taken from here)

For BigAI, it is the same. You ask questions to a system that has the great property to respond “near to” the correct answer when you ask something “close to” the training data (the rather plane areas in the image above).

But, sometimes, the smallest step in the input makes you land far away from the expected response (the “potential wells” in the image above you may fall into).

And, as any response is so particularly linked to your input, considering that in one year, the model you are interrogating will not be here anymore, you will never be able to reproduce a desired output. Your output will be lost forever.

What if hallucinations are the real deal?#

In a way, BigAI, with its hallucinations, is a kind of Book of Sand, a book with all the answers of all the variations of the same question, an infinite book in which some pages are astonishing, unexpected, seminal.

And so, because those hallucinations are transient and we’ll never see them again, they should be something to cherish, first of all because they are rare and will be lost forever, and second of all because they tear down the veil of correctness to other strange realms, where impossible connections occur or where the plausible creates new surprising facts.

Maybe all users of today’s AI are completely misled using BigAI for its accuracy: maybe BigAI brings us novelty, surprise and freshness through hallucinations. Maybe those hallucinations are pages of a hidden book that tries to provide us incredible messages!

Maybe hallucinations are pure creativity, and maybe they are the real purpose, soul, jewels and reason to exist of BigAI!

Then why are we, humans, trying so hard to eradicate them?

(September 9 2025)


Navigation: