A student approached the Master and held up a printout from the Machine. “Master,” the student said, “the Machine describes a library in the desert that contains every book ever written. It describes the smell of the parchment and the color of the sand. But I have searched the desert, and there is no library.” The Master asked, “Is the Machine a window or a mirror?” The student replied, “It is a window into the world’s knowledge.” The Master led the student into a dark, empty room and lit a single candle. “If you see a shadow of a bird on the wall,” the Master said, “do you go outside to feed it?” The student was silent. The Master said, “The Machine does not know the desert. It only knows the dance of the candle. The library is real, but only in the room where the candle is burning.”
A “hallucination” is not a mistake of logic, but a perfection of pattern. The model isn’t “lying”; it is simply describing the shadow cast by our own language, whether or not a physical object exists to cast it. Does this provide a good explanation? submitted by /u/drodo2002
Originally posted by u/drodo2002 on r/ArtificialInteligence
You must log in or # to comment.
