Original Reddit post

For nearly four years now, the conversation about generative AI has revolved almost exclusively around productivity, threatened jobs, automatable tasks, efficiency, and competitiveness. But there is a largely underestimated dimension to this revolution: its cultural effects. AI is not just transforming how we work; it is transforming how we are together, how we trust each other, how we communicate, and how we organize ourselves. To measure this, it helps to borrow a framework from Erin Meyer, a professor at INSEAD whose book The Culture Map identifies eight dimensions along which the cultures of the world differ. Applied to artificial intelligence, Meyer’s eight dimensions reveal a series of cultural shifts that are more profound than we know. Generative AI demands clarity. An effective prompt is an explicit one. There’s no room for body language. This constraint is gradually reshaping how we communicate with each other, too. Cultures that have traditionally relied on what is left unsaid—where reading between the lines or sensing the mood in the room is a valued skill—are being pushed toward greater explicitness. As AI mediates more exchanges, the richness of implicit communication erodes. And there is the curious rehabilitation of the typo. For decades, a spelling mistake in a professional message was a sign of carelessness, even disrespect. Not anymore. A typo is increasingly read as proof that you wrote it yourself—that you took the time, that you cared enough to type it out without outsourcing the task. Imperfection has become a signal of authenticity. submitted by /u/_fastcompany

Originally posted by u/_fastcompany on r/ArtificialInteligence