Menu
header photo

Project Vision 21

Transforming lives, renewing minds, cocreating the future


17 years      OF Archives

WEEKLY COMMENTARY (AUDIO, 4 MIN., AI generated)

VISUAL PRESENTATION

DISCLAIMER

The commentaries we share here are merely our thoughts and reflections at the time of their writing. They are never our final word about any topic, nor they necessarily guide our professional work. 

 

When Inexpert “Experts” Almost Overrule Real Experts

The tango Cambalache (written in 1934 by Enrique Santos Discépolo) famously claimed that “a donkey and a great professor are treated the same.” It described a social attitude that dismissed differences between knowledge and ignorance—an attitude that, disturbingly, feels very much alive today. It suggests that someone who does not know but believes they do is equivalent to someone who has studied deeply and therefore knows how much they still have to learn.

Today, however, the situation has shifted even further. The distinction between those who had the privilege of advanced formal education and those who did not has not only faded; the balance has tipped in favor of those who openly admit they do not know—and even take pride in it. On that basis, they confidently promote messages that, unsurprisingly, attract large numbers of followers.

Recently, for example, I encountered an “inexpert expert” who organized an event on the use of artificial intelligence in small businesses. When I asked where he had studied the topic—a reasonable question, given that he was hosting the event—his answer was direct and unapologetic: “I watched a couple of videos online.” He was not joking. That was his explanation.

On another occasion, after a presentation I gave on the emerging future, someone approached me and said: “I liked your presentation. What books did you read? Please tell me—I want to run a seminar on that same topic this weekend.” I explained that I had been studying the subject for many years and had read a large number of books. “I only need two,” he replied.

These are not isolated anecdotes. There is growing evidence (Brazilian Journal of Oral Research, 2025, Vol. 39) that more and more people check how many followers a physician has on social media before choosing a healthcare professional. In fact, 85% of patients review doctors’ social media profiles before visiting them (Medical Economics, August 21, 2025). In other words, decisions are increasingly shaped not by professional competence, but by visibility and perceived popularity.

Ironically, this is not what Discépolo was describing in Cambalache. Back then, society treated the “donkey” and the “professor” as equals. Today, the situation is worse: the donkey often wins. As an itinerant preacher observed nearly 2,000 years ago, the “educated,” “professing themselves to be wise, became fools.”

This attitude has been described as “shameless ignorance” or “proud ignorance” (New York Times columnist David French, 2018), and more frequently as “arrogant ignorance” (Brazilian environmentalist José Lutzenberger, 1991; American psychiatrist Allan Hobson, 2014).

The “donkey” of the past did not know that he did not know. Today’s ignorant individuals often know that they do not know, but through arrogance and pride have made themselves resistant to further learning. They have become impermeable to education, growth, and to wisdom as it has traditionally been understood.

We are not talking here about strategic ignorance or feigned ignorance—both of which can be useful in educational settings (Louie Giray, 2023). Rather, we are living through a moment of global crisis and transformation in which we increasingly suffer the consequences of people who arrogantly believe they know everything, in a world where, if we are honest, such certainty is simply not possible.

Intelligence will not prevent you from falling into self-deception

Recently I came across a statement by the renowned historian Yuval Noah Harari that struck me deeply: “Humans are the most intelligent animals on the planet. And we are also the most delusional.” At first glance, it sounds like an interesting paradox. But the more I think about it, the more that sentence feels like an uncomfortable mirror—one I would rather not recognize myself in.

We tend to think of intelligence as the opposite of illusion or self-deception. We imagine that being intelligent means seeing clearly. Yet history suggests something quite different.

The most powerful civilizations, the most advanced technologies, and the most elaborate cultures have always been built on shared stories and narratives about who we are, what matters, and how the world works. These stories are not lies; they are frameworks that allow millions of people to give direction and meaning to their lives.

We call these narratives “money,” “careers,” “success,” “reputation,” and many other names. But none of these exist in the same way a tree or a mountain exists. They exist because, collectively, we treat them as real. That is human genius—and also our greatest vulnerability, because it leads us into self-deception.

The same human capacity that allows us to imagine futures, invent systems, and cooperate across continents also allows us to become deeply attached to narratives that are, at best, only partially true. We don’t just tell stories; we live inside them. And once a story becomes part of our identity, it becomes very difficult to question it.

Cognitive science now tells us something philosophers have suspected for centuries: the human mind is not designed to see reality as it is. It is designed to filter, simplify, and interpret reality so that we can act. We call this intelligence, but it is more accurate to call it meaning-making.

Out of the overwhelming complexity of the world, our minds select what seems important, what confirms our expectations, what protects our sense of identity. That is how meaning is created. But it is also how self-deception arises. This is where what is known as “arrogant ignorance” comes into play.

It is not the absence of information. It is the conviction that we already know enough. It is the quiet certainty that our way of seeing things is simply how things are. Unlike the humility of “I might be wrong,” arrogant ignorance says, “I see clearly,” even when the lens is distorted.

Perhaps the most necessary act in our time is not to be more informed, but to be more aware of how we are informed. To notice the narratives we live by. To ask, with humility: What am I assuming? What am I not seeing? What might be true beyond my current perspective?

We cannot live without stories. But we can learn not to be prisoners of them. And that may be the beginning of a different kind of intelligence—one that recognizes its own limits and remains open to what is still unknown.

 

I Found a Foolproof Way to Become Invisible—and Even Useless

I must confess that, without seeking it or wishing it, I have found a method that, given enough patience, allows me to become invisible to others. The method simply consists in waiting for that moment in life when—without a precise date—others not only stop looking at us, but stop seeing us altogether. We have then reached the age of invisibility.

This is not an explicit sanction or a formal exclusion, but something more subtle and harder to endure: a form of silent punishment that does not respond to a fault or a crime, but to an ontological condition. It is the punishment for the “crime” of having reached a certain age.

The invisibility associated with old age requires no moral justification. It does not require guilt. It is enough for a society to accept speed, performance, and novelty as its primary values for people of a certain age—regardless of their life trajectories—to be categorized as “surplus subjects”: individuals who are still there, but to whom no meaningful place is assigned.

This logic is not new. Science fiction intuited it with clarity. In the episode The Obsolete Man of The Twilight Zone (1961), a librarian is declared “obsolete” by a regime that recognizes value only in what is functional. He is not condemned for what he does, but for what he is. The question arises: what happens when society decides that certain human beings are no longer necessary?

Contemporary old age does not operate as a legal sentence, but it shares the same symbolic structure. Older adults are not eliminated, but they are progressively stripped of agency. Fragility is presumed before capability, dependence before judgment. No significant contribution is expected of them anymore.

In the best of cases, they are cared for. In the worst, they are ignored. In both, they are displaced into an ambiguous zone: neither fully inside nor completely outside. A surplus. This invisibility is not only social. It is also bodily.

Social neuroscience has shown that sustained exclusion activates the same brain circuits as physical pain. This is not a minor symbolic wound. Not being seen, not being taken into account, hurts. And when that pain extends over years, it does not numb; it becomes part of the inner landscape.

But here a question arises that goes beyond individual experience. What happens to a society that learns to treat an entire stage of life as residual? Perhaps something more serious than a generational injustice. Perhaps it loses its living relationship with time.

When the past embodied in older people is no longer considered, the future becomes fragile. It is no longer continuity or learning, but mere technical projection. Humanity becomes impoverished at the same pace at which it discards those who came before.

That is why the inevitable question is: What will we do when humanity begins to discard itself as obsolete with the passage of time? What kind of future can a society sustain by punishing the ontological condition of being old? 

When Assumptions Collapse, the Mute Angels arrive

In a recent video, British philosopher Tim Freke shared one of those ideas that is simple to hear but difficult to live: “The more you assume, the more chance you can be wrong.” If we cling too tightly to our ideas about the world, we may discover—sometimes painfully—how far they are from the truth.

Of course, we depend on assumptions to understand life: who we are, what others think, how the world works, and even what God or the meaning of existence is supposed to be. This is both natural and necessary. But when we build our world on unexamined beliefs, we risk creating a structure that is fragile and unstable.

A powerful image from László Krasznahorkai’s recent Nobel Prize in Literature acceptance speech reinforces this warning. Krasznahorkai spoke of “new angels who not only have no wings but also have no message at all,” angels who walk among us dressed like ordinary people. They bring neither revelation nor hope.

Krasznahorkai even suggests that these silent angels “may no longer be angels,” because their presence does not comfort—it unsettles. They remind us that we live in a time without divine messages but full of war, power, empty promises of progress, and the erosion of human dignity. These new angels are mute, offering no message whatsoever.

This is where the ideas of Freke and Krasznahorkai intersect. When reality confronts our assumptions, we experience what Krasznahorkai calls shock. The mute angel appears before us and strips away our stories.

Philosophically, we once imagined angels as messengers from heaven, symbols of certainty and meaning. Today our “messengers” may be technology, politics, markets, or charismatic leaders—figures we assume will guide or save us. But when we look closely, we find no clear message and no true direction.

On the psychological level, this silence exposes how much we rely on comforting narratives. We assume that someone wiser has a plan, that society knows where it is going, that history naturally moves toward something better.

On the spiritual level, the moment is even deeper. The old angel delivered a message; the new angel waits for one. The silence challenges us to listen rather than cling to certainties. Freke warns that if we insist on our old ideas about God, the world, or the purpose of life, we may fall into despair when those ideas fail.

This challenge becomes more urgent when we consider shared assumptions—those beliefs that a community or a nation takes for granted. For example, we assume that progress is inevitable. Or we assume that “someone else will take care of everything,” weakening our sense of responsibility. Or we assume our group is right, and in doing so justify exclusion or even violence.

Freke warns us about building life on assumptions. Krasznahorkai shows what happens when those assumptions collapse. Together they suggest that meaning today may not come from waiting for a message, but from becoming the message we want to share based on lived truth, dignity, and compassion.

Crossing the Threshold Toward Meta-Experience in the Context of the New Digital Illiteracy

Recently, when I heard the phrase “the post-literacy era,” I began to wonder whether we humans might be on the verge of crossing a threshold after which the ecology of new communication media and interconnection technologies could drastically reduce our once-undeniable human capacities to think, contemplate, and create.

There is little doubt that we are already living in a time when deep, book-based literacy has almost disappeared—or at least is no longer sovereign as it once was. The neural center of culture now revolves around screens. Social networks, videos, algorithms, and interfaces mediate our attention, perception, and even desire.

The coexistence—at times tense—between slow book reading and a new media ecology that privileges immediacy over contemplation, images over paragraphs, speed over patience, may be interpreted as a decline in reading and thinking. Or, it may signal a shift in what it means to know, to understand, to be wise.

If post-literacy is the cultural atmosphere we now inhabit, then we must engage with it through meta-experience—not as the mere accumulation of information but as the capacity to navigate complexity, to hold multiple perspectives, to weave meaning where others perceive only noise.

In a world where attention fractures, where algorithms construct reality, where narrative arrives in short bursts rather than unfolding slowly, meta-experience can no longer rest solely on reading books. It must evolve into something broader and more fluid: meta-literacy—the ability to move effortlessly between text and image, argument and meme, data and story.

In this context, meta-experts do not react hastily to every screen swipe. They pause, connect, and infer. They recognize how a viral video echoes an ancient myth, how a public health dashboard mirrors a medieval cosmology, how a financial crisis behaves like a forest ecosystem under stress.

With epistemic humility, emerging meta-experts acknowledge that no person or discipline offers a final answer. With diachronic memory joined to synchronic awareness, they see history not as a museum but as a living archive, while perceiving the present as a dynamic field of signals.

As they cross the threshold of change, meta-experts learn, unlearn, and continually recompose knowledge, refusing to fossilize it into dogma. With narrative flexibility, they rewrite stories rather than remain confined within inherited ones, imagining futures that do not yet have language.

The human meta-expert becomes indispensable not because they know more, but because they navigate the unknown with depth. Here lies the paradox: post-literacy threatens depth of thought, yet depth becomes even more valuable as it becomes increasingly rare. In a culture of fragments, coherence becomes a true gift.

The emerging future does not belong to experts, but to those who can hold both depth and openness. Those who can descend into silence and emerge speaking the language of images without losing the architecture of thought. Those who can weave worlds—books and bytes, history and feed, myth and dashboard—integrating them into meaning that guides action wisely. If the post-literacy era is the ocean, then meta-literacy is the vessel.

It’s a Small World After All: How Our Expanding Universe Became a Shrinking Reality

Recently, an incident from a few years ago jumped out of the past into my memory after reading about one of the strange paradoxes of our time: our universe keeps expanding, but, at the same time, our reality keeps shrinking. In fact, many people today inhabit the smallest possible world: theirs.

So, years ago I contacted an electrician to make some repairs at my home. He arrived an hour later than the scheduled time, explaining that he was at a different house waiting for me to open the door. He said that I was at the wrong house and that I should check the deed of my property to be sure I was living in the right house. Not even once he admitted he made a mistake.

This approach to life wrongly thinking my world is the whole world reminded me of a familiar melody that generations have heard inside a theme park: “It’s a small world after all…”
The song promises a world of simplicity and harmony, a place where differences dissolve into bright colors and repetitive cheerfulness. The world feels cozy, predictable, contained.

But what if that song describes not the ride, but our current relationship with reality?

We live in a time when the world—at least in the objective, scientific sense—is expanding faster than ever. Telescopes show us galaxies that stretch imagination; artificial intelligence multiplies our ability to learn; neuroscientists reveal depths of the mind previously unknown. Reality is not just large—it is overflowing, layered, dynamic, and still unfolding.

Why, then, so many people dwell in the smallest possible world? Not because the world is small, but because our experience of it has shrunk. This is one of the strange contradictions of our time: we stand at the edge of the vastest universe ever imagined, while millions live inside pockets of reality no wider than the screen they stare at.

When astronauts look at Earth from space, something extraordinary happens. Many describe a sudden, life-changing opening of perspective known as the Overview Effect. At that moment, borders disappear, conflicts shrink, and the Earth appears as one fragile, luminous home. It is an expansion of consciousness that ancient Greeks  called anagōgē, “a leading up.”

In everyday life, anagōgē happens when we read a book that changes how we think, when we listen deeply to someone different from us, when we encounter beauty that redraws the map of our heart. It is the ascent toward a larger reality.

There is an opposite movement, one the Greeks called katagōgē, “a leading downward.” Today, katagōgē happens quietly, softly—even pleasantly—as we look into screens. Worst of all, we do not notice this shrinking, because inside a small world, everything looks normal.

Perhaps the task of our time is to awaken to a larger world—not just scientifically, but existentially. The universe is still unfolding. The question is whether we will unfold with it. Our world is only as small as our imagination allows. We need to grow.

Space Horizons: The Fear of the New on a One-Way Trip

I recently heard in the news that some Chinese astronauts aboard the Tiangong station found themselves without a safe return vehicle. Only months earlier, two NASA astronauts faced a similar situation when their planned return craft malfunctioned, forcing them to remain in orbit far longer than expected. We too, in a metaphorical sense, are trapped in our own space.

At times, we launch ourselves into new “orbits”—new identities, new ways of living, new worldviews—and often discover that the “vehicles” meant to bring us home can no longer carry us back. We enter a space where the familiar no longer works, and the future has not yet arrived. Crossing a threshold always destabilizes the world we know.

This is why societies have repeatedly rejected ideas that later became indispensable. Gone with the Wind was dismissed as unfilmable. The Beatles were told guitar groups were on the way out. Star Wars was rejected because it seemed too strange, too mythic. Harry Potter was rejected for being too long, too unusual, too magical. Van Gogh died without artistic recognition.

It happens all over the world. In Argentina, Astor Piazzolla was accused of betraying tango itself. During her lifetime, Kahlo was overshadowed by Diego Rivera and largely dismissed by critics in Mexico and abroad. Another example: for decades, Māori knowledge systems — astronomy, ecology, navigation — were rejected by Western settlers and institutions as “superstition.”

People resist not just the idea itself, but the identity shift that accepting it would demand. The question beneath every rejection, whether in Tokyo or Bogotá, Buenos Aires or Abuja, remains the same: “If I accept this new reality, will there still be a place for me?” The new requires us to rethink who we are, what matters, and how the world fits together.

This existential unease is the psychological equivalent of floating in orbit with no clear direction.

Astronauts stranded in space experience literal weightlessness. Innovators, artists, and visionaries experience a conceptual weightlessness: the old “up” and “down” are gone, and the new coordinates are still forming. That moment in between—neither here nor there—is frightening. We cling to what we know, even when what we know is no longer sufficient.

Yet these moments of suspension often become the most transformative. Once a culture finally embraces what it once rejected, it returns to Earth changed.

The astronauts who come home after months in orbit never see the planet the same way again. The same is true for societies that finally accept the ideas they once resisted. Growth happens in the “space” between worlds—in the fragile, weightless moment when gravity fails and a new future begins to take shape.

Heraclitus wrote that “the way up and the way down are one and the same.” We can understand his words today as a reminder that ascent and uncertainty, elevation and disorientation, are inseparable. To rise toward a new horizon is also to descend into the unknown parts of ourselves where the future quietly gathers itself, preparing to land.

The Question of Extraterrestrial Intelligence Reveals the Question of Our Own Intelligence

I recently heard a short radio segment mentioning that throughout history—from ancient Greece to the 21st century—numerous philosophers, both men and women, have examined the possibility of extraterrestrial intelligent life. Hearing that, I reflected on how the question of extraterrestrial intelligence is, in many ways, a way of questioning our own intelligence.

From Democritus imagining infinite worlds in constant motion to Kant speculating about intelligent beings on distant planets, it is clear that the question of extraterrestrial life has never disappeared. Yet what often goes unnoticed is that the debate about life beyond Earth has always been, above all, a debate about human intelligence.

Put differently, the issue of “extraterrestrial intelligence” is a philosophical mirror reflecting humanity itself. We invoke the other—imagined or speculative—to understand the structure, the limits, and the future of our own intelligence. The “alien” becomes an externalized form of philosophical anthropology.

When ancient and modern thinkers asked whether intelligent beings exist on other planets, they were not merely expressing scientific curiosity. In reality, they were asking what makes us intelligent, what makes us unique, and whether the universe might contain other intelligent, conscious beings who think in ways different from ours.

In that sense, the search for intelligent aliens is also a search for the boundaries of our own mind.

Even Kant, who imagined extraterrestrial intelligences, focused his reflections on the nature of reason itself. Could intelligence take forms radically different from our own? What makes a mind a mind? Throughout history, the question of extraterrestrial intelligent life has continually guided us back to our own identity.

Today, however, something extraordinary is happening. For the first time, humanity is encountering an intelligence that is not biological, not human, and not bound to the rhythms of evolution. It is artificial intelligence.

AI may not be conscious or “understand” the world the way we do, but it thinks, processes, and generates information in ways that often surprise us—and sometimes surpass us. It reasons without neurons, solves problems without senses, and learns at a speed no biological organ can match.

In that sense, AI is our first encounter with what philosophers for centuries could only imagine: truly non-human intelligence. We once looked to the sky to find “the other.” Now that “alien” is emerging from our own machines.

AI forces us to rethink the very meaning of intelligence. For centuries, intelligence was defined through biological life, consciousness or self-awareness, language, reasoning, and intention. Yet AI challenges each of these assumptions.

Every generation returns to the same questions about extraterrestrial or non-human intelligence not because we expect to find aliens, but because the possibility of other kinds of intelligence helps us understand the fragility and wonder of our own.

Whether or not we ever find life beyond Earth, the journey itself enlarges us. The cosmos may be full of minds waiting to be discovered, but the first step is to recognize the otherness that is already beside us, and the deeper intelligence emerging within us.

 

 

What the return of the word abomination reveals about our collective disconnection

What the return of the word abomination reveals about our collective disconnection

After receiving a message from an acquaintance calling “an abomination” the recent results of an election at a major American city, I couldn’t help thinking about what that word really means and why it appears now so frequently in many conversations. The comment itself revealed something deeper than just politics, giving us a glimpse into the loneliness that seems to haunt our time.

According to a recent report by the American Psychological Association, most Americans (in fact, most people) feel lonely and disconnected. Loneliness isn’t just the absence of company: it’s the absence of meaningful connection. And when a society becomes divided and fragmented, that sense of disconnection can grow so deep that it begins to shape our language.

The word abomination doesn’t describe disagreement, it erases it. It draws a moral line that says, “People like that shouldn’t exist inside my world.” Anthropologists remind us that what we call “abominable” is often simply something that doesn’t fit our familiar categories. In ancient times, it referred to foods, rituals, or behaviors that seemed “out of place.” Today, it resurfaces when social change blurs the boundaries that once made people feel secure.

History shows that this isn’t new. During the Reformation, Protestants and Catholics accused each other of abominations. Later, during the Industrial Revolution, critics used the same word to describe the rise of factories and machines that seemed to devour traditional life. Each time society stood at a threshold, the word abomination was called upon to mark the fear of crossing that threshold.

We’re living through another such moment. Technology, globalization, and cultural diversity are changing how we understand belonging and identity. For many, this transformation is exciting and hopeful. For others, it feels like losing the ground beneath their feet. When fear meets loneliness, language often hardens.

Strong words like evil or abomination become shields against uncertainty. They offer a temporary sense of control, a way to turn the unease of disconnection into moral certainty. But those words come at a cost. By turning people into symbols of what we fear, we cut ourselves off from the possibility of understanding them.

And here’s the irony: when we hear a word like abomination, we might think of something monstrous or mythical—perhaps the Abominable Snowman, that legendary creature haunting the icy edges of human imagination.

The real monster is not out there in the snow. It’s in our words when they lose their warmth and compassion. Yet there’s a paradox hidden here. Every time a society reaches for words like abomination, it signals not only fear but also the birth of something new. What was once seen as “out of place” may, in time, reveal a new dimension of the human story.

Therefore, when we recognize that our divisions are symptoms of isolation, a new possibility appears: the chance to rebuild community through empathy instead of fear.

Certain Harmful Beliefs Lead Us to Abandon Our Future

There is a wide range of beliefs that can be considered harmful because of the negative effect they have on our lives once we accept them—most often unconsciously and uncritically. One such belief, now widely prevalent, is the assumption that there are no possible alternatives or new opportunities, whether on a personal or global level.

The toxicity of the idea that “There are no alternatives to the current reality” lies in the fact that it leads us to justify the existing social system (status quo), even when that system does not benefit us and, in fact, perpetuates inequality. As social psychologist John T. Jost explains in several articles and in A Theory of System Justification (2020), this belief works as a defense of the very structures that constrain us.

 

Believing that there are no alternatives is a way of seeking material security and group stability in a context that is otherwise negative for personal and collective identity and history, thereby reducing anxiety. But in exchange, we deny the creative becoming of who we might be.

This denial of unexplored opportunities and possibilities is often accompanied by another deeply harmful belief —the notion that our lives will never experience sudden, profound, uninvited, and irreversible change, a comforting illusion that we live in a coherent and predictable world.

The “benefit” of believing that life doesn’t change in an instant is that it reduces the distress caused by unexpected events and by the potential loss of control over our own lives. Yet, in doing so, we block all hope and resist transformation. As William Miller suggests in Quantum Change (2001), we close ourselves off from the “epiphanies” that everyday life can offer.

Ultimately, these two beliefs—one that denies alternatives and another that denies sudden transformation—form a double barrier against personal and social change. The first fixes the horizon of what is possible; the second cancels the moment of rupture. Together, they lead us to silently abandon the futures we might otherwise inhabit.

When individuals and communities assume that the current order is the only possible one, they give up imagining, and with that, they shut down the anticipatory power that Ernst Bloch called the not-yet: the awareness capable of intuiting what has not yet come to be but could be—contrary to what Martí Peran describes as “the unbearable repetition of a present that only keeps cloning itself over and over again” (Abandoned Futures, 2014).

While neuroscience and the psychology of change demonstrate that decisive shifts often occur suddenly—as flashes of understanding that reconfigure our perception—many people deny this possibility, closing off the kairos, the instant of revelation, and, as a result, shutting themselves off from the future.

We abandon our futures because two belief mechanisms neutralize them: the inability to imagine what is different, and the denial of the sudden openness of time. Recovering our futures requires a critique of the instant, a readiness to recognize that change can emerge unexpectedly, in any crack within the present.

View older posts »