Menu
header photo

Project Vision 21

Transforming lives, renewing minds, cocreating the future


17 years      OF Archives

WEEKLY COMMENTARY (AUDIO, 4 MIN.)

VISUAL PRESENTATION

DISCLAIMER

The commentaries we share here are merely our thoughts and reflections at the time of their writing. They are never our final word about any topic, nor they necessarily guide our professional work. 

 

Living in harmony with our own becoming

Change—whether superficial or a profound transformation—is one of the most perplexing paradoxes of human existence because it means both being and not being at the same time, ceasing to be in order to become, living in the "in-between" of the "no longer" and the "not yet."

We often see with great clarity that others should change, yet we struggle to recognize the need for change within ourselves. This phenomenon is so ancient that it was already noted two thousand years ago: in the Sermon on the Mount, Jesus pointed out that we tend to notice the speck in our neighbor’s eye while ignoring the beam in our own (Matthew 7:3-5). This is not about religion but about human nature—it is easier to analyze from a distance than from within.
 

Some 2,500 years ago, Heraclitus had already intuited this tension when he declared that "everything flows," reminding us that change is inevitable but not necessarily easy or conscious. Human beings tend to cling to the familiar, even when doing so means remaining trapped in patterns that limit or harm us.
 

When someone does not change, is it because they cannot, do not want to, or do not recognize that they should?
 

There are cases in which change requires more resources, support, or skills than a person currently possesses. Factors such as environment, education, past experiences, and even biology can limit one’s capacity for transformation. Asking someone to overcome trauma is not the same as asking them to reorganize their schedule.
 

At times, resistance to change stems from a conscious or unconscious decision to remain in the comfort zone. Change means facing uncertainty, and the fear of losing what one has—however imperfect—can be stronger than the desire to improve. There may also be hidden benefits to not changing: maintaining a sense of identity, avoiding responsibilities, or preserving dynamics that feel convenient.
 

And sometimes, people simply do not recognize that they need to change. It is not that they are stubborn or negligent, but rather that their frame of reference prevents them from seeing what others see so clearly. This is where cognitive biases, the dissonance between self-image and reality, and our reluctance to question our own narratives come into play.
 

In our time, we live in a world where pointing out others' mistakes has become a social norm, yet deep introspection remains uncomfortable and largely discouraged.
 

However, change does not have to be an individual struggle or a solitary process. John Vervaeke proposes the idea of an ecology of practices—a set of interconnected methods (including meditation, Socratic dialogue, philosophical contemplation, or participation in learning communities) that can help us refine our perception, reduce biases, and expand our understanding of ourselves and the world.
 

Change requires more than simply recognizing a need. It demands a combination of self-awareness, will, and favorable conditions. As Heraclitus said, reality flows—and only those who learn to be transformed along with it can truly live in harmony with their own becoming.

 

What Can We Do When We Can’t Do Anything?

Over the past few weeks, an increasing number of people have been asking me—more frequently than usual, though not unexpectedly—what we can do in a world and society undergoing such deep, constant, and unconsulted changes that they literally disorient and even paralyze us. And they asked me, as if I had an answer.


For many, the need to reflect arises—perhaps for the first time—because we find ourselves living in a world that is volatile (any conflict can erupt anywhere, at any moment), unpredictable (we cannot anticipate what will happen), complex (what happens in one place affects everything else), and ambiguous (nothing is as it seems).
 

Perhaps reflection should begin with the fact that at every moment of our lives, we are situated within a historical, cultural, social, and linguistic context. When that context remains stable—and therefore manageable—for a long time, we take it so much for granted that we lose awareness of its presence. (It’s similar to how, after listening to the sound of rain for a long time, we stop noticing it.)
 

The prominent and trailblazing Spanish philosopher José Ortega y Gasset captured this idea in a strikingly simple yet profound way: “I am I and my circumstances.” (Prologue to Meditations on Quixote, 1914). He immediately added: “If I do not save them, I do not save myself.” Yet in daily life, we often ignore both the duplication of the self and the circumstances surrounding it.
 

This is neither the time nor the place to fully unpack Ortega’s famous statement, so for now, we will simply say that only by becoming aware of our own awareness can we recognize our circumstances. At the same time, it is only through our circumstances that we become aware of ourselves. It is an inseparable and continuous interaction.
 

Through the rereading of Ortega by Humberto Maturana and Francisco Varela, his philosophical ideas—reframed from a biological and scientific perspective (as embodied cognition)—have had a significant influence on the neuroscientific and phenomenological research of John Vervaeke at the University of Toronto.
 

In fact, Ortega’s concept of circumstance seems to expand into Vervaeke’s idea of agent-arena (or agent-situatedness). Put simply, in order to know how to act, we must first recognize the arena in which we are acting. We cannot step onto a soccer field dressed as ballet dancers. The arena defines—though not entirely—the ways in which we can act.
 

So, what can we do in today’s world? First, become aware of our own awareness and our circumstances. Second, creatively adapt our actions to the arena we find ourselves in. But there may be a further step: preparing for the future, rather than just a future.
 

The future is not merely the day after today. That’s just chronological time. The existential, living future is an expansion of awareness that allows us to access possibilities and opportunities previously unseen or unimagined. This awareness of possibility enables us to be ready for any future—whether pleasant or not.

 

Pervasive Deceit: A Paradox of Our Time

Pervasive deceit is nothing new. Forty-five years ago, in 1980, cryptographer Gustavus Simmons introduced the concept to describe scenarios in which secure communication had to be established within a context of intentional, continuous, and widespread deception. In other words, our present reality.
 

Originally, this was a technical concern, focused on how to protect information when “adversaries” sought to intercept it. At the time, the challenge of pervasive deceit was mostly confined to military and intelligence contexts, where confidentiality and authentication were paramount.
 

Today, however, this concept has expanded far beyond cryptography to describe the world we all inhabit: a digital ecosystem where misinformation is not an anomaly but the norm, and where trust is growing ever more fragile. In fact, the very technologies designed to connect us are the same ones that now facilitate deception on an unprecedented scale.
 

Information is no longer simply transmitted from reliable sources to recipients. Instead, it is filtered, reshaped, and often weaponized by algorithms that prioritize engagement over truth. Social media platforms, once envisioned as tools for democratizing knowledge, have instead amplified deception by rewarding sensationalism and controversy.

Misinformation spreads faster than facts. The systems that govern online interactions have made deception profitable, viral, and difficult to correct.
 

In such an environment, trust in institutions, experts, and even the very nature of truth itself has eroded. And here we encounter a paradox that defines our time: if new technologies are disconnecting people from one another and if they allow—indeed, even promote—pervasive deceit, why do the proposed solutions almost always involve even more technology?
 

For some, the answer is purely pragmatic. The scale of digital misinformation and deception surpasses human capacity for verification. In an era where artificial intelligence can generate hyper-realistic fake news, falsified videos, and fabricated documents in minutes, it is no longer feasible for individuals—or even institutions—to manually verify every piece of information they encounter. In theory, new technologies should restore trust.
 

However, this reliance on technological solutions introduces its own set of challenges. Trust is not merely a function of verification; it is relational, cultural, and deeply human. The more we delegate to technology, the further we drift from the organic, community-based ways in which trust has historically been built. Are we trapped in a cycle where every solution only deepens the problem?
 

The real challenge is not just technological but philosophical. For centuries, societies relied on shared frameworks of meaning—traditions, institutions, and personal relationships—to establish trust.

The digital revolution, while bringing unprecedented access to information, has fragmented those frameworks, replacing them with algorithmic mediation. In this context, we are left with a critical dilemma: we cannot abandon technology, but we also cannot rely solely on it to rebuild the very thing it has destabilized.
 

If we continue to frame every crisis of trust as a problem that more technology can solve, we will only deepen our disconnection—from ourselves, from others, from the universe, and from the divine— in a perpetual cycle of self-deception. 

 

There is little doubt that our ability to think and communicate is deteriorating

Exactly 170 years ago, American thinker and writer Henry David Thoreau complained about countries that focus on solving pressing problems with no intention of changing the way of thinking that led to the creation and perpetuation of those problems. Thoreau then coined an unpleasant but descriptive expression: brain rot.
 

In the conclusion of Walden, written in 1854, Thoreau wrote: “While England endeavours to cure the potato-rot, will not any endeavour to cure the brain-rot, which prevails so much more widely and fatally?”.
 

The influential transcendentalist was referring to the “rotten potatoes” caused by the expansion in Europe of a fungoid that infected and destroyed potato crops for eight years (1845-1852), an event known as the Irish Famine.
 

For Thoreau, it was (and still is) nonsense to worry about curing rotten potatoes without worrying at all about the intellectual and moral decay that he characterized as “brain-rot.” More than a century and a half later, that phrase acquired new relevance in the context of the impact that artificial intelligence has on the human brain.
 

So much has been the recent use of “rotten brain” that the phrase was selected at the end of 2024 by Oxford University Press as “the phrase of the year.” In these first weeks of 2025, three alarming studies confirm that Thoreau was right: due to AI and other technologies, we are losing our ability to communicate properly and think critically.
 

For example, an article, published in PlosOne and written by Lucas G Gago-Galvagno and collaborators, analyzed the interaction between the use of “screens” (smartphones, tablets) in early childhood and the development of the cognitive skills of those children. The researchers concluded that there is a “negative association” between those two elements.
 

Specifically, the more time young children spend in front of a screen, the less vocabulary they acquire and the longer it takes them to progress with their language development. Obviously, the issue does not end there because the same thing is happening among adults.
 

In another study, experts from Carnegie Mellon University (in Pennsylvania) and Microsoft published a specialized article in which they studied the impact of generative AI on critical thinking.
 

The experts concluded that the more a person trusts AI, the less critically that person thinks because the less they trust their own ability to think for themselves. Even worse, says the study, the uncritical use of AI “changes the nature of critical thinking” and transforms it into mere “information verification.”
 

Recently, Shelly Palmer, a renowned expert in new technologies, argued in an article that AI and large language models will lead to “homogenizing human thought” by delegating all “searching, writing and decision-making” to AI, eliminating the “intellectual friction” that leads to innovation. “We will forget how to think differently,” Palmer said.
 

“Brain rot” is not just an unpleasant phrase. We are very close to being unable to think critically and speak clearly. What kind of world will be a world where children can’t talk, and adults can’t think? Our world, of course. 

 

The Limits of My Library Are the Limits of My World

Beyond what you know an immense world exists, but you cannot see it.

Throughout history, our understanding of the world has expanded in remarkable ways, but it has always been limited by what we have chosen to explore. Just as libraries house the collected knowledge of humanity, our personal “libraries”—whether physical books, digital resources, or ideas we engage with—shape how we perceive reality. If we stop seeking knowledge, we risk shrinking our world to only what we already know.

Time, Knowledge, and an Expanding Universe

Philosopher J.L. Schellenberg introduces the idea of profound time—the recognition that humanity is in its intellectual infancy. Just as the Earth is billions of years old, and human civilization is only a fleeting moment in cosmic history, our understanding of reality is still in its earliest stages.

We might assume that we already grasp the fundamental truths of the universe, but Schellenberg urges us to think long-term: if humanity survives for millions more years, what new ways of thinking will emerge? What truth remains undiscovered simply because we haven’t had enough time to find them?

This idea reminds us that our current knowledge is not the final word. It encourages humility and curiosity—two traits essential for lifelong learning. If we accept that there is more to know than what is in our personal “library” today, we open ourselves to growth, discovery, and a richer understanding of the world.

Hyperobjects and the Challenge of Comprehension

Expanding our understanding is not always easy. Philosopher Timothy Morton introduces the concept of hyperobjects—things so vast in scale that they are difficult, if not impossible, to fully grasp from a human perspective. Climate change, the internet, and deep time itself are examples of hyperobjects. They exist on scales so large that they stretch beyond our ordinary ways of thinking.

Just because something is difficult to grasp, however, does not mean we should avoid learning about it. Hyperobjects challenge us to expand our mental libraries—to think beyond our immediate experiences and consider the broader, interconnected nature of reality. The more we learn, the better we can navigate a world filled with complexity and uncertainty.

A Lifelong Commitment to Learning

If our libraries define the boundaries of our world, then expanding our knowledge means expanding our world. This doesn’t mean we must read every book or master every subject, but it does mean we should remain open to new ideas, seek out different perspectives, and challenge our own assumptions.

The universe is constantly expanding, both in a literal sense and in terms of human understanding. Just as astronomers discover new galaxies, scientists uncover new truths, and philosophers refine their theories, we too can commit to ongoing learning.

In a world that is always changing, the best way to keep pace is to keep learning. The more we explore, the brighter our lantern becomes, and the more of the world we can see. Let’s not let the limits of our libraries define the limits of our world.

The digital tree hides the forest of life and wisdom

In the not-so-distant past, it was said that planting a tree (along with writing a book and having a child) was a clear sign of the stability and maturity of the person who had planted that tree because it indicated a long-term action without expecting anything in return. For this reason, numerous spiritual and philosophical traditions have used the tree as a metaphor for existence.

For example, Seneca (1st century BCE), in his Letter 12 to Lucilius, describes the passage of time by mentioning that, when visiting his old home, he notices that those trees he had planted are already “old” and with large branches. And in On Providence, he compares resilient people to trees capable of withstanding harsh situations.

Even more remarkable, in Letter 33, Seneca compares philosophy to a “forest of ideas,” in which each thinker is a tree. Other philosophers and writers (Aristotle, Augustine of Hippo, Dante, Descartes, Spinoza, Goethe, Nietzsche, Heidegger, Deleuze, Coccia, among others) have also spoken about trees.

In a recent column, the Spanish philosopher Ariadna Romans emphasized that trees are (paraphrasing) symbols of the wisdom inherent in nature and of an autonomous organization of life. So the act of cutting down trees is both an environmental and philosophical phenomenon, that is, the change in our relationship with the world, time and transcendence.

In that context, the tree is a symbol of time in its complexity. Unlike the straight line of progress or the fleeting instant of our time, the tree is cumulative growth and simultaneous unfolding: it unites its past (roots) with the present (trunk) and its future (branches).

By reducing the tree to a mere resource, we are reducing time to a mere consumable instant, nullifying historical continuity and the projection of the future. And we ourselves become “human resources.”

South Korean philosopher Byung-Chul Han adds another element when he speaks of the disappearance of the “symbolic forest” in our era of “hypertransparency” and overexposure. Without “trees,” time becomes flat, without roots or shadows where memory can be protected and regenerated. Without memory, identity does not remain either.

From a similar perspective, Professor Shelly Palmer, an expert in new technologies, states in a recent article that we have “stopped seeing the forest of artificial intelligence by seeing only synthetic trees.”

In other words, synthetic trees do not let us see the forest of AI. But the forest of AI does not let us see its trees either. In our society, the obsession with the immediate prevents us from seeing the totality of the change that we are causing in our way of inhabiting the world.

Hence the need for an integral philosophical vision that does not forget either the tree or the forest but rather understands them in their interconnection.

I am not suggesting a nostalgic return to nature, but rather a reconsideration of what it means to be human in a world where nature disappears from human consciousness, leading to oblivion and the felling of a way of thinking and existing.

AI continues to expand. But what about our intelligence?

Artificial intelligence can now clone itself, or at least large language models can, according to a recent study from Fudan University in China. Since AI cloning does not require human intervention, researchers say that in a short time and if the cycle is repeated, AI will surpass human intelligence.

“Successful self-replication under no human assistance is the essential step for AI to outsmart [humans], and is an early signal for rogue AI”, the researcher said in a study published last December in the database arXiv, according to Live Science.

For my part, I believe that we humans are striving so that AI does not need much effort to surpass us. Just by looking at what is happening in the world today (“world” in both the material and biological sense, as well as the spiritual and cultural sense) we already have enough evidence that our natural intelligence, far from expanding, is shrinking.

These two phenomena (presented in an extremely simplified way in the two paragraphs above) seem to be two sides of the same coin, that is, they seem to be so interconnected that one cannot happen without the other and that each feeds back into the actions of the other.

In a recent interview, Argentine actor Fabian Vena, who focuses on one-man shows about philosophy, stated with great certainty that “One’s thought is not one’s thought, but of everyone who has influenced us,” adding that “ultimately, what one is thinking is not one’s own.”

Or, as one of my philosophy professors at the University of Buenos Aires often repeated a few decades ago: “My best ideas are in the heads of others.”

In other words, as neuroscience has categorically demonstrated during the first quarter of the 21st century, our thoughts and our cognitive processes in general are not confined to our individual minds, but include personal networks, technologies, and everything we use to learn and communicate, now and in the past.

This process of assimilation, integration, and appropriation of the thoughts of others, generally unconscious, not only makes us think what we think, but also creates the illusion that those thoughts are our thoughts, when in reality we are only regurgitating what others thought.

A century ago, Alfred Whitehead argued that “Western” thought is just a footnote to Plato’s thought. And Werner Heisenberg claimed that quantum physics is “extremely close” to what Heraclitus had already thought 2,500 years ago. Now, we recycle influencer memes, and we are satisfied.

Algorithms have colonized our minds, and we no longer think, we only calculate. And that is exactly what AI does (at least now): calculate. But AI lacks the experiential, existential and imaginative dimension of humans.

If AI can duplicate itself and is on its way to surpassing human intelligence, that is because we have delegated the calculating aspect of our intelligence to AI and we have done so in such a way that, after delegating it, we have adopted it not only as our own, but as the only way of thinking.

We have learned how to navigate without thinking, but we have lost our way

Two and a half millennia ago, the Greek philosopher Heraclitus stated that “The way up and the way down are one and the same” (Fragment B60). In doing so, he simultaneously spoke of the unity of opposites, cosmic change, and the ascent to wisdom versus the descent into ignorance—for the very same mechanism that allows us to understand can also lead us into self-deception.

This early philosophical lesson came to mind as I read recent studies (for example, in Nature and Scientific American) on the negative impact of GPS navigation on humans, particularly how it diminishes our spatial memory and cognitive abilities, thereby reducing our capacity for navigation and orientation.

According to experts, excessive GPS use weakens—and, in extreme cases, erases—our cognitive maps: the mental representations and spatial images we create of our surroundings to navigate and understand them. These mental structures are, in fact, essential for our sense of direction.

While the undeniable benefits of GPS should not be overlooked, one of its critical downsides is that, as we stop using our cognitive maps, we are left with only one alternative: following the GPS’s instructions without question—“In 200 meters, turn right.” And some people obey these commands so blindly that they find themselves in uncomfortable or even dangerous situations.

Furthermore, we lose connection with our environment because we no longer need to pay attention to buildings, monuments, or other landmarks that would otherwise help us recognize where we are and how to proceed toward our destination. In other words, we have become passive navigators, both in the streets and in life itself.

Though this reflection may seem theoretical or even exaggerated, a recent conversation with an acquaintance illustrated this reality in a striking way. For the past two years, she has visited her daughter—who lives in another city—almost daily. Despite making this trip countless times, she told me that she could not make the journey without GPS.

"I know how to get there, but I don’t know the way," she said. Respectfully reinterpreting her words, one might say: "I know how to reach my destination by following the GPS’s guidance, but I do not know how to get there on my own." Put another way, the path traveled remains unknown. And the same thing happens in our lives.

No matter how much the way up and the way down, inward and outward, toward the atom and toward the universe, are one and the same (Heraclitus), and no matter how much “a journey of a thousand miles begins with a single step” (Lao Tzu), failing to know the path we have traveled means disconnecting from ourselves, from universal processes, from the unity of opposites, and from the path of life itself.

A little over a century ago, Antonio Machado taught us that “We make the path by walking.” Machado suggested that life is not a predetermined road. Instead, we create our own path through our actions. Now, we have forgotten even how to walk.

Guardians of Transitions: Janus, Hermes, and Hecate in Times of Change

In ancient times, as in our own times, transitions were of profound significance in both personal and societal life. In fact, transitions were of such importance that in the Greco-Roman pantheon there were three gods, Janus, Hermes, and Hecate, who guarded the transitions.

Each of those traditional deities embodies unique aspects of change, liminality, and passage, offering timeless insights into how to navigate uncertain times, both millennia ago and today, in the 21st century. In an era of rapid global transitions, revisiting these ancient figures can inspire us to find new ways of understanding and approaching constant and avoidable change.

Janus, a distinctly Roman god, is perhaps the most emblematic figure of transitions. Depicted with two faces, Janus looks simultaneously to the past and the future, symbolizing the threshold between what has been and what is yet to come. He was present at the start of new ventures, the beginning of the year, the founding of cities, or a new journey.

His role as the god of doorways, gates, and passageways made him a central figure in Roman religious and civic life, emphasizing the importance of mindful reflection when crossing from one phase to another.

Today, Janus serves as a reminder of the dual perspective needed during times of transformation. As we navigate a world of technological upheaval and environmental challenges, his dual gaze encourages us to honor the wisdom of the past while embracing the possibilities of the future with courage and clarity.

Hermes, known as Mercury to the Romans, was a multifaceted god associated with travel, communication, and commerce. However, one of his most significant roles was as a psychopomp—a guide for souls transitioning from life to the afterlife. Hermes was a boundary-crosser, moving effortlessly between realms, whether physical or metaphysical.

This fluidity made him a symbol of adaptability and ingenuity, qualities essential in transitional moments. He also served as a mediator, ensuring smooth exchanges between gods and mortals. In today’s interconnected and often polarized world,

Hermes invites us to act as bridges between disparate groups, cultures, and ideas. His adaptability inspires us to navigate the complexities of modern life with grace, facilitating meaningful communication and collaboration in a global landscape increasingly defined by rapid and unpredictable change.

Hecate, the ancient Greek goddess of crossroads, night, and magic, completes this trio of transitionary figures. Often depicted holding torches, she illuminated paths for those at crossroads, both literal and metaphorical. Hecate was deeply associated with the liminal, those spaces and moments that exist between defined states or identities.

As a guide through the unknown, she was particularly venerated during times of uncertainty and decision-making. In a 21st-century context, Hecate’s role as a protector of those navigating uncharted territory feels especially resonant.

As we face global crises and transitions—whether environmental, technological, or societal—these gods teach us to honor transitions as sacred, to act with mindfulness and adaptability, and to embrace uncertainty as a space of potential. We would do well to carry their timeless lessons with us.

We surround the planet and ourselves with garbage and debris

Recent studies seem to confirm that the amount of waste orbiting the Earth has reached such a level that this “space debris” is already multiplying due to collisions between these objects, threatening not only satellites and space travel, but even, if the problem worsens, humans could become “trapped” on this planet.

The so-called “Kessler Syndrome” was presented as a theoretical scenario in 1978 by Donald J. Kessler, a NASA scientist, that at some point the density of objects in low Earth orbit (LEO) would be so high that collisions between satellites and space debris would become self-sustaining, leading to a cascading chain reaction of these collisions.

That theory is now a reality, 46 years later, or very close to being one, according to Holger Krag, head of Space Safety at the European Space Agency (ESA), who, in a statement, said that current technology is “insufficient to prevent the risks” generated by the rapid accumulation of debris in space.

The Earth's orbital environment has become fragile, endangering the long-term sustainability of space exploration, including the interruption on Earth of services such as the Internet, television, telephony, communications and satellite navigation.

The Kessler Syndrome seems to connect naturally with the Dunning-Kruger Effect, so called for the two surnames of the psychologists who described how those with limited knowledge or skills often overestimate what they know or what they can do, while the opposite happens for those with greater knowledge or skills.

In other words, the Kessler Syndrome appears to be (metaphorically or literally) the space-based expression of the Dunning-Kruger Effect, as both focus on the human tendency to underestimate the complexity of a situation and overestimate our ability to control the situation, whether at a personal level or in terms of the sustainability of the space environment.

One might assume, for example, that since space is vast, the probability of satellite collisions is negligible, while ignoring the growing orbital clutter and debris cascade.

The same individual cognitive bias (the Dunning-Kruger Effect) that perpetuates errors in personal or professional settings appears to have contributed to the failure to adequately recognize the growing problem of space debris (the Kessler Syndrome).

In other words, the Kessler Syndrome can be seen as a metaphor for uncontrolled cognitive errors. That is, in the same way that a satellite collision can trigger a cascade of debris, a mistaken belief, fueled by overconfidence, can lead to larger systemic failures.

To continue the comparison, just as we surround our planet with waste, we also constantly surround ourselves with “trash,” presented as “entertainment”, “information,” “education”, “expert opinion”, and even “education”, among many other names.

One day, perhaps soon, there will be so much space junk and existential trash around our planet and around ourselves that the accumulation of “cognitive waste” will lead us into a spiral of self-reinforcing bad decisions, preventing us from seeing the “space” (literal and metaphorical) for personal growth and co-creation.

The solution lies in encouraging self-awareness and fostering continuous learning, humility, and a commitment to evolving competence.

View older posts »