Menu
header photo

Project Vision 21

Transforming lives, renewing minds, cocreating the future


17 years      OF Archives

WEEKLY COMMENTARY (AUDIO, 4 MIN.)

VISUAL PRESENTATION

DISCLAIMER

The commentaries we share here are merely our thoughts and reflections at the time of their writing. They are never our final word about any topic, nor they necessarily guide our professional work. 

 

The Limits of My Library Are the Limits of My World

Beyond what you know an immense world exists, but you cannot see it.

Throughout history, our understanding of the world has expanded in remarkable ways, but it has always been limited by what we have chosen to explore. Just as libraries house the collected knowledge of humanity, our personal “libraries”—whether physical books, digital resources, or ideas we engage with—shape how we perceive reality. If we stop seeking knowledge, we risk shrinking our world to only what we already know.

Time, Knowledge, and an Expanding Universe

Philosopher J.L. Schellenberg introduces the idea of profound time—the recognition that humanity is in its intellectual infancy. Just as the Earth is billions of years old, and human civilization is only a fleeting moment in cosmic history, our understanding of reality is still in its earliest stages.

We might assume that we already grasp the fundamental truths of the universe, but Schellenberg urges us to think long-term: if humanity survives for millions more years, what new ways of thinking will emerge? What truth remains undiscovered simply because we haven’t had enough time to find them?

This idea reminds us that our current knowledge is not the final word. It encourages humility and curiosity—two traits essential for lifelong learning. If we accept that there is more to know than what is in our personal “library” today, we open ourselves to growth, discovery, and a richer understanding of the world.

Hyperobjects and the Challenge of Comprehension

Expanding our understanding is not always easy. Philosopher Timothy Morton introduces the concept of hyperobjects—things so vast in scale that they are difficult, if not impossible, to fully grasp from a human perspective. Climate change, the internet, and deep time itself are examples of hyperobjects. They exist on scales so large that they stretch beyond our ordinary ways of thinking.

Just because something is difficult to grasp, however, does not mean we should avoid learning about it. Hyperobjects challenge us to expand our mental libraries—to think beyond our immediate experiences and consider the broader, interconnected nature of reality. The more we learn, the better we can navigate a world filled with complexity and uncertainty.

A Lifelong Commitment to Learning

If our libraries define the boundaries of our world, then expanding our knowledge means expanding our world. This doesn’t mean we must read every book or master every subject, but it does mean we should remain open to new ideas, seek out different perspectives, and challenge our own assumptions.

The universe is constantly expanding, both in a literal sense and in terms of human understanding. Just as astronomers discover new galaxies, scientists uncover new truths, and philosophers refine their theories, we too can commit to ongoing learning.

In a world that is always changing, the best way to keep pace is to keep learning. The more we explore, the brighter our lantern becomes, and the more of the world we can see. Let’s not let the limits of our libraries define the limits of our world.

The digital tree hides the forest of life and wisdom

In the not-so-distant past, it was said that planting a tree (along with writing a book and having a child) was a clear sign of the stability and maturity of the person who had planted that tree because it indicated a long-term action without expecting anything in return. For this reason, numerous spiritual and philosophical traditions have used the tree as a metaphor for existence.

For example, Seneca (1st century BCE), in his Letter 12 to Lucilius, describes the passage of time by mentioning that, when visiting his old home, he notices that those trees he had planted are already “old” and with large branches. And in On Providence, he compares resilient people to trees capable of withstanding harsh situations.

Even more remarkable, in Letter 33, Seneca compares philosophy to a “forest of ideas,” in which each thinker is a tree. Other philosophers and writers (Aristotle, Augustine of Hippo, Dante, Descartes, Spinoza, Goethe, Nietzsche, Heidegger, Deleuze, Coccia, among others) have also spoken about trees.

In a recent column, the Spanish philosopher Ariadna Romans emphasized that trees are (paraphrasing) symbols of the wisdom inherent in nature and of an autonomous organization of life. So the act of cutting down trees is both an environmental and philosophical phenomenon, that is, the change in our relationship with the world, time and transcendence.

In that context, the tree is a symbol of time in its complexity. Unlike the straight line of progress or the fleeting instant of our time, the tree is cumulative growth and simultaneous unfolding: it unites its past (roots) with the present (trunk) and its future (branches).

By reducing the tree to a mere resource, we are reducing time to a mere consumable instant, nullifying historical continuity and the projection of the future. And we ourselves become “human resources.”

South Korean philosopher Byung-Chul Han adds another element when he speaks of the disappearance of the “symbolic forest” in our era of “hypertransparency” and overexposure. Without “trees,” time becomes flat, without roots or shadows where memory can be protected and regenerated. Without memory, identity does not remain either.

From a similar perspective, Professor Shelly Palmer, an expert in new technologies, states in a recent article that we have “stopped seeing the forest of artificial intelligence by seeing only synthetic trees.”

In other words, synthetic trees do not let us see the forest of AI. But the forest of AI does not let us see its trees either. In our society, the obsession with the immediate prevents us from seeing the totality of the change that we are causing in our way of inhabiting the world.

Hence the need for an integral philosophical vision that does not forget either the tree or the forest but rather understands them in their interconnection.

I am not suggesting a nostalgic return to nature, but rather a reconsideration of what it means to be human in a world where nature disappears from human consciousness, leading to oblivion and the felling of a way of thinking and existing.

AI continues to expand. But what about our intelligence?

Artificial intelligence can now clone itself, or at least large language models can, according to a recent study from Fudan University in China. Since AI cloning does not require human intervention, researchers say that in a short time and if the cycle is repeated, AI will surpass human intelligence.

“Successful self-replication under no human assistance is the essential step for AI to outsmart [humans], and is an early signal for rogue AI”, the researcher said in a study published last December in the database arXiv, according to Live Science.

For my part, I believe that we humans are striving so that AI does not need much effort to surpass us. Just by looking at what is happening in the world today (“world” in both the material and biological sense, as well as the spiritual and cultural sense) we already have enough evidence that our natural intelligence, far from expanding, is shrinking.

These two phenomena (presented in an extremely simplified way in the two paragraphs above) seem to be two sides of the same coin, that is, they seem to be so interconnected that one cannot happen without the other and that each feeds back into the actions of the other.

In a recent interview, Argentine actor Fabian Vena, who focuses on one-man shows about philosophy, stated with great certainty that “One’s thought is not one’s thought, but of everyone who has influenced us,” adding that “ultimately, what one is thinking is not one’s own.”

Or, as one of my philosophy professors at the University of Buenos Aires often repeated a few decades ago: “My best ideas are in the heads of others.”

In other words, as neuroscience has categorically demonstrated during the first quarter of the 21st century, our thoughts and our cognitive processes in general are not confined to our individual minds, but include personal networks, technologies, and everything we use to learn and communicate, now and in the past.

This process of assimilation, integration, and appropriation of the thoughts of others, generally unconscious, not only makes us think what we think, but also creates the illusion that those thoughts are our thoughts, when in reality we are only regurgitating what others thought.

A century ago, Alfred Whitehead argued that “Western” thought is just a footnote to Plato’s thought. And Werner Heisenberg claimed that quantum physics is “extremely close” to what Heraclitus had already thought 2,500 years ago. Now, we recycle influencer memes, and we are satisfied.

Algorithms have colonized our minds, and we no longer think, we only calculate. And that is exactly what AI does (at least now): calculate. But AI lacks the experiential, existential and imaginative dimension of humans.

If AI can duplicate itself and is on its way to surpassing human intelligence, that is because we have delegated the calculating aspect of our intelligence to AI and we have done so in such a way that, after delegating it, we have adopted it not only as our own, but as the only way of thinking.

We have learned how to navigate without thinking, but we have lost our way

Two and a half millennia ago, the Greek philosopher Heraclitus stated that “The way up and the way down are one and the same” (Fragment B60). In doing so, he simultaneously spoke of the unity of opposites, cosmic change, and the ascent to wisdom versus the descent into ignorance—for the very same mechanism that allows us to understand can also lead us into self-deception.

This early philosophical lesson came to mind as I read recent studies (for example, in Nature and Scientific American) on the negative impact of GPS navigation on humans, particularly how it diminishes our spatial memory and cognitive abilities, thereby reducing our capacity for navigation and orientation.

According to experts, excessive GPS use weakens—and, in extreme cases, erases—our cognitive maps: the mental representations and spatial images we create of our surroundings to navigate and understand them. These mental structures are, in fact, essential for our sense of direction.

While the undeniable benefits of GPS should not be overlooked, one of its critical downsides is that, as we stop using our cognitive maps, we are left with only one alternative: following the GPS’s instructions without question—“In 200 meters, turn right.” And some people obey these commands so blindly that they find themselves in uncomfortable or even dangerous situations.

Furthermore, we lose connection with our environment because we no longer need to pay attention to buildings, monuments, or other landmarks that would otherwise help us recognize where we are and how to proceed toward our destination. In other words, we have become passive navigators, both in the streets and in life itself.

Though this reflection may seem theoretical or even exaggerated, a recent conversation with an acquaintance illustrated this reality in a striking way. For the past two years, she has visited her daughter—who lives in another city—almost daily. Despite making this trip countless times, she told me that she could not make the journey without GPS.

"I know how to get there, but I don’t know the way," she said. Respectfully reinterpreting her words, one might say: "I know how to reach my destination by following the GPS’s guidance, but I do not know how to get there on my own." Put another way, the path traveled remains unknown. And the same thing happens in our lives.

No matter how much the way up and the way down, inward and outward, toward the atom and toward the universe, are one and the same (Heraclitus), and no matter how much “a journey of a thousand miles begins with a single step” (Lao Tzu), failing to know the path we have traveled means disconnecting from ourselves, from universal processes, from the unity of opposites, and from the path of life itself.

A little over a century ago, Antonio Machado taught us that “We make the path by walking.” Machado suggested that life is not a predetermined road. Instead, we create our own path through our actions. Now, we have forgotten even how to walk.

Guardians of Transitions: Janus, Hermes, and Hecate in Times of Change

In ancient times, as in our own times, transitions were of profound significance in both personal and societal life. In fact, transitions were of such importance that in the Greco-Roman pantheon there were three gods, Janus, Hermes, and Hecate, who guarded the transitions.

Each of those traditional deities embodies unique aspects of change, liminality, and passage, offering timeless insights into how to navigate uncertain times, both millennia ago and today, in the 21st century. In an era of rapid global transitions, revisiting these ancient figures can inspire us to find new ways of understanding and approaching constant and avoidable change.

Janus, a distinctly Roman god, is perhaps the most emblematic figure of transitions. Depicted with two faces, Janus looks simultaneously to the past and the future, symbolizing the threshold between what has been and what is yet to come. He was present at the start of new ventures, the beginning of the year, the founding of cities, or a new journey.

His role as the god of doorways, gates, and passageways made him a central figure in Roman religious and civic life, emphasizing the importance of mindful reflection when crossing from one phase to another.

Today, Janus serves as a reminder of the dual perspective needed during times of transformation. As we navigate a world of technological upheaval and environmental challenges, his dual gaze encourages us to honor the wisdom of the past while embracing the possibilities of the future with courage and clarity.

Hermes, known as Mercury to the Romans, was a multifaceted god associated with travel, communication, and commerce. However, one of his most significant roles was as a psychopomp—a guide for souls transitioning from life to the afterlife. Hermes was a boundary-crosser, moving effortlessly between realms, whether physical or metaphysical.

This fluidity made him a symbol of adaptability and ingenuity, qualities essential in transitional moments. He also served as a mediator, ensuring smooth exchanges between gods and mortals. In today’s interconnected and often polarized world,

Hermes invites us to act as bridges between disparate groups, cultures, and ideas. His adaptability inspires us to navigate the complexities of modern life with grace, facilitating meaningful communication and collaboration in a global landscape increasingly defined by rapid and unpredictable change.

Hecate, the ancient Greek goddess of crossroads, night, and magic, completes this trio of transitionary figures. Often depicted holding torches, she illuminated paths for those at crossroads, both literal and metaphorical. Hecate was deeply associated with the liminal, those spaces and moments that exist between defined states or identities.

As a guide through the unknown, she was particularly venerated during times of uncertainty and decision-making. In a 21st-century context, Hecate’s role as a protector of those navigating uncharted territory feels especially resonant.

As we face global crises and transitions—whether environmental, technological, or societal—these gods teach us to honor transitions as sacred, to act with mindfulness and adaptability, and to embrace uncertainty as a space of potential. We would do well to carry their timeless lessons with us.

We surround the planet and ourselves with garbage and debris

Recent studies seem to confirm that the amount of waste orbiting the Earth has reached such a level that this “space debris” is already multiplying due to collisions between these objects, threatening not only satellites and space travel, but even, if the problem worsens, humans could become “trapped” on this planet.

The so-called “Kessler Syndrome” was presented as a theoretical scenario in 1978 by Donald J. Kessler, a NASA scientist, that at some point the density of objects in low Earth orbit (LEO) would be so high that collisions between satellites and space debris would become self-sustaining, leading to a cascading chain reaction of these collisions.

That theory is now a reality, 46 years later, or very close to being one, according to Holger Krag, head of Space Safety at the European Space Agency (ESA), who, in a statement, said that current technology is “insufficient to prevent the risks” generated by the rapid accumulation of debris in space.

The Earth's orbital environment has become fragile, endangering the long-term sustainability of space exploration, including the interruption on Earth of services such as the Internet, television, telephony, communications and satellite navigation.

The Kessler Syndrome seems to connect naturally with the Dunning-Kruger Effect, so called for the two surnames of the psychologists who described how those with limited knowledge or skills often overestimate what they know or what they can do, while the opposite happens for those with greater knowledge or skills.

In other words, the Kessler Syndrome appears to be (metaphorically or literally) the space-based expression of the Dunning-Kruger Effect, as both focus on the human tendency to underestimate the complexity of a situation and overestimate our ability to control the situation, whether at a personal level or in terms of the sustainability of the space environment.

One might assume, for example, that since space is vast, the probability of satellite collisions is negligible, while ignoring the growing orbital clutter and debris cascade.

The same individual cognitive bias (the Dunning-Kruger Effect) that perpetuates errors in personal or professional settings appears to have contributed to the failure to adequately recognize the growing problem of space debris (the Kessler Syndrome).

In other words, the Kessler Syndrome can be seen as a metaphor for uncontrolled cognitive errors. That is, in the same way that a satellite collision can trigger a cascade of debris, a mistaken belief, fueled by overconfidence, can lead to larger systemic failures.

To continue the comparison, just as we surround our planet with waste, we also constantly surround ourselves with “trash,” presented as “entertainment”, “information,” “education”, “expert opinion”, and even “education”, among many other names.

One day, perhaps soon, there will be so much space junk and existential trash around our planet and around ourselves that the accumulation of “cognitive waste” will lead us into a spiral of self-reinforcing bad decisions, preventing us from seeing the “space” (literal and metaphorical) for personal growth and co-creation.

The solution lies in encouraging self-awareness and fostering continuous learning, humility, and a commitment to evolving competence.

Every heartbeat represents the heart thinking

Every heartbeat is an expression of the heart’s thinking, suggests a new study that revealed that the heart has its own mini-brain, a complex nervous system that is independent of the brain. Suddenly, expressions like “broken heart” or “hardened heart” take on a new dimension.

The study, published in the journal Nature Communications, was coordinated by the Karolinska Institutet (Sweden) and Columbia University (United States). The experts found that the heart performs “advanced functions” thanks to the fact that different types of neurons in the heart perform different functions.

From another perspective, metaphorical expressions have long been used contrasting the heart and its emotions with the logical thinking of the brain. Be that as it may, we repeat, every heartbeat is really a thought. And that is moving to me.

In the 17th century, the French thinker and mathematician Blaise Pascal said “the heart has reasons that reason does not understand” (or ignores), indicating that there are certain “truths” or “deep experiences” that are irreducible to strictly logical, deductive and calculating thought.

Love, faith, the experience of the divine, personal values, intuitions, in any of their many expressions, represent both the multidimensionality and the richness and complexity of human reality. That does not mean discarding reason or declaring those emotions or experiences “irrational.” But we must recognize that not everything valuable, true or authentic can be explained or found in calculating thought.

Closer to home, Antoine de Saint-Exupéry wrote in The Little Prince that “One only sees with the heart. What is essential is invisible to the eyes.” I don't know if Saint-Exupéry knew that the eyes are part of the brain, but, leaving that subject aside, the famous French writer knew that we see with the heart, not with the eyes.

“Seeing with the heart” is a form of knowledge and perception that is not based solely on the physical senses, but on intuition, empathy and affection. It’s remembering (from the Latin recordari, “to pass through the heart again”) that there are certain levels of reality that cannot be bought, measured, quantified or used.

“Seeing with the heart” is a call to live an authentic life, that is, to accept and love others for who they really are, not for what they seem to be or for what we believe them to be. Seeing with the heart implies consciously and constantly overcoming our superficial prejudices and recognizing the unique and valuable essence of each person.

Finally, in our time, Dr. Otto Scharmer speaks, in his Theory U, of an “open heart,” that is, a state of conscious presence that makes us more receptive to the inner and outer world, thus allowing us to access a deeper level of understanding.

With each heartbeat we enter a new future. It is time to live it, not plan it or calculate it. As Rumi said, “When the heart speaks, the mind finds its way.” Another teacher said 2000 years ago, “For where your treasure is, there your heart will be also”.

When we stop playing seriously, we become trapped in the anguish of the present

The rapid, inevitable, profound, irreversible, and unsolicited social and technological changes of today’s world seem to indicate that we have disconnected from the future. As a result, we forget the past and cling to a present as distressing as the reality surrounding us—or so says an influential Spanish writer and thinker.

"Never before has the present been so disconnected from the future. In the past, the future offered more guarantees of continuity, but today the question of the future is agonizing," Francisco Jarauta recently stated.

I believe one of the reasons for this is that we are not preparing for a future that is no longer a continuation of the past nor a repetition of the present. And it’s not because we lack the ability to prepare for ongoing disruptions, but because we’ve simply stopped playing seriously. For this very reason, we have abandoned thinking.

In his early childhood, my son and his peers would spend entire afternoons playing. I often observed that, regardless of the game, someone would propose changing the rules, leading to three basic responses: accepting the changes and continuing to play, rejecting the changes and creating conflict, or proposing modifications to the new rules.

It took me many years to understand that this rule-changing in children’s games was a way of preparing for a future in which, for whatever reasons, any of those children—by then adults—might face unforeseen circumstances, forcing them to accept or reject new realities, or to promote new changes.

In other words, those childhood games, far from being frivolous ways to pass the time, were meaningful activities that combined budding interpersonal relationships with moments of exploration, creativity, and experimentation. This fostered personal growth and even cognitive transformation.

This situation is what Dr. John Vervaeke from the University of Toronto calls "serious play." It refers to a process in which a dynamic relationship emerges between the current structure and spontaneous innovation, allowing one to explore the new while simultaneously standing on the firm ground of past experience.

In childhood, we engage in children’s games. In adulthood, serious play includes practices from wisdom traditions, sciences, and states of consciousness beyond the surface-level awareness of daily life. One such serious play that integrates imagination, critical thinking, and intuition is philosophy.

When we stop playing seriously, philosophy (and everything in life) becomes a matter of tradition and dogma. That’s why Nietzsche spoke (in Beyond Good and Evil, 211) of those "philosophers of the future" who possess the ability to break free from the frameworks and values of their own time.

According to Nietzsche, all philosophy is a philosophy of the future. Nietzsche emphasized the idea that philosophy is forward-looking, shaping and influencing the future through its critique of contemporary values, systems, and thought.

Nietzsche often portrayed philosophers as those who pave the way for new ways of thinking, challenging stagnant traditions to create frameworks for humanity's progress. Thus, disconnected from the future and from thinking, we are left with nothing but anguish or escapism.

A conversation with a new philosophy student changed my perspective as a professor

One time, while I was teaching an introductory philosophy course at a local university, a student taught me a valuable lesson. Although I no longer remember many details of that interaction, the lesson itself has stayed with me, and, in fact, I apply it every day. Even more, it is a lesson we all should learn and practice.

The class began at 9 a.m. on a Monday, an inconvenient time to teach philosophy. One day, the student in question arrived after the class had already started, looking half-asleep and dressed as though she had just come from a party. My impression was that this young woman had been out all night and had come directly to class from the party.

I suggested she leave the classroom, promising that I would personally provide her with the necessary information and assignments later. But she asked to stay, emphasizing that she wouldn’t disturb her classmates. As soon as the class ended, she came to my desk to talk.

By then, I had already prepared my entire speech in my mind. My strategy was to emphasize that, while attending parties was fine, she had a responsibility to moderate such activities so they wouldn’t interfere with her other commitments. I also planned to remind her of her responsibility to those (presumably her parents) who were paying for her education.

But before I could say anything, the student looked at me and said, “Professor, you don’t understand.” And she was right.

Many years ago, I learned that if someone tells me I don’t understand something, the chances that they are correct are exceedingly high. So, when I heard her words, I asked her to explain what I wasn’t understanding and why. What she shared took me completely by surprise; it was something I could never have imagined.

The student explained that she came from a certain Asian country and was the first in her family—and her entire small town—to travel abroad and attend university. However, her parents had imposed one condition: on the day of her hometown’s annual festival, she had to participate, dressed for the occasion and eating her country’s traditional foods.

To fulfill that promise, and due to the time difference (about 14 hours), she had stayed up all night, even preparing traditional dishes she needed to show her parents and relatives. When the celebration finally ended, she came to class.

Once again, I learned the lesson of not believing I have all the answers or assuming that my frame of reference is always the correct one for understanding a situation. Knowledge and wisdom emerge precisely from the ability to change perspectives.

When Socrates declared that he knew nothing, he was not admitting to mere ignorance but emphasizing the importance of remaining open to multiple perspectives. This profound insight, rooted in humility and intellectual flexibility, is a teaching that has sadly faded from relevance in our era when, as Father Richard Rohr once suggested, we have become addicted to our own ideas.

Now More Than Ever, We Need to Enhance Our Cognitive Abilities

Less than a century ago, Swiss psychologist Jean Piaget was among the first researchers to determine that children are not less intelligent than adults; rather, they are in a distinct stage of cognitive development. As they progress through this development, children acquire greater abilities to understand reality, the world, and society.

Based on what little I’ve read about advancements in cognitive sciences and recent philosophical and theological proposals, I believe that we, as adults, are stuck in our own stage of cognitive development—almost like young children who confuse size with quantity or weight with volume. However, in our case, the issue is both tragic and dangerous.

According to Piaget, early thinking is dominated by a focus on a single aspect of a situation, by perceptual dominance, and by perspectives limited by the egocentrism inherent in an age where the child depends on adults and, therefore, seeks to capture their attention.

Reviewing the elements of the so-called preoperational stage of cognitive development, it is saddening to see that these same elements now dominate the psychological state of much of humanity. This is a humanity in which certain aspects of cognitive development either never fully emerge or do so inconsistently or inadequately, constrained by cultural or environmental factors.

For instance, according to Piaget, children under the age of seven are generally unable to recognize that rearranging objects in a set does not change their quantity. This is because they lack the ability to mentally reverse the action (reversibility) or to consider multiple aspects of a situation simultaneously (decentration).

Similarly, many adults struggle with complex systems thinking—the ability to perceive and integrate multiple layers of complexity without resorting to oversimplification. This may represent a latent cognitive ability that remains underdeveloped.

Moreover, accepting and navigating ambiguous or contradictory information without needing immediate resolution may be another cognitive capacity that is still nascent for most people.

Just as the “conservation of number” in early childhood focuses on the spatial arrangement of objects, the “conservation of time” in adults might involve understanding that certain processes or truths are not bound to immediate perceptions or results. This understanding is essential for long-term thinking and foresight.

Many philosophers and thinkers have long explored the idea that humanity is still in development. Friedrich Nietzsche described humans as a “bridge” toward greater potential, while Teilhard de Chardin envisioned a collective consciousness (noosphere) as the next stage of human evolution. Even Carl Jung suggested that humanity’s struggles reflect “unexamined shadows.”

Our technological achievements also highlight our immaturity: we wield powerful tools, like artificial intelligence, to amplify existing problems rather than solve them.

Seeing humanity as immature, however, is not a condemnation—it’s an opportunity. Recognizing our immaturity is the first step toward creating a future where humanity reaches its full potential and highest aspirations.

As individuals and as a collective, we are becoming. Growth is not just possible—it is inevitable when we recognize the opportunity within our challenges as an opportunity to grow, to learn, and to coevolve.

View older posts »