If you want to know what human beings really want, consider how Alice reacts to Wonderland. Lewis Carroll’s inventions are so entertaining, we tend to smile at how upset, vexed, and unsettled Alice actually is. Wonderland overturns our regular, orderly, and predictable life, which is what people actually want.
The inhabitants of Wonderland aren’t just fantastic. They are unhappy. The White Rabbit is anxious enough to be a study in stress over a deadline. The Duchess’s cook throws dishes in a state of rage, and the Duchess herself hands Alice her squalling baby, which turns into a pig. Alice is very glad to get out of there. But Wonderland haunts us, and for good reason.
Even though teams of scientists around the world are working on great mysteries, from the origin of the universe to the origin of life, the greatest mystery remains personal, the mystery of the self. So far as we know, human beings are unique in pondering our own existence as selves and also our place in the universe. You would think that this trait is enough to solve the mystery. After all, if I am aware of myself, I should be an expert on how the self operates.
But exactly the opposite is true. No one can say, with any hope of reaching a consensus, even the most basic things about the self. For example, “self” is both a word and a concept, yet no one knows how or when human speech came about, and concepts, which imply thinking, confront us with our ignorance about what a thought actually is. Take the simplest statement about the individual self, “I am.” When you say these two words to yourself, is it really possible that your brain cells know English and possess a voice?
The world’s spiritual traditions can be reframed as explorations into “I am.” Jehovah uses the phrase when he speaks to Moses out of the burning bush, as well as in Psalm 46: “Be still and know that I am God.” Jesus tells his disciples, “I am the way and the truth and the life.” In the ancient Vedas, supreme knowledge is conveyed, mysteriously enough, in the declaration, “I am That.”
The upshot, if we gather these statements together, is that “I am” is a statement beyond what we ordinarily think ourselves as individuals and holds the key to truth, life, existence, and a higher power known as God. The gist of the Upanishads is that all things are done by, for, and because of the self, the foundation of reality. However you parse our different types of scriptural heritage, as a species we have been fascinated and baffled by our own self-awareness.
How do we know that anything is real? This isn’t a question that usually bothers most people, because we’ve all been brought up to look upon the physical world “out there” as a given. But let’s say that someone actually asks you the question, “How do you know the physical world is real?” What would you answer?
If you pause for a second, there are only two kinds of answers to this question: Either you tell a story or you refer to your own experience. Stories used to be collective myths, generally based on religion, about how God or the gods created the world. But any story, including the most advanced scientific models, depends on belief. If you believe in the Book of Genesis, you will see reality very differently from someone who believes in the Big Bang. To sort out which story is actually true, the second kind of answer arose, defining reality according to our experience. A rock is hard because two people who kick it agree from their experience that it is, in fact, hard.
The power of seeing is well known to everyone, and many examples exist. There is love at first sight and Alexander Fleming noticing that penicillium mold kills bacteria. Galileo as a youth in church was the first to notice that a pendulum swings in a regular rhythm, setting the basis for pendulum clocks. Isaac Newton famously discovered gravity by watching an apple fall, although this tale was told second-hand and is probably a romantic fiction.
But what if a mere glance has untold power, literally the power to create reality? The opening for this idea came from what is known in quantum physics as the measurement problem a hundred years ago. A quantum is a tiny unit of energy, and if a specific quantum like an electron or a photon is considered a thing, it should be measurable. You should be able to know where it is at a given instant in time, for example, or how fast it is moving, how much it weighs, and the other properties that we assign to things in the everyday world.
When you take the popular phrase “Follow your bliss” and trace it back to its source, something more powerful was intended. In a late interview the famous expert on mythology Joseph Campbell first used the phrase, saying “If you follow your bliss, you put yourself on a kind of track that has been there all the while, waiting for you.”
This implication that bliss is a personal path, and that the path is pre-determined, is much more than “do what you really like to do,” which is how most people interpret “Follow your bliss.” Let me expand on this point by showing that “bliss” is much more fundamental than almost anyone realizes. It holds the key to transforming the mind.
“If you follow your bliss, you put yourself on a kind of track that has been there all the while, waiting for you.”
Doing what you really like to do is certainly a good idea; it is much better than the opposite, doing what you have to do even if you don’t particularly like it. But no one can engage in pleasurable activity all the time. The human mind brings us experiences of pleasure and pain, and since the two are paired as inescapable opposites, mental tension and conflict are inevitable no matter how positive and pleasant you try to make your life be. (For deeper background, please see my most recent post, “Can You Make Your Mind Your Friend?”)
Although we don’t often put it this way, the most important relationship in everyone’s life is with the mind. The late Stephen Hawking drew the world’s attention by leading a life totally of the mind, his physical activity reduced to eye motion and blinking. The body without the mind is inconceivable, however. We cannot exist without thought. So it’s important to ask how best to relate to our minds.
I’m thinking of the most basic issue: Is the mind friend or enemy? Leave aside for the moment the traits that make it fascinating to be human: love, creativity, intelligence, evolution, and self-awareness. These traits make the human mind unique among all life forms on Earth, but we also suffer uniquely. Our minds are the source of anger, fear, envy, depression, grief, and hopelessness. If a friend brought suffering into our lives, it wouldn’t matter how happy he made us at other times—suffering trumps friendship, especially when you consider that the mind is capable of confusing us so deeply. The last thing the mind seems to understand is itself.
If someone invited you to live in a world where every physical thing—granite, stars, trees, the bones in your body—lost their thingness, would you accept? The fact that things exist is very reassuring, so reassuring that we can hardly do without it. Unfortunately, this reassurance is false. We live in a world where things aren’t really things, whether we choose to or not.
Matter, the physical side of matter and energy, is one half of a duo act. We are told matter is what the universe is made of, and energy is what puts matter in motion. The dance between them constitutes the reality we inhabit, a fact so obvious that modern science relies upon it as the unquestioned basis for doing science, not to mention for leading our everyday lives.
If matter and energy are not what they seem, science could be rocked to its core—but great care is taken for this not to happen. Strangely, a nursery rhyme tells the tale. Like Humpty-Dumpty in the English nursery rhyme, physical matter—solid, tangible inert matter composed of atoms and molecules- took a great fall over a hundred year ago, when quantum mechanics demolished every one of those qualities. It is entirely inaccurate to envision the universe being built up from bits of solid matter—or bits of anything.
The ancient Greek notion that reality can be reduced to a minuscule speck of matter (the atom) was a delusion of logic, and therefore a mental construct only. In reality the elementary particles that comprise the atom have a mysterious existence. They have no measurable weight, position, or any other characteristic until they are observed. Before that, they exist as waves that extend infinitely in all directions. These waves have no properties you can assign to any solid object. They arise as ripples in the quantum field, and the entire structure of the universe is mathematically described as interference patterns among these ripples, like the pattern formed on the surface of a pond if you throw two rocks in at the same time.
The dissolution of physical matter isn’t controversial—quantum mechanics is the bedrock of modern physics--but it turned out to be intolerable for working scientists. They rely upon the reassuring nature of thingness just as much as ordinary people. Theoretically, doing away with thingness should have been the end of the story. As every child knows, all the king’s horses and all the king’s men couldn’t put Humpty-Dumpty together again. Physics, however, managed to do something more mysterious. It ignored that matter fell and broke in the first place.
The billions of dollars spent on high-energy particle accelerators shows the lengths to which jobs, budgets, and complex projects rest on an ability to ignore what quantum physics actually means. There are now eighteen basic particles, with the hope that more will be discovered in the future, dependent on building even more mega-accelerators. But when these vast machines cause a new particle to bounce out of the quantum field for a fraction of a millisecond, using huge amounts of energy to accomplish this, where is that particle really coming from?
The story of life on Earth owes a great deal to Charles Darwin, and even though few people today read his epoch-making 1859 book, On the Origin of Species, without a doubt we live in a Darwinian world. Revolutionary ideas are subject to change, and when they go viral, as Darwinism did with a vengeance, many unexpected consequences result.
The crudest misuse of Darwin’s theory of evolution are contained, ironically enough, in phrases Darwin never uttered: “survival of the fittest,” ”the law of the jungle,” and “Nature red in tooth and claw.” These notions have been enormously influential. They turn evolution into a winner-take-all competition ruled by the violent opposition of predatory and prey.
Survival of the fittest, when applied to human society, celebrated the rich and powerful as evolutionarily superior. It justified the prejudice that the poor deserve to be poor because they are unfit (i.e., weak, stupid, genetically inferior). Racism and genocide have looked to Darwinism as an excuse to “purify” whole populations through means ranging from forced sterilization to mass murder. Oppressing workers in the worst periods of the Industrial Revolution also looked to Darwin for (false) justification.
It seems perverse that the easier life becomes, the worse our problems. Technology has created life-changing innovations like the Internet that are directly linked with terrorist attacks, giving like-minded fanatics instant global communication. Computers gave rise to social media, which has led to cruel bullying at school, fake news, conspiracy plots, and the anonymity to mount vicious personal attacks—all of these seem as endemic as hacking, another insoluble problem created by technology.
One could go on practically forever, and it wouldn’t be necessary to blame current technology either—the internal combustion engine is directly connected to climate change, and nuclear fission led to the horrors of atomic warfare. But my point isn’t to bash technology; we owe every advance in the modern world to it—except one.
Technology is based on higher education, and whatever its benefits, higher education has almost totally lost interest in wisdom. Wisdom isn’t the same as knowledge. You can collect facts that lead to the understanding of things, but wisdom is different. I’d define it as a shift in allegiance, away from objective knowledge toward self-awareness.
The Greek dictum “Know thyself” doesn’t make sense if the self you mean is the ego-personality, with its selfish demands, unending desires, and lack of happiness. Another self is meant, which isn’t a person’s ego but a state of consciousness. “Self” might not even be a helpful term, despite the age-old references to a higher self identified with enlightenment. It is more helpful to say that the pursuit of wisdom is about waking up.
For centuries a quality has existed that is referred to as wisdom. A phrase like “wiser heads prevailed” implies that wisdom can save us from stupid or foolish actions. Elders were once considered wise, and so were philosophers. But once you bring up these references, wisdom feels antiquated and irrelevant. Who are the wiser heads in our day? Aside from a revered figure like the Dalai Lama, it’s hard to name one, and he is really a spiritual figurehead more than the classic wise man.
Whatever wisdom might be, the average person doesn’t think about it very much, if at all, and when you consider the problems that feel the most disturbing—climate change, terrorism, racism, poverty, and international tensions, for example—nobody is clamoring to call on wisdom to solve them.
But maybe only wisdom can. Let me explain what I mean.
Every problem, not just the big global ones but problems in everyday life, get solved by using a mental model. This model explains what has gone wrong, which is the first step in making things right again. Consider a common problem like feeling depressed. In our time we apply a medical model and send the depressed person to get help from a doctor, who will prescribe an antidepressant, or to a psychologist, who will apply some kind of therapy.
In the past, other models would have offered a very different explanation of why someone is depressed. Instead of calling depression a mental disorder or a psychological malady, which leads to trying to understand the person’s brain, depression would have been considered a lack of personal discipline or a moral failing. A depressed person in another model would be considered possessed by evil spirits or punished by God for some hidden sin. It’s strange to think that depression might be treated using everything from bleeding to exorcism, but such is the power of mental models.
Models fool people into believing that they are true. In modern society, the general belief that depression is an illness like catching a cold or contracting cancer feels so certain that few would disagree. But in fact, the disease model is not always workable in depression. The action of popular antidepressants on the brain isn’t certain and may be totally misunderstood. You cannot reliably predict who will get depressed, and quite often depression comes and goes on its own for no reason anyone can explain.
If your model doesn’t predict things correctly, leads to haphazard solutions, and depends on unproven assumptions (in this case, the assumption that depression is a brain problem), it’s not a model that matches reality. In modern life, we rely on a model of reality that has three components or levels.
The first level is data, which we collect and assemble into facts. Facts are supposed to match reality, but thanks to the human gift of rationality we now have so-called “alternative” facts, that are really just stubborn opinions that refuse to be rational.
The second level is information. Information consists of the conclusion that the data reveals. If your blood test comes back with an abnormal blood sugar reading (fact), your doctor might inform you that you are diabetic (conclusion). But in many cases of other disorders doctors and other experts frequently disagree. The same information can often lead to opposite conclusions.
The third level is knowledge, which consists of understanding. You are a knowledgeable doctor if you went to medical school and acquired the knowledge of diseases and how to treat them. Knowledge is the summit of the scientific or rational model. Data gives us the facts; facts assemble into correct information; information, when absorbed as knowledge, allows any problem to be solved, any question to be answered.
The human potential movement deals in self-improvement, encouraging people to realize that they are not as limited as they think they are. This approach of overcoming limitations has benefitted many, but from a wider perspective, there should be an “infinite potential” movement. Let’s say that the proposition of infinite potential is viable. How would you prove that it exists?
The proof is much simpler—and far more surprising—than you might suppose. Consider yourself going to the supermarket to buy a dozen organic brown eggs. This everyday task is enough to open the door to infinity. “Dozen” is a mathematical concept. Not only are numbers infinite, but so are the equations that grow out of numbers. From equations grow scientific formulas, and science stands for the human capacity to experiment, measure, and rationally understand the world, which may not be infinite but shows no signs of doing anything but grow.
Modern machines are assembled from separate moving parts, a fact that seems so obvious that we usually don’t notice its vast influence over us. But the image of a machine extends to the human body, which is an assemblage of trillions of separate cells, and ultimately to the universe, which is considered an assemblage of atoms and molecules beyond numbering.
So ingrained is the machine metaphor that it has taken centuries to realize that it has a fatal flaw. The human body and the universe operate as a single wholeness that cannot be explained mechanically or even logically. The general public has a vague acquaintance that quantum physics changed how science views space, time, matter, and energy. What escapes general notice, however, is the revolution that followed the quantum revolution.
For many decades It was assumed that the human brain must be special, as superior to the brains of other mammals as our minds are. This specialness was never seriously questioned, and even basic facts, like asserting that the human brain contains 100 billion neurons, were arrived at with surprising casualness.
In an interesting 2013 TED talk, the articulate Brazilian neuroscientist Suzana Herculano-Houzel offers clarity for the first time on several of the basic issues. After devising a way to dissolve brain cell membranes so that only the nuclei remained, and isolating them to be counted, she determined that the human brain contains 86 billion neurons, the most of any primate. Even though the human brain is a small fraction of our total weight, it uses 25% of a person’s daily calorie consumption.
That may seem like an incidental fact, but Herculano-Houzel makes it the cornerstone of her argument, which declares that the human brain isn’t special. We have primate brains, she says, that are in proportion to our primate relatives like chimpanzees and gorillas. But in an odd evolutionary twist, chimps and gorillas cannot sustain the calorie load of an immense brain by eating raw food. Typically, a great ape feeds for eight hours a day to sustain its large body, and over time a choice was made to prefer a very large body with a smaller number of neurons.
More people than ever have undertaken a spiritual path of their own, independently of organized religion. “I’m not religious, but I’m spiritual” has become a common expression, and I count myself among those who struck out on their own as a seeker. My search has covered a lot of ground over the years, from mind-body medicine to quantum physics, higher consciousness, the future of God, and personal transformation.
What all of these disparate topics have in common is reality, in the sense that everyday reality is hiding from view the “real” reality that needs to be unveiled. (Readers might want to look at last week’s post, “Unveiling Reality,” which details what it means to unveil reality.) There’s no question that the five senses detect the world in a very limited way, since they give no clue that molecules, atoms, and subatomic particles exist, not to mention genes and DNA. But unveiling a deeper physical reality is far from the whole story.
I’d like to explain one of the great mysteries faced by spiritual seekers. On the surface this mystery sounds simple. The most basic statement of it is this: You don’t have to go anywhere to reach higher consciousness. At some level you are already enlightened. All you have to do is to uncover this level within yourself.
There are countless versions of the same teaching. “Be still and know that I am God” is a religious version. So is “The kingdom of Heaven is within.” Outside religion a version from India is called “the pathless path.” However different, all these teachings imply the same thing: The seeker’s goal is here and now. There is nowhere to go, no journey to take, no distance between the beginning and the end of the seeker’s path.
The near-death experience (NDE) has entered popular culture, starting in the 1970s, and "going into the light" is considered by the average person to be what happens after you die, assuming that anything happens. But the largest study of NDEs, which examined 2,060 patients who died under emergency or intensive care, arrive at the conclusion that death isn't a single event--it is a process. During this process, there are ways to reverse death. If you are successful at getting the heart, lungs, and brain to come back to normal functioning, about 40% of those who died and came back remember that "something happened" when they were flat line.
This part of the study, which was titled AWARE and was led by intensive-care doctor Sam Parnia, seems irrefutable. But very quickly the details of "something happened" become controversial. We have to dive into a few details to see what the issues are. Out of the 2,060 patients who died (the study went from 2008 to 2012 and included 33 researchers in 15 hospitals), 104 were resuscitated. The first point to note is that all had actually died. They were not "near death." Their hearts and lungs had stopped functioning, and within 20-30 seconds their brains showed no activity. The decomposition of cells throughout the body actually takes several hours to commence afterwards. During the interval between dying and being brought back is when 39% reported the memory of being conscious even though their brains had stopped.
The field of genetics is so complex that the story is simplified for popular consumption. The simplified story is that DNA contains the “code of life,” a master blueprint that jumps into action the instant an egg is fertilized in the mother’s womb. From that point on, a human being develops from a single cell to 37 trillion cells as the blueprint unfolds. The traditional view is that we are then the sole products of our genes. Yet, increasingly, evidence shows that “nurture” plays a much bigger role over “nature” than even professional geneticists have ever envisaged. When it comes to genetics, “nurture” exerts its effects on “nature” via epigenetics, as we laid out in our book Super Genes.
As powerful as the “code of life” story is, behind the scenes a growing number of geneticists don’t buy into it; in fact, they think we’ve gotten a lot about genes, wrong. At the same time, a new, improved picture of human development, based on the interplay of genes and lifestyle, is emerging. This revolution is outlined beautifully in an online article at Nautilus.com titled “It’s the End of Genes as We Know It.” The author, Ken Richardson, is an expert in human development, and he is worried that wildly exaggerated assumptions about the deterministic effects of DNA could lead to social policy that echoes the racism that fueled the eugenic movement decades ago, most notoriously with the Nazi ideology of a master race. As a case in point, Nobel Laureate, James Watson, who co-discovered the structure of DNA in 1953, was recently stripped of all his honors at Cold Spring Harbor, Laboratories, where he spent much of scientific career, after he continually expressed his bigoted opinion that black people and women are less intelligent than others based on their genetics.
There’s an old joke about a man who falls off the Empire State Building. As he passes an office window on the way down, someone shouts, “How are you doing?” and the man answers, “I’m okay so far.” I don’t know anyone who doesn’t laugh at the punchline the first time they hear the joke, but there’s also a wince thinking about the thud that awaits the man at the end.
Science has been okay—so far—in explaining how nature works, riding the crest of success for several centuries now. But the thud is near at hand, as outlined in a very readable, perceptive online article titled “The Blind Spot,” jointly written by two physicists, Adam Frank and Marcelo Gleiser, and a philosopher, Evan Thompson. It’s well worth your time to read it, because the blind spot referred to in the title has been of tremendous but hidden importance in your life.
People have become convinced that there is a spiritual benefit to living in the present. This is a surprising phenomenon, because nothing seems more mundane than the here and now. You wouldn't expect anything special to emerge from the constant flow of seconds, minutes, and hours that fill everyone's life from the moment of birth. There must be a deeper reason for giving the present moment a special value. (As an introduction to the significance of now, please see my recent post, "What Does It Mean to Live in the Present?"
"Now" is a concept that runs deeper than you might suppose. First of all, it cannot be measured by the clock. Before the tick of the clock is over, it has vanished into the past. Likewise, the experience of now as a subjective event is ungraspable by the mind. A thought is gone the instant you think it, and there's an argument from neuroscience that says the words you perceive as a thought are after-effects of the brain activity that created them, since the electrical impulses and chemical reactions inside neurons take fractions of a second, while the words in your head take much longer.
In recent decades the concept of living in the present moment has been widely discussed, prompted by the surprising success of Eckhart Tolle’s 1997 book, The Power of Now. For millions of readers Tolle’s basic thesis, that there is something special about the here and now, came as a spiritual message they could seize upon in daily life.
The power that the present moment possesses, as many people now believe, is its reality. To be in the now means that you are not distracted by memories of the past or expectations about the future. You dwell instead on whatever is right in front of you, applying mental clarity, alertness, and your full attention. Simple enough—until one looks deeper. Young children live in the now. Are they better off for it, considering the years of maturation that lie ahead to bring about full-fledged adulthood? The elderly suffering from dementia typically have severe memory loss, forcing them to live only in the passing moment, and this condition becomes confusing and blank, not to mention a source of distress.