The story of life on Earth owes a great deal to Charles Darwin, and even though few people today read his epoch-making 1859 book, On the Origin of Species, without a doubt we live in a Darwinian world. Revolutionary ideas are subject to change, and when they go viral, as Darwinism did with a vengeance, many unexpected consequences result.
The crudest misuse of Darwin’s theory of evolution are contained, ironically enough, in phrases Darwin never uttered: “survival of the fittest,” ”the law of the jungle,” and “Nature red in tooth and claw.” These notions have been enormously influential. They turn evolution into a winner-take-all competition ruled by the violent opposition of predatory and prey.
Survival of the fittest, when applied to human society, celebrated the rich and powerful as evolutionarily superior. It justified the prejudice that the poor deserve to be poor because they are unfit (i.e., weak, stupid, genetically inferior). Racism and genocide have looked to Darwinism as an excuse to “purify” whole populations through means ranging from forced sterilization to mass murder. Oppressing workers in the worst periods of the Industrial Revolution also looked to Darwin for (false) justification.
It seems perverse that the easier life becomes, the worse our problems. Technology has created life-changing innovations like the Internet that are directly linked with terrorist attacks, giving like-minded fanatics instant global communication. Computers gave rise to social media, which has led to cruel bullying at school, fake news, conspiracy plots, and the anonymity to mount vicious personal attacks—all of these seem as endemic as hacking, another insoluble problem created by technology.
One could go on practically forever, and it wouldn’t be necessary to blame current technology either—the internal combustion engine is directly connected to climate change, and nuclear fission led to the horrors of atomic warfare. But my point isn’t to bash technology; we owe every advance in the modern world to it—except one.
Technology is based on higher education, and whatever its benefits, higher education has almost totally lost interest in wisdom. Wisdom isn’t the same as knowledge. You can collect facts that lead to the understanding of things, but wisdom is different. I’d define it as a shift in allegiance, away from objective knowledge toward self-awareness.
The Greek dictum “Know thyself” doesn’t make sense if the self you mean is the ego-personality, with its selfish demands, unending desires, and lack of happiness. Another self is meant, which isn’t a person’s ego but a state of consciousness. “Self” might not even be a helpful term, despite the age-old references to a higher self identified with enlightenment. It is more helpful to say that the pursuit of wisdom is about waking up.
For centuries a quality has existed that is referred to as wisdom. A phrase like “wiser heads prevailed” implies that wisdom can save us from stupid or foolish actions. Elders were once considered wise, and so were philosophers. But once you bring up these references, wisdom feels antiquated and irrelevant. Who are the wiser heads in our day? Aside from a revered figure like the Dalai Lama, it’s hard to name one, and he is really a spiritual figurehead more than the classic wise man.
Whatever wisdom might be, the average person doesn’t think about it very much, if at all, and when you consider the problems that feel the most disturbing—climate change, terrorism, racism, poverty, and international tensions, for example—nobody is clamoring to call on wisdom to solve them.
But maybe only wisdom can. Let me explain what I mean.
Every problem, not just the big global ones but problems in everyday life, get solved by using a mental model. This model explains what has gone wrong, which is the first step in making things right again. Consider a common problem like feeling depressed. In our time we apply a medical model and send the depressed person to get help from a doctor, who will prescribe an antidepressant, or to a psychologist, who will apply some kind of therapy.
In the past, other models would have offered a very different explanation of why someone is depressed. Instead of calling depression a mental disorder or a psychological malady, which leads to trying to understand the person’s brain, depression would have been considered a lack of personal discipline or a moral failing. A depressed person in another model would be considered possessed by evil spirits or punished by God for some hidden sin. It’s strange to think that depression might be treated using everything from bleeding to exorcism, but such is the power of mental models.
Models fool people into believing that they are true. In modern society, the general belief that depression is an illness like catching a cold or contracting cancer feels so certain that few would disagree. But in fact, the disease model is not always workable in depression. The action of popular antidepressants on the brain isn’t certain and may be totally misunderstood. You cannot reliably predict who will get depressed, and quite often depression comes and goes on its own for no reason anyone can explain.
If your model doesn’t predict things correctly, leads to haphazard solutions, and depends on unproven assumptions (in this case, the assumption that depression is a brain problem), it’s not a model that matches reality. In modern life, we rely on a model of reality that has three components or levels.
The first level is data, which we collect and assemble into facts. Facts are supposed to match reality, but thanks to the human gift of rationality we now have so-called “alternative” facts, that are really just stubborn opinions that refuse to be rational.
The second level is information. Information consists of the conclusion that the data reveals. If your blood test comes back with an abnormal blood sugar reading (fact), your doctor might inform you that you are diabetic (conclusion). But in many cases of other disorders doctors and other experts frequently disagree. The same information can often lead to opposite conclusions.
The third level is knowledge, which consists of understanding. You are a knowledgeable doctor if you went to medical school and acquired the knowledge of diseases and how to treat them. Knowledge is the summit of the scientific or rational model. Data gives us the facts; facts assemble into correct information; information, when absorbed as knowledge, allows any problem to be solved, any question to be answered.
The human potential movement deals in self-improvement, encouraging people to realize that they are not as limited as they think they are. This approach of overcoming limitations has benefitted many, but from a wider perspective, there should be an “infinite potential” movement. Let’s say that the proposition of infinite potential is viable. How would you prove that it exists?
The proof is much simpler—and far more surprising—than you might suppose. Consider yourself going to the supermarket to buy a dozen organic brown eggs. This everyday task is enough to open the door to infinity. “Dozen” is a mathematical concept. Not only are numbers infinite, but so are the equations that grow out of numbers. From equations grow scientific formulas, and science stands for the human capacity to experiment, measure, and rationally understand the world, which may not be infinite but shows no signs of doing anything but grow.
Modern machines are assembled from separate moving parts, a fact that seems so obvious that we usually don’t notice its vast influence over us. But the image of a machine extends to the human body, which is an assemblage of trillions of separate cells, and ultimately to the universe, which is considered an assemblage of atoms and molecules beyond numbering.
So ingrained is the machine metaphor that it has taken centuries to realize that it has a fatal flaw. The human body and the universe operate as a single wholeness that cannot be explained mechanically or even logically. The general public has a vague acquaintance that quantum physics changed how science views space, time, matter, and energy. What escapes general notice, however, is the revolution that followed the quantum revolution.
For many decades It was assumed that the human brain must be special, as superior to the brains of other mammals as our minds are. This specialness was never seriously questioned, and even basic facts, like asserting that the human brain contains 100 billion neurons, were arrived at with surprising casualness.
In an interesting 2013 TED talk, the articulate Brazilian neuroscientist Suzana Herculano-Houzel offers clarity for the first time on several of the basic issues. After devising a way to dissolve brain cell membranes so that only the nuclei remained, and isolating them to be counted, she determined that the human brain contains 86 billion neurons, the most of any primate. Even though the human brain is a small fraction of our total weight, it uses 25% of a person’s daily calorie consumption.
That may seem like an incidental fact, but Herculano-Houzel makes it the cornerstone of her argument, which declares that the human brain isn’t special. We have primate brains, she says, that are in proportion to our primate relatives like chimpanzees and gorillas. But in an odd evolutionary twist, chimps and gorillas cannot sustain the calorie load of an immense brain by eating raw food. Typically, a great ape feeds for eight hours a day to sustain its large body, and over time a choice was made to prefer a very large body with a smaller number of neurons.
More people than ever have undertaken a spiritual path of their own, independently of organized religion. “I’m not religious, but I’m spiritual” has become a common expression, and I count myself among those who struck out on their own as a seeker. My search has covered a lot of ground over the years, from mind-body medicine to quantum physics, higher consciousness, the future of God, and personal transformation.
What all of these disparate topics have in common is reality, in the sense that everyday reality is hiding from view the “real” reality that needs to be unveiled. (Readers might want to look at last week’s post, “Unveiling Reality,” which details what it means to unveil reality.) There’s no question that the five senses detect the world in a very limited way, since they give no clue that molecules, atoms, and subatomic particles exist, not to mention genes and DNA. But unveiling a deeper physical reality is far from the whole story.
I’d like to explain one of the great mysteries faced by spiritual seekers. On the surface this mystery sounds simple. The most basic statement of it is this: You don’t have to go anywhere to reach higher consciousness. At some level you are already enlightened. All you have to do is to uncover this level within yourself.
There are countless versions of the same teaching. “Be still and know that I am God” is a religious version. So is “The kingdom of Heaven is within.” Outside religion a version from India is called “the pathless path.” However different, all these teachings imply the same thing: The seeker’s goal is here and now. There is nowhere to go, no journey to take, no distance between the beginning and the end of the seeker’s path.
The near-death experience (NDE) has entered popular culture, starting in the 1970s, and "going into the light" is considered by the average person to be what happens after you die, assuming that anything happens. But the largest study of NDEs, which examined 2,060 patients who died under emergency or intensive care, arrive at the conclusion that death isn't a single event--it is a process. During this process, there are ways to reverse death. If you are successful at getting the heart, lungs, and brain to come back to normal functioning, about 40% of those who died and came back remember that "something happened" when they were flat line.
This part of the study, which was titled AWARE and was led by intensive-care doctor Sam Parnia, seems irrefutable. But very quickly the details of "something happened" become controversial. We have to dive into a few details to see what the issues are. Out of the 2,060 patients who died (the study went from 2008 to 2012 and included 33 researchers in 15 hospitals), 104 were resuscitated. The first point to note is that all had actually died. They were not "near death." Their hearts and lungs had stopped functioning, and within 20-30 seconds their brains showed no activity. The decomposition of cells throughout the body actually takes several hours to commence afterwards. During the interval between dying and being brought back is when 39% reported the memory of being conscious even though their brains had stopped.
The field of genetics is so complex that the story is simplified for popular consumption. The simplified story is that DNA contains the “code of life,” a master blueprint that jumps into action the instant an egg is fertilized in the mother’s womb. From that point on, a human being develops from a single cell to 37 trillion cells as the blueprint unfolds. The traditional view is that we are then the sole products of our genes. Yet, increasingly, evidence shows that “nurture” plays a much bigger role over “nature” than even professional geneticists have ever envisaged. When it comes to genetics, “nurture” exerts its effects on “nature” via epigenetics, as we laid out in our book Super Genes.
As powerful as the “code of life” story is, behind the scenes a growing number of geneticists don’t buy into it; in fact, they think we’ve gotten a lot about genes, wrong. At the same time, a new, improved picture of human development, based on the interplay of genes and lifestyle, is emerging. This revolution is outlined beautifully in an online article at Nautilus.com titled “It’s the End of Genes as We Know It.” The author, Ken Richardson, is an expert in human development, and he is worried that wildly exaggerated assumptions about the deterministic effects of DNA could lead to social policy that echoes the racism that fueled the eugenic movement decades ago, most notoriously with the Nazi ideology of a master race. As a case in point, Nobel Laureate, James Watson, who co-discovered the structure of DNA in 1953, was recently stripped of all his honors at Cold Spring Harbor, Laboratories, where he spent much of scientific career, after he continually expressed his bigoted opinion that black people and women are less intelligent than others based on their genetics.
There’s an old joke about a man who falls off the Empire State Building. As he passes an office window on the way down, someone shouts, “How are you doing?” and the man answers, “I’m okay so far.” I don’t know anyone who doesn’t laugh at the punchline the first time they hear the joke, but there’s also a wince thinking about the thud that awaits the man at the end.
Science has been okay—so far—in explaining how nature works, riding the crest of success for several centuries now. But the thud is near at hand, as outlined in a very readable, perceptive online article titled “The Blind Spot,” jointly written by two physicists, Adam Frank and Marcelo Gleiser, and a philosopher, Evan Thompson. It’s well worth your time to read it, because the blind spot referred to in the title has been of tremendous but hidden importance in your life.
People have become convinced that there is a spiritual benefit to living in the present. This is a surprising phenomenon, because nothing seems more mundane than the here and now. You wouldn't expect anything special to emerge from the constant flow of seconds, minutes, and hours that fill everyone's life from the moment of birth. There must be a deeper reason for giving the present moment a special value. (As an introduction to the significance of now, please see my recent post, "What Does It Mean to Live in the Present?"
"Now" is a concept that runs deeper than you might suppose. First of all, it cannot be measured by the clock. Before the tick of the clock is over, it has vanished into the past. Likewise, the experience of now as a subjective event is ungraspable by the mind. A thought is gone the instant you think it, and there's an argument from neuroscience that says the words you perceive as a thought are after-effects of the brain activity that created them, since the electrical impulses and chemical reactions inside neurons take fractions of a second, while the words in your head take much longer.
In recent decades the concept of living in the present moment has been widely discussed, prompted by the surprising success of Eckhart Tolle’s 1997 book, The Power of Now. For millions of readers Tolle’s basic thesis, that there is something special about the here and now, came as a spiritual message they could seize upon in daily life.
The power that the present moment possesses, as many people now believe, is its reality. To be in the now means that you are not distracted by memories of the past or expectations about the future. You dwell instead on whatever is right in front of you, applying mental clarity, alertness, and your full attention. Simple enough—until one looks deeper. Young children live in the now. Are they better off for it, considering the years of maturation that lie ahead to bring about full-fledged adulthood? The elderly suffering from dementia typically have severe memory loss, forcing them to live only in the passing moment, and this condition becomes confusing and blank, not to mention a source of distress.
The human potential movement has become a roaring success over the past few decades. Yoga, meditation, the evolution of consciousness, even human potential itself are terms almost everyone knows. But the aura of spirituality hovers around them, which leads scientists to ignore human potential or to relegate it to psychology, considered the softest of soft sciences.
So it is quite startling, and a major leap forward, to find out that human potential deserves its place among the hard sciences. In fact, the five senses, instead of being grossly inferior to modern scientific apparatus, turn out to have abilities ten times greater than anyone ever supposed. In a nutshell we are quantum detectors, meaning that simply by sight, touch, taste, hearing, and smell we are participating in the finest fabric of Nature, and possibly can cause the quantum field to move at will.
There needs to be a clear rebuff of this notion that human beings are mechanisms, and the fact that science has a wealth of findings about both genes and the brain doesn't make the notion any more valid. The general public isn't aware, for example, that only 5% of disease-related genetic mutations are fully penetrant, which means that having the mutation will definitely cause a given problem. The other 95% of genes raise risk factors and in complex ways interact with other genes.
In a news-driven society more attention is paid to events that will soon fade away than to ideas that could alter civilization. Modern secular society needs the impetus of great ideas to add meaning and purpose to our lives, as religion once did when it was the dominant force around the world. In turbulent times the prospect of a single idea that can transform humanity seems remote.
But just such an idea has arrived. It travels under various tags, the most common being “the one mind.” It’s the notion that there is only one mind in the universe despite the appearance here on Earth of seven billion minds. On the surface the radical possibilities stemming from “the one mind” aren’t obvious. In fact, the last thing anyone would want to give up is the claim to be a unique individual. That’s not what the one mind is about—it’s about expanding into higher consciousness as a practical reality. If humanity shares one mind, and this mind has a cosmic dimension, the very idea begins to cause one’s consciousness to expand.
Fad diets come and go, but officially the subject of nutrition is guided by science. The public stubbornly thinks in terms of "good" foods and "bad" foods, so when the government's nutritional experts issue scientifically based advice, any attempt at a nuanced picture generally gets lost. Recently there were headlines when the highest board for dietary protocols, the Dietary Guidelines Advisory Council, reversed a government warning about avoiding foods high in cholesterol, which has been in place nearly 40 years.
The public is likely to shrug off this about-face, or else decide that eggs, the most common food high in cholesterol, is no longer a "bad" food but has moved into the "good" column. This ignores the council's message, which weighed one thing against another. For people in a normal state of health, saturated fats from animal products pose a higher risk than high cholesterol. This finding is more a shift in focus than an about-face. It's still unhealthy, the majority of nutritionists agree, to eat too much red meat as opposed to eggs, but eggs are high in saturated fat, too, so you shouldn't overdo them, either.
Neuroscience is based on the assumption that the brain produces the mind. After all, without a brain, most of us would be much poorer thinkers. If the brain produces the mind, then it’s important to know how it does it. A team at the University of California Santa Barbara has come up with a new theory—actually, an ancient theory now couched in modern scientific terms—basing mental activity on vibrations.
For a long time it’s been known that various waves of electrical activity are present, and these waves are independent in how they relate to having a mind. As explained by one of the team members from UC Santa Barbara, “Gamma waves are associated with large-scale coordinated activities like perception, meditation or focused consciousness; beta with maximum brain activity or arousal; and theta with relaxation or daydreaming. These three wave types work together to produce, or at least facilitate, various types of human consciousness…”
More than six decades after Einstein's death in 1955, his prestige is enormous and worrisome. It is enormous because relativity remains tremendously important and to this day, both the special and the general theories of relativity remain valid. It is worrisome because Einstein harbored a deep skepticism about quantum mechanics, even though quantum mechanics has been validated time after time experimentally and despite the fact that Einstein himself was one of the founders, receiving the Nobel Prize for the quantum photoelectric effect. The embarrassing fact is that quantum mechanics, which explains the behavior of the smallest level of Nature, cannot be reconciled with general relativity, which explains the behavior of the universe at the largest level. They are both right but not merged yet.
By now there's widespread acceptance and abundant research to show that the placebo effect is real. In fact, every drug gains some of its effect with many patients by dint of placebo--expecting to get better makes the drug work better. Subtract the placebo effect, and many drugs have little efficacy.
This fact has stared medicine in the face since 1962, when the Food and Drug Administration demanded that every new drug prove its clinical benefits. To subtract the placebo effect, a typical drug test involves giving the control group a sugar pill while the other half of the trial take the new drug. In the area of painkillers--placebos are at their most powerful with pain--more than 90% of new drugs cannot pass the test of working better than a sugar pill. Among those drugs that do pass, the gap that separates them from sugar pills, which was once 27%, has narrowed to an average of 9%.