These times you can study business psychology, what is for crowd control, implanting ideas into the tiny mind. Make people to wilful follower, shopping stuff without reason and believe in artificial narratives. This is the story of relationship between crowd control and the tiny linear thinking mind, that has to be keep down to enable this trickery. Remind the quote from Descartes: I think; therefore i am ‘programmable’.
Today your environment topple the tiny mind boundaries, so most of information must be trashed so survive the day. Programmed patterns replace real evidence to save time for the tiny mind routine. The real bug within system is not to use the holistic graphical mind with it’s akash access. An ability to experience and see the outcome from many beings with the same question in one moment.
Limiting experience in an experience rich world is not the solution, awakening to the coherent holistic brain is it. That what cannot be measured by IQ, it’s quantum to allow humans to operate beneficial within his environment.
The Science Behind Why People Are So Easily Fooled
The world has changed tremendously from the time of our ancestors. Today, we develop most of our beliefs based on external forces, with very little first-hand experience. Where the early humans relied on direct sensory experience to shape their beliefs, we now rely on language and our own ability to discern falsehoods from truth.
With language, we undoubtedly receive a plethora of opinions and bias based on the orator’s own belief system. Yet, we are willing to believe much, without taking the time to investigate new ideas or seeking to experience them first-hand. What is the reason for this eager credulity and can we control it?
Society’s Intrusion of the Gullible Brain
The 17th century philosopher Rene Descartes formalized the idea that, “if one wishes to know the truth, then one should not believe an assertion until one finds evidence to justify doing so.”
This sounds like a reasonable approach to integrating new beliefs. Most of us think ourselves capable of evaluating ideas and making up our own mind. Yet, think about it. When was the last time that you actually took the time to seek out new evidence to help you prove or disprove new ideas?
I’m not just talking about some random fact you saw on a news website. I’m talking about ideas that you receive from everywhere. All of your media outlets, social networks and personal interactions. Truthfully, there’s so much information coming at us all the time, who has time to fact check and research all of it?
Furthermore, how many beliefs has society embedded into our brains from early childhood? That is mostly what we believe as fact, even though we never really took the time to reflect on those ideas.
It is during the formative years of our lives that we establish associations and strong beliefs about key aspects of life. We form our religious beliefs and associations. We establish a foundation of our political views and civil roles. Even more importantly, we adapt to perpetuated ideas of authority and conformity. Finally, we buy into society’s view of what it means to be a human.
All of this happens without any first-hand investigation if any of these societal norms and beliefs are truth. Yet, for most, these programmed beliefs are the first resource for fact-checking and assessing new ideas and assertions.
Skepticism is Quite Rare, Especially When We’re Distracted
Another philosopher, Benedict Spinoza questioned Descartes’ idea. Spinoza realized that the brain does not process ideas the way that Descartes proposed. He suggested that, “people believe every assertion they understand but quickly ‘unbelieve’ those assertions that are found to be at odds with other established facts.”
Confirming this theory, new research has shown that our brains are actually naturally willing to believe whatever we feed them. Researchers Daniel T. Gilbert et al. from The University of Texas at Austin conducted an experiment where they presented a set of true and false statements about a crime to study subjects.
The researchers asked one group of participants to read the statements and concurrently find and count the digit 5 as it appeared in the text. The other group was allowed to read the statements uninterrupted.
Afterwards, the researchers asked the participants to recollect which statements were false, and which were true. They also requested that the subjects decide on the jail time for the perpetrator of the crime. The study outcome showed that the group that was also counting recalled more false assertions as true, but not the other way around. They also gave the fictitious perpetrator more jail time.
Thus, Gilbert et al. showed that people are more likely to believe that false assertions are true, especially when they are interrupted. Therefore, this reinforces Spinoza’s theory that people are quick to believe an idea. Yet, the findings introduce the argument that interruption prevents us from “unbelieving” new assertions. Thus, are we really capable of the skepticism that the modern world requires?
The implication of these findings is that the world is distracting. It is fast-moving, flashy, loud and over-whelming. We are connected to people’s lives well outside of our home and community, ingesting massive amount of information. Never mind the incessant attempts of advertisers to snag your attention and constant beckoning of your smartphone. How can we expect our brain to take an uninterrupted assessment of new ideas?
It Takes Cognitive Work to Disbelieve
Spinoza and Gilbert et al. suggest that “belief is first, easy, and inexorable and that doubt is retroactive, difficult, and only occasionally successful.”
Gilbert et al write:
Acceptance, then, may be a passive and inevitable act, whereas rejection may be an active operation that undoes the initial passive acceptance. The most basic prediction of this model is that when some event prevents a person from “undoing” his or her initial acceptance, then he or she should continue to believe the assertion, even when it is patently false. For example, if a person is told that lead pencils are a health hazard, he or she must immediately believe that assertion and only then may take active measures to unbelieve it. These active measures require cognitive work (i.e., the search for or generation of contravening evidence), and if some event impairs the person’s ability to perform such work, then the person should continue to believe in the danger of lead pencils until such time as the cognitive work can be done.
Therefore, much removed from Descartes theory, Gilbert et al. propose that each event and encounter in your life alters your brain. At times, this change is permanent, unless you have the time and cognitive ability to reflect on the encounter and, then, decide if you want to disbelieve the ideas it introduced.
This is why advertising is so effective. Marketers introduce ideas (beliefs) about their products in passing. Typically, you’re already distracted with whatever it is that you’re doing (driving, watching a show, reading a news piece, etc). Some might argue that advertisers are creating beliefs in your brain without your permission.
The same can be applied to politics, public schools, and news media. Are all of these entities imposing changes in your belief system against your will? Is the brain really that gullible?
Some would say yes, but let’s go back to Descartes’ idea that one can seek out evidence so s/he can decide to disbelieve an assertion. Gilbert et al write:
People, then, do have the potential for resisting false ideas, but this potential can only be realized when the person has (a) logical ability, (b) correct information, and (c) motivation and cognitive resources.
What I’m getting at here is that we must have the cognitive ability, as well as true information, to help us disbelieve false assertions. Unfortunately, these skills are mostly the function of our education system, as well as organized groups and religious sects. Therefore, society partially controls our ability to distinguish truth from falsehoods. We are told to believe that the information fed to us in schools and churches is truth. But how do we know for sure? And is the system of education teaching children how to think, or just what to think?
Many are concerned that society’s influence has stifled our ability discern falsehoods. That the hive-mind has infected the masses, disparaging skepticism of mainstream beliefs.
Yet, we must not forget our own personal power. Regardless of how gullible the brain really is, we hold the power over our thoughts. This control stems from our willingness (motivation) to reflect on and contemplate ideas.
Our cognitive resources are not bound by our GPA or IQ score. We can expend these abilities through self-education and the rational exchange of ideas with others. Solitude, meditation, reflection, mentorships, first-hand experience – this is all within our reach. We have access to an endless pool of knowledge. All we need to do is decide. Do we want to ingest only what mainstream society throws at us, or are we willing to seek out our own truth?
Read more articles by Anna Hunt.
About the Author
Anna Hunt is writer, yoga instructor, mother of three, and lover of healthy food. She’s the founder of Awareness Junkie, an online community paving the way for better health and personal transformation. She’s also the co-editor at Waking Times, where she writes about optimal health and wellness. Anna spent 6 years in Costa Rica as a teacher of Hatha and therapeutic yoga. She now teaches at Asheville Yoga Center and is pursuing her Yoga Therapy certification. During her free time, you’ll find her on the mat or in the kitchen, creating new kid-friendly superfood recipes.
Source: You Can’t Not Believe Everything You Read, Daniel T. Gilbert et al, Department of Psychology University of Texas at Austin
This article (The Science Behind Why People Are So Easily Fooled) was originally created and published by Awareness Junkie. You may not copy, reproduce or publish any content therein without written permission. Feel free to share this article on social networks and via email. Also, if you have questions, please contact us here.
Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of Waking Times or its staff.