Pages

Friday 28 February 2014

Shocking Facts And Myths About Hypnosis

Hypnosis is a controversial subject. The word hypnosis alone, conjures up pictures in the mind of many, some evil looking man waving a pocket watch back and forth to put people into trances and make them do things against their will. In the movie, Stir of Echoes, starring Kevin Bacon, after being hypnotized, Bacon starts experiencing all sorts of whacked out problems. In every Dracula movie ever produced, the vampires are often shown hypnotically staring down beautiful women before moving in for a late night snack.

But is the real process of hypnosis this evil? Can you get locked into a zombie like state under hypnosis? Can Satan enter your mind when you are hypnotized? Can you really be forced to do things against your will under hypnotism? Well, stick around and let us examine some of the common misconceptions surrounding the sometimes controversial practice of hypnosis.

Having been a practicing Hypnotherapist and Stage Hypnotist for over 13 years now, and having hypnotized over 15,000 people, I feel I have a pretty firm grasp on hypnotism and its controversies. With that in mind, let us first explore what physically happens to a person when experiencing hypnosis.

There are four known brain wave states or levels of mind. These are beta, alpha, theta and delta. Beta is the alert wakened state of mind, what you are experiencing right now. Then there is alpha, which is a relaxed physical state, along with a focused mental state. Next is theta, which is deep physical relaxation, with meditative like mental awareness (very dream like). Finally there is delta, or deep sleep.

Under hypnosis a person is fluctuating back and forth between alpha and theta brain wave levels, with theta being more dominant. It is at this state of mind that intensely focused concentration, imagination, and memory recall are most powerfully experienced. This is the trance experience. A person will be sitting in a physically relaxed state, even very limply in many cases. To the outside viewer a person under deep hypnosis can appear limp, loose, and lazy, kind of like a human puppet whose strings have been cut. This is what aids in the belief that hypnosis is a sleep like state. To the person who is experiencing hypnotism, even though they have allowed themselves to relax physically, their mind is very aware, sharp and alert.

Under hypnosis, the conscious mind is relaxed and the subconscious is triggered and opened up. This can seem kind of spooky to some. Many people fear what can happen to the subconscious when hypnotized. They have this belief that the subconscious is simply a sponge that will absorb all commands from the hypnotist, uncritically. This is simply not true. Let us examine some real instances from my experience as a hypnotist that show this to be false.

Case in point: I was performing my comedy hypnosis show for a group of high school students and their parents at an After Prom Party in Iowa once, and during the show I gave my hypnotized subjects a command that when I snap my fingers, on the count of three, that some wild dance music would start to play, and suddenly they would all think they are the worlds greatest dancers and stand up and start competing against each other in a dance contest. During most hypnosis shows I have performed, when I get to this routine in the show, everyone will get up and dance. In this instance though, when I snapped my fingers on the count of three, and the music started playing, everyone got up and really started moving, except for one boy.

Now this boy had reacted to every previous hypnotic command I had already given him during the show, but when it got to dancing in front of a live audience, he simply sat in his chair and shook his head no. Seeing this reaction, I realized he just really did not want to dance, so I quickly gave him a hypnotic suggestion to be my dance contest judge, which he readily agreed to.

After the show, to confirm my suspicions, I asked him about the dance routine in the show, and why he did not react to that, even though he reacted positively to every other hypnotic command I had given him that night. He simply stated, I hate to dance in front of others and never will. This proved to me the common statement that even under hypnosis you can not be forced to do something against your will.

Let us address now the fearful reaction I have experienced from some hard-lined Christians that say, hypnosis is of the Devil. This belief is totally formulated on fear. What has happened to these people is nothing more than a good case of brainwashing from their minister. If it were possible to be possessed by evil spirits under hypnosis, then we are all dead meat. Did you know that every time you watch TV, and get in one of those TV trances, you are in the hypnotic state? You know what I am talking about. When you are so into the program you are watching that you do not even react to what someone is saying right next to you, for example when they ask you a question 3 or 4 times before you really become aware of what they are saying and then you finally respond. Well guess what, that is because you are hypnotized. So, let me ask you this, are we all walking around possessed by the devil? Certainly not.

The bottom line is this. Hypnosis remains and always will remain a somewhat controversial subject. I simply ask you to explore it for yourself. React without fear or prejudice and then decide for yourself what it is really all about.


Wednesday 26 February 2014

5 Reasons Why Television Is So Addictive

It’s okay, we understand. You likely have over five shows on TiVo, and your Sunday nights are devoted to Downton Abbey and Girls. You’re eagerly awaiting Sherlock, which returns on Jan. 19, and we know you’re on top of House of Cards‘ release on Netflix on Valentine’s Day (forget restaurant reservations–you’ll be on the couch).
When it comes to today’s top shows, it’s about more than just good TV. We are literally obsessed. But what makes them so hard to turn off? Here’s our list of what is getting you hooked — and keeping you that way.
1. Rapid-fire scenes — you can’t look away 
You may have noticed that many shows jump quickly from one scene to another, or flit between characters in the same scene. That’s all designed to keep you glued to the screen, says Robert Kubey, a psychologist and professor of journalism media studies at Rutgers University in New Jersey. “We are talking about the rapid cutting or the quick montage,” he says. According to his research, rapid scene changes are especially engaging to watch, and that can lead to zoning out (wait, it’s four hours later already?). Commercials take full advantage of this tactic, flashing multiple images within a few seconds to grab, and hold your attention — look away and you miss something. Watching a person drink coffee in a coffee shop is not as effective in drawing us in as, say, switching back and forth between characters in a conversation, or an epic battle scene that quickly switches from one gory assault to another. Kubey says this reaction is wired into our biology. It’s called our orienting reflex, which involves our ability to react to movements around us, like a fly avoiding the swat of a hand. Our orienting reflex is triggered when we watch these scenes, and we become more engaged with what’s happening to the point that it’s physically hard for us to look away.
2. The controlling director
Research from Princeton University found that the more a director controls a viewer’s focus in a scene, the more engaged his audience becomes. Psychologist Uri Hasson and his colleagues took fMRI images of the brains of viewers who watched clips from The Good, the Bad and the Ugly, Curb Your Enthusiasm, Bang! You’re Dead and an unedited shot of Washington Square Park in New York City. They monitored how similar the brain activity of the viewers were, in order to get a sense of which scenes captured their attention. Only 5% of the participants shared similar reactions to the Washington Square Park clip, 18% had the same activity patterns to Curb Your Enthusiasm45% had the same brain patterns to The Good, the Bad and the Ugly, but 65% had synced brain activity in response to Alfred Hitchcock’s Bang! You’re Dead. The researchers,who published their findings in the journal Projections, concluded that the more controlling a director is with a scene, the more engaging, and potentially hard to avoid, it is. Every scene in a Hitchcock movie, for example, is intentional and planned out. He points you exactly where he wants you to look, and there is very little variability in what viewers watch. House of Cards employs a similar technique; its directors dictate where they want viewers to look — when Kevin Spacey addresses the audience, everything else fades into the background, and it’s just him and you.
3. What happens next?! The cliffhanger
This one is the most obvious, and perhaps most common tactic that TV and movie producers use. But it always works. In the final scene of the Sherlock season 2 finale, Sherlock falls to his apparent death, only to be spotted again, watching his partner Dr. Watson lamenting his death at his own tombstone. Wait, is he really alive? According to Kubey, TV shows even use mini-cliffhangers before each commercial break to make sure the viewer doesn’t change channels. “The cliffhanger makes you want to come back and ask for more,” he says. Even in the age of recording TV shows or streaming on Netflix, there’s nothing like leaving viewers hanging to keep them hooked.
4. Sex — blatant, implied, or anticipated
Humans are hard wired to respond to sex and violence–but most strongly to sex. Both, after all, are critical to our survival. “It’s dramatic whether two people in a scene are going to consummate the relationship,” says Kubey. “There’s a suspense, and it’s arousing and hard to pull away.” Sex is appealing to us on a base level–since it’s our means to procreate–and we get a kick out of watching it too, in fact it’s one of our favorite things to watch. Sometimes for hours (Think porn).
5. Violence…we find it oddly appealing
Violence on the other hand, is hard for some to watch. So what is it about violence and the danger it represents that attracts people to such content? One recent study from researchers at the University of Augsburg, Germany and the University of Wisconsin-Madison found that people are more likely to watch movies with gory scenes and violence if they think there is meaning or purpose behind it — say, revenge or justice. ”Perhaps depictions of violence that are perceived as meaningful, moving and thought-provoking can foster empathy with victims, admiration for acts of courage and moral beauty in the face of violence, or self-reflection with regard to violent impulses,” said study author Anne Bartsch in a statement. Other research suggests that people are not necessarily attracted to violence, but may be drawn to the content because, similar to the way that sex scenes are appealing, they enjoy the anticipation, thrill and suspense of the scenes.
http://www.time.com/time/

Tuesday 25 February 2014

Does True Altruism Exist?

Altruism has been thought of as an ego defense, a form of sublimation in which a person copes with his anxiety by stepping outside himself and helping others. By focusing on the needs of others, people in altruistic vocations such as medicine or teaching may be able to permanently push their needs into the background, and so never have to address or even to acknowledge them. Conversely, people who care for a disabled or elderly person may experience profound anxiety and distress when this role is suddenly removed from them.

Altruism as an ego defence should be distinguished from true altruism—one being primarily a means to cover up uncomfortable feelings and the other being primarily a means to some external end such as alleviating hunger or poverty. However, many psychologists and philosophers have argued that there is, in fact, no such thing as true altruism. In The Dawn,the 19th century philosopher Friedrich Nietzsche maintains that that which is erroneously called ‘pity’ is not selfless but variously self-motivated.

Nietzsche is in effect agreeing with Aristotle who in the Rhetoric defines pity as a feeling of pain caused by a painful or destructive evil that befalls one who does not deserve it, and that might well befall us or one of our friends, and, moreover to befall us soon. Aristotle surmises that pity cannot be felt by those with absolutely nothing to lose, nor by those who feel that they are beyond all misfortune.

In an interesting and insightful aside, Aristotle adds that a person feels pity for those who are like him and for those with whom he is acquainted, but not for those who are very closely related to him and for whom he feels as he does for himself. Indeed, says Aristotle, the pitiful should not be confounded with the terrible: a man may weep at the sight of his friend begging, but not at that of his son being led to death.

Altruistic acts are self-interested, if not because they relieve anxiety, then perhaps because they lead to pleasant feelings of pride and satisfaction; the expectation of honor or reciprocation; or the greater likelihood of a place in heaven; and even if neither of the above, then at least because they relieve unpleasant feelings such as the guilt or shame of not having acted at all.

This argument has been attacked on various grounds, but most gravely on the grounds of circularity— altruistic acts are performed for selfish reasons, therefore they must be performed for selfish reasons. The bottom line, I think, is this. There can be no such thing as an ‘altruistic’ act that does not involve some element of self-interest, no such thing, for example, as an altruistic act that does not lead to some degree, no matter how small, of pride or satisfaction. Therefore, an act should not be written off as selfish or self-motivated simply because it includes some inevitable element of self-interest. The act can still be counted as altruistic if the ‘selfish’ element is accidental; or, if not accidental, then secondary; or, if neither accidental nor secondary, then undetermining.

Need this imply that Aristotle is incorrect in holding that pity cannot be felt by those with absolutely nothing to lose, or who feel that they are beyond all misfortune? Not necessarily—although an altruistic act is often driven by pity, this need not be the case, and altruism and pity should not be amalgamated and then confounded with each another. Thus, it is perfectly possible for someone lying on his deathbed and at the very brink of death, who is compos mentis and whose reputation is already greatly assured, to gift all or most of his fortune to some deserving cause, not out of pity, which he may or may not be beyond feeling, but simply because he thinks that, all things considered, it is the right thing to do. In fact, this goes to the very heart of ancient virtue, which can be defined as the perfection of our nature through the triumph of reason over passion. The truly altruistic act is the virtuous act and the virtuous act is, always, the rational act.

Sunday 23 February 2014

Why Generation 'Y' Is Unhappy

Say hi to Lucy.
2013-09-15-Geny1.jpg
Lucy is part of Generation Y, the generation born between the late 1970s and the mid 1990s. She's also part of a yuppie culture that makes up a large portion of Gen Y.
I have a term for yuppies in the Gen Y age group -- I call them Gen Y Protagonists & Special Yuppies, or GYPSYs. A GYPSY is a unique brand of yuppie, one who thinks they are the main character of a very special story.
So Lucy's enjoying her GYPSY life, and she's very pleased to be Lucy. Only issue is this one thing:
Lucy's kind of unhappy.
To get to the bottom of why, we need to define what makes someone happy or unhappy in the first place. It comes down to a simple formula:

2013-09-15-Geny2.jpg
It's pretty straightforward -- when the reality of someone's life is better than they had expected, they're happy. When reality turns out to be worse than the expectations, they're unhappy.
To provide some context, let's start by bringing Lucy's parents into the discussion:
2013-09-15-Geny3.jpg
Lucy's parents were born in the '50s -- they're Baby Boomers. They were raised by Lucy's grandparents, members of the G.I. Generation, or "the Greatest Generation," who grew up during the Great Depression and fought in World War II, and were most definitely not GYPSYs.

2013-09-15-Geny4.jpg
Lucy's Depression Era grandparents were obsessed with economic security and raised her parents to build practical, secure careers. They wanted her parents' careers to have greener grass than their own, and Lucy's parents were brought up to envision a prosperous and stable career for themselves. Something like this:
2013-09-15-Geny5.jpg
They were taught that there was nothing stopping them from getting to that lush, green lawn of a career, but that they'd need to put in years of hard work to make it happen.
2013-09-15-Geny6.jpg
After graduating from being insufferable hippies, Lucy's parents embarked on their careers. As the '70s, '80s, and '90s rolled along, the world entered a time of unprecedented economic prosperity. Lucy's parents did even better than they expected to. This left them feeling gratified and optimistic.
2013-09-15-Geny7.jpg

With a smoother, more positive life experience than that of their own parents, Lucy's parents raised Lucy with a sense of optimism and unbounded possibility. And they weren't alone. Baby Boomers all around the country and world told their Gen Y kids that they could be whatever they wanted to be, instilling the special protagonist identity deep within their psyches.
This left GYPSYs feeling tremendously hopeful about their careers, to the point where their parents' goals of a green lawn of secure prosperity didn't really do it for them. A GYPSY-worthy lawn has flowers.
2013-09-15-Geny8.jpg
This leads to our first fact about GYPSYs:
GYPSYs Are Wildly Ambitious
2013-09-15-Geny9.jpg
The GYPSY needs a lot more from a career than a nice green lawn of prosperity and security. The fact is, a green lawn isn't quite exceptional or unique enough for a GYPSY. Where the Baby Boomers wanted to live The American Dream, GYPSYs want to live Their Own Personal Dream.
Cal Newport points out that "follow your passion" is a catchphrase that has only gotten going in the last 20 years, according to Google's Ngram viewer, a tool that shows how prominently a given phrase appears in English print over any period of time. The same Ngram viewer shows that the phrase "a secure career" has gone out of style, just as the phrase "a fulfilling career" has gotten hot.
2013-09-15-Geny10.jpg

2013-09-15-geny11.jpg

To be clear, GYPSYs want economic prosperity just like their parents did -- they just also want to be fulfilled by their career in a way their parents didn't think about as much.
But something else is happening too. While the career goals of Gen Y as a whole have become much more particular and ambitious, Lucy has been given a second message throughout her childhood as well:
2013-09-15-Geny12.jpg

This would probably be a good time to bring in our second fact about GYPSYs:
GYPSYs Are Delusional
"Sure," Lucy has been taught, "everyone will go and get themselves some fulfilling career, but I am unusually wonderful and as such, my career and life path will stand out amongst the crowd." So on top of the generation as a whole having the bold goal of a flowery career lawn, each individual GYPSY thinks that he or she is destined for something even better --
A shiny unicorn on top of the flowery lawn. 

2013-09-15-Geny13.jpg

So why is this delusional? Because this is what all GYPSYs think, which defies the definition of special:

spe-cial | 'speSHel |
adjective
better, greater, or otherwise different from what is usual.
According to this definition, most people are not special -- otherwise "special" wouldn't mean anything.
Even right now, the GYPSYs reading this are thinking, "Good point... but I actually am one of the few special ones" -- and this is the problem.
A second GYPSY delusion comes into play once the GYPSY enters the job market. While Lucy's parents' expectation was that many years of hard work would eventually lead to a great career, Lucy considers a great career an obvious given for someone as exceptional as she, and for her it's just a matter of time and choosing which way to go. Her pre-workforce expectations look something like this:

2013-09-15-Geny14.jpg
Unfortunately, the funny thing about the world is that it turns out to not be that easy of a place, and the weird thing about careers is that they're actually quite hard. Great careers take years of blood, sweat and tears to build -- even the ones with no flowers or unicorns on them -- and even the most successful people are rarely doing anything that great in their early or mid-20s.
But GYPSYs aren't about to just accept that.
Paul Harvey, a University of New Hampshire professor and GYPSY expert, has researched this, finding that Gen Y has "unrealistic expectations and a strong resistance toward accepting negative feedback," and "an inflated view of oneself." He says that "a great source of frustration for people with a strong sense of entitlement is unmet expectations. They often feel entitled to a level of respect and rewards that aren't in line with their actual ability and effort levels, and so they might not get the level of respect and rewards they are expecting."
For those hiring members of Gen Y, Harvey suggests asking the interview question, "Do you feel you are generally superior to your coworkers/classmates/etc., and if so, why?" He says that "if the candidate answers yes to the first part but struggles with the 'why,' there may be an entitlement issue. This is because entitlement perceptions are often based on an unfounded sense of superiority and deservingness. They've been led to believe, perhaps through overzealous self-esteem building exercises in their youth, that they are somehow special but often lack any real justification for this belief."
And since the real world has the nerve to consider merit a factor, a few years out of college Lucy finds herself here:
2013-09-15-Geny15.jpg
Lucy's extreme ambition, coupled with the arrogance that comes along with being a bit deluded about one's own self-worth, has left her with huge expectations for even the early years out of college. And her reality pales in comparison to those expectations, leaving her "reality - expectations" happy score coming out at a negative.
And it gets even worse. On top of all this, GYPSYs have an extra problem that applies to their whole generation:
GYPSYs Are Taunted
Sure, some people from Lucy's parents' high school or college classes ended up more successful than her parents did. And while they may have heard about some of it from time to time through the grapevine, for the most part they didn't really know what was going on in too many other peoples' careers.
Lucy, on the other hand, finds herself constantly taunted by a modern phenomenon:Facebook Image Crafting.
Social media creates a world for Lucy where A) what everyone else is doing is very out in the open, B) most people present an inflated version of their own existence, and C) the people who chime in the most about their careers are usually those whose careers (or relationships) are going the best, while struggling people tend not to broadcast their situation. This leaves Lucy feeling, incorrectly, like everyone else is doing really well, only adding to her misery:
2013-09-15-Geny16.jpg

So that's why Lucy is unhappy, or at the least, feeling a bit frustrated and inadequate. In fact, she's probably started off her career perfectly well, but to her, it feels very disappointing.
Here's my advice for Lucy:
1) Stay wildly ambitious. The current world is bubbling with opportunity for an ambitious person to find flowery, fulfilling success. The specific direction may be unclear, but it'll work itself out -- just dive in somewhere.
2) Stop thinking that you're special. The fact is, right now, you're not special. You're another completely inexperienced young person who doesn't have all that much to offer yet. You can become special by working really hard for a long time.
3) Ignore everyone else. Other people's grass seeming greener is no new concept, but in today's image crafting world, other people's grass looks like a glorious meadow. The truth is that everyone else is just as indecisive, self-doubting, and frustrated as you are, and if you just do your thing, you'll never have any reason to envy others.

Friday 21 February 2014

The Importance Of Childhood Play

I’m a research bio-psychologist with a PhD, so I’ve done lots of school. I’m a pretty good problem-solver, in my work and in the rest of my life, but that has little to do with the schooling I’ve had. I studied algebra, trig, calculus and various other maths in school, but I can’t recall ever facing a problem – even in my scientific research – that required those skills. What maths I’ve used was highly specialised and, as with most scientists, I learnt it on the job.

The real problems I’ve faced in life include physical ones (such as how to operate a newfangled machine at work or unblock the toilet at home), social ones (how to get that perfect woman to be interested in me), moral ones (whether to give a passing grade to a student, for effort, though he failed all the tests), and emotional ones (coping with grief when my first wife died or keeping my head when I fell through the ice while pond skating). Most problems in life cannot be solved with formulae or memorised answers of the type learnt in school. They require the judgement, wisdom and creative ability that come from life experiences. For children, those experiences are embedded in play.

I’m lucky. I grew up in the United States in the 1950s, at the tail end of what the historian Howard Chudacoff refers to as the “golden age” of children’s free play. The need for child labour had declined greatly, decades earlier, and adults had not yet begun to take away the freedom that children had gained. We went to school, but it wasn’t the big deal it is today. School days were six hours long, but (in primary school) we had half-hour recesses in the morning and afternoon, and an hour at lunch. Teachers may or may not have watched us, from a distance, but if they did, they rarely intervened. We wrestled on the school grounds, climbed trees in the adjacent woods, played with knives and had snowball wars in winter – none of which would be allowed today at any state-run school I know of. Out of school, we had some chores and some of us had part-time jobs such as paper rounds (which gave us a sense of maturity and money of our own); but, for the most part, we were free – free to play for hours each day after school, all day on weekends, and all summer long. Homework was non-existent in primary school and minimal in secondary school. There seemed to be an implicit understanding, then, that children need lots of time and freedom to play.

I’m writing, here, in response to the news that the independent School Teachers Review Body is due to report back this week to Michael Gove on his plan to make school days longer and holidays shorter. The Education Secretary’s hope is that more hours in school will raise test scores in the UK to the level of those in China, Singapore and other East Asian nations. Paradoxically, Gove’s proposal has appeared just a few months after the Chinese ministry of education issued a report – entitled Ten Regulations to Lessen Academic Burden for Primary School Students – calling for less time in school, less homework and less reliance on test scores as a means of evaluating schools.

Educators in East Asian nations have increasingly been acknowledging the massive failure of their educational systems. According to the scholar and author Yong Zhao, who is an expert on schools in China, a common Chinese term used to refer to the products of their schools is gaofen dineng, which essentially means good at tests but bad at everything else. Because students spend nearly all of their time studying, they have little opportunity to be creative, discover or pursue their own passions, or develop physical and social skills. Moreover, as revealed by a recent large-scale survey conducted by British and Chinese researchers, Chinese schoolchildren suffer from extraordinarily high levels of anxiety, depression and psychosomatic stress disorders, which appear to be linked to academic pressures and lack of play.

The main focus of my own recent research is on the value of play for children’s development. All mammals play when they are young and those that have the most to learn play the most. Carnivores play more than herbivores, because hunting is harder to learn than grazing. Primates play more than other mammals, because their way of life depends more on learning and less on fixed instincts than does that of other mammals. Human children, who have the most to learn, play far more than any other primates when they are allowed to do so. Play is the natural means by which children and other young mammals educate themselves. In hunter-gatherer bands, children are allowed to play and explore in their own chosen ways all day long, every day, because the adults understand that this is how they practise the skills that they must acquire to become effective adults.

The most important skills that children everywhere must learn in order to live happy, productive, moral lives are skills that cannot be taught in school. Such skills cannot be taught at all. They are learned and practised by children in play. These include the abilities to think creatively, to get along with other people and cooperate effectively, and to control their own impulses and emotions.

My bet is that Gove would agree that now, even more than in the past, creativity is a key to economic success. We no longer need people to follow directions in robot-like ways (we have robots for that), or to perform routine calculations (we have computers for that), or to answer already-answered questions (we have search engines for that). But we do need people who can ask and seek answers to new questions, solve new problems and anticipate obstacles before they arise. These all require the ability to think creatively. The creative mind is a playful mind.

All young children are creative. In their play and self-directed exploration they create their own mental models of the world around them and also models of imaginary worlds. Adults whom we call geniuses are those who somehow retain and build upon that childlike capacity throughout their lives. Albert Einstein said his schooling almost destroyed his interest in mathematics and physics, but he recovered it when he left school. He referred to his innovative work as “combinatorial play”. He claimed that he developed his concept of relativity by imagining himself chasing a sunbeam and catching up with it, and then thinking about the consequences. We can’t teach creativity, but we can drive it out of people through schooling that centres not on children’s own questions but on questions dictated by an imposed curriculum that operates as if all questions have one right answer and everyone must learn the same things.

Even more important than creativity is the capacity to get along with other people, to care about them and to co-operate effectively with them. Children everywhere are born with a strong drive to play with other children and such play is the means by which they acquire social skills and practise fairness and morality. Play, by definition, is voluntary, which means that players are always free to quit. If you can’t quit, it’s not play. All players know that, and so they know that to keep the game going, they must keep the other players happy. The power to quit is what makes play the most democratic of all activities. When players disagree about how to play, they must negotiate their differences and arrive at compromises. Each player must recognise the capacities and desires of the others, so as not to hurt or offend them in ways that will lead them to quit. Failure to do so would end the game and leave the offender alone, which is powerful punishment for not attending to the others’ wishes and needs. The most fundamental social skill is the ability to get into other people’s minds, to see the world from their point of view. Without that, you can’t have a happy marriage, or good friends, or co-operative work partners. Children practise that skill continuously in their social play.

In play, children also learn how to control their impulses and follow rules. All play – even the wildest-looking varieties – has rules. A play-fight, for example, differs from a real fight in that the former has rules and the latter doesn’t. In the play-fight you cannot kick, bite, scratch, or really hurt the other person; and if you are the larger and stronger of the two, you must take special care to protect the other from harm. While the goal of a real fight is to end it by driving the other into submission, the goal of a play-fight is to prolong it by keeping the other happy. In sociodramatic play – the kind of imaginary play exemplified by young children’s games of “house” or pretending to be superheroes – the primary rule is that you must stay in character. If you are the pet dog, you must bark instead of talk and you move around on all fours no matter how uncomfortable that might be. If you are Wonder Woman and you and your playmates believe that Wonder Woman never cries, you must refrain from crying if you fall and hurt yourself. The art of being a human being is the art of controlling impulses and behaving in accordance with social expectations.

Games over: there seemed to be an implicit understanding in the 1950s, when this photo was taken, that children needed lots of time and freedom to play

Play is also a means by which children (and other young mammals) learn to control fear. Young mammals of many species play in ways that look dangerous. Goat kids romp along the edges of cliffs; young monkeys chase one another from branch to branch in trees, high enough up that a fall would hurt; and young chimpanzees play a game of dropping from high up and then catching themselves on a lower branch just before they hit the ground. Young humans also play in such ways when free to do so. Why? Apparently, the slight risks involved are outweighed by gains. They are dosing themselves with the maximum levels of fear that they can tolerate without panicking, and they are learning to control their bodies in the face of that fear – an ability that may one day save their lives.

Children also play in ways that elicit anger. One youngster may accidentally hurt another in the rough and tumble, or negotiations about the rules of a game may fail, or teasing that was at first in fun may go too far. But for the fun to continue, the anger must be controlled. To keep the game going in such situations, the players must react assertively, to stop the offending behaviour, without physically attacking or throwing a tantrum, either of which would bring play to an end. In this way, children learn to control their anger.

Researchers have raised young monkeys and rats in ways such that they are allowed other types of social interactions but are deprived of play. When these animals are tested, in young adulthood, they are emotional cripples. When placed in a moderately frightening environment, they overreact with fear. They panic and freeze in a corner and never explore the environment and overcome the fear as a normal monkey or rat would. When placed with an unfamiliar peer, they may alternate between panic and inappropriate, ineffective aggression. They are incapable of making friends.

Some people object, on moral grounds, to experiments in which young animals are deprived of play. What a cruel thing to do. But consider this: over the past 50 to 60 years, we have been continuously decreasing the opportunities for our own children to play. School became more onerous, as breaks were reduced, homework piled up, and pressure for high grades increased. Outside school, adult-directed sports (which are not truly play) began to replace impromptu games (which are play). Children began to take classes out of school, rather than pursue hobbies on their own. “Play dates”, with adults present, replaced unsupervised neighbourhood play, and adults began to feel it was their duty to intervene rather than let children solve their own problems. These changes have been gradual, imperceptible, but over time they have been enormous. They have been caused by a constellation of social factors, including the spread of parents’ fears, the rise of experts who are continuously warning us about dangers, the decline of cohesive neighbourhoods and the rise of a school-centric, or “schoolish”, take on child development – the view that children learn more from teachers and other adult directors than they do from one another.

This dramatic decline in children’s opportunities to play has been accompanied by an equally dramatic increase in childhood mental disorders. It’s not just that we are detecting such disorders where we failed to look before; the increase is real. Clinical assessment questionnaires, which have been administered to normative groups in unchanged form over the years, show that rates of clinically significant depression and anxiety in US schoolchildren are now five to eight times what they were in the 1950s. Other research indicates that empathy has been declining and narcissism increasing, ever since valid measures of these were first developed in the late 1970s. There are even well-validated ways of assessing creative thinking, and research using these tools suggests that such thinking has been decreasing among schoolchildren at all grade levels over the past 30 years. All of these deleterious changes, accompanying the decline of play, are exactly what we would predict from our knowledge of play’s purposes.

No, our children don’t need more school. They need more play. If we care about our children and future generations, we must reverse the horrid trend that has been occurring over the past half century. We must give childhood back to children. Children must be allowed to follow their inborn drives to play and explore, so that they can grow into intellectually, socially, emotionally and physically strong and resilient adults. The Chinese are finally beginning to realise this, and so should we.

Sunday 16 February 2014

VIDEO Buddhism: A Science Of Mind


Friday 14 February 2014

'Trance' Film Star Converts To Hypnosis

Rosario Dawson revealed tonight she is a convert to hypnotherapy after trying it for new film Trance.
The Sin City star looked entrancing on the red carpet in a glittering red gown by British designer Jenny Packham, at the premiere of Danny Boyle's new thriller in London's Leicester Square.

Dawson's co-stars Vincent Cassel and James McAvoy both claimed they had tried hypnosis as research for the film, but it hadn't worked.

And Boyle said he didn't even try it because "directors are control freaks and I don't think they ever relax enough to be a useful subject for a hypnotist".

But Dawson, who plays a hypnotherapist in the film, said: "I think it works actually, quite well. I didn't have anything like cigarettes to quit or anything like that.

"But I remember I went in and gave her an idea of what I wanted to work on in the session. And I laid down and she put a blanket over me because when you go into a trance state it's sort of like being in between sleep and awake, so your body thinks it's falling asleep. So I got cold and then I relaxed and I went along with her voice and it was really comforting. My body did those little spasms that it does when you're falling asleep.

"And then I woke up and she said 'I know you said you wanted to work on this, but you reacted to this, this and this.' And I was like, 'How can you know all that?!' And she said my foot was kicking when she asked me about a certain thing.

"So it's actually quite interesting that your conscience will reveal itself, and if you're trained to read that, people give a lot of tells. It's almost like poker. We think we're hiding things, but we're not.

"So it was really great for me to experience that, because the premise of the film is that hypnotherapy works on such a strong level.

"So it was really necessary for me to believe that, and having been through that I gained a lot of respect for the profession, I have to say."

http://www.independent.co.uk/arts-entertainment/films/news/rosario-dawson-converted-to-hypnotherapy-after-starring-in-trance-8541723.html

Wednesday 12 February 2014

Do Dreams Affect Decision Making?

Need to sleep on that big decision? Your dreams might influence your final choice, suggests new research.

Scientists disagree as to what extent dreams reflect subconscious desires, but new research reported in the Journal of Personality and Social Psychology (Vol. 96, No. 2) concludes that dreams do influence people's decisions and attitudes.

Social psychologists Carey Morewedge, PhD, at Carnegie Mellon University, and Michael Norton, PhD, at Harvard University, conducted studies to find out how people respond to their dreams. Their study of people in the United States, South Korea and India found that 56 percent, 65 percent and 74 percent of respondents, respectively, believe that dreaming reveals hidden truths.

The researchers then wanted to know whether dreams could influence people's decision-making. They asked 182 Boston commuters to consider which of four scenarios would most likely change their flight plans: the government raising the national threat level; consciously imagining a plane crash; learning an actual flight crashed along your route; or dreaming about a plane crash. Commuters said the dream would be just as unsettling as a real crash and more unsettling than consciously imagining a crash or a government warning.

People also seem to selectively find meaning in their dreams based on their biases, Morewedge says. In another study, the researchers asked people of assorted religious beliefs to imagine that God spoke to them in a dream and told them either to travel the world or go work in a leper colony. The very faithful said that either dream would be meaningful to them, while the more agnostic said the travel commandment might be somewhat meaningful, but not the leper colony commandment.

These experiments gauge people's attitudes, not their behaviors, but Morewedge thinks one follows the other. In a different study, he found that 68 percent of people believe their dreams can predict the future. If they believe that, and they have a dream that weighs heavily on them, "it becomes a self-fulfilling prophesy," Morewedge says.

Tuesday 11 February 2014

Winter Olympics Sport Psychology: The Individuals And The Team

The debut of the team figure skating event Thursday at the Olympics didn’t go so well for the Americans. After less than stellar performances by both the men’s entrants and pairs members, the U.S., a gold medal favorite going into the Games, is in danger of not making the second round of competition.

That’s the danger of a team event – or, in this case, a hybrid team event – in which athletes compete as individuals but their standing depends on the performance of others. In team figure skating, each discipline – ladies, men, pairs, and ice dance – accumulates points for a each team’s total. Only those with the top five point totals go on to skate off for a medal in the finals.

Tying their fate to the performance of others isn’t something that figure skaters are accustomed to doing. Same for the single luge athletes, who are used to racing the clock alone on their slides. “Every elite athlete, has to have a certain degree of selfishness,” says Mark Aoyagi, director of sport and performance psychology at the University of Denver. “They have to look out for themselves by definition to become elite, at a level that normal people don’t understand.” Shifting that self-centered focus and accepting that their fate may now be in the hands of others may, he says, “be a hard pill to swallow.”

There’s no data suggesting that people who are more me-focused tend to gravitate toward solo sports and shun team endeavors, but those who pick up tennis or ski jumping or figure skating do tend to be more independent and self-reliant. Participating in a team event requires a shift in that mentality to accept that others may have different ways of training, different ways of preparing, and vastly different ways of handling stress and competing.

“It’s a different kind of pressure,” says Cory Newman, director of the center for cognitive therapy and professor psychology at the University of Pennsylvania Perelman School of Medicine. “Now it’s not just your own face you risk falling on if you have a hard time, but there are other people invested in you, and depending on you.”

How potentially paralyzing, or helpful, that pressure can be depends on the rest of your teammates. With the right atmosphere, he says, the experience can be inspiring and uplifting, and there is evidence that such collaborative efforts can even lead to better performances. In a 2002 study by researchers from University of Western Ontario and Brock University, scientists found, for example, that strongly cohesive groups of athletes who performed individually but had their scores pooled into a team effort performed better than those who were collaborative in traditional team sports such as basketball and football. Aoygai says it’s possible that because these teams didn’t normally function as a team, they took more time to address ways to bring the individual athletes together to foster a sense of camaraderie and fellowship.

One way to ease the transition is to help athletes see the change as a positive, which many do. For one, in a team event there is less pressure on each individual since each performance becomes part of a group score. There’s also the support of your teammates to celebrate victories and commiserate in defeats. “There must be an inherent loneliness in the life of a figure skater,” says Newman, “since everybody is your competitor. It must be nice to have an opportunity in which not everybody is your competitor, but there to help you.”

But what happens if things don’t work out well, and some athletes bring the entire group down? That’s what the U.S. team faces now. For sports psychologists, that’s an opportunity to ensure that the instinctively independent athletes don’t revert to their me-first mentality, and, as Aoyagi says, “look for the first fire exit to get out and protect themselves and their ego,” but instead remain committed to the team. Agreeing to be part of a team, as any player knows, requires sacrifice, and for self-minded athletes, that may include accepting that there is a risk that the outcomes may not always reflect their own abilities. “You have to be willing to compromise that things might not work out the way you like it,” says Gregory Dale, director of the sports psychology and leadership program at Duke University. “And you need to understand that your teammates are not going to do anything to screw things up on purpose. They’re in it like you are.”

Coming to that understanding can be easier if the team builds cohesiveness before the competition; some go-to techniques respected coaches have used include bringing athletes outside of their training setting to share a meal or participate in an activity that will allow them to collaborate and learn more about each other so they can establish trust and respect. Mike Candrea, who heads up the softball teams at the University of Arizona and brought together players from around the country for the U.S. Olympic women’s softball team, used holiday parties to bring his players together. Others have turned to rock climbing or other activities that force athletes to communicate and rely on each other. “Trust is the foundation of any dynamic in a team,” says Dale.

And that trust, says sports psychologists, will be the key to determining which teams of normally self-minded athletes pull ahead of others. “If you look at it the right way,” says Aoyagi, “research shows that even if a team is made up of individual athlete performances, they can truly be a team and that does lead to a better outcome.”

Read more: Olympics 2014: Figure Skating and the Psychology of Team Events | TIME.com http://healthland.time.com/2014/02/08/olympics-2014-team-events-and-when-the-gold-medal-is-out-of-your-hands/#ixzz2svjyrhQi

Sunday 9 February 2014

VIDEO 7 Billion People: Are You Typical?


Friday 7 February 2014

What Is Munchausen's Sydrome?

Munchausen's syndrome is a psychological and behavioural condition where someone pretends to be ill or induces symptoms of illness in themselves.

Munchausen's syndrome is also sometimes known as factitious disorder.

In people with Munchausen's syndrome:
  • they intentionally produce or pretend to have physical or psychological symptoms of illness
  • their main intention is to assume the ‘sick role’; have people care for them and be the centre of attention
  • there is no practical benefit for them in pretending to be sick – for example, claiming incapacity benefit

Munchausen's syndrome is named after a German aristocrat, Baron Munchausen, who became famous for telling wild, unbelievable tales about his exploits and past.

Types of behaviour
People with Munchausen's syndrome can show different types of behaviour including:
  • pretending to have psychological symptoms – for example, claiming to hear voices or claiming to see things that are not really there
  • pretending to have physical symptoms – for example, claiming to have chest pain or stomach ache
  • actively seeking to make themselves ill – such as deliberately infecting a wound by rubbing dirt into it

Some people with Munchausen's syndrome may spend years travelling from hospital to hospital feigning a wide range of illnesses. When it is discovered they are lying, they may suddenly leave hospital and move to another district.

People with Munchausen's syndrome can be very manipulative and, in the most serious cases, may undergo painful and sometimes life-threatening surgery, even though they know it is unnecessary.

Read more about the symptoms of Munchausen’s syndrome.
What causes Munchausen's syndrome?
Munchausen’s syndrome is a complex and poorly understood condition and it is still unclear why people with the condition behave in the way they do.

Some experts have argued that Munchausen’s syndrome is a type of personality disorder. Personality disorders are a type of mental health condition where an individual has a distorted pattern of thoughts and beliefs about themselves and others. This leads them to behave in ways most people would regard as disturbed and abnormal.

Another theory is that the condition may be the result of parental neglect and abandonment, resulting in feelings of childhood trauma which causes them to fake illness.

Read more about the possible causes of Munchausen’s syndrome.
Treatment
Treating Munchausen’s syndrome can be challenging as most people with the condition refuse to admit they are faking illness.

For those who do admit their behaviour is abnormal, talking therapies such as cognitive behavioural therapy can sometimes be effective.

Read more about the treatment of Munchausen’s syndrome.
Who is affected
There appear to be two distinct groups of people affected by Munchausen's syndrome:
  • women who are 20 to 40 years of age, who often have a background in healthcare, such as working as a nurse or a medical technician
  • unmarried white men who are 30 to 50 years of age
It is unclear why this is the case.

It is not known exactly how common Munchausen's syndrome is. Some experts believe it is under-diagnosed because many people with the condition succeed in deceiving medical staff. It is also possible that cases of Munchausen's syndrome may be over-diagnosed because the same person could use different identities.

A large study carried out in a Canadian hospital estimated that out of 1,300 patients there were 10 who were faking symptoms of illness.

Wednesday 5 February 2014

Benefits Of Hypnosis In Pregnancy And Childbirth

Hypnosis has been used during childbirth for approximately 100 years. Many research studies have been conducted to study the effects of hypnosis on pregnancy and labor. Hypnosis has been used on women during labor to help reduce pain. Hypnosis can be used as a natural analgesic to not only reduce pain but reduce use of pain medication. In addition to pain management, self-hypnosis has been used to control breathing during labor. Other studies show that hypnosis has a psychological benefit to mothers and to newborns.

A meta-analysis of studies conducted involving hypnosis with pregnant women was compared to non-hypnosis intervention, no treatment, and placebo. Primary measurements in the meta-analysis included analgesia used during labor and also pain scores during labor. The meta-analysis included 8395 women who had used hypnosis during pregnancy or labor. The analysis concluded that fewer women needed to use a form of analgesia during labor. Women who received hypnosis reported less severe pain than those in the control groups.

In another study, 60 pregnant women participated. The participants were divided into two groups based on their suggestibility; all received childbirth education and tips on pain control. These two groups were then subdivided with half receiving a hypnotic induction and the other half learning breathing and relaxation exercises. Women in the hypnosis group and in the high suggestibility group reported less pain. Those who used hypnosis reported using less medication and had a shorter stage 1 labor.

Another meta-analysis looking at various studies performed using hypnosis with pregnant women showed that hypnosis reduced the level of medical intervention during labor and reduced risk to women and newborn babies. One study showed that women who were trained to use hypnosis during childbirth very rarely experienced postpartum depression. Hypnosis can help manage both depression and anxiety related to pregnancy, labor, and becoming a new mom. This shows that hypnosis can have many benefits on both women and their newborn babies.

One study found that women in their second and third trimester of pregnancy were more suggestible. The study showed that as women became further along in their pregnancy, their suggestibility increased according to the Harvard Hypnotizability Scale. Pregnant women also scored higher on the Creative Imagination Scale. This study researched women at two time periods, when they were pregnant and not pregnant. The research shows that if women are more suggestible during pregnancy, there is more of a reason to use hypnosis for pregnancy and childbirth.

Learn more: http://www.naturalnews.com/027494_hypnosis_childbirth.html#ixzz2sGwrz26Y
http://www.naturalnews.com/index.html

Tuesday 4 February 2014

Psychology Of Conspiracy Theorists

To believe that the U.S. government planned or deliberately allowed the 9/11 attacks, you’d have to posit that President Bush intentionally sacrificed 3,000 Americans. To believe that explosives, not planes, brought down the buildings, you’d have to imagine an operation large enough to plant the devices without anyone getting caught. To insist that the truth remains hidden, you’d have to assume that everyone who has reviewed the attacks and the events leading up to them—the CIA, the Justice Department, the Federal Aviation Administration, the North American Aerospace Defense Command, the Federal Emergency Management Agency, scientific organizations, peer-reviewed journals, news organizations, the airlines, and local law enforcement agencies in three states—was incompetent, deceived, or part of the cover-up.

And yet, as Slate’s Jeremy Stahl points out, millions of Americans hold these beliefs. In a Zogby poll taken six years ago, only 64 percent of U.S. adults agreed that the attacks “caught US intelligence and military forces off guard.” More than 30 percent chose a different conclusion: that “certain elements in the US government knew the attacks were coming but consciously let them proceed for various political, military, and economic motives,” or that these government elements “actively planned or assisted some aspects of the attacks.”

How can this be? How can so many people, in the name of skepticism, promote so many absurdities?

The answer is that people who suspect conspiracies aren’t really skeptics. Like the rest of us, they’re selective doubters. They favor a worldview, which they uncritically defend. But their worldview isn’t about God, values, freedom, or equality. It’s about the omnipotence of elites.

Conspiracy chatter was once dismissed as mental illness. But the prevalence of such belief, documented in surveys, has forced scholars to take it more seriously. Conspiracy theory psychology is becoming an empirical field with a broader mission: to understand why so many people embrace this way of interpreting history. As you’d expect, distrust turns out to be an important factor. But it’s not the kind of distrust that cultivates critical thinking.

In 1999 a research team headed by Marina Abalakina-Paap, a psychologist at New Mexico State University, published a study of U.S. college students. The students were asked whether they agreed with statements such as “Underground movements threaten the stability of American society” and “People who see conspiracies behind everything are simply imagining things.” The strongest predictor of general belief in conspiracies, the authors found, was “lack of trust.”

But the survey instrument that was used in the experiment to measure “trust” was more social than intellectual. It asked the students, in various ways, whether they believed that most human beings treat others generously, fairly, and sincerely. It measured faith in people, not in propositions. “People low in trust of others are likely to believe that others are colluding against them,” the authors proposed. This sort of distrust, in other words, favors a certain kind of belief. It makes you more susceptible, not less, to claims of conspiracy.

A decade later, a study of British adults yielded similar results. Viren Swami of the University of Westminster, working with two colleagues, found that beliefs in a 9/11 conspiracy were associated with “political cynicism.” He and his collaborators concluded that “conspiracist ideas are predicted by an alienation from mainstream politics and a questioning of received truths.” But the cynicism scale used in the experiment, drawn from a 1975 survey instrument, featured propositions such as “Most politicians are really willing to be truthful to the voters,” and “Almost all politicians will sell out their ideals or break their promises if it will increase their power.” It didn’t measure general wariness. It measured negative beliefs about the establishment.

The common thread between distrust and cynicism, as defined in these experiments, is a perception of bad character. More broadly, it’s a tendency to focus on intention and agency, rather than randomness or causal complexity. In extreme form, it can become paranoia. In mild form, it’s a common weakness known as the fundamental attribution error—ascribing others’ behavior to personality traits and objectives, forgetting the importance of situational factors and chance. Suspicion, imagination, and fantasy are closely related.

The more you see the world this way—full of malice and planning instead of circumstance and coincidence—the more likely you are to accept conspiracy theories of all kinds. Once you buy into the first theory, with its premises of coordination, efficacy, and secrecy, the next seems that much more plausible.

Many studies and surveys have documented this pattern. Several months ago, Public Policy Polling asked 1,200 registered U.S. voters about various popular theories. Fifty-one percent said a larger conspiracy was behind President Kennedy’s assassination; only 25 percent said Lee Harvey Oswald acted alone. Compared with respondents who said Oswald acted alone, those who believed in a larger conspiracy were more likely to embrace other conspiracy theories tested in the poll. They were twice as likely to say that a UFO had crashed in Roswell, N.M., in 1947 (32 to 16 percent) and that the CIA had deliberately spread crack cocaine in U.S. cities (22 to 9 percent). Conversely, compared with respondents who didn’t believe in the Roswell incident, those who did were far more likely to say that a conspiracy had killed JFK (74 to 41 percent), that the CIA had distributed crack (27 to 10 percent), that the government “knowingly allowed” the 9/11 attacks (23 to 7 percent), and that the government adds fluoride to our water for sinister reasons (23 to 2 percent).

The appeal of these theories—the simplification of complex events to human agency and evil—overrides not just their cumulative implausibility (which, perversely, becomes cumulative plausibility as you buy into the premise) but also, in many cases, their incompatibility. Consider the 2003 survey in which Gallup asked 471 Americans about JFK’s death. Thirty-seven percent said the Mafia was involved, 34 percent said the CIA was involved, 18 percent blamed Vice President Johnson, 15 percent blamed the Soviets, and 15 percent blamed the Cubans. If you’re doing the math, you’ve figured out by now that many respondents named more than one culprit. In fact, 21 percent blamed two conspiring groups or individuals, and 12 percent blamed three. The CIA, the Mafia, the Cubans—somehow, they were all in on the plot.

Two years ago, psychologists at the University of Kent led by Michael Wood (who blogs at a delightful website on conspiracy psychology), escalated the challenge. They offered U.K. college students five conspiracy theories about Princess Diana: four in which she was deliberately killed, and one in which she faked her death. In a second experiment, they brought up two more theories: that Osama Bin Laden was still alive (contrary to reports of his death in a U.S. raid earlier that year) and that, alternatively, he was already dead before the raid. Sure enough, “The more participants believed that Princess Diana faked her own death, the more they believed that she was murdered.” And “the more participants believed that Osama Bin Laden was already dead when U.S. special forces raided his compound in Pakistan, the more they believed he is still alive.”

Another research group, led by Swami, fabricated conspiracy theories about Red Bull, the energy drink, and showed them to 281 Austrian and German adults. One statement said that a 23-year-old man had died of cerebral hemorrhage caused by the product. Another said the drink’s inventor “pays 10 million Euros each year to keep food controllers quiet.” A third claimed, “The extract ‘testiculus taurus’ found in Red Bull has unknown side effects.” Participants were asked to quantify their level of agreement with each theory, ranging from 1 (completely false) to 9 (completely true). The average score across all the theories was 3.5 among men and 3.9 among women. According to the authors, “the strongest predictor of belief in the entirely fictitious conspiracy theory was belief in other real-world conspiracy theories.”

Clearly, susceptibility to conspiracy theories isn’t a matter of objectively evaluating evidence. It’s more about alienation. People who fall for such theories don’t trust the government or the media. They aim their scrutiny at the official narrative, not at the alternative explanations. In this respect, they’re not so different from the rest of us. Psychologists and political scientists have repeatedly demonstrated that “when processing pro and con information on an issue, people actively denigrate the information with which they disagree while accepting compatible information almost at face value.” Scholars call this pervasive tendency “motivated skepticism.”

Conspiracy believers are the ultimate motivated skeptics. Their curse is that they apply this selective scrutiny not to the left or right, but to the mainstream. They tell themselves that they’re the ones who see the lies, and the rest of us are sheep. But believing that everybody’s lying is just another kind of gullibility

Sunday 2 February 2014

VIDEO Mind Of Plants: Are They More Intelligent Than We Think?