Mind and Body

A Personality For Every Language

Your sense of humor, moral compass and attitude may seem like characteristics that eventually become a defining part of your personality—qualities and principles developed over the years that make you “you.” But studies say these traits are actually malleable when switching languages.

Multilinguals tend to have multiple personalities depending on what language they’re speaking, research shows. In one study from 2001-2003, more than a thousand bilinguals were asked if they “feel like a different person” when speaking a different language; two-thirds said yes.

In 1964, Susan Ervin—a sociolinguist at University of California, Berkeley—recruited 64 French adults who lived in the U.S. for an average of 12 years and who also spoke French and English fluently (40 of whom were married to an American). Ervin gave them the “Thematic Apperception Test” on two different occasions six weeks apart, where the subjects were shown a series of illustrations and were then asked to create a three-minute story describing each scene. One session was conducted only in English, the other in French. There was a consistency in themes, Ervin found, depending on what language the subjects were speaking: Those in English emphasized more female achievement, physical aggression, verbal aggression toward parents, and attempts to escape blame. But in French, there appeared more domination by elders, guilt, and verbal aggression toward peers.

She continued similar studies in 1968, where she analyzed Japanese women living in San Francisco, who, while bilingual, were largely isolated from other Japanese in America, mostly married to American men, and only spoke Japanese when visiting Japan or with bilingual friends. Ervin had a bilingual interviewer give various verbal tasks in both Japanese and English. In one exercise, the women were asked to complete sentences that were sometimes presented in English and other times in Japanese. When asked to finish the statement, “When my wishes conflict with my family …” in Japanese, the response was typically “… it is a time of great unhappiness.” In English, however, the women generally completed the sentence with “… I do what I want.”

Another study analyzed how levels of proficiency affect morality. Researchers utilized the model that divides moral judgment into two forces: intuitive processes, driven by the emotional aspects of a dilemma and favoring individual rights, and rational processes, which entail a conscious evaluation with utilitarian tendencies. The study showed that individuals tend to make utilitarian decisions when speaking in a foreign language. Participants from the U.S., Korea, France, Israel, and Spain were given the same scenario: A train is about to kill five people, and the only way to save them is by pushing one man in front of the train. Half were asked this question in their native tongue, the other half in a foreign language.

When using a foreign language, participants from all countries were 50 percent more likely to say they’d push the man (the utilitarian answer: sacrifice one to save five)—a pattern researchers correlated with a reduced emotional response. Participants feel an increased psychological distance when the scenario is presented in an alien language, trading emotional concerns for rationality. Researchers hypothesized that responses were highly dependent on levels of proficiency, as increased fluency helped people become more emotionally grounded in a foreign language.

Image by Sue Clark, licensed under Creative Commons.

A Woman’s View From the Top Might Be a Sad One

To say that women can’t have it all may ring as anti-feminist. “If he can do it, we can do it,” women encouragingly tell one another. And they can … although it may come at a price, a recent study says.

Data from 1993-2004 showed that mid-career women with leadership roles are more likely than men in similar positions to suffer from chronic stress and symptoms of depression. Sociologists Tetyana Pudrovska and Amelia Karraker reported that this may be due to conflicting expectations, a “double bind.” The report said that “On the one hand, [women] are expected to be nurturing, caring, and agreeable, consistent with the normative cultural constructions of femininity. On the other hand, they are also expected to be assertive and authoritative, consistent with the expectations of the leadership role.” The results surprised the researchers, as previous studies show that a higher education and income—both typical of management roles—usually results in more positive mental health. “There are these cultural and societal forces that may undermine women’s leadership,” Pudrovska said.

The stress of a high-powered career, perhaps more privy to women but certainly not immune to men, isn’t likely to fade anytime soon. But things are looking up for younger women: A decades-long survey starting in 1957, focusing on Wisconsin high school graduates, found a shift in societal expectations with younger generations. “Now women’s leadership is more respected,” Pudrovska reported. “If we want to understand what’s happening in the workplace, we need to look at younger cohorts.”

Younger generations leave us with hope for a brighter future, one free of double standards for working women. Addressing these issues is vital; however, one must also be wary of over-emphasizing the emotional value of professional pursuits.

Cue Amy Poehler, whose book Yes, Please ended with a counterargument to Sheryl Sandberg’s Lean In—the working woman’s latest manifesto. “Career is something that fools you into thinking you are in control and then takes pleasure in reminding you that you aren’t,” Poehler wrote, with advice directed at woman but just as applicable to men. “Try to care less. Practice ambivalence. Learn to let go of wanting it,” because “career is the thing that will not fill you up and will never make you truly whole. Depending on your career is like eating cake for breakfast and wondering why you start crying an hour later.”

Image by tai nkm, licensed under Creative Commons.

A Generation of No Religion

“Religion is the sigh of the oppressed creature, the heart of a heartless world, and the soul of soulless conditions,” wrote Karl Marx in Critique of Hegel’s Philosophy of Right. “It is the opium of the people.”

Since the 1800s, we’ve long been told that religion faces an imminent death. Lawrence Krauss argued at the Victorian Skeptics Café 2014 that it’s time to “plant the seeds of doubt in our children” so that religion can disappear within a mere generation, “or at least largely go away.” He insisted that “change is always one generation away.”

Which is a fair assessment when it comes to social prejudices. He mentioned that 13-year-olds today don’t even see gay marriage as an issue, and slavery soon became a reviled concept once generations failed to remember it as a cultural norm. But one cognitive scientist told The Daily Beast that Krauss neglected to consider one of the major findings from cognitive science of religion in the last few decades—that “people’s natural receptivity to religious ideas may be borne out of certain ordinary habits of the human mind that are hard to extinguish,” such as the tendency to see actions as intentional, objects as products of design, or the mind as distinct from the body.

One study showed that even kids who have no religious exposure in the home still believe that events happen for intentional reasons, such as to teach a lesson or send a sign. Another British study found that children unexposed to religion still used God for the unexplainable, demonstrating that this kind of thinking is an ingrained mental habit. Furthermore, half of nonbelievers still believe in fate, proving there’s a tendency to see intention and agency in lives without religious ties.

The need for a higher power may arguably be an inherent human tendency, but millennials are more critical of organized religion than previous generations were at this age. In the 1970s, only 13 percent of baby boomers considered themselves religiously unaffiliated; in 2012, a third of 18 to 29-year-olds did. It’s likely that millennials marrying and entering the workforce later in life, thus extending college years, largely contributes to that gap: 86 percent of Americans without a college education believe in Jesus Christ’s resurrection, whereas 64 percent of those with postgraduate degrees do.

But millennials aren’t outright rejecting Christianity, either, as three quarters of them believe it still preaches good values and principles. They do, however, find modern Christianity to be rather hypocritical (58 percent), judgmental (62 percent), and anti-gay (64 percent). Even before moving out of their childhood homes, many millennials have already moved away from the religion of their upbringing. Only 11 percent are raised without a religious affiliation, and yet a quarter of millennials now identify themselves as religiously unaffiliated. Much of this is due to modern politicization from churches: 69 percent of those between ages 18 and 29 find that religious groups alienate young people by being too judgmental about gay or lesbian issues. And because the disaffiliated rate is higher than previous generations in this life cycle, it’s likely that fewer millennials will return to their childhood congregations even after settling down and raising children.

This trend, however, is not in conjunction with Krauss’ testament that religion as a whole is on its way out, let alone that society is a generation away from total agnosticism. Instead, the statistics lend themselves to the notion that modern oppression within organized religion must be addressed if a younger congregation is to fill the pews. Regarding Krauss’ argument, The Daily Beast notes that it’s one thing to see a social shift in bigotry, and quite another to anticipate sophisticated traditions disappearing in a lifetime: “Specific brands of religion might disappear quickly (as one hopes bigoted and violent fundamentalism will), but it’s a stretch to think that something like religion—a complex and dynamic set of rituals, social practices, and cognitively ingrained ways of viewing the world—could be quite so fragile.”

Below is a clip of Krauss arguing how religion should be approached in schools, as well as how organized efforts could allow it to disappear within a generation:


Image by Waiting For The Word, licensed under Creative Commons.

The Science Behind Smiling

 

You smile when you’re happy. You smile when you’re satisfied. Now try a smile when you’re bummed—see if good feelings start to kick in 3…2…1…

Turns out, actions affect emotions: you’re not smiling because you’re happy, but rather, you realize you’re happy because of your instinctive reaction to smile. Darwin suggested that that our emotions are influenced by, not the cause of, our facial reactions; only after having a physical reaction can you deduce how you feel about something. With that logic, faking a smile actually increases positive vibes. Studies show that smiling reduces stress-enhancing hormones while boosting mood-enhancing hormones. Scientific American, however, cited studies that showed knowledge of this effect could reverse its results. “At first the brain says ‘I’m smiling; I must be happy!’ But upon learning that smiling can be a proactive strategy, this turns into ‘I’m smiling; I must be trying to make myself happy—I must be sad!’… In the same way you can’t set your alarm clock forward 10 minutes to trick yourself into punctuality, artificially forcing a smile isn’t going to do much for your happiness. Too much knowledge and the jig is up.” (quick! forget you read this!)

But it still doesn’t hurt to fake it, as one study from Penn State notes that smiling makes you appear more likable, courteous, and competent. Plus, it’s a global communicator, translatable in all languages: smiling is one of few biologically uniform reactions across the world. Babies smile in the womb. Blind babies smile at the sound of a human voice. And researchers found that members of a tribe in Papa New Guinea, one that’s completely disconnected from Western civilization, also smile when feeling joyous or satisfied.

In his TED talk, “The Hidden Power of Smiling,” Ron Gutman said that a third of Americans smile at least 20 times a day, and roughly 14 percent smile less than five times a day. Children, on the other hand, smile around 400 times a day. One British study managed to put a price tag on it: One smile can generate the same level of brain stimulation as receiving up to $25,000 in cash or 2,000 bars of chocolate.

Gutman cites an amusing 30-year study that looked at yearbook photos and accurately predicted, based on their smiles, their sense of fulfillment in their marriage, standardized scores of well-being, and how inspiring that person would be to others. Looking at old baseball cards, research also found that the span of the player’s smile directly correlated with his life span: those who didn’t smile lived an average of 73 years, and those with beaming smiles on average lived to be 80.

Interested in reading more positive and solution-focused articles? Receive the Utne Uplifter newsletter! 

This video by BrainCraft explores the origins of studies in smiling:

Image by joyuousjoym Blessings, licensed under Creative Commons.

Chin Up, Liberal Arts Students: History's On Your Side


Death of Socrates
by Jacques-Louis David (1787)

The dreaded question for all those majoring in philosophy, art history, or English literature: “So … what can you do with that?”

The 21st century—an age when proficiency in science, engineering and technology reign supreme in job applications—is slowly witnessing a dying breed: the liberal arts student. In 1996, 14 percent of U.S. graduates majored in one of the liberal arts; in 2010 that number was cut in half. And while nearly half of Stanford’s faculty teaches subjects in the humanities, fewer than 20 percent of applicants are interested in those courses.

On the surface, there seems to be little measurable reward in these subjects, especially when colleges are willing to lower tuitions for students interested in pursuing “job-friendly” majors. Semesters spent poring over Walt Whitman’s writings may seem self-indulgent next to courses teaching technical skills for white-collar professions. And the defense that these subjects teach critical thinking and creativity, while true, falls flat when the same could be said of, say, engineering. Perhaps there’s comfort in knowing Socrates went through similar scrutiny, as Callicles once told him in Gorgias, “It’s not shameful to practice philosophy while you’re a boy, but when you still do it after you’ve grown older and become a man, the thing gets to be ridiculous, Socrates!”

But there’s a fault in believing that a specific major means a specific job, primarily when we live in a time where job descriptions are constantly evolving: Being a journalist now, for example, requires entirely different skills than it did 15 years ago. An English professor at Roanoke College, Paul Hanstedt wrote, “It’s surprising how much we oversimplify the conversations we have about education and how it should work, reducing everything to a simple X=X formula: if I study accounting, I will be an accountant. If I study biology, I will be a biologist … As a result of this simplification, we make some pretty peculiar decisions.” Forbes makes a similar argument, saying that if students enter college with a narrow focus on fields that currently have the most opportunities, every other type of job could end up having too few candidates as the job market improves and other traits become necessary. Furthermore, skills directly applicable to a position shouldn’t be the end-all-be-all of a candidate’s profile: “We, as a culture of competition, are so hyper-focused on career success that we lose sight of all the other things that make a person interesting, well-rounded and, ultimately, a good hire.”

Numbers tend to favor the “interesting” hire. A study by the Association of American Colleges and Universities found that 74 percent of employers would recommend a liberal education to a young person they know as the best way to prepare for success in today’s global economy. And nearly all of employers surveyed (93 percent) say that the ability to think critically, communicate clearly, and solve complex problems is more important than a candidate’s undergraduate major.

In defense of a liberal arts education, Time columnist Joel Stein wrote, “We live in a time when smart people want to discuss only politics, technology and economics—a time when we believe that all progress is technological. But lots of history’s greatest leaps came from humanities: the Renaissance, the Enlightenment, Romanticism, civil rights, getting girls to go wild.” The sciences, he noted, have not provided a pope or president.

We mustn’t leave the humanities to history. While it’s tempting to imagine that the future will be one scientific breakthrough after another, we must hold the same level of anticipation for creative progress. Stein aptly concludes in his column:

This great moment for technology is only a moment. Just as the transportation revolution ended 50 years ago, when planes stopped going any faster, the information revolution will sputter as well. But the revolutionary ideas that will once again change everything will come from our humanities departments. Though I have no idea who will give them money in the meantime.

Here’s an iconic defense of romantic pursuits, from the 1989 movie Dead Poets Society:

Image by Rodney, licensed under Creative Commons

The Ancient Art of Lying

We’ve come a long way since the Garden of Eden, the parabolic birthplace of deception. Because now you hear anywhere from 10 to 200 lies a day.

Lying—a trick that 90 percent of 4-year-olds have figured out—may range from a harmless white lie to grand White House lies, but most are told with the same intention: to make oneself look better.

A UMass study found that 60 percent of participants told two to three lies within a 10-minute conversation, from merely faking agreement with the other person to pretending to be the star of a rock band. While men and women lie at approximately the same rate, women tend to do so to make the other person feel good, whereas men lie to make themselves look better.

And online dating profiles are the Mecca of image-boosting tweaks. One survey revealed that 53 percent of people admit to lying on their profiles. Dropping a few pounds (32 percent), using an old photo (18 percent), or glamorizing a job (40 percent of men; a third of women) are among the most common fibs used on dating sites, though not entirely challenging to figure out within seconds of meeting.

Résumés have also proved to be rich with deceit. One recruitment firm estimates that about 40 percent of résumés aren’t totally truthful, which, in the past three years, have human resources departments dramatically increasing time spent toward checking out references. The most common lie on résumés involves playing with job dates, whether it’s to mask job-hopping, being fired, prison time, or—common among women—time recently taken off for family. Applicants also embellish accomplishments, increase previous salaries, swell GPAs and honors, fake language fluency (a bold move, no doubt), and blur “familiarity” with “proficiency” regarding computer skills. Fear of ageism tends to promote lying by omission, such as graduation dates. The easiest lie to check out? Diplomas and degrees.

Hearing a lie in person, however, requires detection more acute than a 30-second, first-date conversation or brief follow-ups with employers. The conscious mind only controls 5 percent of communication; the rest is beyond awareness, allowing signs of dishonesty to slip into body language and word choice. The four most common qualities of a lie include minimal self-referencing (such as speaking in third person or using pronouns to create distance from the lie), a tendency to employ negative words out of guilt, overly simplistic explanations of a situation, and convoluted sentence structures that help pad the lie.

This video breaks down these giveaways through “linguistic text analysis”:

Pope Francis Popularizes Church’s Stance on Science

Pope Francis reminded Catholics that the theories of evolution and Big Bang are indeed consistent with the notion of a Creator at the Pontifical Academy of Sciences, where he said Genesis misleads us into believing God is a “magician with a magic wand,” according to The Independent.

While not the first pope to welcome scientific theories, as a Catholic, this statement was exciting to me because it puts to rest the pseudo-theories of creationism and intelligent design for confused Catholics—the latter of which Pope Benedict XVI has been mistaken to endorse. In reality, Francis’ declaration is not breaking news for the Church, but rather, providing a refreshing reminder:

“When we read about Creation in Genesis, we run the risk of imagining God was a magician, with a magic wand able to do everything. But that is not so,” he said, while unveiling a statue of his predecessor, Benedict. “He created human beings and let them develop according to internal laws that he gave to each one so they would reach their fulfillment.”

Pope Pius XII welcomed the Big Bang theory and evolution in 1950, and, Pope John Paul II went even further in 1996 by saying it was “more than a hypothesis” and “effectively a proven fact.” Yet with other Christian denominations fighting these theories, the Catholic Church has somehow been lumped into the category of “anti-science” despite its very clear position, with media pointing at rather ancient accounts when investigating the Church—such as Galileo’s excommunication for suggesting the Earth revolved around the sun. John Paul II, however, put this matter to rest when he declared in 1992 that the 17th-century theologians who condemned Galileo did not recognize the distinction between the Bible and its interpretation. They were merely working with the knowledge they had.

Despite the scientific evidence available to us and the Catholic Church’s expressed support for these theories, this topic still remained divisive among Catholics, primarily in the U.S.: as some preferred these theories to Creationism, others clung to the literal words of Genesis. So why is it different this time around with Francis, who’s essentially repeating decades-old ideas?

So far in Francis’ papacy, he’s been regarded as “progressive,” despite not having changed any Catholic doctrine or public teachings regarding social issues. But what is different with him is his incredible outreach to youth and media: publications who once happily criticized the Church now find themselves quoting Francis’ works with smiling headlines, and issues that were never fuzzy within Catholicism yet fuzzy to onlooking non-Catholics (such as the Church “condemning” homosexuals) are finally gaining attention without spin. For whatever reason, his words are more celebrated, which is good news for the Church.

“The Big Bang, which today we hold to be the origin of the world, does not contradict the intervention of the divine creator but, rather, requires it. Evolution in nature is not inconsistent with the notion of creation, because evolution requires the creation of beings that evolve.”

This statement does not blanket all Christian beliefs. But the Catholic Church—its largest denomination—is (and has been) spoken for, only this time with a more effective megaphone.

Image by Emil Nolde, licensed under Creative Commons.