Add to My MSN

12/11/2014

Forget the recommended 7 to 8 hours of regular sleep. History has shown that inspiration and energy don’t come from the undisturbed hours we try to sandwich between bedtimes and alarms, but from their disruption.

Like animals and insects, humans originally slept in increments, with a few hours interrupting the “first sleep” and “second sleep,” wrote A. Roger Ekirch, historian and author of At Day’s Close: Night in Times Past. That time was filled with praying, thinking, writing, sex, and discussion—activities our ancestors were too tired to do right before their initial sleep, but in “night-waking” would feel recharged and inspired, eventually drifting back into a second sleep after having a moment of peaceful stimulation.

Ekirch wrote that it’s likely people were deep in dreams right before waking up from the first sleep, “thereby affording fresh visions to absorb before returning to unconsciousness … their concentration complete.” Thomas Jefferson, for example, would read books on moral philosophy before bed so that he could “ruminate” on the subject between sleeps.

Those hours contain a sense of tranquility and natural rhythm that dispel distractions and inspire an optimal state of mind. “In the dead of night, drowsy brains can conjure up new ideas from the debris of dreams and apply them to our creative pursuits,” wrote Aeon magazine.

Psychiatrist Thomas Wehr of the U.S. National Institute of Mental Health found that at night, the brain experiences hormonal changes that enhance creativity, altering the state of mind. The hormone, prolactin, contributes to the hallucinations we have in our sleep but continues to produce during the “quiet wakefulness” between sleeps.

But by the early 20th century, sleeping changed dramatically. Cue Thomas Edison, whose invention of the light bulb revolutionized and regularized sleeping schedules, eliminating segmented sleeping: The longer a house had lighting, the later a household would go to bed. Electricity had elbowed out night-waking, along with its valuable qualities. “By turning night into day,” wrote Ekirch, “modern technology has obstructed our oldest avenue to the human psyche, making us, to invoke the words of the 17th-century English playwright Thomas Middleton, ‘disannulled of our first sleep, and cheated of our dreams and fantasies.’” Wehr agreed, suggesting that current routines have not only changed our sleeping patterns, but also “might provide a physiological explanation for the observation that modern humans seem to have lost touch with the wellspring of myths and fantasies.”

Through a month-long experiment in the 1990s, Wehr discovered that segmented sleeping returns when we extinguish artificial light. His subjects had access to light 10 hours a day rather than the current, artificially extended 16. Wehr reported that in this cycle, “sleep episodes expanded and usually divided into two symmetrical bouts, several hours in duration, with a one- to three-hour interval between them.”

While experiments and history prove that segmented sleeping is natural, it can be quite difficult to master in the schedule-obsessed, appointment-filled modern lifestyle. But as the Industrial Revolution put an end to the creative spells that once punctuated slumber, Aeon wrote, the Digital Revolution has potential to sympathize with the segmented sleeper:

“If we can make time to wake in the night and ruminate with our prolactin-sloshed brains, we may also reconnect to the creativity and fantasies our forebears enjoyed when, as Ekirch notes, they ‘stirred from their first sleep to ponder a kaleidoscope of partially crystallized images, slightly blurred but otherwise vivid tableaus born of their dreams.’”

Image by Alex, licensed under Creative Commons.



12/3/2014

Attention, twentysomethings: Now is the time to change yourself for the better, if ever. Because if you’re older than 30, there’s a good chance your personality will stay more or less the same for the remainder of your life, studies say. That might seem like terribly uninspiring news, but it does have its loopholes.

Brian Little, a University of Cambridge psychology lecturer and author of Me, Myself, and Us: The Science of Personality and the Art of Well-Being, told Melissa Dahl at Science of Us that by the age of 30, a personality is “half-plastered.” The five prominent personality traits psychologists consider are openness, conscientiousness, extroversion, agreeableness, and neuroticism—traits that don’t fluctuate according to moods, but are most likely core characteristics, with about 40 to 50 percent of our personality coming from genes. In fact, certain traits can show up within days of being born, although evolve quite rapidly throughout childhood.

Yes, there’s a strong genetic component to our personality that’s generally stable throughout our lives. That doesn’t mean it’s fixed, exactly, but that the rate of change greatly slows compared to childhood, adolescence, and your early twenties. It becomes more consistent. Dahl also quotes Paul T. Costa at the National Institutes of Health, who said, “What you see at 35, 40 is what you’re going to see at 85, 90.” Yikes.

Still, it’s not a rigid timeline: “There’s nothing magical about age 30,” Costa said. Rather, it’s about maturity, with the 30s being a developmental arc: It’s likely college is over, most of the “firsts” are over, and life generally becomes a little more settled—so, too, does your personality.

But it’s nothing a little faking can’t fix. Little argues that we can still choose to act against our natures. Our basic personality traits may not change, but our behavior and how we choose to express ourselves can decidedly be at odds with our true selves if the situation calls for it. An example, he said, would be an introvert playing the part at a social event, or a disagreeable person just trying to be nice.

“Acting out of character,” as it’s known, comes more naturally the more it’s practiced. Little warns, however, that too much of this can cause anxiety—heart starts pounding, muscles get tense—a typical stress reaction. Consider how an introvert might need some “alone time” to recharge after playing the part of a social butterfly for a while, he said.

From 30 to 90, milestones will occur, no doubt, and be the catalysts that could soften your heart, instill cynicism, or dramatically change your outlook on life: marriage, children, heartbreak, deaths, success, failure. Certain personality traits may eventually just be a part of who you are, but it’s how you express them and the choices you make that unlock the genetic shackles.

Image by Torrie, licensed under Creative Commons.



12/3/2014

Nearly half the time you’re awake, it’s likely spent daydreaming. Sometimes it’s a pleasant trip down memory lane; other times, just a weird thought that feels like time wasted after snapping out of it. But most of the time, removing yourself from the present is just what you need—maybe for a jolt of ingenuity, an amnesiac remedy, or a Walter Mittyesque escape. The Huffington Post broke down five upsides to these wistful distractions:

1. They’ll help connect subconscious dots. When asked to come up with multiple uses for a brick within two minutes, volunteers in a study were 41 percent more inspired when asked to daydream before focusing. In your own little world, you’re subconsciously making connections and using information to form unexpected solutions. Try the study’s drill by contemplating a problem and then spacing out for 12 minutes before refocusing.

2. They’ll remind you of the bigger picture. When trivialities tend to occupy your mental energy (traffic jams, checkout lines, chores), one study found that daydreaming can prompt your deeper ambitions. The brain is divided into a default network (associated with creativity and self-reflection) and an executive network (for planning and problem-solving). Rarely do the two work simultaneously—unless you’re daydreaming. The study found that the dual activity was at its strongest when an individual was so completely “zoned out” that they didn’t even realize it. One theory is that when the present moment is out of the picture, the two networks can mix freely, and abstract ideas, what-if scenarios, memories, and insights can surface more naturally.

3. They can help partially erase bad memories. Psychologists recommend taking your mind to a totally different emotional experience when you’re trying to forget about something for a while—say, an unfortunate run-in or hovering deadline. One study showed that when volunteers were asked to memorize a set of words, those who daydreamed right after learning them were far less likely to remember than those who stayed present. This effect is even stronger when your thoughts take you to a distant place, such as a vacation abroad, or a distant time (at least two weeks ago). While it won’t rid your memory entirely, some unwanted details could escape you.

4. They’re relatively manageable. One study suggests that if you want to spend your daydreams planning for the future, move your body forward; to spend them reminiscing, move backward. Whether it’s physical (such as sitting in a moving bus) or an illusion (think old-school galactic screensavers), there’s a subconscious correlation. One study found that when participants watched certain animations, there was a parallel between the direction of the graphics and whether they thought toward the future or of the past.

5. It’s a youthful thing to do. College students spend roughly half their time daydreaming, compared to one-third for seniors, said a study. It showed that both age groups were rather comparable overall, but older people—whether due to weaker memory, more discipline, or fewer distractions—were still more mentally present.

Image by Nichole Renee, licensed under Creative Commons.



12/3/2014

Darwin’s term “survival of the fittest” typically connotes ruthlessness, greed, and self-interest in the name of competition. But that’s actually quite far from what he meant: “Communities, which included the greatest number of sympathetic members, would flourish best,” he wrote.

Studies today show that our minds are indeed wired for sympathy and empathy—qualities that activate other mammals’ brains, suggesting it’s an innate part of our neurological system. In this animated video, University of California, Berkeley’s psychologist Dacher Keltner discusses the various aspects of kindness: its evolution, biology, and circumstantial influences. Because humans produce vulnerable offspring causing us to be cooperative caregivers, he said, our physiology has evolved accordingly.

Studies revealed that class structure does affect one’s level of empathy, with a “compassion deficit” found in wealthier communities. And contrary to common sense, those with less money are also more likely to donate and give to charity, showing a greater strength of generosity and empathy within lower classes.

While 60 percent of what we do is motivated by self-interest and competition, there’s a great “secondary delight” in acting for others, accounting for 40 percent of our actions: We still find personal fulfillment when sacrificing and risking exploitation through these selfless deeds. “It really requires that we have to redefine human self-interest,” Keltner said. “The great ethical traditions have always been encouraging this, and now we’re seeing the science, that, yeah, the brain really cares about other people.”



12/1/2014

Your sense of humor, moral compass and attitude may seem like characteristics that eventually become a defining part of your personality—qualities and principles developed over the years that make you “you.” But studies say these traits are actually malleable when switching languages.

Multilinguals tend to have multiple personalities depending on what language they’re speaking, research shows. In one study from 2001-2003, more than a thousand bilinguals were asked if they “feel like a different person” when speaking a different language; two-thirds said yes.

In 1964, Susan Ervin—a sociolinguist at University of California, Berkeley—recruited 64 French adults who lived in the U.S. for an average of 12 years and who also spoke French and English fluently (40 of whom were married to an American). Ervin gave them the “Thematic Apperception Test” on two different occasions six weeks apart, where the subjects were shown a series of illustrations and were then asked to create a three-minute story describing each scene. One session was conducted only in English, the other in French. There was a consistency in themes, Ervin found, depending on what language the subjects were speaking: Those in English emphasized more female achievement, physical aggression, verbal aggression toward parents, and attempts to escape blame. But in French, there appeared more domination by elders, guilt, and verbal aggression toward peers.

She continued similar studies in 1968, where she analyzed Japanese women living in San Francisco, who, while bilingual, were largely isolated from other Japanese in America, mostly married to American men, and only spoke Japanese when visiting Japan or with bilingual friends. Ervin had a bilingual interviewer give various verbal tasks in both Japanese and English. In one exercise, the women were asked to complete sentences that were sometimes presented in English and other times in Japanese. When asked to finish the statement, “When my wishes conflict with my family …” in Japanese, the response was typically “… it is a time of great unhappiness.” In English, however, the women generally completed the sentence with “… I do what I want.”

Another study analyzed how levels of proficiency affect morality. Researchers utilized the model that divides moral judgment into two forces: intuitive processes, driven by the emotional aspects of a dilemma and favoring individual rights, and rational processes, which entail a conscious evaluation with utilitarian tendencies. The study showed that individuals tend to make utilitarian decisions when speaking in a foreign language. Participants from the U.S., Korea, France, Israel, and Spain were given the same scenario: A train is about to kill five people, and the only way to save them is by pushing one man in front of the train. Half were asked this question in their native tongue, the other half in a foreign language.

When using a foreign language, participants from all countries were 50 percent more likely to say they’d push the man (the utilitarian answer: sacrifice one to save five)—a pattern researchers correlated with a reduced emotional response. Participants feel an increased psychological distance when the scenario is presented in an alien language, trading emotional concerns for rationality. Researchers hypothesized that responses were highly dependent on levels of proficiency, as increased fluency helped people become more emotionally grounded in a foreign language.

Image by Sue Clark, licensed under Creative Commons.



12/1/2014

To say that women can’t have it all may ring as anti-feminist. “If he can do it, we can do it,” women encouragingly tell one another. And they can … although it may come at a price, a recent study says.

Data from 1993-2004 showed that mid-career women with leadership roles are more likely than men in similar positions to suffer from chronic stress and symptoms of depression. Sociologists Tetyana Pudrovska and Amelia Karraker reported that this may be due to conflicting expectations, a “double bind.” The report said that “On the one hand, [women] are expected to be nurturing, caring, and agreeable, consistent with the normative cultural constructions of femininity. On the other hand, they are also expected to be assertive and authoritative, consistent with the expectations of the leadership role.” The results surprised the researchers, as previous studies show that a higher education and income—both typical of management roles—usually results in more positive mental health. “There are these cultural and societal forces that may undermine women’s leadership,” Pudrovska said.

The stress of a high-powered career, perhaps more privy to women but certainly not immune to men, isn’t likely to fade anytime soon. But things are looking up for younger women: A decades-long survey starting in 1957, focusing on Wisconsin high school graduates, found a shift in societal expectations with younger generations. “Now women’s leadership is more respected,” Pudrovska reported. “If we want to understand what’s happening in the workplace, we need to look at younger cohorts.”

Younger generations leave us with hope for a brighter future, one free of double standards for working women. Addressing these issues is vital; however, one must also be wary of over-emphasizing the emotional value of professional pursuits.

Cue Amy Poehler, whose book Yes, Please ended with a counterargument to Sheryl Sandberg’s Lean In—the working woman’s latest manifesto. “Career is something that fools you into thinking you are in control and then takes pleasure in reminding you that you aren’t,” Poehler wrote, with advice directed at woman but just as applicable to men. “Try to care less. Practice ambivalence. Learn to let go of wanting it,” because “career is the thing that will not fill you up and will never make you truly whole. Depending on your career is like eating cake for breakfast and wondering why you start crying an hour later.”

Image by tai nkm, licensed under Creative Commons.



11/19/2014

“Religion is the sigh of the oppressed creature, the heart of a heartless world, and the soul of soulless conditions,” wrote Karl Marx in Critique of Hegel’s Philosophy of Right. “It is the opium of the people.”

Since the 1800s, we’ve long been told that religion faces an imminent death. Lawrence Krauss argued at the Victorian Skeptics Café 2014 that it’s time to “plant the seeds of doubt in our children” so that religion can disappear within a mere generation, “or at least largely go away.” He insisted that “change is always one generation away.”

Which is a fair assessment when it comes to social prejudices. He mentioned that 13-year-olds today don’t even see gay marriage as an issue, and slavery soon became a reviled concept once generations failed to remember it as a cultural norm. But one cognitive scientist told The Daily Beast that Krauss neglected to consider one of the major findings from cognitive science of religion in the last few decades—that “people’s natural receptivity to religious ideas may be borne out of certain ordinary habits of the human mind that are hard to extinguish,” such as the tendency to see actions as intentional, objects as products of design, or the mind as distinct from the body.

One study showed that even kids who have no religious exposure in the home still believe that events happen for intentional reasons, such as to teach a lesson or send a sign. Another British study found that children unexposed to religion still used God for the unexplainable, demonstrating that this kind of thinking is an ingrained mental habit. Furthermore, half of nonbelievers still believe in fate, proving there’s a tendency to see intention and agency in lives without religious ties.

The need for a higher power may arguably be an inherent human tendency, but millennials are more critical of organized religion than previous generations were at this age. In the 1970s, only 13 percent of baby boomers considered themselves religiously unaffiliated; in 2012, a third of 18 to 29-year-olds did. It’s likely that millennials marrying and entering the workforce later in life, thus extending college years, largely contributes to that gap: 86 percent of Americans without a college education believe in Jesus Christ’s resurrection, whereas 64 percent of those with postgraduate degrees do.

But millennials aren’t outright rejecting Christianity, either, as three quarters of them believe it still preaches good values and principles. They do, however, find modern Christianity to be rather hypocritical (58 percent), judgmental (62 percent), and anti-gay (64 percent). Even before moving out of their childhood homes, many millennials have already moved away from the religion of their upbringing. Only 11 percent are raised without a religious affiliation, and yet a quarter of millennials now identify themselves as religiously unaffiliated. Much of this is due to modern politicization from churches: 69 percent of those between ages 18 and 29 find that religious groups alienate young people by being too judgmental about gay or lesbian issues. And because the disaffiliated rate is higher than previous generations in this life cycle, it’s likely that fewer millennials will return to their childhood congregations even after settling down and raising children.

This trend, however, is not in conjunction with Krauss’ testament that religion as a whole is on its way out, let alone that society is a generation away from total agnosticism. Instead, the statistics lend themselves to the notion that modern oppression within organized religion must be addressed if a younger congregation is to fill the pews. Regarding Krauss’ argument, The Daily Beast notes that it’s one thing to see a social shift in bigotry, and quite another to anticipate sophisticated traditions disappearing in a lifetime: “Specific brands of religion might disappear quickly (as one hopes bigoted and violent fundamentalism will), but it’s a stretch to think that something like religion—a complex and dynamic set of rituals, social practices, and cognitively ingrained ways of viewing the world—could be quite so fragile.”

Below is a clip of Krauss arguing how religion should be approached in schools, as well as how organized efforts could allow it to disappear within a generation:


Image by Waiting For The Word, licensed under Creative Commons.





Pay Now & Save $5!
First Name: *
Last Name: *
Address: *
City: *
State/Province: *
Zip/Postal Code:*
Country:
Email:*
(* indicates a required item)
Canadian subs: 1 year, (includes postage & GST). Foreign subs: 1 year, . U.S. funds.
Canadian Subscribers - Click Here
Non US and Canadian Subscribers - Click Here

Want to gain a fresh perspective? Read stories that matter? Feel optimistic about the future? It's all here! Utne Reader offers provocative writing from diverse perspectives, insightful analysis of art and media, down-to-earth news and in-depth coverage of eye-opening issues that affect your life.

Save Even More Money By Paying NOW!

Pay now with a credit card and take advantage of our earth-friendly automatic renewal savings plan. You save an additional $5 and get 4 issues of Utne Reader for only $31.00 (USA only).

Or Bill Me Later and pay just $36 for 4 issues of Utne Reader!