Our Avatars, Ourselves

laura croft

After a little hiatus from blogging, this article about how digital avatars influence our beliefs got me back on the wagon. In particular, it got me thinking about the amazing, good ol’ fashioned power of storytelling – that the stories we tell shape our beliefs about who we are, what we can become, and what is possible or impossible.  This idea is an old one, but its prosaicness lulls us into thinking that the power of stories is an abstraction, not a reality.

This article highlights the very real power of stories – in the form of digital avatars. An avatar, from the Sanskrit word origin, means an incarnation. More commonly, we think of avatars as representations of ourselves in virtual environments. When we represent ourselves digitally we are expressing some aspect of ourselves. That is, we are telling a self-story, real or imagined, that we want to explore. This psychological experience of an embodiment or “incarnation” of self goes a long way in explaining the research findings described in this article.

The research shows that using a “sexy avatar” in a video game influences women – and not for the better.  For example, women who played a game using sexualized avatars – especially those that looked like them – were more accepting of the rape myth (rape is a woman’s fault) and more likely to objectify themselves sexually in an essay. Other studies document the “Proteus effect” in which embodying a character in virtual environments like a game influences behaviors in  in the real world, such as eating patterns, brand preference, and physiological arousal. This effect is strongest when people actively engage with an avatar as compared to passively watching the character. While many of these studies have flaws (e.g., small sample size which makes it hard to generalize that these findings actually apply to people in general) they also have strengths such as strong experimental methods. So, these studies should be given serious consideration.

This article might lead some to demonize video games; but I think that is a mistake. We can bash video games all we want, but this black and white view misses the point that one can tell stories that sexualize women to the exclusion of individuality, intelligence, or competence in all sorts of media: books, movies, cosplay, the news we follow, and the conversations we have. It also misses the point that if avatars are so powerful, they can be used in positive ways.

So, is there something special about video games besides the fact that a single game can make billions of dollars in two weeks? Is actively engaging in a story rather than passively watching it the key to the effects that avatars can have on us? As a society, we need to have this conversation. But it will be crucial for science to weigh in and help interpret whether and how the stories we tell in virtual worlds transform what we do, believe, and become.

Laying down a social marker

Another thought-provoking piece by Gareth Price about how the pressure to share via social media may be influencing the quality and quantity of our ideas.

DisCoverage

frustrated_writer_by_photonerd88-d3gobx6BrainJuicer’s Tom Ewing wrote a blog post today about how the way we listen to music could change.

He envisioned people will soon have “attention regimes, in the way they follow dietary regimes and exercise regimes, and will have them in public: a proclamation of one’s listening regime will become a kind of social marker”; adding:

Demonstrating you can pay attention in a world of instant clicks will be a mark of presumed character (and bragging rights) in the same way demonstrating you keep fit in a world of chairs and screens is among white-collar workers now.”

Ephemerality is built into the internet.

If you don’t update your website Google will punish you by pushing you down its search rankings.

Fail to tweet for any extended period and people will unfollow you.

Don’t update your status and friends will accuse you of being a ‘Facebook…

View original post 229 more words

The Happiest iPhone on the Block: Why Managing Your Digital Life is Like Good Parenting

When I started blogging a little over a year ago, I was a true social media skeptic. I drew more inspiration from thinkers like Sherry Turkle than Anil Dash. But my experiences with social media have turned this on its head. I’m still a skeptic in the sense that, as a scientist, I believe we need to know a lot more about how social media affect our lives for better and for worse. But I don’t feel the kind of concern I used to feel. Perhaps I’ve been tempted by the siren song of technology, lulled by a false sense of security engendered by the all-consuming digital embrace… but I don’t think so. I actually feel more in control and less overwhelmed by social media and other digital forms of communication than ever before. I feel they are tools, which I can selectively choose among and harness. I believe that a sense of well-being and balance in social media use is possible if we use some simple practices. The best metaphor I can think of for these practices is that they are the types of things that an effective and sensitive parent does. Here are the top five “parenting strategies” I’ve used to manage my social media burden:

naughty child

  1. Establish rules and set limits. Children thrive when there are consistent limits and structure. In the same way, our technology use needs rules and limits. If I don’t set limits on when and how I use social media, I’m more likely to get sucked into the black hole of keeping up with every tweet/text/email/post/newsfeed. I’m more easily distracted by social media, less present with others, and more likely to waste time and be less efficient because of it. Like all good parents, I try to create structure that is firm but fair. Harsh discipline might work in the short term, but the child usually rebels. So, I try not to be unreasonable or unrealistic about the rules (e.g., “I can only check email once a day, and for no more than 10 minutes” doesn’t work). I’ve tried to find a set of guidelines that work with my life and make me happy.
  2. Monitor communication technology use. It’s 10 o’clock. Do you know how much social media you’ve used today? This is really about being mindful about how we’re using our technology. I prioritize my time – I only have so much time and attention in a day, and so I try to spend my mental and social capital wisely. I keep track and schedule times that I will use these tools, and know the times that they need to be put to bed.
  3. Reinforce good behavior. It’s not only the amount of time we use social media or communication technology. It’s about how we use it and what it brings to our lives. I try to select digital communities that brings something positive to my life and that cultivates a positive peer network.
  4. Selectively ignore. In parenting, the idea here is that if a child is showing a troublesome behavior, as long as it’s not destructive, it can be “extinguished” by just ignoring it. If there is no reaction, and no reward, there ceases to be a reason for the child to act that way. And then the child stops being a nuisance. In the similar vein, when I start to feel that my communication technology use is becoming burdensome and bossy, when I feel the pressure to respond to every message or push notification is too much, I start ignoring it. Most of us like the feeling of being connected, and hope that the dings and rings on our devices will bring something good into our lives or that stressful things can be averted and dealt with quickly. So, we start to check obsessively and end up spending dinner time with our family on a device, or walking into traffic with our eyes glued to our iPhone. When I begin to move in this direction, I reverse course and start to consciously and selectively ignore my devices in order to break the cycle.
  5. Adapt technology use to fit my life. One key to being a good parent, I believe, is structuring your life so that it can accommodate children in support of their well-being and happiness. Some (in my opinion) not-so-great parents do the opposite, they expect not to change their lives at all and that children should just fit in. In contrast to my list of strategies thus far, when it comes to mobile technology and social media I try to follow the inspiration of the questionable parent: I fit technology into my life so that I remain able to do what I want and need to do without being sidetracked. If my life is becoming  more stressful and less organized because of social media burden, then I’m probably doing the opposite.

So remember, when that naughty stream of Facebook status updates are just too much to handle, you’re a week behind on your twitter feed, the pesky email inbox just won’t empty out, and those 10 texts – that are going to go unanswered for another few days – won’t stop bugging you, ask yourself: what would mom do?

The Medium is the Message: On Mindfulness and Digital Mirrors

I recently had the pleasure of doing a talk-back with Congressman Tim Ryan on the role of mindfulness – focusing your awareness on the present moment – in education, as part of the Rubin Museum’s Brainwave Festival in NYC. The film, called “Changing Minds at Concord High School,” followed an entire school as they took part in a mindfulness training program. This school is unique in that it is a transfer school, a last stop for many kids with a history of school failure and discipline problems. The twist here is that the students both filmed the experience and conducted a study – of their classmates! – comparing the effects of mindfulness training with that of a placebo. We also included a science curriculum on the neuroscience of mindfulness – how it can change our brains for the better. I was the lead scientist on this project, so the kids were my “research assistants.” The project was spearheaded and directed by the amazing Susan Finley and filmed by the equally inspiring Peter Barton (with the help of the students). Our outstanding scientific advisors were David Vago and Robert Roeser. There is a lot that was amazing about this project, these kids, and this film. I want to focus on just one aspect, which hinges on the phrase “The medium is the message.”

lake yoga

The medium is the message. This phrase was coined by Marshall McLuhan who put forward the idea that the “form of a medium embeds itself in the message.” That is, the medium in which we experience something influences how we perceive the take-home message. Using movies as an example, he argued that the way in which this medium presents time has transformed our view of time from something that is linear and sequential into something that reflects patterns of connection across people and places. I am obviously no film theorist, but I apply this notion to the idea that different media provide us with an array of tools that can help us create a narrative of ourselves and the world that is unique to that medium.

Film and self-identity. In the case of our film “Changing Minds at Concord High School,” I believe that one way that the medium was the message for our students was that film is able to portray individual identities as being truly flexible and changeable. I think that the teens at Concord High, many of whom have experienced tremendous challenges, stress, and obstacles in life, didn’t believe as a group that change for them was really possible. But what our program strove to do, using converging media – film, scientific readings, mind/body experiences of mindfulness – was to convince these young adults that they really could change their brains, change counterproductive habits of thinking, and find the tools to focus more and let negative feelings go. As we move on to Phase 2 of the project by refining and developing our program, we are asking the fundamental question: How can we best use these tools to teach teens to view themselves and the world differently, creating a narrative in which personal change is possible?

Our digital mirrors. I think these issues are especially important to consider now, in this era of social media and reality television in which we crave to see ourselves reflected back to ourselves. We can criticize this, and analyze this, but the fact of it borders on the irrefutable. We know that it’s easier than ever before to document our lives via pictures and videos on our mobile devices, and share them with our digital networks. And we love to do so. Social media, through which we share our images of ourselves and our lives, are an immeasurably huge and complex array of mirrors into which we can gaze at ourselves. There may be costs and benefits to this, but it simply is. The power of this, however, is that we now have a new set of tools to curate our beliefs about who we are – hopefully for the better. And perhaps we believe this evidence of who we are more strongly because it is concrete, it is documented, it receives “likes” and is seen by others and thus is real. I’m liked therefore I am.

This digital infrastructure also provides a profound opportunity for those trying to support growth and positive change in youth. If we help youth document the possibility of change – like we did in “Changing Minds at Concord High School”- they may start to believe it applies to their own lives. This is particularly important for those of us who aren’t used to feeling that the world is full of possibilities. In this way, social networking may be a medium that gives the message that change is possible and that our limitations are as fluid as the flow of information.

Appily Ever After?

I was very interested to read this funny take  on psychology smartphone apps in the New York Times (by Judith Newman) – or more accurately, how NOT to build a psychology app. I just blogged about this general topic in my last post, and what struck me most about this article was the notion of time.

Image

Art by Emily Flake (published in the New York Times 4/5/2013)

This article seems to suggest that mental health apps should quickly and effortlessly facilitate our relationships, efficiency, and well-being. As Newman writes in the article:  “All of these apps require thought. Lots and lots of thought. Thinking is what I do all day long. I needed something that would turn my mind off, not on.”

Great point. Maybe we don’t want the app to be our shrink – because when we go to a therapist, we tend to have a set of expectations that involve spending a good deal of time and energy (unless we’re just looking for a medication fix). Apps, by their nature, are fast, easy, and mobile. So, most of us expect that a psychology app will be a shortcut to mental health. We shouldn’t have to spend time learning how to use the app or being on it too much – at least not so much that it’s taking away from “having a life.”

This view tells me that there is a potentially deep disconnect here: between what many of us in the mental health field think of as the promise of mobile health technologies and what everyone else thinks. Many psychologists see a future in which apps and computerized therapeutic tools break down barriers to treatment, which can be too expensive and intensive for many. For example, for the most common class of psychiatric disorder, the anxiety disorders, only about 20% of anxious people receive treatment! So, the psychologists are thinking, jeez, mobile technologies offer so many amazing possibilities for integrating mental health treatment into the daily life of people who are suffering.  Let’s create an app for that!

But we need to think through our approach carefully. If we just put the same old (frankly boring) computerized interventions on smartphones, will that actually help us reach more people? How many will choose to use these tools? Maybe some, but perhaps not many. Perhaps what most of us want from an app is the digital and interactive version of the self-help book – you can take it or leave it,  pick it up and put it down after a few minutes and still get something from it, and which doesn’t feel like just another source of techno-burden.

So, what is the take-home message for the mental health professionals? Make it fun, make it fast, and make it effective or get back to work on making traditional treatments better.

Gamifying Mental Health or: Mental Health – We Got Game

I just attended the second annual Entertainment Software and Cognitive Neurotherapeutics Society (ESCoNS) conference. Say that five times fast.  This conference brought together people in the gaming world with cognitive neuroscientists. I went because I’m developing (and testing) an app that I believe can help people reduce stress, worry, and anxiety in their lives. In addition to more deeply exploring how to make mental health truly fun, I felt that I was seeing the future of mental health unfolding before my eyes.

Gamifying mental health

Here are four ideas I think will change how the field of mental health will look in a decade (or less):

1. Mental health care WILL BE gamified. The mobile revolution and app zeitgeist have changed how we get things done. We want an app for everything because we want our life mobile and streamlined, and the minute we think we want to do something, we want a device to help us do it. We also are trusting ourselves (and our networks) more and professionals less. This is the self-help movement taken to a new level. If we can seek mental health support on our devices rather than through a professional, more of us will do so. This plays into our growing tendency to feel more comfortable with devices than with others – this may be good or bad, or somewhere in between, but this is how it is.  I believe that it is not whether mental health care will be gamified, it is only a question of how and when.

2. Fun will motivate mental health treatment seeking. Scientists interested in human beings understand how to break something down into its component parts (whether an idea, a behavior, or a biological response) to study it, but scientists are not trained to construct something that is fun and that motivates people to come back again and again. That is art and intuition, combined with a lot of experience and good old-fashioned luck. If we want to reach the greatest number of people, and help them integrate mental health interventions into their lives, we need to make mental health fun.

3. Training your brain….with video games? The idea that you could train your brain with video games is still perceived by many to be in the realm of science fiction. But if you think about the fact that every experience we have, particularly repeated experiences, change our brains – why wouldn’t a video game? This reflects the important concept of neural plasticity – that the structure and function of the brain is malleable and changeable not just in childhood, but throughout the lifespan. In addition to games that can train different abilities (e.g., attention in kids with ADHD) technologies like virtual reality are being used as safe and effective ways to treat everything from addiction to post-traumatic stress disorder.

 4. The Emotional Brain is a “buzzing” target for intervention. In the 20th century, psychology was dominated by cognitive theories of how the brain works and what causes mental illness. Emotion was a little blip on the screen, an irrational irritant to the otherwise rational, predictable, and orderly domain of the thinking mind. Now, that irritant is an increasingly important focus of research. For example, not much more than a decade ago, economic decision making was understood as a “rational” process. Now it’s assumed that emotions influence our decisions, for better and for worse, and the task is to figure out how. The effect of emotion is not “irrational.” Rather, it reflects the fundamental integration between our ability to feel and to think. Without one, the other is deeply impoverished. As an emotion researcher, my colleagues and I are happy everyone has caught up – it’s about time! Emotions are the engines of our lives – and of psychopathology. No real living happens in an emotional vacuum.

It was clear to me from the conference that there is an emerging field in which the gaps between clinical psychology, cognitive neuroscience and entertainment are being bridged. This field is fundamentally interested in the emotional and social brain and “healthy emotional brain architecture” will be the goal of many computerized, gamified interventions. Increasingly, people predict a (near) future in which games will routinely be prescribed in the doctor’s office, and may eventually replace the office visit. If we can change our emotional brains, we can change ourselves. At least, that’s what many are counting on.

 

Rebel Without a Status Update

I am fascinated by the psychology of Facebook status updates. There are many reasons to make a status update. One reason, of course, is obvious – let others know what you’re up to or share something that’s cool. For example, if I did frequent status updates, I might decide to post “Buying a fantastic ½ pound of Australian feta at Bedford Cheese Shop on Irving Place – should I up it to a pound?!” (and seriously, it is incredible). This may be an interesting snapshot of a day in my life, but these types of status updates are exactly the ones that tend to annoy me for some reason. Even the most benign version of this feels like TMI.

Why? Status updates are for many an instinctive way to reach out.  A recent study even showed that increasing the number of status updates you do every week makes you feel more connected to others and less lonely. Seems like a good thing! Moreover, it’s consistent with what seems to be our new cultural comfort zone – being virtually seen and heard by a loosely connected group of people we know (or sort of know) as our “social network.” This virtual network is the social status quo for many of us, and certainly for many children growing up today.

I believe one consequence of this is that no one wants to be James Dean anymore. Putting it another way, maintaining privacy and being a strong silent type, like Dean, are no longer alluring ideas to us.  And when I thought of this, I realized why I don’t feel fully comfortable with the status update culture – I am a proponent of the James Dean School of Sharing Personal Information in Public: motto, the less the better. I like understatement, privacy, the choice to share with a few and retain privacy with most.

 Image

It’s no coincidence that as a culture, we don’t fetishize James Dean any more. Many of today’s icons (some of them “anti-icons” because we love to feel superior) are people who humiliate themselves, who will tweet that they’re on the toilet and what they’re doing there, who end up in compromised positions, and happen to have pictures and videos of those positions, which then promptly go viral (funny how that happens). James Dean would have disapproved.

James Dean himself would have been very bad at social media…..or perhaps very, very good. Very bad, because he would have had little to say, and would have hated the constant spotlight and social media culture of ubiquitous commentary and chit chat. On the other hand, he might have been very good at it because he would have been the Zen master of the status update, expounding with haiku-like pithiness. An imaginary James Dean status update:

James Dean…….

Old factory town

Full moon, snow shines on asphalt

#Porsche alive with speed

But seriously, while he probably wouldn’t have written haiku, perhaps he somehow would have figured out how to use sharing to create a sense of privacy, because a sense of mystery would have remained intact.

Yes, the status update is a beautiful thing. We have an efficient and fun tool which allows us to reach out to others, curate our self-image, and think out loud to a community. But I wonder if we’re starting to lose the simple pleasures of privacy, of knowing less and wondering more.

Downton Abbey: Television for the Internet Age?

I love Downton Abbey. It hits a sweet spot of mindless pleasure for me. Yes, it’s really just a British-accented Days of our Lives, but it’s wonderfully acted, soothingly English, and with a few nice clever twists. In honor of the US premiere of the third season last night, I thought I’d bring my interest in things digital to bear on Downton Abby. “How?” you might ask. It all starts with Shirley MacLaine.

Shirley MacLaine, who joined the cast for the third season (already aired in the UK but just now airing in the US), was recently interviewed by the New York Times about why she thinks the show has captured the devotion of so many. As most of you probably know, it’s been a huge, surprise international hit. If I have my stats right, it’s one of the biggest BBC shows ever.

She made a comment that caught my attention. From the interview (verbatim):

Q. What about the show hooked you in?

A. I realized that Julian [Fellowes, the “Downton Abbey” creator and producer] had either purposely or inadvertently stumbled on a formula for quality television in the Internet age. Which means there are, what, 15 or so lives and subplots, with which not too much time is spent so you don’t get bored, but enough time is spent so you are vitally interested.

Photo: Carnival Film

This comment alludes to an idea that we’re all familiar with – because we’re constantly multitasking and skimming huge amounts of information in a superficial way in order to wade through our daily information overload, we have developed a preference for short snippets of entertainment rather than more in-depth (read intelligent/complex) material. We just don’t have the patience or bandwidth anymore for anything longer or more involved.

I think linking up the popularity of Downton Abbey with this notion is an interesting idea. Of course, I have no basis upon which to say whether Ms. MacLaine is right or wrong, but my instinct is that she is not quite right. Although it’s hard to avoid the conclusion that much of our entertainment has evolved towards less depth and more superficiality over the past decades, this drift towards the superficial precedes the internet age. Soap operas have been popular for a long time. Reality television was a well-entrenched phenomenon before the dominance of mobile devices made multi-tasking a daily reality. And come on now; look at the TV shows from the 1950s or 1960s: Not exactly sophisticated material across the board.  How much have we actually drifted towards the superficial? Maybe we’ve just always been here.

So, for me, this explanation doesn’t hit the mark. However, another way to interpret Ms. MacLaine’s comment is that we love having “15 or more subplots” (to avoid getting bored) simply because we enjoy multitasking. It’s not that we CAN’T focus our attention for a long period of time. We just don’t like to. Perhaps we prefer shifting our attention because it feels better/easier/more familiar to divide our attention among several things. Perhaps we just want to have it all.

In illustration, yesterday, I showed my four-year-old son Kavi a picture (on my iPad) of some of his friends. He liked it a lot but there was something about it he didn’t totally understand (it was a joke picture). Whatever the case, he thought “it was cool.” His dad, Vivek Tiwary, and he were having a little boys’ time watching Tintin, so I started to walk away with the picture. He completely balked at that claiming he wanted to look at the picture and watch Tintin at the same time. I asked him how he’d do that, why he would want to do that, etc,…. No coherent answers were forthcoming except his claim that “it’s better this way.” And indeed, he proceeded to watch the movie, glance down at the picture on my iPad, watch the picture, glance down….for the next several minutes. He seemed to be enjoying himself. He seemed to feel it was better this way.

For me, the take-home message here was that for my little guy, more was just better. Maybe that’s the secret of Downton Abbey as well: it’s just a whole lot of whatever it is that makes it special.

 

Islands in the Stream: A Meditation on How Time Passes on Facebook

Shortly after the terrible tragedy in Newtown, I received email notifications that my (designated) close friends on Facebook had made status updates. Scrolling through my news feed, my friends expressed the range of emotions that we all felt – horror, sadness, distress, anger, and confusion. Later that day, I popped onto Facebook again and was jarred and a little upset to read that friends who seemed to have just expressed horror and heartbreak were now posting about every day, silly, and flippant things.

Now, why should I be jarred or upset? Hours had gone by. After three, or six, or ten hours, why wouldn’t we be in a different emotional state, and why wouldn’t it be ok to post about it? I started to think that it was not my friends’ posts that were at issue here. Rather, it was the nature of how I perceive the passage of time and sequence of events on Facebook. A couple aspects of this came to mind:

Facebook time is asynchronous with real time. Time is easily condensed on Facebook. Events and updates that might be spread out over the course of a day or several days can be read at a glance, and therefore seem to be happening almost simultaneously. So, our perception of time on Facebook is a combination of how frequently our friends post and how frequently we check in. For example, say I check in twice in two days – at 9am on day 1 and at 9pm on day 2. I know a good bit of time has passed (and the amount of time that has passed is clearly indicated next to friends’ updates), but I still read each status update in the context of the previous ones – especially if I pop onto a friend’s Timeline instead of my news feed.

With this type of infrequent checking, friends’ updates about their varying and changing emotions (which might be reasonably spread out over the course of a day or multiple days) appear to be an emotional roller coaster. If someone has several posts in a row about the same thing, even if they are spaced days apart, the person comes across as preoccupied with the topic. Somehow, I form a view of this individual that brings these little snippets together into one big amorphous NOW. If I were checking more frequently, however, perhaps I wouldn’t lump updates together in this way. I’d “feel” the passage of time and – more accurately – see that the ebb and flow of status updates are like islands in the stream of our lives rather than a direct sequence of events.

RiverIslands_2

Related to this first point, it occurred to me that status updates are not meant to be interpreted in the context of preceding status updates. Our brains are pattern recognition machines. So, if Facebook status updates follow one after the other, our brains may perceive a direct sequence of events. But, each status update is a snapshot of a moment, a thought, or a feeling. Intuitively, they are supposed to be stand-alone, not readily interpreted in the context of a previous update, even if they occur close together in actual time. Think how different this is from our face-to-face interactions, in which sequence of events matter. For example, imagine that you’re at work, and your co-worker tells you she is on pins and needles waiting to hear back about a medical test. When you see her a few hours later, she is joking and laughing. You assume she either (a) got some good news from the doctor, or (b) is trying to distract herself from the worry. You don’t think she’s just having a good time, out of context of what you learned about her earlier in the day. But this contextualization is not the way it works on Facebook. Linkages between updates are tenuous, connections malleable. We can lay out our stream of consciousness in a way that requires no consistency among updates. Maybe the temporal and logical requirements of the off-line world are suspended on social networking sites like Facebook. Maybe our brains need to catch up.

In Love with the Written Word: Reading in the Digital Age

I was interested to see this commentary by five college students about reading in the digital age, posted on Zócalo Public Square. One of the things that struck me the most was my own anticipation that I would be out of touch with how college students are engaging with the written word today; and I’m a college professor who should be in touch! But actually, I found that the diversity of their approaches mirrors the same diversity I see among my peers.

digital readingSeveral seemed to express a need for speed and fast consumption of many (relatively superficial) sources of information in the attempt to swim rather than sink in the ocean of information that needs sorting through every day. Others seemed to feel burdened by this glut of information and feel nostalgic for the simple and physically-satisfying pleasure of holding and reading a book – a virtual luxury in our fast-paced lives because it’s hard to multitask with a book.  Among all the writers, however, I sensed information fatigue combined with enthusiasm for the written word.

My take-home message is that, whatever the future holds, the digital age has put writing, reading, and text at the center of our lives. I think we are becoming more rather than less in love with reading. The question is, what will we be reading and will it be grammatically correct ;-)?