Journalists’ role: Taking the amusement out of news

For anyone who’s been a J201 TA, this comic (thanks to Hans for sending me this link) speaks to what we spend the first few weeks of the course discussing – and what our students write their first paper on (or at least they did when I taught the class). I really enjoyed the distilling of Postman’s work into a dozen comic slides. 🙂

Also, this comic reminds us that while I was as eager as anyone to poke holes in Postman’s argument (and particularly his use of evidence), he made a really interesting point that society should debate. Is the glut of information actually making people less willing or able to act? I think Postman’s concern is amplified with the growth of new sources of information, such as blogs and open-source news. For example, despite the comparison of WikiLeak’s exposure of information on the war in Afghanistan to the Pentagon Papers, the differences trump the similarities: as Slate’s Anne Applebaum points out, what is any untrained eye to do with over 90,000 pages of information?

Postman is right in one point: too much information can be almost as damaging as not enough information, especially when society often has little incentive and no clear pathway to affect change. For all the concerns about journalists’ infusing their reporting with expertise – and don’t get me wrong, this is always a concern – journalists can use their expertise to help distill information and present a clear picture of what’s important (an argument made by Brent Cunningham). This is a role we still need journalists to perform, perhaps even more when information is abundant.

Media and Search Credibility

This will probably come to no surprise to anyone who’s taught an introductory university course that requires students to do research, but for all the time they spend online, students remain uncertain about how to find credible information. A new study by Northwestern University researchers demonstrates that when performing a wide range of information-seeking activities online, students often relied on the first link that Google provided, suggesting that coming up first on Google confers credibility. The students were able to recognize .edu and .gov as rating higher in credibility than other sites, but falsely included .org in their catalogue of credible sites, with most not realizing it is available for purchase like .com or .net (as I have demonstrated by purchasing vraga.org).

But while we shake our heads at their naivite, we might be missing the underlying cause. A recent poll shows that teens and adults alike trust technology firms like Google more than traditional media outlets – and even Facebook scored more highly than “the media.” Although this study has flaws – it is unclear how exactly “trust” or “the media” are defined – it demonstrates that students belief in the search results provided by Google may not be unreasoned.

Of course, that is not to say that it is rational. Google is known for offering little clarity on how its search rankings are returned. And with Google branching out to owning new businesses, including its purchase of ITA-software, which is linked to airline flight information, many are now calling for some kind of regulation to ensure equality and that Google does not unfairly favor its own interests. This call seems reasonable, for although Google does not have a clear monopoly over search, especially with the growth of its competitor Bing, it still maintains 65 percent of the search market.

Meanwhile, we are left pondering why our students trust technology firms like Google, Apple, and Microsoft, more than the media. But understanding that their use of Google for results is driven by their trust and faith in the company may provide the  key to deepening their understanding of the media environment.

MTV leads the way

It’s unusual for MTV to get praise for its television programming. But while other outlets and channels are getting criticized for their lack of positive portrayals of gay and lesbian characters, MTV has received the first-ever “Excellent” rating for their programming from GLAAD, or the Gay & Lesbian Alliance Against Defamation. And the primary source of these portrayals? MTV’s host of reality TV shows.

Meanwhile, other American broadcast networks scored much lower in this report. And this phenomenon isn’t limited to the US – the BBC has also come under fire recently for its lack of positive portrayals – and again, it is on reality TV shows that most of the gay characters surface. One point of comparison – while the British study appeared to focus on positive portrayals, there is little indication from the GLAAD study whether any portrayal of gay characters was counted, or whether only positive portrayals were valued.

This isn’t the first time television studios and networks have made a push to portray a discriminated group more fairly and equitably, with portrayal of black characters perhaps the best example of this. It also raises questions on how much is “enough” – for example, should programming reflect the true proportions of each group in the population? Would that be fair for very small groups, who might then be seen very little? Cultivation theory and social comparison reserach suggests that seeing a wide variety of groups on television can shape our understanding of the world, so it seems reasonable that we would encourage these positive portrayals. But how much is enough?

Hot summer without a cause

2010 is looking to overtake 2005 as the hottest year ever recorded on the planet. But the heat isn’t the only thing unusual this summer – for example, the torrential downpour that hit the Midwest last week is also causing dams to collapse, airports to close, and residents to seek alternative housing, as their own homes remain filled with water and debris.

But with these environmental disasters, one thing has not changed: people remained equally unconcerned in May about “global warming” as they did at the beginning of 2010 – and even in July, global warming is not one of the public’s top priorities. The government has responded to this lack of emphasis by the public, with Senate Democrats abandoning – at least temporarily – their efforts to produce legislation to curb greenhouse gas emissions.

What is perhaps most interesting about this current “climate” is that few articles – even in many stories about the heat waves – mention global warming. Research has suggested that hotter local temperatures are linked to more discussion of global warming (Shanahan & Good make this point in their 2000 article, Heat and Hot Air: Influence of Local Temperature on Journalists’ Coverage of Global Warming). And while it is foolish to suggest that any one event is “caused” by global warming, the trend certainly matches scientists’ predictions this summer.

Meanwhile, because “global warming” does not accurately cover the range of outcomes expected from a rise in temperatures, many scientists prefer the term “climate change,” while Thomas Friedland of the New York Times has suggested the term “global weirding.”

While these redefinitions may be more accurate – and eventually extend people’s concerns about greenhouse gases beyond heat waves – this shifting terminology makes it harder for both the public and news organizations to grasp and define the concept. And while it is not this change in terminology that caused lack of coverage currently – instead, it is bad timing, with so much attention focused on the poor economy and job creation – the lack of clear name also make confusion and dismissal more likely. The term we use to describe something is very important in determining attitudes, so scientists, politicians, and journalists alike need to choose and use a single term to describe the phenomenon and focus on helping the public understand the real effects – and not just the heat – that can result from climate change.

In Online Journalism, Burnout Starts Younger – NYTimes.com

In Online Journalism, Burnout Starts Younger – NYTimes.com.

In honor of this story, today’s post will be brief (for me at least). This article details changes in the culture of the news environment, with the change from a 24-hour news cycle to the 24-second news cycle. With news coming out constantly, there’s a lot of pressure on reporters to be the first out with the news, to avoid being “scooped.”

There’s been a lot of debate about what this change means in terms of the news output – whether we are losing in-depth analysis, journalists’ expertise, and forsaking the investigative journalism – which requires a substantial amount of time – for quick hits on what’s been said. In the introductory journalism class here at UW, we spent time discussing this very possibility – would Watergate have been uncovered in this media environment?

But this article reminds us of another important toll this shift may be taking – on young reporters. The stress in always finding the next big story – and apparently being judged by your readership – must be enormous. How will this affect those who are looking to enter the business of journalism? And there is, of course, the link back to the product – how will stressed, over-tired, and burned-out journalists produce quality news stories for public consumption?

With that, I’m taking the weekend off. 🙂

Facebook: Growing out-of-control?

With the recent news that Facebook has surpassed 500 million readers, or 1 out of every 13 people on the planet, it is worth considering its implications. Facebook remains the most popular social networking site across a host of countries, beating out other sites such as Twitter, MySpace, and Flickr (for a comparison, see here), and continues to grow rapidly. Not only is its user base growing, but users in the US average over 6 hours per month on the site in 2010, up over an hour from just one year ago.

But despite the news that Facebook is a rapidly growing phenomenon, not all the news has been rosy. Facebook and MySpace both scored relatively low in terms of customer satisfaction, behind sites like Wikipedia and YouTube, as well as all news sources – of which Foxnews.com had the highest rating.  Privacy woes have plagued Facebook, as public backlash to their new privacy settings forced them to change their policies. Concerns about children on Facebook being exposed to paedophiles also abound (although in this story, a girl used Facebook to alert authorities to sexual assault).

But perhaps scarier than the rest of these stories are concerns about how the new digital world in which we live – and that Facebok is a large part of – affects our ability to construct our own image. This article sums up these fears very well, reminding us of the host of concerns that follow people posting so much information about themselves online: the impact on careers, the inability of living separate lives depending on context, and the potential for past mistakes to haunt us forever, as they remain in the public domain. The article also provided some interesting potential solutions: from image doctors, to having a reputation score (much like your credit score), putting an expiration date on old information, or that society will learn to accept, forgive, and forget others’ past mistakes.

It’s a scary notion, and one that all of us need to consider as we post our pictures, update our status, and maintain our blogs. What trail are we leaving for others to follow years later…and how will our posts of today influence our lives tomorrow?

Refudiating my previous post

In a previous post, I discuss the potential of social networking and its use by a host of Republican candidates. In it, I note that Sarah Palin has been widely praised for her use of these new resources to communicate with the public, from her Facebook page, which boasts nearly 2 million fans, to her new advertisement on YouTube.

But Sarah Palin does employ Facebook and YouTube, but also makes use of Twitter, with 200,000 followers and over 350 tweets. But in the interest of fairness, given the earlier praise of her use of social networking, Palin’s been taking some heat this week for her recent tweets, in which she not only makes up a word, “refudiate,” but then defends her use of the word, comparing herself to William Shakespeare, who also “coin[ed] new words.”

Although it makes Palin sound somewhat silly to use the word “refudiate” on multiple occasions, it is truly her defense of the action that makes this a mistake. Palin is right in her defensive tweet – English is a continuously evolving language and people make up new words often (myself included!). Stephen Colbert is famous for this, with his “The Word” segment, and most particularly, “truthiness.”

But Palin was not being satirical or clever, but presumably was trying to make a point about an issue. It is one thing to make up words as a joke or among friends, but it is another to make a professional error and then claim it was intentional. For Palin to defend what was, most likely, a mental error or a typo, makes her look more foolish than she did originally.

The power of apology

No one could have missed Apple’s recent “woes,” with the release of the iPhone 4 – if a company over a million and a half units in their first weekend can have woes. Part of the cell phone debates that I wrote about in previous posts have centered on the problems in reception for the iPhone 4. So what’s Apple to do?

Slate’s Farhad Manjoo suggested that Steve Jobs apologize for the problems with the iPhone, as well as offer a more practical solution – a case to limit reception problems. And that’s just what Steve Jobs did, despite the New York Times’ earlier skepticism. But response to the press conference has been mixed: Manjoo remains unsatisfied with Jobs’ condescension, while the BBC reports technology reporter Maggie Shiels thought Jobs’ performance was “sterling.”

But even as Jobs apologized, he also attacked the media for what he termed unfair criticism of their product. Apple’s problems were certainly exacerbated by the Consumer Reports recommending against purchasing the iPhone 4.

So was this an effective response? Despite the mixed reviews, Jobs did offer free iPhone cases to help minimize the reception problems. But at the same time, he limited the effectiveness of his apology through his defensiveness and his attempts to minimize the problem. I agree with Manjoo – with a sincere apology, Apple stood to gain a lot of credibility. But by belittling those having the problem and making it seem like an unfair attack by the media, people will be much less inclined to offer Apple credit – no one likes being told that a flaw in something they are committed to isn’t that big of a problem. Apple has depleted its “reservoir of credibility” in this event, and they might find it difficult to rebuild. But at the same time, Matt at Gizmodo.com is right – it’s time to move on. iPhone users have to make a choice: accept a less-than-ideal solution or take advantage of Jobs’ offer to return their phones, while the rest of us can admire our intelligence in avoiding the iPhone 4.

Social media campaigns: Successes and failures

As we head into the 2010 elections, a lot of candidates, especially Republicans, are attempting to build on Obama’s successes from 2008. And there were some important lessons to be learned: Obama’s campaign forced many of us to rethink how successful political campaigns can be run. In particular, Obama’s use of social networking and video-sharing websites provided a template for future campaigns.

Some candidates have used these social networking ideas successfully, although perhaps none with so much acclaim as Sarah Palin. Slate claims she is “the most successful adopter on Facebook,” but also points out that many Republican heavy-weights, including Newt Gingrich and Mitt Romney, are moving to take advantage of these new opportunities to communicate with voters.  Meanwhile, Time magazine praises a new Palin advertisement, which she posted both on her Facebook page and on her YouTube channel, as marking her as a real player in both the 2010 elections and the 2012 presidential campaign. And her advertisement does appear to be effective: it has a clear target audience (American moms) and a clear appeal (anti-DC policies).

But that doesn’t mean that all candidates have been equally successful in harnessing the power of social networking. My current favorite isn’t a big-name politician, so maybe it isn’t fair to compare their efforts to Palin’s, who surely benefits from many consultants advising her in her campaign. However, no one should produce an advertisement this bad and still expect it to benefit their campaign. Perhaps the candidate believed that emulating “Glee” would endear them to younger voters, but this candidate needed to think much more carefully about who he is targeting and what message he is trying to convey.

Ultimately, what makes a good social media campaign is the same thing that makes any good campaign: a clear target and a clear message.

Always available

Although it isn’t a new issue, an article today in Slate reminded me of my love-hate relationship with cell-phones, smart phones, and any device that encourages people to be more “accessible.” The Slate article argues that it is being constantly available that is so very draining on adults in today’s society – and reminds me of the claims made about texting and availability among teens. I think that what she says has a lot of merit. I remember when I was a teen, when people use to “leave messages” when I was out of the house. It was nice not feeling like I always had to respond immediately whenever something comes up, especially because so often this “something” is not especially important or can wait. And I can definitely validate the changing expectations that society has – I once had a student complain that four hours had passed and I had not responded to an email asking a question. After this complaint, I always made sure at the beginning of the semester to highlight my “24-hour email response policy” but I also did agree to notify the students when I was away for an extended period of time – such as when I was out-of-town.

Meanwhile, I can also see the benefits of being so connected. I own an Andriod phone, and I do love that I can check my email any time and from any location. Just yesterday, I recognized again my reliance on my phone. When I was supposed to meet my husband after he dropped off our car for repairs, I realized I didn’t remember the name of the place or the location – and more importantly, I didn’t have my phone so I could look those things up or call my husband to see where he was. We were able to sort it out (I remembered the location well enough to see him walking to Wendy’s in hopes of finding me), but there was an “oh-shit” moment when I realized I didn’t have my phone and I was going to have to do things the “old-fashioned” way. I rely on my phone to be able to always check my email for information, use my GPS for directions, or to call someone when trying to meet up.

My personal solution for dealing with this enjoyment of being able to access the Internet without wanting to always be accessible is simply not to be accessible. I am the first to admit that I don’t always answer my phone, even if I see it ringing. There are times when I am not in the mood to talk or just want to be away (for example, dates are strictly phone-free, a policy that required some enforcement at the beginning of my relationship). Even when I get emails, if I am out-of-town, I often won’t answer unless it is urgent. Some may say it is hypocritical – and maybe it is. But it is the solution that has worked best for me, and what is wrong with having your cake and eating it too?

I’m eager to hear how other people have dealt with this issue. Have you avoided getting a smart phone? Only answer at certain times? Or answer whenever you can?