530_emoderation_wp_managing_social_media_0911

19
White paper: A guide to managing social media for news sites and media organisations September 2011 For more information call Tamara Littleton on +44 (0)20 3178 5050 or visit www.emoderation.com eModeration Limited :: The Media Village :: 131151 Gt Titchfield St :: London :: W1W 5BB :: UK

Upload: interbrand

Post on 06-Mar-2016

212 views

Category:

Documents


0 download

DESCRIPTION

http://www.brandchannel.com/images/papers/530_emoderation_wp_managing_social_media_0911.pdf

TRANSCRIPT

Page 1: 530_emoderation_wp_managing_social_media_0911

White paper: A guide to managing

social media for news sites and

media organisations

September 2011

For more information call Tamara Littleton on +44 (0)20 3178 5050 or visit www.emoderation.com

eModeration Limited :: The Media Village :: 131–151 Gt Titchfield St :: London :: W1W 5BB :: UK

Page 2: 530_emoderation_wp_managing_social_media_0911

2

Contents

Approaching social communities: before you start ................................................................ 5

Set guidelines – for both users and journalists ........................................................................... 5

Escalation processes .................................................................................................................... 6

The legal issues of user-generated content on news sites .................................................... 13

Creating engaging communities ............................................................................................. 16

The future for social news .......................................................................................................... 18

About eModeration ................................................................................................................... 19

Page 3: 530_emoderation_wp_managing_social_media_0911

3

These days, news breaks on Twitter. MSNBC’s Breaking News Twitter feed

has (at the time of writing) just over three million followers; BBC Breaking has 1.5 million

and CNN has 2.2 million. These are in addition to the myriad of newsroom feeds, channel-

specific feeds and, of course, journalists’ own feeds.

This paper looks at how media organisations use and manage online communities. It‘s

worth stating that it doesn‘t specifically address how news organisations use social

campaigns to market themselves, but focuses on how best to engage and manage users

within ‗owned‘ communities (forums, comment sections, Facebook pages and Twitter

feeds).

When Hurricane Irene hit the US in 2011, she had her own Twitter account (borrowed for

the duration of the hurricane from a woman of the same name).Mobile phones have

played a huge part in communicating the uprisings in the Middle East. When a US Airways

plan landed on the Hudson River in early 2009, pictures were uploaded to Twitter from

mobile phones during the rescue operation, before news teams had reached the scene.

Twitter has published a ‗Twitter for newsrooms‘ guide, to ‗help creative professionals in

news, TV, sports and entertainment use Twitter effectively‘ with details of publishing tools,

using Twitter to source information, and even examples of journalists using Twitter to

engage with audiences (Katie Couric and the Washington Post‘s Melissa Bell are cited).

Page 4: 530_emoderation_wp_managing_social_media_0911

4

As consumers, we don‘t just receive news passively any more. We expect to comment on

the big stories of the day, interact with journalists and share our thoughts with a

community of other readers. Journalists are also bloggers, and bloggers journalists. The

big publishers all accept user-generated content on their sites, whether on blogs, articles,

videos, or forums, and with or without involvement from journalists or community

managers. According to a study of Nielsen audience stats by the Pew Research Center‘s

Project for Excellence in Journalism, Facebook is the third biggest referrer of traffic to

newsrooms (interestingly, Twitter still barely registers – it seems we want topline information

from Twitter only).

Of course, audience participation in media goes beyond the newsroom. Text-to-screen

technologies mean that we can air our opinions on TV via a mobile phone, from the

comfort of the sofa. New TV shows regularly get their own Twitter hashtags (the

Apprentice has become a multi-channel series) and Facebook discussions grow around

breaking stories or news based events, such as the Royal Wedding, or the Obama Town

Hall debate on MTV. In Sweden, one regional newspaper has gone so far as to open up

its editorial decisions to consumers, creating an ‗online open newsroom‘ where readers

can suggest and discuss stories via a tool called ‗eEditor‘, created by CoverItLive.

Traditional approaches to restricting access to news don‘t work anymore, either, as UK

footballer Ryan Giggs found out when his legal team tried to apply conventional privacy

injunction laws to Twitter –

newspapers were left unable to

report what millions of consumers

had worked out from Twitter.

Super-injunctions become even

less effective when you consider that the public could not possibly comply with them

since they don't even know they exist, nor is the majority of the global Twitterati bound by

UK legal rulings.

Page 5: 530_emoderation_wp_managing_social_media_0911

5

Approaching social communities: before you start

As James Hohmann of Politico says in his paper 10 Best Practices for Social Media: Helpful

guidelines for news organizations (written for the 2010-11 ASNE Ethics and Values

Committee):

―Putting in place overly draconian rules discourages creativity and innovation,

but allowing an uncontrolled free-for all opens the floodgates to problems and

leaves news organizations responsible for irresponsible employees.‖

The paper is well worth reading, and outlines the social media guidelines of all the

major US news outlets.

Set guidelines – for both users and journalists

Guidelines for journalists

There are practical reasons for setting a social media policy for journalists, as shown by

new guidelines released in July 2011 by the Associated Press, which states:

―Everyone who works for AP must be mindful that opinions he or she expresses

may damage the AP‘s reputation as an unbiased source of news. AP

employees must refrain from declaring their views on contentious public issues

in any public forum and must not take part in demonstrations in support of

causes or movements. This includes liking and following pages and groups that

are associated with these causes or movements. Sometimes AP staffers ask if

they‘re free to comment in social media on matters like sports and

entertainment. The answer is yes, with a couple of reasonable exceptions.

First, trash-talking about anyone (or team or company or celebrity) reflects

badly on staffers and the AP. Assume your tweet will be seen by the target of

your comment. The person or organization you‘re deriding may be one that an

AP colleague is trying to develop as a source.‖

Page 6: 530_emoderation_wp_managing_social_media_0911

6

The guidelines also warn against the possibility of bias:

―It is acceptable to extend and accept Facebook friend requests from sources,

but we should try to avoid situations that may jeopardize AP‘s reputation by

giving the appearance of bias.‖

Guidelines for users

It‘s worth spending some time writing clear, readable

guidance for users on what is and what isn‘t acceptable

behaviour on the site. Long, jargon-filled terms and conditions

may be necessary for legal reasons, but most consumers won‘t

read them. Highlight the most important things you want users

to do in plain language, with a link through to the more

detailed terms. This will help to set the tone and values of the community from the outset.

Be clear what action will be taken if the rules are broken (the post will be removed, in

extreme or repeat offence cases the user will be blocked, and if illegal action is taken,

then the appropriate authorities will be notified, etc) and – importantly – take that action.

The Guardian's community standards are a good example to follow.

Escalation processes

Set clear escalation processes so that the community managers can respond quickly

and appropriately to issues. A subscription problem will have a different escalation

process to a security threat – make sure procedures are in place for various levels of

issues.

Page 7: 530_emoderation_wp_managing_social_media_0911

7

Response policies

Do you want journalists to remain involved with the community after their article is written

and posted? We know from our own experience that where there is evidence of an

authoritative voice within the community, there are fewer issues with abusive user

behaviour. It also cuts down the amount of spam posted to the site. But it can be

problematic.

Dealing with angry or abusive responses can be challenging for reporters, particularly

when the abuse is personal. If you expect your reporters to engage within the

community, set clear response guidelines and offer training in dealing with negative or

personal comments.

Reuters‘ policy for journalists (reported in Hohmann‘s paper) states:

―Think before you post. One of the secrets to social media‘s success is how easy

it has become to participate. But that also makes it easy to respond or repeat

before you have thought through the consequences. Whether we think it is fair

or not, other media will use your social media output as your news

organization‘s comment on topical stories. And you will play into the hands of

your critics unless you take care: Resist the temptation to respond in anger to

those you regard as mistaken or ill-tempered.‖

There may even be an argument for a third party (the site‘s community manager, for

example) to select posts for journalists to respond to, in order to reduce the chance of

posting in anger. AdAge published a great story on the ‗chapter missing‘ from Twitter‘s

newsroom guide, citing a ‗throw down‘ on Twitter between Jeff Jarvis and Jeff Bercovici.

It‘s very easy to respond in anger with the online world

watching.

Page 8: 530_emoderation_wp_managing_social_media_0911

8

Dealing with reader comments

You don‘t have to be a social media expert to know that there are certain types of

stories that are more likely to generate abusive comments from readers. Anything

covering emotive issues - religion, race, sexuality, war, politics - is likely to attract strong

feelings from readers. Recognise which stories are likely to generate strong emotions, and

deal with them appropriately:

Make sure you adhere to the terms of the site. Don‘t allow abusive or threatening

posts

If appropriate, move the discussion elsewhere, for example to a forum that is

separate from the main news site. This distances the user comments from the

journalists and allows you to treat it slightly differently from a normal article discussion

If you are employing moderators (which we would strongly advise) you need to

scale up and brief them accordingly

Moderation options for reader comments

There are four options:

1. Pre-moderation (all comments are moderated

before they go live). This is obviously the safest

route to take to protect the reputation of the

organisation and is the one most serious news organisations opt for. However, it has

a major disadvantage in that there will be a time lag and many comments may

never be screened due to high volumes

2. Post-moderation (all comments are moderated after they go live, and removed if

they are abusive). This is slightly more risky, as inappropriate comments will be seen

by the community, and associated with the media brand. Again, there is danger

that high volumes may mean that not all comments are screened.

Page 9: 530_emoderation_wp_managing_social_media_0911

9

3. A combination of pre and post moderation. Reuters has ‗approved‘ commentators

– people who have a strong record of behaving well in a community and having

their comments approved – who can move away from pre-moderation to post-

moderation. There is a cost advantage to this approach, as you don‘t need round-

the-clock moderation

4. Relying on the community to moderate comment s by flagging them. This is the

most risky strategy, and has the most potential to damage the brand. In the UK,

however, unlike the US, there is still some confusion around who is responsible for

user-generated content on a news site, and some news organisations take this route

to try to cover themselves under the EU Commerce Directive hosting exemption. But

in practice, legal rulings on the use of the Hosting Defence have varied: it's an

extremely grey legal area. In the main, brands take their duty of care very seriously,

and agree that it is important to moderate thoroughly, to protect the brand‘s own

reputation and, of course, its users.

Moderating comments on Twitter

Of course, you can‘t moderate what people say on Twitter – and creating hashtags for

particular subjects can be a risk, as there‘s no way of moderating a live feed. What you

can do is to moderate Twitter feeds that are pulled into websites, to avoid being

‗brandjacked‘ on your site or Facebook page.

Dealing with text-to-screen comments

The only way to do this is to pre-moderate all comments using technology and human

moderators. It‘s not worth the risk to the organisation‘s reputation if an inappropriate post

gets through to a screen in a live TV environment. And remember that a Twitpic can

make the offending tweet viral in seconds ...

Page 10: 530_emoderation_wp_managing_social_media_0911

10

Blocking comments

Blocking comments completely is an option in some cases, for example those stories most

likely to incite a strong reaction from the community, or on coverage of a legal case. The

Portland Press Herald blocked user comments entirely in late 2010, in response to ‗vile,

crude, insensitive and vicious postings‘. (That policy has since been reversed and

comments are enabled, but with a strict no-bullying policy made clear at the top of

each comments page.)

But there is a real difference between negative and abusive comments. Inviting

commentary on news articles will generate differences of opinion and healthy debate.

Views that disagree with those of the news organisation should be accepted (and

actually help to make the site feel authentic, open and honest). But there is no need to

put up with abuse, personal comments or spam. Equally, libellous comments should be

removed. Although in the US publications aren‘t legally responsible for libellous postings

on their sites (unless altered by the organisation – see our legal section, below), this is a

more grey area in the UK, and it is good practice to remove anything that could drag

you or your users into a court of law.]

Some news organisations choose not to publish comments that are very similar to ones

that have run before, if they don‘t add anything to the debate. In Reuters‘ words:

―Some of the guidelines for our moderators are hard to define precisely.

Mocking of public people can be fair sport, for example, but a moderator that

has just approved 30 comments calling someone an idiot can rightly decide

that there‘s little incremental value in publishing the 31st. When we block

comments of this nature, it‘s because of issues of repetition, taste or legal risk,

not political bias.‖

(Note: at the time of writing, there appears to be no engagement with the commenters

under this particular post.)

Page 11: 530_emoderation_wp_managing_social_media_0911

11

Allowing users the right to contest a decision?

If a user contests the decision to delete a comment, or block them, don‘t enter into a

discussion within the thread itself, but move the discussion elsewhere (to a contact

centre, for example). As one of our community managers, Tom Miller, says:

―It's unwise to respond to the user who is protesting a moderation decision within the

thread itself. Any limited discussion tends to snowball into more users asking 'but why

X if not Y?'. It‘s best to direct it through to a contact centre. If the protest starts a

new thread, close it with something like 'Hi xx, we don't discuss moderation issues on

the board itself. Please get in touch at xxx and the forum support team will be happy

to help'. If it‘s an old thread, the posts should be split to archive and a post put up

along the lines of 'A number of posts have been removed from this thread. We don't

permit discussion of moderation actions on the board itself but are happy to discuss

via xxx'.

―At the contact centre, after you've explained the decision, if the user still isn't happy

and you see no reason to alter the decision, you need to close the discussion: an

almost-mean 'we're perfectly within our rights to determine who can and can't use

our services and if you can't follow the T&Cs then you're out. No further discussion will

be entered into' can work for repeat offenders or particularly silly banned users.

There is of course a nicer approach for people who‘ve just got it a bit wrong, saying

something like: ‗we hope you can understand our reasoning and change how you

post next time‘.‖

Don‘t enter into a

discussion within the

thread itself, but move

the discussion elsewhere

Page 12: 530_emoderation_wp_managing_social_media_0911

12

Dealing with mis-information

Often, the community will pitch in to correct mis-

information on comment threads. Serious mis-

information posted within the community should be

corrected (and if it is defamatory, deleted). But it‘s

not always readers who make mistakes. In this

Mediashift post, the author Nathan Gibbs gives

excellent advice on how to correct (and delete)

erroneous posts on social media platforms, including how to publicly acknowledge the

error and notify those who have shared it.

Dealing with anonymous comments

Asking users to register on the site – whether

that‘s using a social network ID or a separate

registration process - can reduce spam and

abusive comments, and in some cases means

the news organisation could hand over details

in the event of a lawsuit. (It‘s worth noting that

depending on the comment software used, an

IP address is usually logged with the hosting

site, even if there is no registration process.) Of course this is not a foolproof system – as

with any social media accounts, it‘s fairly simple to register false information, but it does

helps to reduce the problem, although there is a genuine fear that it will reduce

participation... The counter-argument is, of course, whether this inhibits free speech, as

explored in this article on Gigaom. Also, linking comments to a person's real Facebook

persona, for example, could be potentially dangerous: as one of our Account Managers

put it: "With so many clients, we go out of our way to remove PI [personal information]-

including real names. I don't want to be tracked down on FB just because someone

doesn't like my pro-West Ham remarks on the FA.com. Abusive remarks hiding behind

anonymity seem to me to be a lesser evil, and more easily mitigated against."

Page 13: 530_emoderation_wp_managing_social_media_0911

13

Major news sites, including the New York Times, the Huffington Post and the Washington

Post decided not to allow anonymous comments after a review in 2010; the Sun

Chronicle has gone a step further, charging users a nominal one-off fee of 99 cents to

register and comment. This is payable via credit card, which means users have to register

their real name and address.

The legal issues of user-generated content on news sites

Before we get into the legalities of social media and newsrooms, I should say that we are

not legal experts. But we asked for advice from our legal partners in the US and the UK on

the legal issues facing news organisations in creating communities online, and their

guidance is below. Of course, news organisations should seek their own independent

legal advice before making decisions about the inclusion of user-generated content.

In the US, the law is clearer than in the UK in protecting news organisations from liability

following a user-generated comment on its site. Deborah Peckham, of Burns & Levinson

LLP, says:

―Under U.S. law it is clear that a news organization is not responsible for the

potentially defamatory content of third party comments posted online. Section

230 of the Communications Decency Act (47 USC § 230 provides ―No provider

or user of an interactive computer service shall be treated as the publisher or

speaker of any information provided by another information content provider.‖

Accordingly, online service providers, including news organizations that publish

online, are treated merely as ―distributors‖ of online content and not publishers,

for purposes of assigning liability.

―Furthermore, online news providers generally will not be accountable even if

they edit or remove comments, provided that the editing of content doesn‘t

materially change the post or comment (for instance by adding a word or

removing a word that results in defamatory content).‖

Page 14: 530_emoderation_wp_managing_social_media_0911

14

This principle applies to all social platforms. We asked if there was any difference in legal

liability for user-generated content between a platform such as Twitter and a publisher.

As Peckham says:

―Under U.S. law there is no difference. The critical element is that the

publication/distribution of comments occurs online and that the comments are

spoken/posted by a third party that is not an employee of the news

organization. Under such circumstances, there is no liability to the news

organization (or the communications platform, e.g., Yahoo! Or Twitter!)‖

In the UK, this is a more complex issue. The EU Commerce Directive‘s hosting exemption

removes liability for certain sites from comments posted by users provided that the

content has not been previously seen by the publisher, but it isn‘t completely clear to

which sites that exemption applies (we wrote a more detailed post on this issue, here).

On the subject of defamation, Rachel Boothroyd, of WTS Legal in the UK, says:

"There is a major difference between hate comments and defamation. The first

is an unpleasant or negative expression of opinion. There is nothing unlawful

about this unless it strays into the realm of the obscenity laws. The second is

unlawful and a recourse via the courts on the basis that it is an untrue

statement which would lower that person in the eyes of the right-thinking

public.

"So let‘s consider the defamation issue. Display of defamatory comments

beneath a news article would attract the same analysis as any potential liability

for defamation issue online. The key question is whether the organisation is

merely hosting the material and thus may benefit from the exemption under

the E-Commerce Directive (subject to notice and take down rules) or whether it

is publishing the material in which case it does not benefit

from any exemption.―

However, no publisher wants their brand to be associated with

bad material, and the risk of being sued for defamation is not

the only risk that they are running: by only reactively

Page 15: 530_emoderation_wp_managing_social_media_0911

15

moderating, they are potentially hosting illegal content in the form of obscenity, terrorism,

racial abuse, child endangerment.. the list goes on.

In the US, it is only changing the sense of a user‘s post that could result in the publisher

being liable for that content. Peckham says:

―For instance, if the news organization changes the sentence ―Mayor Dobbins is

not a thief,‖ to ―Mayor Dobbins is not a thief,‖ then the new organization may

lose its immunity from publisher liability for defamatory content under the

Communications Decency Act.‖

News organisations in the US and the UK have no legal obligation to hand over users‘

details. As Peckham says of the US:

―Under U.S. law there is no requirement that service providers or publisher

release identities. Typically, aggrieved subjects of allegedly defamatory

content will seek legal redress in the form of a court order requiring disclosure.

While the law is unsettled, most courts that have heard such challenges require

that the complaining party submit adequate evidence to sustain a defamation

claim before they will issue an order to a publisher requiring disclosure. That

said, as a practical matter many service providers, including news organizations

will post terms of use that purport to notify posters that they will (or will not)

identify posters with or without a court order. Many simply do not want to be

sued or appear in court and so will reserve the right to reveal the names of

authors so as to avoid legal proceedings.‖

Rachel Boothroyd says of the UK:

―In the UK, there is no legal right to access. In fact, it is arguable that under

Data Protection laws the news organisation would be in breach of its

obligations if it gives away personal information relating to a subscriber or

otherwise a person submitting comment. For these and other reasons, many

news organisations and ISPs do not as a matter of policy provide such details

and a complainant must obtain a court order compelling such disclosure.‖

Page 16: 530_emoderation_wp_managing_social_media_0911

16

Creating engaging communities

Set the tone for each community. Different communities (and channels) require different

handling. Meg Pickard, head of digital engagement at the UK‘s Guardian newspaper, is

quoted in an article by Emma Heald on editorsweblog.org as saying Twitter is a more

"conversational medium" than Facebook, and has a "real immediate journalistic aim,"

while Facebook is more about sharing content with friends. In the same article, Heald

goes on to say:

―Pickard made an interesting differentiation between the way the paper uses

Twitter and Facebook in terms of the type of content it posts to each. With

Facebook, it is necessary to remember that this is more social territory, Pickard

said, and that the news from media organisations will be mixed in with news

from users' friends.‖

Apply resource. Managing a community takes time and effort, both from moderators

and community managers and from the journalists who created the original content. In

another article by Heald, she quotes the New York Times social media editor, Jennifer

Preston, as saying:

―The journalists have to own the page. I warn them that it's like a puppy... you

have to be prepared to commit.‖

Have conversations, don‘t just broadcast information. The secret to a valuable

community is to listen as well as to feed information out into the community. Asking

questions, listening to users and ascertaining the mood of the community works as market

research for media organisations and can help them develop stories that their readers

want.

Page 17: 530_emoderation_wp_managing_social_media_0911

17

Stay involved. A journalist‘s personal brand can support that of his or her media

organisation, and inspire loyalty, according to Brandmeajournalist.com. Although, of

course, this also poses a risk for the organisation – if a journalist leaves, his or her followers

leave, too. Facebook has a guide for journalists on how to set up their Facebook pages,

including the following advice:

―Ask questions, solicit feedback, and use the wisdom of the crowd. Pages

enable you to not only have a dialogue with your community around the

content you're producing, but also enlist them in the process for

crowdsourcing.‖

Here is a useful guide about the kind of posts that will engage (including posting out

of hours).

Dan Gillmor‘s book, Mediactive, cites the example of Robert Niles:

―Robert Niles, who has created a number of online services including the

award-winning Theme Park Insider.com says that tomorrow‘s journalists will

need to be community organizers—and that you‘ll need to understand that the

eople who pay the bills, not just the audience, comprise one of the

communities you‘ll need to organize and

serve. This is true for a one-person effort or a

larger one.‖

―Know what you‘re doing online,‖ Niles

says. ―Embrace community organizing;

create value for a community… [and] you

will find a community that will value you.‖

Page 18: 530_emoderation_wp_managing_social_media_0911

18

The future for social news

There‘s an excellent article in The Economist which claims that news is going back to its

traditional roots before mass media: a time when it was spread through word of mouth in

coffee houses and taverns, via leaflets and newsletters. Those coffee house

conversations are now held on Facebook, and Twitter has replaced the pamphlet. But

the basic human behaviour – sharing information and views on current events – hasn‘t

changed.

While of course social media doesn‘t cause revolutions, it plays a part in communicating

those revolutions to the outside world, faster than we have ever known (and in doing so

facilitates them, provides critical mass to nascent movements). A consumer with a

mobile phone is as likely to provide that front page story as a journalist in situ. The

traditional news model – one way communication of news – has shifted for good. But

consumers are still seeking out reliable news sources over these new channels, just as

journalists are seeking out reliable sources.

Managed properly, the engagement between news media and consumers could

benefit both sides immeasurably.

Page 19: 530_emoderation_wp_managing_social_media_0911

19

About eModeration

eModeration Limited is an award-winning social media management agency. Based in London

UK, with offices in Los Angeles and New York, eModeration provides multi-lingual moderation and

community management services, consultancy and social media crisis management training to

clients in the TV, entertainment and digital publishing industry and blue chip clients hosting online

communities.

Committed to ethical business practices and to the promotion of child online safety,

eModeration's CEO Tamara Littleton recently worked with the UK Government department

UKCCIS to produce its guidelines on how to moderate online environments for children.

eModeration contributes to the growth of knowledge in the social media world via its white

papers, blogs and seminars, and has a strong roster of returning clients who appreciate the high

quality of its services.

For further press information, or to speak to Tamara Littleton, CEO of eModeration, please contact:

Kate Hartley

Carrot Communications

Tel: +44 (0)771 406 5233

E: [email protected]

Twitter: @katehartley

© eModeration Limited 2011. This document is the intellectual property of eModeration Limited and may not be

duplicated or disclosed to any third party without the written permission of an authorised officer of the company.