If you thought 2018 was tough for you, imagine being a staffer in Facebook’s public relations department. Facebook CEO Mark Zuckerberg began the year by pledging to fix the company’s problems, but instead 2018 turned into 12 months of mea culpas, self-inflicted scandals, and screwups.
Dami Lee, writing for The Verge:
Facebook is rolling out more music features today, bringing more ways to integrate its licensing partnership with all three major labels into Stories, user profiles, and its Lip Sync Live feature.
Starting today, users will be able to add music stickers to their Facebook Stories. You can search for songs, pick out the part you want to share, and add the sticker with the artist and song name. It works exactly the same way as it does on Instagram Stories, which introduced the feature in June.
Facebook on Friday said an attack on its computer network led to the exposure of information from nearly 50 million of its users.
Parmy Olson, writing at Forbes:
It’s also a story any idealistic entrepreneur can identify with: What happens when you build something incredible and then sell it to someone with far different plans for your baby? “At the end of the day, I sold my company,” Acton says. “I sold my users’ privacy to a larger benefit. I made a choice and a compromise. And I live with that every day.”
Mike Isaac, reporting for The New York Times:
Kevin Systrom and Mike Krieger, the co-founders of the photo-sharing app Instagram, have resigned and plan to leave the company in coming weeks, according to people with direct knowledge of the matter. The exits add to the challenges facing Instagram’s parent company, Facebook.
Mr. Systrom, Instagram’s chief executive, and Mr. Krieger, the chief technical officer, notified Instagram’s leadership team and Facebook on Monday of their decision to leave, said the people, who spoke on condition of anonymity because they were not authorized to discuss the matter publicly.
Whoa. This isn’t a good sign for the future of Instagram.
Todd Spangler, writing for Variety:
Shares of Facebook plunged more than 19% in early trading Thursday, as investors reacted to signs that the social-media giant’s user and revenue growth are significantly slowing down.
The stock drop, to its lowest levels in nearly three months, wiped out more than $110 billion in market capitalization for Facebook and dragged down other internet and tech stocks including Twitter and Snap. Facebook’s market cap was $629.6 billion at the close of market Wednesday, and stood at around $503 billion as of Thursday at 10:30 a.m. ET.
Nick Statt, writing at The Verge:
An investigative journalist who went undercover as a Facebook moderator in the UK says the company lets pages from far-right fringe groups “exceed deletion threshold,” and that those pages are “subject to different treatment in the same category as pages belonging to governments and news organizations.” The accusation is a damning one, undermining Facebook’s claims that it is actively trying to cut down on fake news, propaganda, hate speech, and other harmful content that may have significant real-world impact.
Burn it all down.
Kara Swisher sat down with Mark Zuckerberg of Facebook. It did not go well:
Swisher: Okay. “Sandy Hook didn’t happen” is not a debate. It is false. You can’t just take that down?
Zuckerberg: I agree that it is false.
I also think that going to someone who is a victim of Sandy Hook and telling them, “Hey, no, you’re a liar” — that is harassment, and we actually will take that down. But overall, let’s take this whole closer to home…
I’m Jewish, and there’s a set of people who deny that the Holocaust happened. I find that deeply offensive. But at the end of the day, I don’t believe that our platform should take that down because I think there are things that different people get wrong. I don’t think that they’re intentionally getting it wrong, but I think—
Swisher: In the case of the Holocaust deniers, they might be, but go ahead.
Zuckerberg: It’s hard to impugn intent and to understand the intent. I just think, as abhorrent as some of those examples are, I think the reality is also that I get things wrong when I speak publicly. I’m sure you do. I’m sure a lot of leaders and public figures we respect do too, and I just don’t think that it is the right thing to say, “We’re going to take someone off the platform if they get things wrong, even multiple times.”
You know who isn’t coming into any argument in good faith? Holocaust deniers. Big tech’s inability to grasp what their moral obligation is given the power and influence they now have is one of the biggest crisis facing us today. They simply don’t want the responsibility for what they’ve built.
Zeynep Tufekci, writing for The New York Times:
In a largely automated platform like Facebook, what matters most is not the political beliefs of the employees but the structures, algorithms and incentives they set up, as well as what oversight, if any, they employ to guard against deception, misinformation and illegitimate meddling. And the unfortunate truth is that by design, business model and algorithm, Facebook has made it easy for it to be weaponized to spread misinformation and fraudulent content. Sadly, this business model is also lucrative, especially during elections. Sheryl Sandberg, Facebook’s chief operating officer, called the 2016 election “a big deal in terms of ad spend” for the company, and it was. No wonder there has been increasing scrutiny of the platform.
Facebook’s “if both sides are mad, we’re doing something right” defense is absolutely absurd.
Facebook Inc. is offering major record labels and music publishers hundreds of millions of dollars so the users of its social network can legally include songs in videos they upload, according to people familiar with the matter.
The posting and viewing of video on Facebook has exploded in recent years, and many of the videos feature music to which Facebook doesn’t have the rights. Under current law, rights holders must ask Facebook to take down videos with infringing material.
Alex Kantrowitz, writing for BuzzFeed:
For well over a year now, the digital advertising and publishing industries have grappled with the growing power of Google and Facebook, which suck up 98% of every new ad dollar spent online, according to some estimates. With so much growth and power concentrated in just two companies, publishers worry about the viability of their ad businesses, while advertisers bemoan their loss of leverage around ad buys.
Deeply unsettled by the idea of a Google-Facebook duopoly, both groups have done what they can to defend against it. But so far, nothing they’ve done seems to have worked.
It seems so obvious to say that two companies controlling this much of the online advertising market is bad.
Sam Levin, writing for The Guardian:
“A bunch of conservative groups grabbed this and said, ‘Hey, they are trying to silence this blog – share, share share,’” said Winthrop, who published the story that falsely claimed hundreds of thousands of Irish people were brought to the US as slaves. “With Facebook trying to throttle it and say, ‘Don’t share it,’ it actually had the opposite effect.”
Mark Zuckerberg has outlined a new mission statement for Facebook:
This is a time when many of us around the world are reflecting on how we can have the most positive impact. I am reminded of my favorite saying about technology: “We always overestimate what we can do in two years, and we underestimate what we can do in ten years.” We may not have the power to create the world we want immediately, but we can all start working on the long term today. In times like these, the most important thing we at Facebook can do is develop the social infrastructure to give people the power to build a global community that works for all of us.
For the past decade, Facebook has focused on connecting friends and families. With that foundation, our next focus will be developing the social infrastructure for community — for supporting us, for keeping us safe, for informing us, for civic engagement, and for inclusion of all.
Kara Swisher spoke with Mark over at Recode about the manifesto:
He is particularly concerned about the use of headlines as the only signal to online users. “In most news consumed on paper, the headline is not separate from the content,” he said. “Online the headline is often the only indicator and that is a problem.”
Zuckerberg seems to get how complex it is and how damaging fake news is to the Facebook platform. “Our community wants good information and no one ever said, ‘I want misinformation,’” he said.
Lucas Shaw, writing for Bloomberg:
The world’s largest social network has redoubled its efforts to reach a broad accord with the industry, according to interviews with negotiators at labels, music publishers and trade associations. A deal would govern user-generated videos that include songs and potentially pave the way for Facebook to obtain more professional videos from the labels themselves.
The Washington Post is reporting that Facebook is rolling out some new tools to help tell users when news stories in their feed may be fake:
“We have a responsibility to reduce the spread of fake news on our platform,” said Facebook’s Adam Mosseri, vice president of product development, in an interview with The Post. Mosseri added that Facebook still wants to be a place where people with all kinds of opinions can express themselves. And Facebook has no interest in being the arbiter of what’s true and what isn’t for its billion users, he said.
Facebook and Google have decided to ban fake news sites from using their advertising networks. Again, this seems like something that would have been useful a few months ago, but it’s a good step in the right direction.
Google kicked off the action on Monday afternoon when the Silicon Valley search giant said it would ban websites that peddle fake news from using its online advertising service. Hours later, Facebook, the social network, updated the language in its Facebook Audience Network policy, which already says it will not display ads in sites that show misleading or illegal content, to include fake news sites.
The most obvious way in which Facebook enabled a Trump victory has been its inability (or refusal) to address the problem of hoax or fake news. Fake news is not a problem unique to Facebook, but Facebook’s enormous audience, and the mechanisms of distribution on which the site relies — i.e., the emotionally charged activity of sharing, and the show-me-more-like-this feedback loop of the news feed algorithm — makes it the only site to support a genuinely lucrative market in which shady publishers arbitrage traffic by enticing people off of Facebook and onto ad-festooned websites, using stories that are alternately made up, incorrect, exaggerated beyond all relationship to truth, or all three. (To really hammer home the cyberdystopia aspect of this: A significant number of the sites are run by Macedonian teenagers looking to make some scratch.)
However, during the time period analyzed, we found that right-wing pages were more prone to sharing false or misleading information than left-wing pages. Mainstream pages did not share any completely false information, but did publish a small number of posts that included unverified claims. (More on that below.)
We rated 86 out of a total 666 right-wing Facebook posts as mostly false, for a percentage of 13%. Another 167 posts (25%) were rated as a mixture of true and false. Viewed separately or together (38%), this is an alarmingly high percentage.
Left-wing pages did not earn as many “mostly false” or “mixture of true and false” ratings, but they did share false and misleading content. We identified 22 mostly false posts out of a total of 471 from these pages, which means that just under 5% of left-wing posts were untrue. We rated close to 14% of these posts (68) a mixture of true and false. Taken together, nearly a fifth of all left-wing posts we analyzed were either partially or mostly false.
Facebook, what are you doing? This is where a shocking number of people get their “news” and at this rate a huge part of it is literal bullshit.
The Washington Post tracked Facebook’s “Trending Topics” section for three weeks. They found a bunch of fake and inaccurate stories.
Our results shouldn’t be taken as conclusive: Since Facebook personalizes its trends to each user, and we tracked results only during work hours, there’s no guarantee that we caught every hoax. But the observation that Facebook periodically trends fake news still stands — if anything, we’ve underestimated how often it occurs.
Maybe it’s time to re-think this whole thing, yeah?