Fast company logo
|
advertisement

LikeWar author P.W. Singer on how Taylor Swift, Facebook, Trump, and others helped turn us into accidental soldiers in the battles of the future.

War is memes. “Don’t be a victim like the Americans”

“And yet, we literally invented the internet,” says P.W. Singer [Photo: Sam Cole, courtesy of P.W. Singer]

BY Alex Pasternack10 minute read

One weekend in June 2009, Twitter was preparing an upgrade that would take the service offline for a few hours, when the U.S. State Department called. Delay the upgrade, officials requested, or risk jeopardizing an important means of political expression during daytime hours in Iran, just days before a presidential election there. Twitter complied. As a State Department official explained, “We highlighted to [Twitter] that this was an important form of communication . . . This is about giving their voices a chance to be heard.”

You’d be forgiven for forgetting that nearly decade-old bit of internet history, when Twitter and Facebook were still shiny new tools for revolution, or democracy, or something. All that’s been buried in the rubble of an unending social media war zone.

Troll armies targeting activists and journalists, including the murdered Jamal Khashoggi, in an effort to silence them. Militaries using Facebook as a weapon in government-backed ethnic cleansing campaigns, as the UN’s recent report on Myanmar describes it. Spamming WhatsApp users with hard-to-stop fake news ahead of an election (see Brazil). Also: hashtag hijackings by ISIS. Gang beefs on Facebook. Imprisoned bloggers. Deepfakes. Digital badges for making good memes on behalf of Israel. Pretty much every election now.

War zone isn’t a facile metaphor: Real, kinetic battlefields have been bleeding over into the internet for years, and vice versa. This is one idea behind the title of LikeWar, Peter W. Singer and Emerson T. Brooking’s wide-ranging survey of the ways information has been weaponized and even changed war itself. If cyberwar is hacking of the networks, “Likewar,” says Singer, “is the hacking of the people on the networks by driving ideas viral through likes and lies.”

The battles here can ultimately be deadly, but most immediately they are battles for our attention. “Whether it is the actual Israeli army–the Israeli Defense Forces versus Hamas–or politics—Donald Trump versus The Resistance–or Kanye West versus Taylor Swift’s online armies, they’re all using the same tactics and techniques, and learning the very same lessons, because it’s the same battlespace.”

Singer, who is a national security strategist at the non-partisan think tank New America, recently spoke with me about this weird, worrisome future of war. Here is an edited, condensed version of our conversation.

All’s fair in likes and war

Fast Company: You’ve got a quote from U.S. Army colonel turned historian Robert Bateman: “Once, every village had an idiot. It took the internet to bring them all together.” How does the internet change the propaganda equation?

Peter W. Singer: What social media allowed is the combination of the revolution of the telegraph or the telephone—that connection one-on-one at a distance—with the revolution of, first the printing press and then the radio and then the TV, broadcasting to the world—the simultaneity of it. This is whether you’re talking about marketing or propaganda and information warfare, and the fuzzy lines between them.

You have past examples of propaganda — the Germans broadcast radio into Great Britain during the Blitz, during WWII. But they weren’t able to make that one-on-one connection, and it wasn’t very effective. Actually this German radio show was most popular because the British liked to laugh at it.

One of the differences with social media—you see these same approaches used by wildly diverse actors—and one of the lessons of winning with it, is inundation through experimentation. Everyone from ISIS to the Donald Trump campaign to Buzzfeed—these are models of, “I could push out not just one message but multiple ones, and— important—I’m getting a constant feedback about whether they’re working or not.” Who’s clicking, who’s reacting, and then, because I’m combining that with knowledge about what that person is posting about themselves elsewhere. With their profile, and—then you get into the Cambridge Analytica stuff–their psychological tells.

FC: And then…

PS: I can then swing back and refine: now I know that this is a man between the ages of 30 and 40 that likes Subarus and reacts to light blue and is a fan of the TV show JAG. Now I can target everyone else who has the same kind of profile. The revolution that this has brought to the news business, to marketing and regular businesses–it’s also brought to politics and war.

And in fact, that’s where you see these blurring of the lines, where you have, well, Russian military intelligence units using Facebook digital marketing techniques that are available to everyone to influence an election. And then at the opposite end, you have teenagers taking selfies and doing live broadcasts and they’re using those to influence the outcome of a physical battle, like ISIS at the battle of Mosul.

‘They’re getting better’

FC: The State Department office designed to counter propaganda from U.S. adversaries was finally exempted from a hiring freeze in April and said then that it planned to bring on experts on Russia, China and Iran. The Pentagon is now talking about how it’s countering Russian information operators. How capable do you think the U.S. is now, and where is policy in Washington, when it comes to fighting wars through social media?

PS: We’re behind the curve. And there’s a shame in that, in that, we literally invented the internet. And yet we’re the example that other nations point to of, like, “The Americans, don’t be a victim like the Americans, don’t let what happened to the Americans happen to us.” That’s the discourse in Sweden, Estonia, France.

But this is a state of conflict, so there’s been learning. In the book, we open with the first battle of Mosul, where ISIS basically just shocks everyone by this use of social media, where it’s not just recruiting people: it’s weaving it into its physical battlefield operations. The force in Mosul is about 2,500 men, lightly armed, and they’re taking on a defending force of over 20,000 that’s backed by us. So it has Abrams tanks, Blackhawks. And yet, partly through the use of social media, ISIS is in effect able to collapse the Iraqi army.

Now everyone’s learning. And part of that learning is going on in the US military. And you could see this in the second battle of Mosul, where we take back Mosul. There’s a lot of things that we’re were doing that were different than, say, back in 2014, where, now, we are the ones doing the hashtag wars, we’re doing the live broadcasts, et cetera.

This is one of the areas where, [military theorist Carl von] Clausewitz would recognize, that basically, yeah, of course, there’s two thinking adversaries. When one side gets an advantage, the other side is going to learn about, change tactics and move on. It’s that attention economy, that rewards players for their activity online and the players who understand it. It’s something that everyone from Donald Trump to Nike understand.

During our research for the book, [reality TV star] Spencer Pratt walked us through all the ways you use narrative to manipulate people, to achieve your goal. His goal was to become famous and that he did. And he linked that back to psychology studies. A couple weeks after we were speaking with Speidi [Spencer and his wife, TV star Heidi Montag] we go to meet with the head of the U.S. counter-ISIS online propaganda campaign. And after the interview, I realized, they don’t get it. Our reality star has a better handle on winning the war online than our people in charge of it do—and that is not good.

advertisement

Read more: Hi, we failed to sell you America


FC: There’s ISIS, but there’s also the information campaigns that are happening at home, which have posed their own challenge for U.S. authorities.

PS: We know that there was Russian activity back during the Republican nomination, ads going after Cruz and Rubio. [How we missed it] is a question for the military and intelligence community, but it’s also a question for the platforms. I think the intelligence community was missing it because the social media was out in the open, and the cybersecurity teams at the companies were looking at hacks of customer accounts. They weren’t looking at mass ad buys or people posing as customers. It’s a bit of the parallel to what the 9/11 commission found: that we were looking in the wrong place.

I also think that’s why we are now not as up in arms about this as we ought to be, because it is out in the open. It’s not the secret intercept that reveals it, it was, “Oh, it’s these Facebook ad buys.” And that’s hard for the media and the public to wrap its head around. It’s a little bit of a parallel to some of the Trump scandals. “Well, he just said it.” If we had a secret tape recording of him saying it, it would be different.

Let’s put it this way: We know [disinformation campaigns] are going to continue because all the players at it think it worked. We know Russia thinks it worked because they’re coming back for more. And we also know that while Cambridge Analytica itself may not be with us in the same way, the people, the databases, the funders, et cetera, are still at work. And they’re getting better.

Preparing for the next ones

FC: Facebook, Twitter, Google and other companies say they are now taking serious action to fight disinformation in political campaigns, for example. What do you make of tech platforms’ responses?

PS: I think they could take a page from the military. This is where they actually ought to be doing more of this kind of war gaming of their products. And it would help them avoid a lot of the problems that have hit them. So there’s the practice generally known as beta testing: Push it out in the world, and customers will give us feedback and revise. That’s fine when it’s some kind of app, like a pizza app or something. It’s not fine when it’s the nervous system of the modern world, when it’s being used to fight everything from elections to physical wars.

One of the new rules of LikeWar, of this space, is, some of the most powerful actors in war and politics today are a handful of geeks who never set out to have this role. Zuckerberg built some software initially to rate whether his classmates were hot-or-not. After the 2016 election, he called it a “pretty crazy idea” that this could have happened on my platform, and that people could have been influenced in this way. Of course there’s the irony that Facebook is literally advertising to political campaigns at that point, because this is the best place to influence people.

So they’re continually surprised: Oh, the terrorists are doing livefeed, my gosh, I didn’t realize terrorists would live broadcast attacks. Not just that bad guys might use it, but good guys might misuse it. Oh, gosh, we didn’t anticipate that teenagers would live broadcast their suicides! The military is war gaming that out. It’s not a cybersecurity, find-software-vulnerabilities question. It’s, How might the bad guys misuse this? If they were doing that, we’d be in a much better place.

Read more: How Facebook’s crackdown on propaganda also helps the propagandists


FC: The 9/11 Commission also identified problems with collaboration and communication. What sort of collaboration do we need between the private sector and the government and the public in order to better cope with meme warfare?

PS: There is a need for information sharing across government. During the Cold War, in the 1980s, there was a program called the Active Measures Working Group, related to KGB activities. It was a space for spies, diplomats, broadcasters, and educators to come together and identify KGB misinformation programs and then develop responses to them. The battleground they were looking at back in the day was the third world. But now it’s everywhere. But we don’t have a version like that.

The public-private sector collaboration could be like the Centers for Disease Control, a model that brings together public, private sector, identifies viral outbreaks, finds suggestions about what to do. The Baltics [a hotbed of Russian-funded disinformation campaigns] have set up versions of this. There’s a great parallel for thinking about viral outbreaks and the like here.

And then finally there’s collaboration across the private sector where companies that compete in different ways can better information share. And that’s not just the tech companies, it’s going to, I believe, have to involve the media companies. To give you a parallel of best practice: In Norway, the media—they compete hard, the different news organizations are competitors. But they do share on things like identifying disinformation campaigns, shared fact checking, and the like, in a way that we just don’t have in the U.S. right now.

It’s not that you’re going to stop all of this activity, because some of this is the nature of business, the nature of politics, and the nature of war. But like in all these spaces we need to understand it, set up policies for it and set up rules. And it goes back to us: If we’re both the combatants and the targets in these wars, we need to understand the weapon that’s in our hand and how it’s being used against us.

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the early-rate deadline, May 3.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Alex Pasternack is a contributing editor at Fast Company who covers technology and science, and the founding editor of Vice's Motherboard. Reach him at apasternack@fastcompany.com and on Twitter at @pasternack More


Explore Topics