In-faux-mation: John Gray

Fri, 12 Apr 2019

John F. Gray
CEO/co-founder Mentionmapp Analytics Inc.
SFU Alumnus, BASc Communications ‘85 & BA English ‘93

Question: Has misinformation (or disinformation) ever impacted or interfered with your professional work?  

The simple answer is yes. Misinformation and disinformation has impacted my professional work. In late 2016 with my co-founder’s support, we embarked on a complete business make- over. Our goal is to be part of the solution to a global problem. The purpose is simple as well, solutions are desperately needed. This complex problem (no hyperbole intended) is a grave threat to our democratic institutions and values.

Two weeks before the 2017 Presidential inauguration we started testing a prototype application combining our Twitter network visualization tool and a bot detection algorithm. Curiosity  pushed us to better understand how bots were operating in social media.

It’s important to qualify how we’ve approached our research and reporting effort.

  • Let the data tell the story, never force the data to fit a narrative,
  • Recognize that all bots are not bad bots. Plus proving a profile is a bot is extremely difficult.
  • Make no claims about attribution. Proving who the operators/owner are is even more difficult
  • Make no claims about impact. Proving this activity equals people taking action is off the difficulty charts altogether.
  • Report about suspect and ill-intended behavior.

Elections and political events have often been at the center of our research. It was finding bots at work in the #BCpoli conversation before the 2017 BC Provincial election that caught us by surprise. We suspect a number of things, and can only prove the long ago deleted profile @ReverendSM was using commercially available bots to amplifier and manipulate the metrics (retweets and likes) of their self authored (anti-Premier)tweets.

From a collection of 13 tweets, the same pattern emerged - more bots than real people engaged with the content. Overall, we connected 280 Bot profiles at work, and more importantly can suggest did not influence the elections outcome. It was interesting to note a few days after speaking on the John McComb (CKNW98) show about our findings @ReverendSM disappeared from Twitter. Did we cause this? We’ll never know, nor make that claim, but what we do know is that someone (not the Russians) intended to see what could be learned from their efforts.

This case was an acknowledgement that this kind of online manipulation is a local problem too. And, most importantly, we have see this issue beyond a tale of bots, Russians and the US Presidency.

Since then, we’ve researched and reported on topics and issues in 2017 like the anti-vax movement; cryptocurrency; fake influencer marketing; in 2018 we looked at the audience engagement related events like the Super Bowl and the World Cup of Soccer; hashtags such as #YellowVests, #NATO, #OperationOliveBranch (Turkey’s incursion into Northern Syria), #HandsOffVenezuela and many more. The disinformation age is global, and touches every discourse of social importance.

What has this taught us? I’ll answer - “The bad guys are winning. They have a playbook and no rule book, while the good guys are bound to the rule book and don’t yet have a play book.”

Today’s conversation needs to evolve beyond cybersecurity and protecting  our networks and vital infrastructure, and preventing the theft of our data and Intellectual Property. More than ever we need to define and implement a framework of Cognitive Security. It’s the hacking of our brains, and the targeting of our emotions that’s eroding trust in democracy and sowing chaos across our information landscape.

While working to confront this dark spectre, I’m remain optimistic and firmly committed to helping find meaningful information security solutions. There are values and principles still worth fighting for.

“Falsehood flies, and the truth comes limping after it.” - Jonathan Swift.

Confronting the Disinformation Age Blog Posts