Thursday, August 6, 2020

BH20: Keynote: Hacking Public Opinion

Renée DiResta, Research Manager, Stanford Internet Observatory

Vocab background: Misinformation, the sharer thinks the information is true, and sharing out of trying to help people. Disinformation, the sharer knows the information is false.  Propoganda is information that is created to make you feel and act a certain way (not always false). Finally, there's an agent of influence - someone acting on behalf of someone else (Nation State, etc). 

Dissemination is an important part of sharing information. In the past, someone would have to physically hand out flyers.  This got easier with tv and radio, but still restricted.  Then, we got zero-cost publishing with blogging - but attracting the audience was still tricky.  Now we have social media - the feeds are designed for engagement and dissemination. 

Now we have a glut of material, no editors, no gatekeepers - just an algorithm that rates, ranks and disseminates.  These algorithms are gameable, and the systems are open to everyone.

We are now going beyond influencing public opinion to hacking public opinion.  It's easy and cheap to create fake media companies and personas, it's how the platform was designed.

We see distraction, persuasion, entrenchment (to highlight and exacerbate existing divisions), and then divide.

Now our broadcast media feeds into social media - and it also flows in the reverse! Both of these can be easily influenced by bad actors.

Renee then walked through a few examples from China - obvious government propaganda, less obvious and then "news" coming from a fabricated news company on twitter to make China look good.  In addition, many Chinese news agencies have facebook pages - even though Facebook is banned in China.  Why? To influence China's image in countries that do have access to Facebook, used recently to discredit Hong Kong protestors.

She did a great breakdown, as well, on creation of twitter bots and figuring out their purposes - and also how effective they were (engagement, number of retweets, etc.)

Memes are properties created for social media, and are easily digestible, identity focused. Often created by state actors to create more division - on both sides of the political spectrum.

Great deep dive into the Russian interference in the 2016 election, with lots of great graphics.

Well researched state agents will exploit divisions in our society using vulnerabilities in our information ecosystem. They will likely target voting machines again and to infiltrate groups. But most of all, they will aim to reduce trust in the US elections.

The more these images and stories are spread, they start to influence and impact people, though direct measurement of impact on each individual is more difficult and will be part of further research. They can see disinformation jumping from one group to another, which seems to demonstrate people are believing it and feel strongly enough to reshare.

An excellent talk - I highly recommend you catch it on YouTube when posted!





No comments:

Post a Comment