8 Weaponizing Cyberspace or traditionally structured media outlets report on it. In many cases, when one such organization features it, other media outlets will refer to it as well. This can help accelerate rates at which the disinformation piece is shared.27 Sharing itself helps validate the importance and worthiness of a post or video according to the traditional practices of social media plat- forms in the early twenty-first century. Unpacking this process, it becomes clear that while the literal cost of engaging in disinformation is low, patience is important, and further investment in related infrastructure can be extremely valuable. For exam- ple, setting up entities such as media outlets that resemble traditional journalistic services can help project disinformation. Experience indi- cates that these media outlets can even broadcast a combination of factual news, messaging that carries a favorable editorial bent or biased selection and interpretation of evidence, and outright falsified disinformation. The degree to which its disinformation is mixed in among factual pieces may help the outlet gain viewership and a semblance of credibility. That can help increase the number of people who will watch it (and see the disinfor- mation) and also the likelihood that other media outlets will unwittingly parrot the disinformation under the impression that they are reporting on a story that the first (conspiratorial) outlet has “discovered.” Another tool for increasing the footprint of disinformation is to estab- lish artificial social media accounts that exist to repeat disinformation. This rebroadcasting helps deceive social media platforms into concluding that a particular story (in this case, the disinformation) is authentically gaining traction and that it therefore deserves to be brought to the atten- tion of still further platform users. As social media platforms gradually become alert to this method of propagandizing (and as they become per- suaded to counteract it), the entity that is concocting disinformation will of course respond. That response can include developing new ways to camouflage the artificial bot accounts more effectively, commandeering or renting legitimate accounts in order to fulfill the bot functions, or simply continually establishing ever more accounts so that the platform’s moni- tors must play whack-a-mole to find and close the continually sprouting fake accounts. All of these methods have already been demonstrated. War, and competitive dynamics broadly, sees adaptation and evolution. As long as illicit activities such as disinformation are being undertaken, new methods for curtailing it will be met by changes in how manipula- tions are conducted. Traffic volume pays dividends, and researchers have noted that experi- ments suggest that “all other things being equal, messages received in greater volume and from more sources will be more persuasive. Quan- tity does indeed have a quality all its own.”28 The value of producing an effect by which the same message appears to come from different sources (implying its validity) goes far to help explain why the disinformation
Previous Page Next Page