By Lauren Fischer
Traditionally, illicit markets operate in a physical sense, with tangible goods or services moving from place to place. However, as society evolves, so do these markets. Today, social media plays an up-and-coming role in the gray market of purchasing fake followers or bots. Bots are completely intangible, yet are in high demand by anyone who wants to appear more popular on social media platforms. Companies such as Devumi buy these bots wholesale, and then sell them by the thousands to politicians, celebrities, athletes, influencers, and more. Although many purchase these bots solely to boost their public image, others purchase fake followers and likes to spread election propaganda, promote their campaigns, and spread “fake news.” This spread of misinformation and its ability to create divisions in society make bots more dangerous. Despite this, no U.S. laws currently regulate fake accounts, which could allow this gray market to quickly turn into a thriving illicit one. Although bots act as their own entity, social media users created and maintain this market through their increased dependence on social media and desire to amplify their social media following.
Before examining the effects of this new market, it is crucial to identify bots, and understand how they work. According to an article published by WIRED, an American magazine, a bot is defined as an automated program designed to perform a specific task (Martineau). This fairly broad definition includes a wide variety of programs. Not all bots spread malicious information. Some, such as the Twitter account @big_ben_clock, are completely harmless and make the job of posting to social media much easier for humans. The @big_ben_clock account tweets out “BONG” on the hour every day (@big_ben_clock). Its accuracy would be hard to reciprocate by a human, who could often forget to tweet “BONG” on the hour, or have spelling errors in their tweets. In this case, a bot makes humans’ lives easier and gives some Twitter users a good laugh.
The more malicious bots stem from the amplification bot family. These bots exist solely to like, retweet, or follow clients who pay to use them (Confessore & Dance). Millions of these fake bot accounts exist across all social media platforms. The Turkish broadcast service TRT reported that bots make up 15 percent of all Twitter users (TRT World). Their large Twitter presence makes bots more common on users’ Twitter feeds. In December 2017, Twitter identified an average of 6.4 million suspicious bot accounts each week (Confessore & Dance). Although Twitter can identify suspicious accounts, it struggles to find the actual user behind the account. When trying to find bot accounts, Twitter looks for “bot-like activity,” which often consists of retweeting hundreds of times per day, spamming the same link, and using multiple accounts to amplify the same message (Martineau). When discovered by Twitter, these bot accounts get suspended from the platform. While Twitter does this with a high degree of accuracy, it has made mistakes in the past, including suspending the accounts of actual users.
The gray area between fake bot accounts and real Twitter users only gets more complex when bots steal a real person’s profile. In the New York Times article “The Follower Factory,” the Times talked to people who had their profile stolen and used by Devumi bots. The amplification bots used the same, or similar, profile picture, bio and Twitter handle, so the two accounts looked practically identical. People then purchased these bot accounts from Devumi, or other bot companies. Devumi promises its customers fake followers that look like the real thing. These accounts often look real because the bots impersonate actual Twitter users. Devumi also claims to “only use promotion techniques that are Twitter approved so your account is never at risk of getting suspended or penalized,” (Devumi). Twitter prohibits users from buying followers, but that did not stop people from utilizing Devumi to gain social media fame. According to the article “The Follower Factory,” Devumi sold about 200 million Twitter followers to at least 39,000 customers, accounting for a third of more than $6 million in sales during that period (Confessore & Dance). One Twitter user’s stolen identity got sold to 2,000 customers for a mere two cents per customer. His social identity brought Devumi around $30 (Confessore & Dance). Devumi’s profit quickly adds up, but comes at a cost for the impersonated Twitter users. Many of the bot accounts will retweet things that the real person never would, ranging from pornography to foreign languages unknown to the real users. Impersonated users said they worried about business professionals or colleagues finding the account and linking it to them. The New York attorney general recently found that Devumi’s sale of fake accounts and online engagement violated New York laws against fraud and false advertising (Confessore). She also found that Devumi violated a statute that prohibits impersonating someone at their expense. The ruling shut down Devumi, but its influence on social media platforms continues to show how this new market functions.
Social media’s obsession with high levels of interaction, whether it be many followers, retweets, or likes, helps the bot market thrive. Follower count has been part of social media since the beginning. It indicates popularity, how well-liked someone is, and sometimes the relevancy of a user’s content. People tend to trust accounts with a high follower count, as it indicates some sort of expertise. “You see a higher follower count, or a higher retweet count, and you assume this person is important, or this tweet was well received,” said Rand Fishkin, the founder of Moz, a company that makes search engine optimization software,” (Confessore & Dance). Buying bots can make a user’s follower count higher, thus making them seem more trustworthy or credible. They then try to gain more actual followers based on appearing more popular.
Because of this mindset, some users purchase bots to boost their career potential. New York Times reporters talked to Marcus Holmlund, a young writer who landed a job at Wilhelmina, an international modeling agency. Holmlund managed Wilhelmina’s social media efforts, but when Wilhelmina’s Twitter following did not grow quickly enough, a supervisor told him to buy followers or find another job (Confessore & Dance). He then turned to Devumi and paid for followers with his own money. “I felt stuck with the threat of being fired, or worse, never working in fashion again…Since then, I tell anyone and everyone who ever asks that it’s a total scam — it won’t boost their engagement,” (Confessore & Dance). The social media bot market is built around making it easy to gain followers, and it therefore changed companies’ social media expectations. It clearly puts more pressure on social media content directors, who at the end of the day need more followers to boost their company’s engagement and social media presence. Engagement consists of mutual interactions between a user and their followers through social media. Increasing engagement and gaining followers takes time, so many turn to the quick and easy solution of buying followers, despite that being against Twitter’s policy.
Social media users today, however, care less about follower count than they used to. More people see the follower count as toxic to users’ mental health, as people compare their follower counts to others, creating more competition. Instagram, which greatly emphasizes likes and follower count, was determined the worst social media network by nearly 1,500 teens and young adults in the #StatusofMind survey, published by the United Kingdom’s Royal Society for Public Health. Survey participants cited Instagram giving users increased levels of anxiety, FOMO and depression (MacMillan). This attitude makes its way into professional settings as well. Social Media analytics agency Klear found that follower count is not an effective way for marketers to select Instagram influencers, as a high follower count does not always translate to high engagement (Sjoberg). However, many want-to-be Instagram influencers still used bots to gain followers. In July of 2018, Instagram had as many as 95 million bot accounts dedicated to inflating the follower count of Instagram users wanting to become influencers (Sjoberg). While bots can help with boosting a user’s numbers, they do little for engagement. “Fake followers and engagement is a problem in the influencer space, and we wanted to highlight that though an influencer may have a high following count, it doesn’t always mean they are real or relevant to the brand’s goals,” (Sjoberg). On the contrary, the Klear study found that accounts with less followers occasionally had more impact and engagement than larger accounts (Sjoberg). These changing perspectives on follower count and likes may cause some upset within the bot market, as influencers feel less inclined to purchase fake followers.
This market also benefits from people’s willingness to interact with controversial fake news stories or propaganda. “Fake news” stories appear accurate but are actually deliberately false (Nagler). Bots may initiate the spread of a fake news story, but real users make it popular. According to an article published by Science News, “Researchers found that in the first few seconds after a viral story appeared on Twitter, at least half of the accounts sharing that article were likely bots; once a story had been around for at least 10 seconds, most accounts spreading it were maintained by real people,” (Temming). Bots merely strike the match, encouraging real social media users to spread fake news stories. Bots may make a political figure’s account appear to be more credible with a higher follower count. They can also be programmed to retweet fake news stories, which makes the stories appear more credible due to a higher retweet count. They give low-credible stories or propaganda momentum so the articles can later go viral (Temming). The fake news epidemic on social media increased the use of the bot market so much that there are now fake news bots designed specifically to spread false stories. Fake news spread by bots appears to also increase during high-stakes political events or controversies. During the 2018 midterm elections, Twitter bots were far more active on social media, according to an article published by CNBC. “Between Oct. 6 and Nov. 19, nearly two weeks after Election Day, the researchers… identified more than 200,000 bots posting about the midterm elections, compared to about 750,000 humans,” (Higgins). With a high number of bots posting about the election, thousands of people were quickly effected by potentially misleading information. The increased spread of fake news makes bots more common on social media, as they can disseminate false information quickly and efficiently, especially if a paying customer is behind it.
Some of Devumi’s 39,000 customers included political figures or organizations, who use Devumi to promote themselves and occasionally their campaigns. One political organization is China’s state-run news agency, Xinhua News. Following the Hong Kong riots in March 2019, it purchased hundreds of thousands of followers and retweets from Devumi to promote their political propaganda, some of which wound up on the Twitter feeds of Americans (Confessore & Dance). Much of what users see on Twitter is based on trends and tweets from accounts with a large following. By buying followers and retweets, political officials have more power to shape what users see on their Twitter feed, due to trends, and therefore shape their opinions. In August 2019, more Americans began seeing promoted tweets from Xinhua News describing the pro-democracy demonstrations in Hong Kong. The tweets painted the August demonstrations as violent, one reading: “All walks of life in Hong Kong called for a break to be put on the blatant violence and for order to be restored,” (Shu). The story may seem credible at first because it appears in a news article format. However, international media described the riots as mostly peaceful at the time the misleading articles were shared (Shu). Documented footage shot by Amnesty International showed excessive police force toward peaceful protesters (Shu). While some people may already know of these peaceful protests, others may not know what to make of them after seeing Xinhua News’ promoted tweets.
Uninformed people then begin to form opinions centered around an incorrect and untrustworthy piece of information. Although bots make a large difference in the tweets users see on their feeds, the spread of this market ultimately comes back to people’s willingness to believe in and engage with misinformation. As stated in the article How Fiction Becomes Fact on Social Media, “skepticism of online “news” serves as a decent filter much of the time, but our innate biases allow it to be bypassed, researchers have found,” (Carey). A person’s bias is often so strong, it allows them to give into fake news and lies, adding to their own personal echo chamber. Confirmation bias, in particular, allows people to believe things that confirm their inherent biases and beliefs. They, in turn, ignore facts or articles that go against their beliefs, despite them being true or false. Bots only add to this problem of confirmation bias, as they often spread fake news stories that are particularly political or partisan.
The bot market thrives largely due to people’s increased social media consumption. This year, the PEW Research Center found that 54 percent of U.S. adults “sometimes” or “often” get their news from social media (Shearer & Greico). This is a 10 percent increase over the past three years. One reason for this increase is that more and more Americans lean on social media as a news source, making them almost dependent on these platforms for their information. With the increase of fake news circulating via bots, however, this dependency can become dangerous. Fake news stories can sometimes lead to real acts or threats of violence. One incident of this was the scandal #pizzagate, where a man read a fake news story about Hilary Clinton running a sex trafficking ring out of a pizza parlor in Washington, D.C., and decided to investigate it himself while armed with several weapons (Nagler). One shot was fired, but no one was hurt. Despite this, there was a possibility for violence over a fake story amplified by bots. By learning to spot bots and fake news stories, people become less susceptible to falling prey to false accounts and stories. Nagler’s article suggests using fact checking websites, such as Politifact.com, to check the validity of news articles. It is also important to keep in mind the common characteristics of bots, such as retweeting seemingly random content hundreds of times per day or having complex, lengthy Twitter handles.
These skills help users work against bots and the fake news they spread. However, bots continue to exist on social media platforms due to the fake follower demand created by users. Twitter’s inability to rid bots from their platform is also to blame, and big steps must be taken to fight against the bot market. On the federal level, the Honest Ads Act was proposed in October of 2017, which would expand source disclosure requirements for political advertisements (Funke & Flamini). States are also proposing that media literacy be a required component of public schooling (Funke & Flamini). These proposals are steps in the right direction to fight this new and developing market. Ultimately, this market falls back on social media users, who make the decision to engage with bot accounts. Although this market differs from traditional, tangible black markets, it still has the potential to be just as dangerous. With more bots circulating on social media every day, users could eventually lose the ability to discern real from fake.
@big_ben_clock. “BONG.” Twitter, 29 July 2019, 8:00 a.m., https://twitter.com/big_ben_clock/status/1155810210566627330.
Carey, Benedict. “How Fiction Becomes Fact on Social Media.” The New York Times, The New York Times, 20 Oct. 2017, www.nytimes.com/2017/10/20/health/social-media-fake-news.html.
Confessore, Nicholas. “Firm That Sold Social Media Bots Settles With New York Attorney General.” The New York Times, The New York Times, 31 Jan. 2019, www.nytimes.com/2019/01/30/technology/letitia-james-social-media-bots.html.
Confessore, Nicholas, and Gabriel J.X. Dance. “The Follower Factory.” The New York Times, The New York Times, 27 Jan. 2018, www.nytimes.com/interactive/2018/01/27/technology/social-media-bots.html.
Devumi. “Accelerate Your Social Media Growth.” Devumi, 2016, devumi.top/.
Funke, Daniel, and Daniela Flamini. “A Guide to Anti-Misinformation Actions around the World.” Poynter, Poynter Institute, 2018, www.poynter.org/ifcn/anti-misinformation-actions/#us.
Higgins, Tucker. “Twitter Bots Were More Active than Previously Known during the 2018 Midterms, a New Study Suggests.” CNBC, CNBC, 5 Feb. 2019, www.cnbc.com/2019/02/04/twitter-bots-were-more-active-than-previously-known-during-2018-midterms-study.html.
MacMillan, Amanda. “Why Instagram Is the Worst Social Media for Mental Health.” Time, Time, 25 May 2017, time.com/4793331/instagram-social-media-mental-health/.
Martineau, Paris. “Do You Know What a Bot Actually Is?” Wired, Conde Nast, 19 Nov. 2018, www.wired.com/story/the-know-it-alls-what-is-a-bot/.
Nagler, Christina. “4 Tips for Spotting a Fake News Story.” Harvard Summer School, 1 Nov. 2018, www.summer.harvard.edu/inside-summer/4-tips-spotting-fake-news-story.
Shearer, Elisa, and Elizabeth Grieco. “Americans Are Wary of the Role Social Media Sites Play in Delivering the News.” Pew Research Center’s Journalism Project, PEW Research Center for the People and the Press, 2 Oct. 2019, www.journalism.org/2019/10/02/americans-are-wary-of-the-role-social-media-sites-play-in-delivering-the-news/.
Shu, Catherine. “Twitter Is Blocked in China, but Its State News Agency Is Buying Promoted Tweets to Portray Hong Kong Protestors as Violent.” TechCrunch, TechCrunch, 19 Aug. 2019, techcrunch.com/2019/08/19/twitter-is-blocked-in-china-but-its-state-news-agency-is-buying-promoted-tweets-to-portray-hong-kong-protestors-as-violent/.
Sjoberg, Brooke. “Should Instagram Get Rid of the Follower Count?” The Daily Dot, The Daily Dot, 24 Sept. 2019, www.dailydot.com/debug/instagram-get-rid-likes-follower-count-influencers/.
Temming, Maria. “How Twitter Bots Get People to Spread Fake News.” Science News, 8 Aug. 2019, www.sciencenews.org/article/twitter-bots-fake-news-2016-election.
TRT World. Social Media’s Black Market. YouTube, YouTube, 2 Feb. 2018, www.youtube.com/watch?v=T9w0RdVm1o4.