In the past, the power of news dissemination was under a few people's control, like newspapers' editors and TV channels. Thanks to social networks, this power is in the hand of everyone now. Social networks became very popular as soon as they were launched, and many societies extensively welcomed them. They have provided an engaging environment so that people can share their moments with their relatives, friends, colleagues, and even their unseen friends (so-called virtual friends) as their `followers.' In this virtual world, people can also share their opinions with their followers by broadcasting a message. Diffusing information and news among the followers will affect them and slightly change their opinions. When a follower is influenced, she may shares/retweets/forwards the message to her own followers and cause more propagation. There are many shreds of evidence that a message shared by few people (even in some cases one person) has been watched by millions of users and went viral. Hence, social media is an inseparable part of our life that can provide many opportunities, e.g., teaching, entertainment, news, and give us the power of sharing our experiences. Researchers have shown that many people prefer to get news from social networks rather than related websites as they are speedy tool to provide news from everywhere. Therefore, social media is considered one of the most effective tools to manipulate the users' opinions, and it is an attractive means of election control for political campaigns/parties/candidates. As a real example, in the 2016 US presidential election, it has been shown that 92% of Americans saw and remembered pro-Trump fake news stories, 23% remembered pro-Clinton false news, and a very high portion of them believed the news. Moreover, the campaigns can use social influence in order to polarize the users such that a voter receives specific messages in support/oppose of a candidate/party and not all possible messages. These activities impair the integrity of the elections and our democracies because people should have access to all reliable news from different perspectives to make a fair judgment. In this thesis, we investigate the computational aspects of this problem and study different manipulators' strategies to understand how they work. Our goal is to prevent malicious activities as they have enough potential to cause drastic consequences for any society. We study different aspects of controlling elections utilizing social influence. First, we consider a multi-winner election control where some parties are running for an election, and more than one candidate will be selected as winners. There is a social network of voters and an attacker trying to bribe some users/voters to start a diffusion process and spread a message among them; her goal is to change the voters' opinion regarding a target party. In the constructive model, the attacker tries to maximize the number of winners in the target party, while in the destructive case, she wants to minimize it. In this model, we present some hardness results, approximation guarantee, and polynomial-time algorithms regarding different structures (e.g., graphs, trees, and arborescent), objective functions, diffusion models (e.g., linear threshold and independent cascade models), and different configurations of influencing voters. Second, we investigate a single-winner election control problem where the attacker does not know the exact voters' preference list; instead, she has/guesses a probability distribution over all candidates for each voter. In this case, we show that the problem is at least as hard to approximate as the Densest-k-subgraph problem, which is hard to approximate for some constant under the exponential time hypothesis. Then we consider a lightly relaxed version and present some hardness and constant factor approximation algorithms for some objective functions regarding both constructive and destructive models. We also examine some real-world social networks and experimentally show that our algorithm works well. Finally, we present a Stackelberg game variation for competitive election control where there are two players called attacker and defender. They have a budget and the number of their seed nodes should not exceed their budget. The attacker plays first and selects a set of seed nodes to start a diffusion and change the voters' opinion. She knows that the defender is aware of everything and plays afterward. When the attacker's diffusion process is finished, the defender selects her seed nodes to cancel the attacker's influence over the infected voters. Indeed, the attacker tries to maximize the number of infected voters after both diffusion processes, while the defender attempts to minimize it. For simplicity, we first investigate the influence maximization model of this problem and then extend it to the election control through social influence for a single-winner election control problem regarding plurality scoring rule under the independent cascade model. We show that the attacker's problem is $Sigma_2^p$-hard when the defender is able to find an optimal strategy. We also show the same hardness result regarding any approximation algorithm. Moreover, we show that the defender's problem is NP-hard to approximate within any factor $alpha geq 1$. Since the problems are inapproximable, we consider a relaxed version in which the defender selects her seed nodes based on a probability distribution over the nodes, and the attacker is aware of the distribution. In the relaxed model, we give a constant-factor approximation algorithm for the attacker's problem. We also simulate our results and show that the attacker can activate many voters even when the defender can find the optimal solution. Moreover, we show that the greedy influence maximization algorithm works very well for the defender.
Election Control via Social Influence / Aboueimehrizi, Mohammad. - (2021 May 24).
Election Control via Social Influence
ABOUEIMEHRIZI, MOHAMMAD
2021-05-24
Abstract
In the past, the power of news dissemination was under a few people's control, like newspapers' editors and TV channels. Thanks to social networks, this power is in the hand of everyone now. Social networks became very popular as soon as they were launched, and many societies extensively welcomed them. They have provided an engaging environment so that people can share their moments with their relatives, friends, colleagues, and even their unseen friends (so-called virtual friends) as their `followers.' In this virtual world, people can also share their opinions with their followers by broadcasting a message. Diffusing information and news among the followers will affect them and slightly change their opinions. When a follower is influenced, she may shares/retweets/forwards the message to her own followers and cause more propagation. There are many shreds of evidence that a message shared by few people (even in some cases one person) has been watched by millions of users and went viral. Hence, social media is an inseparable part of our life that can provide many opportunities, e.g., teaching, entertainment, news, and give us the power of sharing our experiences. Researchers have shown that many people prefer to get news from social networks rather than related websites as they are speedy tool to provide news from everywhere. Therefore, social media is considered one of the most effective tools to manipulate the users' opinions, and it is an attractive means of election control for political campaigns/parties/candidates. As a real example, in the 2016 US presidential election, it has been shown that 92% of Americans saw and remembered pro-Trump fake news stories, 23% remembered pro-Clinton false news, and a very high portion of them believed the news. Moreover, the campaigns can use social influence in order to polarize the users such that a voter receives specific messages in support/oppose of a candidate/party and not all possible messages. These activities impair the integrity of the elections and our democracies because people should have access to all reliable news from different perspectives to make a fair judgment. In this thesis, we investigate the computational aspects of this problem and study different manipulators' strategies to understand how they work. Our goal is to prevent malicious activities as they have enough potential to cause drastic consequences for any society. We study different aspects of controlling elections utilizing social influence. First, we consider a multi-winner election control where some parties are running for an election, and more than one candidate will be selected as winners. There is a social network of voters and an attacker trying to bribe some users/voters to start a diffusion process and spread a message among them; her goal is to change the voters' opinion regarding a target party. In the constructive model, the attacker tries to maximize the number of winners in the target party, while in the destructive case, she wants to minimize it. In this model, we present some hardness results, approximation guarantee, and polynomial-time algorithms regarding different structures (e.g., graphs, trees, and arborescent), objective functions, diffusion models (e.g., linear threshold and independent cascade models), and different configurations of influencing voters. Second, we investigate a single-winner election control problem where the attacker does not know the exact voters' preference list; instead, she has/guesses a probability distribution over all candidates for each voter. In this case, we show that the problem is at least as hard to approximate as the Densest-k-subgraph problem, which is hard to approximate for some constant under the exponential time hypothesis. Then we consider a lightly relaxed version and present some hardness and constant factor approximation algorithms for some objective functions regarding both constructive and destructive models. We also examine some real-world social networks and experimentally show that our algorithm works well. Finally, we present a Stackelberg game variation for competitive election control where there are two players called attacker and defender. They have a budget and the number of their seed nodes should not exceed their budget. The attacker plays first and selects a set of seed nodes to start a diffusion and change the voters' opinion. She knows that the defender is aware of everything and plays afterward. When the attacker's diffusion process is finished, the defender selects her seed nodes to cancel the attacker's influence over the infected voters. Indeed, the attacker tries to maximize the number of infected voters after both diffusion processes, while the defender attempts to minimize it. For simplicity, we first investigate the influence maximization model of this problem and then extend it to the election control through social influence for a single-winner election control problem regarding plurality scoring rule under the independent cascade model. We show that the attacker's problem is $Sigma_2^p$-hard when the defender is able to find an optimal strategy. We also show the same hardness result regarding any approximation algorithm. Moreover, we show that the defender's problem is NP-hard to approximate within any factor $alpha geq 1$. Since the problems are inapproximable, we consider a relaxed version in which the defender selects her seed nodes based on a probability distribution over the nodes, and the attacker is aware of the distribution. In the relaxed model, we give a constant-factor approximation algorithm for the attacker's problem. We also simulate our results and show that the attacker can activate many voters even when the defender can find the optimal solution. Moreover, we show that the greedy influence maximization algorithm works very well for the defender.File | Dimensione | Formato | |
---|---|---|---|
2021_PhDThesis_AboueiMehrizi.pdf
Open Access dal 01/01/2023
Descrizione: Main article
Tipologia:
Tesi di dottorato
Licenza:
Dominio pubblico
Dimensione
3.35 MB
Formato
Adobe PDF
|
3.35 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.