Academy of Marketing Studies Journal (Print ISSN: 1095-6298; Online ISSN: 1528-2678)

Research Article: 2020 Vol: 24 Issue: 4

Propaganda as Communication Strategy: Historic and Contemporary Perspective

Mohit Malhan, FPM Scholar, Indian Institute of Management, Lucknow

Dr. Prem Prakash Dewani, Associate Professor, Indian Institute of Management, Lucknow

Abstract

In a world entrapped in their own homes during the Covid-19 crisis, digital communication has taken a centre stage in most people’s lives. Where before the pandemic we were facing a barrage of fake news, the digitally entrenched pandemic world has deeply exacerbated the problem. The purpose of choosing this topic is that the topic is new and challenging. In today’s context, individuals are bound to face the propaganda, designed by firms as a communication strategy. The study is exploratory is nature. The study is done using secondary data from published sources. In our study, we try and study a particular type of communication strategy, propaganda, which employs questionable techniques, through a comprehensive literature review. We try and understand the history and use of propaganda and how its research developed from its nascent stages and collaborated with various communications theories. We then take a look at the its contemporary usages and tools employed. It is pertinent to study the impact of propaganda on individual and the society. We explain that how individual/firms/society can use propaganda to build a communication strategy. Further, we theories and elaborate on the need for further research on this widely prevalent form of communication.

Keywords

Propaganda, Internet, Communication, Persuasion, Politics, Social Network.

Introduction

Propaganda has been in operation in the world for a long time now. A quick glance at history reveals several instances of propaganda being used. The speeches of Demosthenes against Macedonians were laden with propagandist techniques. The early Christian missionaries, in particular, and almost any other religion in general, have relied heavily on the propaganda techniques to spread the “the word of God”. Pope Gregory XV established the Catholic Propaganda institute in 1633 to further their cause. However, the large-scale application of these techniques, and that too to devastating effects, was during the World Wars, where the “paper bullets” were deemed to be as devastating as the lead bullets.

Propaganda can be thought of as a form of communication with the sole purpose of fulfilling the propagandist’s needs. It tries to generate a response in the audience, befitting the propagandist’s own agenda. Propaganda is a powerful tool that can mould public opinion and affect behavioural change (Lasswell, 1927). Some scholars view propaganda as the intrinsic thought and practice in societal culture. A few recent studies have focused on the role of propaganda as the carrier of ideology, and how it shapes the dominant ideological meanings in mass media (Burnett, 1989).

Alfred McClung Lee states that,

“Through graphic symbols, music, pageantry, and combinations of words the propagandist makes impressions upon masses of people. These impressions are sometimes vivid. They are frequently charged with emotion. They may be wholly or partially "true," confusing, or "false."” (Lee, 1945).

It is therefore important to note that Propaganda is not just about transmitting ideas and opinions to masses to affect their opinions and actions. It is the loaded style of communication, with use of omnibus symbols and charged words to stir emotions in the masses, that in one way separates propaganda from other forms of communication. The idea is that at the time of decision or action, people often use cognitive shortcuts instead of using a more rational approach, and hence are susceptible to manipulation by the propagandist. Modern mass communication, by both political parties and corporates, rely heavily on these techniques. In our research, we try to study propaganda, its use in history and contemporary world, and various scientific fields which have contributed to the development of literature on propaganda.

Propaganda has its roots in Latin, and is the gerundive form of ‘propagare’, which means to spread. Thus, propaganda means to disseminate, spread or promote particular ideas. The Vatican established the Sacra Congregatio de Propaganda Fide in 1622 to propagate the faith of the Roman Catholic Church. The aim of this Sacred Congregation was to spread the faith to the world, and hence, propaganda lost the neutrality in its meaning. Its usage in World Wars I and II, at the beginning of the 20th century further inflated the pejorative sense of the word. In today’s world if you label anything as propaganda, it would be akin to suggesting that it is something dishonest or deceitful. As World War II drew to its conclusion, researchers studying propaganda stopped addressing their subject as propaganda and began understanding the constructs of persuasion, and behavioural and attitudinal change. The growth of social scientific study and development of subjects like communication and social psychology drove the research on mass persuasion.

Before 1980

The research in propaganda was driven by the need for understanding mass persuasion by the governments, and was triggered by the first World War. Researchers such as (Lasswell, 1927) and (Creel, 1920) were of the view that Propaganda can possibly sway public opinion to any point of view. Based on the stimulus response theory, Lasswell assumed that human responses to media were uniform and immediate. As propaganda started gaining more attention, an effort went underway by the President’s Research Committee (USA) to bring together knowledge of different fields together. They categorised the fields of propaganda, public opinion, marketing and social psychology as “agencies of mass impression” in 1931 (Czitrom, 1982). Doob (1948) defined propaganda as

“The attempt to affect the personalities and to control the behaviour of individuals towards ends, considered unscientific or of doubtful value in a society at a particular time”.

However, he later stated in a 1989 essay that “a clear-cut definition of propaganda is neither possible nor desirable”.

In contrast to the purpose theory, (Ellul, 1965) considered propaganda as a sociological phenomenon and not simply as something being done by someone to a particular end. He was of the belief that almost all the messages in our society are propagandistic to some extent, due to the conscious and subconscious biases of people. Although, Ellul contended that propaganda distorts historical recollection and impedes critical reflection, he was of the opinion that the world needs propaganda as we live in a large society. Propaganda helps to bring the population together for important events like elections, celebrations, and memorials. On one hand, propaganda could be used to incite masses to certain ends, and on the other it could be used to pacify them into a non-challenging lumber (Szanto, 1977).

As World War II drew to its conclusion, researchers studying propaganda stopped addressing their subject as propaganda and began understanding the constructs of persuasion, and behavioural and attitudinal change. The growth of social scientific study and development of subjects like communication and social psychology drove the research on mass persuasion.

1980-90

Burnett (1989) was of the view that propaganda acts as the carrier of ideology and can shape dominant ideological meanings in mass media. The study states that propaganda can be thought of as a form of communication with the sole purpose of fulfilling the propagandist’s needs. It tries to generate a response in the audience befitting the propagandist’s own agenda. Propaganda is considered a powerful tool that can mould public opinion and affect behavioural change. Although, some scholars view propaganda as the intrinsic thought and practice in societal culture, others view it as “organised persuasion”, and have characterised it as being unethical and harmful (DeVito, 1986). Hardt (1989) and Lang (1989) rejected Lasswell’s theory that human response to media and propaganda were uniform and immediate, and posited that propaganda is a complex sociological phenomenon.

It is also interesting to note the role propaganda plays in educational practices as well. For instance, (Aronson, 1980) questioned whether teaching student arithmetic at schools through questions that primarily deal with capitalist ideas, legitimises them as the right behaviour in the society.

1990-2000

Sproule (1994) identified propaganda as orchestrated public persuasion:

“Propaganda represents the work of large organizations or groups to win over the public for special interests through a massive orchestration of attractive conclusions packaged to conceal both their persuasive purpose and lack of sound supporting reasons”.

Rogers (1994) stated that

“Private foundations and the federal government were more eager to support research that was useful to policymakers but did not raise troubling questions about the interests and motives of the persuaders”.

However, Simpson (1994) was of the view that:

“Sponsorship can, however, underwrite the articulation, elaboration, and development of a favoured set of preconceptions, and in that way improve its competitive position in ongoing rivalries with alternative constructions of academic reality”.

A popularly used word for propaganda in the domain of manipulating political information is ‘Spin’, and the public relations officers attempting to manipulate the news are referred to as “spin doctors” (Kurtz, 1998).

Propaganda is not just limited to politics or societal exchanges. In the corporate world it is defined as the,

“Communications where the form and content is selected with the single-minded purpose of bringing some target audience to adopt attitudes and beliefs chosen in advance by the sponsors of the communications” (Carey, 1997).

Noam Chomsky, in his introduction to Carey’s collection of essays, said that Carey believed that

“The twentieth century has been characterized by three developments of great political importance: the growth of democracy, the growth of corporate power, and the growth of corporate propaganda as a means of protecting corporate power against democracy”.

Carey said that

“Commercial advertising and public relations are the forms of propaganda activity common to a democracy. . . It is arguable that the success of business propaganda in persuading us, for so long, that we are free from propaganda, is one of the most significant propaganda achievements of the twentieth century”.

Hitler’s propaganda minister, Joseph Goebbels, was of the opinion that ‘extreme and outlandish’ would prove to be more effective in being believed by the masses than simply bending the truth (Bogart & Bogart, 1995). O’Shaughnessy (1996) describes the characteristics of propagandist communication to be biased and ideological. The research claims that propagandists use tactics like simplification, exaggeration, and high-pressure advocacy to further their agenda.

2000-10

Parry-Giles (2002), defined propaganda as

“Conceived of as strategically devised messages that are disseminated to masses of people by an institution for the purpose of generating action benefiting its source”.

In essence, propaganda aims to change the attitudes and behaviours of masses, and could potentially act as a tool to spread an ideology (Collison, 2003). Messina (2007) was of the view that the aim of propaganda is to control information flow, and deceive recipients by spreading untruthful information. A study performed involving four authors of management textbooks discovered that managerial theory

“Would seem to serve the interest of other groups who are also currently most powerful in management education” (Cameron et al., 2003).

In essence, we need to evaluate education practices in terms of their end results to identify use of propaganda in such practices. Researchers in the past have claimed propaganda and PR to be the same (Moloney, 2004), while some have claimed it to be a part of PR’s toolbox (Messina, 2007). Hiebert (2003) argues that the goal of mutual understanding between organisations and audiences distinguishes PR from propaganda. The ethical concerns regarding communication have been highlighted by some authors (Weaver et al., 2006), with the focus being on content, ends, and transparency. Some social scientists contend that ethical persuasive communication would allow the receivers to make ‘voluntary, informed, rational and reflective judgements’ (Messina, 2007). Thus, the characteristics of persuasion differ from propaganda in being truthful, respectful, ethical and authentic.

2010-2020

Jowett and O’Donnell hold similar views on propaganda and persuasion. They state that an informative communicator differs from other kinds of communicators by having the purpose of creating mutual understanding of data that are considered to be accurate, concepts that are considered to be indisputable, and ideas that are based on facts. (Jowett & O’donnell, 2018). A propagandist on the other hand, builds on the audience’s existing beliefs, and uses them as anchors to alter or form new beliefs.

“The stronger the belief of a receiver, the more likely it is to influence the formation of a new belief.” (Jowett & O’donnell, 2018).

Jowett & O’Donnell seek to

“Understand and analyse propaganda by identifying its characteristics and to place it within communication studies to examine the qualities of context, sender, intent, message, channel, audience, and response.” (Jowett & O’donnell, 2018).

They define Propaganda as the

“Deliberate, systematic attempt to shape perceptions, manipulate cognitions, and direct behaviour to achieve a response that furthers the desired intent of the propagandist.” They view propaganda as a subcategory of persuasive communication (Jowett & O’donnell, 2018).

Persuasion has been defined as

“A complex, continuing, interactive process in which a sender and a receiver are linked by symbols, verbal and nonverbal, through which the persuader attempts to influence the persuadee to adopt a change in a given attitude or behaviour because the persuadee has had perceptions enlarged or changed” (O’Donnell & Kable, 1982).

“Both persuader and persuadee stand to have their needs fulfilled, persuasion is regarded as more mutually satisfying than propaganda.” (Jowett & O’donnell, 2018).

Thus, both persuader and persuade would benefit from persuasion. Persuasion is based on the normative demands of accountability, transparency, and participation (Lock et al., 2016). Taylor & Kent (2014) contend that a persuasive attempt built on constricting freedom and instilling obedience is intrinsic to propaganda, with the sole intent of changing attitudes and behaviours. Another research states that propaganda lies at unethical end of a spectrum from ethical to unethical, and on the persuasive end of an axis from persuasion to understanding (Lock & Ludolph, 2020).

Thus, the underlying difference between PR and propaganda lies in the intent of the communicator. It is possible for a communicator to perform persuasion ethically (e.g. brand communication) or unethically (e.g. propaganda). Some researchers (Cornelissen & Werner, 2014) state that propaganda is, in essence, one of the tools of a PR manager. It is a specialist form of ‘unethical persuasive communication’ (Jowett & O’donnell, 2018). Researchers point out that trust in organisations is at an ‘all-time low’ (Auger, 2013) and there is scepticism among audiences regarding organisations’ communications (Chang & Lin, 2014). Thus, it is imperative to differentiate between PR and propaganda.

Propaganda Types

With propaganda running into trouble due to its implicit negative connotation, it is important to differentiate between different types of propaganda. This would help us in distinguishing between the positive and negative side of it, as well. On one hand, propaganda could be used to incite masses to certain ends, and on the other it could even be used to pacify them into a non-challenging lumber (Szanto, 1977). Extant literature defines the following three types of propaganda.

White Propaganda

This is the type of propaganda where the origin of the information is known and the content is considered truthful (Guth, 2009). This, by definition then, would encompass most of the advertising done by corporates or governments. An example of white propaganda could be government communication to deter drivers from drinking and driving. Similarly, corporates promoting their products with unsubstantial claims can be seen as white propaganda. However, it is important to note that this has two checks that need to be cleared, i.e., even if the source is known but the information in itself is false, then it cannot be termed as white propaganda. For instance, if the government publishes data about the performance of a welfare scheme, and the data itself is fraudulent and/or not reliable, this would then not be considered as white propaganda despite the source being known.

Black Propaganda

In this type of propaganda, the origin of the source is unknown and the information being transmitted is false (Guth, 2009). This type of Propaganda is fairly common in war efforts and political marketing, where the truth may not see any daylight. For example, the Iraq war by the US presents multiple instances, where the US propaganda machinery ensured that enough false information was fed to the US populace to keep the public opinion in favour of the war. At the start of the war, fraudulent documents alleged the Iraqi regime to be in possession of weapons of mass destruction, both biological and nuclear, however no such claims were corroborated even after the US invasion. Investigative reporter Seymour Hersh writes,

“One member of the U.N. inspection team, who supported the American and British position, arranged for dozens of unverified and unverifiable intelligence reports and tips data known as inactionable intelligence to be funnelled to MI6 operatives and quietly passed along to newspapers in London and elsewhere” (Hersh, 2003).

Disinformation is another term used to define propaganda. It uses covert and incorrect information, and hence is considered black propaganda. Disinformation means

“False, incomplete, or misleading information that is passed, fed, or confirmed to a targeted individual, group, or country” (Shultz et al., 1984).

Hitler’s propaganda minister, Joseph Goebbels, was of the opinion that ‘extreme and outlandish’ would prove to be more effective in being believed by the masses than simply bending the truth (Bogart, 1995). Hence, it can be observed over time that propagandists tend to spread messages that are highly polarising and divisive. Black propaganda is even used by allies on friendly nations. British intelligence tried to manipulate the United States to go to war in the two years before Pearl Harbour was attacked by the Japanese. The extent to which black propaganda works depends on the audience’s acceptance of the source credibility and message content. A propagandist’s efforts might fail if the message and/or source fall outside the accepted socio-cultural and political frameworks of the audience. Here, it is also interesting to note that the failure of black propaganda usually does not have drastic negative effects for the cause, i.e., the public soon forgets that they were being deceived and would not hold long term grudges to having been manipulated.

Grey Propaganda

This is the type of propaganda that propagandists and public relations experts absolutely embrace. Here, the source of the information is suspect and the information’s truthfulness is also doubtful. This provides two advantages to the propagandist - first, it is really difficult to identify this kind of propaganda as it weaved in, generally, with some amount of truth; and second, the perpetrators have full deniability as the source is suspect and hence, they can get away with using such technique, over and over again.

Thus, to no surprise, grey propaganda could be seen everywhere around us. Both, corporates and governments, world over, subscribe to this kind of propaganda. There are corporates that ‘misrepresent’ data on their reports, FMCG companies that make outlandish claims about their products in their commercials, movies that are produced just to promote products, and televangelists who hoard personal wealth in the name of religion. All these could be characterised as grey propaganda in Table 1.

Table 1 Propaganda Techniques
Bases of Typology Type & Definition (and example) Reference
Applied Procedure Selecting the Issue
Refers to selecting the issues in the social context of the group, which bears heavily on the ultimate victory or defeat of the propagandist.
Example: At the beginning of the Iraq War, US mentioned “self-defence” (against weapons) as the narrative, and when proven false, the narrative was changed in the middle of the war.
(Lee, 1945)
Case-Making/Card-Stacking
A case is made (via evidence, arguments and illustrations) in a manner to have the highest impact in their favour. It makes the opposition's cause appear dastardly, uncivilized, money- grubbing, unprincipled, or at least unnecessary. Card-stacking is case-making used in a deliberate unfair manner, involving selection and use of facts or falsehoods, illustrations or distractions, and logical or illogical statements to give the best or the worst possible case for an idea, program, person, or product. The propagandist stacks the cards against the truth.
Example: Parallel news reports in Germany and other nations by the controlled press, during WW-II.
(Jowett et al., 2012; Yourman, 1939)
Simplification
This technique reduces the propaganda material to easily understandable small portions rooted in dogmatism, leaving little or no room for logical dialogue.
Example: Bank-sponsored commercial stating “…all your troubles will be over when you take out a loan with us”.
(Conserva, 2003; Lee, 1945)
Use of Omnibus Words Name-Calling
Name-calling attaches a negative label to an idea, thus diverting attention from issues, and derailing discussions.
Example: Nazi Propaganda Machinery calling the Jews, rats, was one way of dehumanising them and diverting the attention away from the atrocities done to them.
(Yourman, 1939)
Glittering Generality
Opposite to name-calling, glittering generality associates an idea with a “virtue word” (like freedom, security, tradition, prosperity, etc.) to make the populace accept and approve it, without much evidence.
Example: US’s military aggression in the Middle-East and Vietnam have been bred and promoted on nationalistic calls. The terrorist jihad is promoted on religious calls.
(Stevens, 2012)
Identification Transfer
This technique is used to gain identity with the target group, to induce positive or negative feelings, and lends prestige, sanction, or authority to the program. Transfer helps people identify more readily with the program and shift loyalties of groups in favour of the program. The technique can be used to make an idea more acceptable or rejection worthy.
Example: Comparing the September 11, 2001 attacks to the Iraqi War.
(Conway et al., 2007; Fleming, 1995)
Testimonial
Similar to transfer, but Testimonials use a respected person to endorse the program. It can also be used to induce positive or negative feelings.
Example: Political strategist and former Clinton adviser Dick Morris calling Bush’s inaugural
speech ‘‘brilliant’’.
(Collins, 2017; Conway et al., 2007)
Plain Folks
This technique helps the propagandist to convince people that both, he and his ideas, are “of the people” and hence should be deemed ‘good’. The propagandist tries to appeal to values that common people hold dear, like family and patriotism.
Example: Phrases like “as we know”, “we Americans”, “your humble correspondent”
(Conway et al., 2007; Lee & Lee, 1995)
Band-Wagon
Here, people are encouraged to follow the ‘crowd’ of people who have already accepted the program. The propagandist tries to engage cognitive shortcuts of decision making by having people accept an idea without weighing the proper evidence, by having people identify with members already in the program.
Example: “It’s what the pioneers did… Millions of English and American parents have done it before you… Teach your child yourself how to read…”
(Lamkin, 1955; Pierce, 1940; Tilley, 2004)
Strategy Hot Potato
The propagandist tries to discredit his opponent by entrapping him/her in situations which would be viewed by most people in a negative light. The event or situation need not necessarily be untrue, rather it is the use of extraneous events with right timing and skill that determines the effectiveness of this technique. It blames an individual
or group for something that was beyond their control and forces them to answer for it in an attempt to embarrass them.
Example: “Have you stopped beating your wife?”
(Cooper, 1971; Curnalia, 2005)
Stalling
Stalling is a delaying technique to make the opposition ‘lose steam’. It includes formation of committees, adherence to ‘proper procedure’ (red tape), memo passing, etc.
Example: “I’m in favour of your objectives but I want to investigate to make certain your methods are the best by which to achieve them”
(Cooper, 1971; Curnalia, 2005)

Results

Propaganda and Technology

As a form of communication, then, propaganda is influenced by the technological tools available at any given time. Hence, with the advancement in communication technology, from print and radio to satellite TV and high speed mobile internet, the reach and speed of propagandist’s messages has increased multi fold (Woolley & Howard, 2016).

Al Qaeda, the broad-based militant Islamist organisation, uses the internet to reach its followers in 68 countries (Jowett & O’donnell, 2018). Similarly, propagandist messages from ISIS have triggered recent violent acts in the world even without their direct involvement (Jowett & O’donnell, 2018). Technology, especially the internet (in its early stages) proved to be a boon for dissenters and activists across the world. Internet disrupted state control over the information flow (Stelter & Stone, 2009). Dissenters have used technology to garner support in Ukraine (2004), Moldava (2009) and even during the Arab Spring. However, the democratisation of information was a short-lived phenomenon. Myanmar turned off the country’s internet service to curb dissenters (2009). China has controlled the flow of information available to its public on the internet for years now. It is no secret that Google had to pull its services from China because of the latter’s censorship and information control. The Tiananmen Square protests (1989) which rocked the globe, cannot even be discussed over the internet there, with no records of it found in China’s public domain.

However, governments have moved beyond merely curbing dissent and controlling the flow of information, and have started adopting these new technologies to spread propaganda.

Web 2.0

Web 2.0 describes the phase of internet development where it stopped being a simple page to view information to one which the users can interact with. It is also called Participative or Social Web. The focus here is on user generated content, ease of use even for a non-expert, and interoperability with other online resources (O’reilly, 2009). It does not refer to update in technical specifications, rather how the webpages are designed and used. In the earlier versions of the net, people were simply passive viewers to the content provided by the companies. However, with Web 2.0 people can interact with each other, collaborate and even personalise web pages as per their needs.

Social media sites like Facebook and Twitter, content sharing sites like YouTube and 9gag, communication services like Skype, and other web applications (called Apps) are all part of Web 2.0.

From a propagandist’s point of view, Web 2.0 provides immense potential to spread their message. Facebook has billions of users on its platform, who share content with each other on a daily basis. The content created on Facebook ranges from people’s personal updates and pictures to their own opinions and news. In addition, people are also members of groups which share common interests among its members. The algorithms used by Facebook and other popular social media sites is based on keeping users engaged all the time (Santini et al., 2018). To state simply, users are exposed to things that they like. For instance, if an individual were to like a kitten video, Facebook would start feeding them with more kitten videos. As an extension it might also start showing them other similar ‘cute’ videos as they might like them. Strictly speaking, this is not necessarily a bad thing. However, such an algorithm does not differentiate between kitten videos or hate speeches. In essence, if someone were to like an article with derogatory content towards a particular community, Facebook would end up showing them more of such hateful content. In addition, hateful or negative content is more likely to keep a user engaged than any other type of content. Hence, the algorithm is inclined to show more negative content than positive (Milan, 2015; Santini et al., 2018). This would then spur a cycle where the user keeps seeing a particular kind of content and Facebook keeps showing him/her similar content. Moreover, since Facebook is a social platform, the content, more often than not comes from a source which is known to the user and hence the acceptance of the content is also higher. This, thus, puts the users in a cycle of ever strengthening beliefs and attitudes (Peterson-Salahuddin & Diakopoulos, 2020). Such cycles could be termed as thought silos.

Weaponizing the Internet

As technology becomes more sophisticated, the use of propaganda and the techniques employed become sophisticated as well. In the age of Social media, Internet can be weaponised to control public opinion and achieve desired goals. Since, social media provides individual, continued and prolonged engagement, which no other form of media can provide, it is used by propagandists to fabricate an alternative reality by using the power of bots and fake accounts to further manipulate people (Rafiq, 2019).

A bot is a program that gives automated responses to posts on social media. These responses could be both positive or negative. Thousands of such programs or bots can be managed by a handful of people. Since it is machine driven, a bot can create thousands of posts in a matter of seconds. Thus, this creates the perception that there is a huge support for a particular cause. A fake account is simply a ‘manufactured online identity’. They are generally difficult to differentiate from a real account by the common user. In essence, a single person can create multiple accounts on a single social media platform. Hence, a single person can manage multiple fake accounts. Based on the behaviour exhibited by these fake accounts, these could be termed as trolls as well. A troll is an account (not necessarily fake but generally people do not use their real identities for trolling) with the sole purpose of derailing a conversation with the use of derogatory language, memes, threats, etc.

Fake accounts work in tandem with anonymous pages, thus using Facebook’s algorithms to their advantage and improving each other’s reach. Bots then can improve the working of such networks, however usage of bots is not a necessity (Salter, 2018). Such political promotion pages are tailor made for Facebook and are engineered in ways to take over people’s Facebook news feeds, around the world (Trottier, 2017). These accounts thus exploit regional, communal, economic and political divides to generate social movements that disintegrate societal cohesiveness.

FUD: Fear, Uncertainty and Doubt

A commonly used strategy in sales, marketing, public relations and politics, it was initially used by corporates in the social media landscape. FUD stands for Fear, Uncertainty and Doubt, which is a disinformation strategy used to spread negative and/or false information to trigger fear among the masses. A latest example of this is the circulated news about Cadbury chocolates being infected with HIV positive blood and asking customers to stop eating these chocolates for a few weeks. It, clearly, was a false news story. In fact, the images being circulated for this story were the same as used for another fake story about Pepsi drinks being infected with HIV (Mehta, 2019).

However, the use of this techniques takes an even sinister form when it comes to political propaganda. For instance, if a person uses a hashtag for a cause which is against that of the propagandist, the bots would automatically send messages directly to the user sowing fear and doubt. Such messages are often aimed at issuing personal threats to life and property. This effectively shuts out dissenting voices. This is backed with fake accounts that support the campaign. The two work at a frequency that it effectively drowns out any opposing views or campaigns.

Trolls and fake accounts rally people to create and spread messages and memes that have a tiny shred of truth. Such grey propaganda is one of the most efficient ways of using FUD. For example, FUD messages generally cover the following themes to discredit media: i) Bias – the media is biased against the cause; ii) Paid – the journalists are paid and corrupt; iii) Oligarchs – journalists work for vested interests; and iv) Clickbait – they use cheap headlines to garner more attention.

It is also important to note here the difference between traditional media and social media. Consider the repeatability of a news, which often is a guiding factor for people to decide if something is true or not. In traditional media, this duty was performed by news houses, which at least to some extent, had to maintain objectivity towards the truth. However, in the age of social media where the users are both content generators and propagators, a lie reposted or retweeted a million times will start appearing to be true to most people. In addition, a viral story is a story whether it is true or not. A viral story then is picked up by news houses as well, thus giving it even more legitimacy. In the new age of social media, the algorithms are the new gate keepers which decide the stories that deserve to be spread.

Social media platforms have also evolved over the years. Today, people across the world get their news from platforms like Facebook, 9Gag, Twitter, etc. The echo chamber effect thus got even more intense as social media platforms started showing more news articles in the news feeds of users. The algorithms have now essentially become the editor.

“Given that Facebook-owned platforms, including Instagram, Messenger and WhatsApp, reach 86% of internet users aged 16 to 64 in 33 countries, and that 44% of people across 26 countries say they use it for news, that algorithm determines reality” (Propaganda War, 2016.).

The biggest flaw of the algorithms though is that they don’t distinguish between fact and fiction. Thus, anyone with an army of fake accounts and trolls could easily beat its competitors, especially in a scenario where truth gets little weightage. Similarly, in a heated political environment, emotional call to actions garner higher engagement and are then promoted by the algorithm. However, this is not a problem which has an easy solution. Any attempt made by social networking sites to limit what people say on their platforms would fall under Freedom of Speech issues.

Another thing to take into account is the changing role of these technology companies like Facebook. Facebook CEO Mark Zuckerberg’s views about Facebook's role: "We're a technology company. We're not a media company" (Castillo, 2018) However, we cannot turn a blind eye towards the impact Facebook has in disseminating news and information to the world and it is now time to discuss the application of the ethics and principles that have guided the fourth pillar of democracy till now.

We have discussed the techniques of propaganda as well as the behavioural and cognitive aspects of the masses that the propagandist plays on. It is also important to study what supports or helps propaganda. These may not be directly linked to the propagandist idea; however, propagandists use various ways to change the environment to make propaganda more effective. These changes in environment are focussed on incapacitating masses by:

Violence

Police brutality accompanied with violence by organised crime networks and vigilantes intimidate dissenters and stop them from acting against propagandists. In India this could be seen in the form of murders and intimidation of journalists, mob violence against minorities, and organised crime cartels in almost every industry.

Judicial paralysis

When the judicial system is so designed that it is almost impossible for the rich and powerful to get acquitted, the masses lose faith in the struggle against the propagandist forces. The Indian judicial system is notorious for being glacially slow and has often not been able to deliver justice in high profile cases. For instance, none of the people accused in the 2G Scam, which turned out to be a multi-million-dollar scam as per the CAG reports, have been convicted till date.

Curbing education

Education is one of the few ways of defending oneself against propaganda. Hence, the propagandists have always tried to control education, and not let it reach the masses. The fact that India’s education record is dismal is a testament to it. As per the government report, only 24% of people in India pursue any kind of higher education. Although about 70% of children enrol in secondary education in India, as per a UN report, the quality of education is so dismal that basic literacy has also not been achieved with these students (Education & Employability, 2016).

Curbing free speech and investigative journalism

Journalism acts as the fourth pillar of democracy. It is imperative for the propagandist that they not just control the media but also curb investigative journalism. India fares the among lowest countries when it comes to protecting its journalists, ending up lower than even some war-torn countries and dictatorships economies. In addition, the country’s sedition laws have been used by the government to stifle any sort of criticism of itself or its policies.

Increasing power distance

Another aspect that has hardly been investigated in the existing literature is of power disparity. By definition, power distance represents an individual and society’s acceptance towards power and authority. India has one of the highest power distance indices in the world. The increasing power distance incapacitates the population towards standing up for what is right and dissenting with power figures. This also has a trickle-down effect where not only the highest authorities reap the benefit of power distance but almost everyone who has any power over someone else, also benefit.

Hate

It is interesting to analyse hate in the context of propaganda. History has shown us that where propaganda has worked to devastating effect, there were copious amounts of hate in the society. This we believe, is not by accident but by design. Indian history shows that the British ruled India for over 100 years by the policy of ‘divide and rule’. Even after 70 years of independence the policy can still be seen at play in India. Propagandists sow hate in the environment as the effect of propagandist messages is exaggerated in the presence of hate. In our research we have witnessed that propaganda works best in a negative environment. Propagandists work on public fears and insecurities to further their cause.

Discussion

Propaganda works, there is no doubt about that. Moreover, it works better with pre-existing beliefs, and more with negative ones. Then the question is whether or not to use propaganda. Do we as a society stand to gain or lose anything with this. Extant literature provides evidence that the propagandist plays on the following: confirmatory bias, group norms, appealing to emotions, appealing to existing attitudes, discrediting other sources of information, appealing to people’s cognitive dissonance, repeated exposure to an idea making people more apt to accepting it, ease of inciting existing emotions over driving change.

The application of these theories by the propagandist has essentially lead to the issue of polarisation in the society. It is beneficial for the propagandist if people keep moving towards extremes. An individual keeps absorbing more and more propagandist material which keeps reinforcing his or her existing dispositions. Often these dispositions are rooted in cultural stereotypes, religious or communal vitriol, and nationalistic fervour. This leads to the development of hyper attitudes in the individual, where each of these ideas are heightened to the extreme. This leads to a decrease in empathy and reasoning. A person thus may not just become a more polarised and hateful individual, but may also seize to add value to the society.

Similarly, the struggle of power (or market share) that the propagandist is engaged in, shifts the focus off important issues. Since logic and reason are not the weapons of propaganda, the main issues of a propagandist’s message are always general, which cannot be resolved easily or in the short term, and more often than not incite negative emotions in the masses. Hence, agendas such as education, healthcare and environment take a back seat and attention is given to communal disputes, political vendetta, and scandals.

Another aspect of propaganda which is not being discussed is not accepting one’s fault or failure. Since, propaganda works on personalities and their appeal to the masses, a propagandist cannot publicly accept any fault committed by him or her. This leads to the classical case of escalation of commitment where the propagandist spends ever increasing amount of resources on keeping the appearance of being right, meanwhile squandering any opportunity at correcting the mistake.

Another aspect that deserves dwelling deeper is the moral corruption due to behaviour modelling. People see that it is only important to maintain outward appearances even while performing illegal acts. Hence, the society, as a whole, moves towards ends justifying means.
Researchers in the past have focussed on the effectiveness of the propaganda. They have not focussed on the repercussions the use of propaganda has on the society. For instance, in India the ‘Babri Masjid-Ram Mandir’ dispute has been going on for almost two decades and proves to be one of the most important political campaign points. This has led to ever increasing communal tensions between the Hindus and Muslims in the largest state in the country. It would be interesting to see if this has had any effect on the wellbeing of the society. Has it led to more divisiveness in the country? Has it led to important issues being side-lined? These are important questions that need to be addressed on an urgent basis.

Conclusion and Further Research Orientations

In our review, we have tried to understand propaganda and its role in our society. It is evident from literature that propaganda today, is widespread. We can see propaganda at work in both worn torn countries and advanced democracies. With vast improvements in information dissemination technologies, it has become easier to spread propaganda. With complex communication networks now in place, information has become easy to access but difficult to find. Institutions of power across the world find it necessary to control mass behaviour, and propaganda provides an easy tool. However, it is difficult to ascertain the nature of communication as an individual’s attitudes guide their perception of the received information. As discussed earlier, a propagandists deliberate intent to manipulate the receiver distinguishes propaganda from other forms of communication.

Occasionally, we view our actions in isolation and ignore the impact of our actions on the world. We believe the use of propaganda has a similar hidden impact on the world. The establishment of the effect of propaganda on the society could pave avenues for further theoretical developments even in related fields of policy and decision making. As the definition of marketing currently states adding value to the society as a whole, it is imperative to analyse such marketing techniques for their impact on society as well.

Further research opens multiple avenues for managers as well. For instance, it may determine which are the brands for which propagandist techniques might work. Returns of propaganda techniques could be compared with those obtained from traditional methods of marketing. The long-term impact of using propagandist techniques on brand equity should also be researched on. Brand or firm credibility, in the eyes of the customer, if found using propagandist techniques could also be explored. These are important managerial issues that deserve further probing.

References

  1. Aronson, E. (1980). The Social Animal, San Francisco. H. Freeman and Company.
  2. Auger, G.A. (2013). Fostering democracy through social media: Evaluating diametrically opposed nonprofit advocacy organizations’ use of Facebook, Twitter, and YouTube. Public Relations Review, 39(4), 369376.
  3. Bogart, L., & Bogart, A. (1995). Cool words, cold war: A new look at USIA’s Premises for propaganda. Univ Pr of Amer.
  4. Burnett, N. (1989). Ideology and propaganda: Toward an integrative approach. Propaganda: A Pluralistic Perspective, 127-137.
  5. Cameron, K.S., Ireland, R.D., Lussier, R.N., New, J.R., & Robbins, S.P. (2003). Management textbooks as propaganda. Journal of Management Education, 27(6), 711-729.
  6. Carey, A. (1997). Taking the risk out of democracy: Corporate propaganda versus freedom and liberty. University of Illinois Press.
  7. Castillo, M. (2018). Zuckerberg tells Congress Facebook is not a media company: “I consider us to be a technology company.” CNBC. https://www.cnbc.com/2018/04/11/mark-zuckerberg-facebook-is-a-technology-company-not-media-company.html
  8. Chang, T.K., & Lin, F. (2014). From propaganda to public diplomacy: Assessing China’s international practice and its image, 1950–2009. Public Relations Review, 40(3), 450–458. https://doi.org/10.1016/j.pubrev.2014.04.008
  9. Collins, S. (2017). Star Testimonials and Trailers: Mobilizing during World War I. Cinema Journal, 57(1), 46-70. https://doi.org/10.1353/cj.2017.0055
  10. Collison, D.J. (2003). Corporate propaganda: Its implications for accounting and accountability. Accounting, Auditing & Accountability Journal.
  11. Conserva, H.T. (2003). Propaganda Techniques. AuthorHouse.
  12. Conway, M., Elizabeth Grabe, M., & Grieves, K. (2007). VILLAINS, VICTIMS AND THE VIRTUOUS IN BILL O’REILLY’S “NO-SPIN ZONE”: Revisiting world war propaganda techniques. Journalism Studies, 8(2), 197-223.
  13. Cooper, C.B. (1971). A Description and Analysis of Propaganda Techniques used in Undergraduate Recruiting Materials Published and Distributed by the University of Georgia, College of Agriculture. 4.
  14. Cornelissen, J.P., & Werner, M.D. (2014). Putting framing in perspective: A review of framing and frame analysis across the management and organizational literature. Academy of Management Annals, 8(1), 181-235.
  15. Creel, G. (1920). How we advertised America: The first telling of the amazing story of the Committee on Public Information that carried the gospel of Americanism to every corner of the globe. Harper & brothers.
  16. Curnalia, R.M.L. (2005). A Retrospective on Early Studies of Propaganda and Suggestions for Reviving the Paradigm. Review of Communication, 5(4), 237-257.
  17. Czitrom, D.J. (1982). Media and the American mind: From Morse to McLuhan. Univ of North Carolina Press.
  18. DeVito, J.A. (1986). The communication handbook: A dictionary. Harpercollins.
  19. Doob, L.W. (1948). Public opinion and propaganda.
  20. Education & Employability. (2016). UN India. Retrieved August 30, 2020, from https://in.one.un.org/un-priority-areas-in-india/education-and-employability/
  21. Ellul, J. (1965). Propaganda: The formation of men’s attitudes (K. Kellen & J. Lerner, Trans.). New York: Vintage.
  22. Fleming, C.A. (1995). Understanding propaganda from a general semantics perspective. ETC.: A Review of General Semantics, 52(1), 3-13.
  23. Guth, D.W. (2009). Black, White, and Shades of Gray: The Sixty-Year Debate Over Propaganda versus Public Diplomacy. Journal of Promotion Management, 14(3-4), 309–325.
  24. Hardt, H. (1989). The return of the “critical” and the challenge of radical dissent: Critical theory, cultural studies, and American mass communication research. Annals of the International Communication Association, 12(1), 558–600.
  25. Hersh, S.M. (2003). Who Lied to Whom? The New Yorker. Retrieved August 28, 2020, from https://www.newyorker.com/magazine/2003/03/31/who-lied-to-whom
  26. Hiebert, R.E. (2003). Public relations and propaganda in framing the Iraq war: A preliminary review. Public Relations Review, 29(3), 243-255. https://doi.org/10.1016/S0363-8111(03)00047-X
  27. Jowett, G., O’Donnell, V., & Jowett, G. (2012). Propaganda & persuasion (5th ed). SAGE.
  28. Jowett, G.S., & O’donnell, V. (2018). Propaganda & persuasion. Sage publications.
  29. Kurtz, H. (1998). Spin cycle: Inside the Clinton propaganda machine. Free Press New York.
  30. Lamkin, F.D. (1955). An Analysis of Propaganda Techniques Used in “Why Johnny Can’t Read”: Flesch. The Reading Teacher, 9(2), 107-117. JSTOR.
  31. Lang, K. (1989). Communications research: Origins and development. International Encyclopedia of Communications, 1, 369-374.
  32. Lasswell, H.D. (1927). Propaganda technique in the world war. Ravenio Books.
  33. Lee, A.M. (1945). The Analysis of Propaganda: A Clinical Summary. American Journal of Sociology, 51(2), 126-135. JSTOR.
  34. Lee, A.M., & Lee, E.B. (1995). THE ICONOGRAPHY OF PROPAGANDA ANALYSIS. ETC: A Review of General Semantics, 52(1), 13-17. JSTOR.
  35. Lock, I., & Ludolph, R. (2020). Organizational propaganda on the Internet: A systematic review. Public Relations Inquiry, 9(1), 103-127.
  36. Lock, I., Seele, P., & Heath, R.L. (2016). Where grass has no roots: The concept of ‘shared strategic communication’as an answer to unethical astroturf lobbying. International Journal of Strategic Communication, 10(2), 87-100.
  37. Mehta, A. (2019, November 26). Cadbury products contaminated with HIV? Old image from Nigeria used to share false claim. Alt News. https://www.altnews.in/cadbury-products-contaminated-with-hiv-old-image-from-nigeria-used-to-share-false-claim/
  38. Messina, A. (2007). Public relations, the public interest and persuasion: An ethical approach. Journal of Communication Management.
  39. Milan, S. (2015). When algorithms shape collective action: Social media and the dynamics of cloud protesting. Social Media+ Society, 1(2), 2056305115622481.
  40. Moloney, K. (2004). Debate papers: Democracy and public relations. Journal of Communication Management, 9(1), 89.
  41. O’Donnell, V., & Kable, J. (1982). Persuasion: An interactive-dependency approach. Random House.
  42. O’reilly, T. (2009). What is web 2.0. O’Reilly Media, Inc.
  43. O’Shaughnessy, N. (1996). Social propaganda and social marketing: A critical difference? European Journal of Marketing.
  44. Parry-Giles, S.J. (2002). The rhetorical presidency, propaganda, and the Cold War, 1945-1955. Greenwood Publishing Group.
  45. Peterson-Salahuddin, C., & Diakopoulos, N. (2020). Negotiated autonomy: The role of social media algorithms in editorial decision making. Media and Communication, 8(3), 27-38.
  46. Pierce, W.M. (1940). Climbing on the Bandwagon. The Public Opinion Quarterly, 4(2), 241-243. JSTOR.
  47. Propaganda war: Weaponizing the internet. (2016). Rappler. Retrieved August 30, 2020, from https://rappler.com/nation/propaganda-war-weaponizing-internet
  48. Rafiq, A. (2019). Like War: The Weaponisation of Social Media. JSTOR.
  49. Rogers, E.M. (1994). History of communication study. Free Press New York.
  50. Salter, M. (2018). Publicising privacy, weaponising publicity: The dialectic of online abuse on social media. In Digital Intimate Publics and Social Media (pp. 29-43). Springer.
  51. Santini, R.M., Agostini, L., Barros, C.E., Carvalho, D., Centeno De Rezende, R., Salles, D.G., Seto, K., Terra, C., & Tucci, G. (2018). Software Power as Soft Power. A Literature Review on Computational Propaganda Effects in Public Opinion and Political Process [Data set]. University of Salento. https://doi.org/10.1285/I20356609V11I2P332
  52. Shultz, R.H., Godson, R., Graff, R.D., Schultz, R., Sterling, C.H., Evans, J., Sreberny-Mohammadi, A., Schiller, H.I., Hawkridge, D., Robinson, J., Lewis, P.M., Bell, P., Cansell, R., Beilby, P., Muscia, W.T., & Macmillian, P.R. (1984). Foreign and International. Communication Booknotes, 15(11), 124-127.
  53. Simpson, C. (1994). Science of coercion: Communication research and psychological warfare 1945-1969. New York: S. 8f.
  54. Sproule, J.M. (1994). Channels of Propaganda. ERIC.
  55. Stelter, B., & Stone, B. (2009). Web pries lid of Iranian censorship. New York Times, 22.
  56. Stevens, J.P. (2012). Glittering Generalities and Historical Myths. University of Louisville Law Review, 51, 419.
  57. Szanto, G.H. (1977). Theater & propaganda. University of Texas Press.
  58. Taylor, M., & Kent, M.L. (2014). Dialogic engagement: Clarifying foundational concepts. Journal of Public Relations Research, 26(5), 384-398.
  59. Tilley, E. (2004). Propaganda — Who, Us? The Australian Government ‘Terror Kit.’ Media International Australia Incorporating Culture and Policy, 113(1), 30-43.
  60. Trottier, D. (2017). Digital vigilantism as weaponisation of visibility. Philosophy & Technology, 30(1), 55-72.
  61. Weaver, C.K., Motion, J., & Roper, J. (2006). From propaganda to discourse (and back again): Truth, power, the public interest and public relations. Public Relations: Critical Debates and Contemporary Practice, 7-21.
  62. Woolley, S.C., & Howard, P.N. (2016). Political communication, computational propaganda, and autonomous agents: Introduction. International Journal of Communication, 10.
  63. Yourman, J. (1939). Propaganda Techniques Within Nazi Germany. The Journal of Educational Sociology, 13(3), 148-163.
Get the App