Buddhist perspectives on society and culture


Buddhist perspectives on
society and culture

Conformists, Dissenters and Contrarians

Posted in: Politics

Book Review: ‘Conformity: The Power of Social Influences’ by Cass R. Sunstein, NYU Press, 2019

Cass Sunstein is an American legal scholar who was the Administrator of the White House Office of Information and Regulatory Affairs in the Obama administration from 2009 to 2012. He is also a prolific author, having written a huge number of legal and political books, and also some popular ones, such as Nudge (with Richard Thaler), Infotopia, and How Change Happens.

Conformity is a very valuable book about how certain supposed facts become accepted truths, and about people who question the truth of those ‘facts’. The Preface contains two proposals which, Sunstein says, ‘capture much of the territory’. Firstly, ‘The actions and statements of other people provide information about what is true and what is right.’ Secondly, ‘The actions and statements of other people tell you what you ought to do and say if you want to remain in their good graces (or get there in the first place).’

Sunstein returns to these two proposals a little later, where he states that if a number of people believe that a proposition is true, there is reason to believe that it is in fact true. It’s a pity that in a book that I intend to praise highly, I have to disagree with one of the first things that the author says, but unfortunately that is the case: I don’t think there is any good reason to believe something simply because the majority of people think it’s true. On the contrary, I have found that most people are not very well informed about most things.1

Because Sunstein thinks there is reason to believe what the majority consider to be true, he goes on to say that conformity is often a rational course of action, and that much of the time people do better when they take close account of what others say and do.  But if there isn’t a good reason to believe what the majority consider to be true, then it’s actually irrational to conform, although it depends on what we mean by ‘do better’. Conforming makes life much easier: if you follow the crowd, even though what they are thinking and doing is wrong or mistaken, you will be liked more and you’ll probably be able to advance your career more effectively. Dissenters tend to be unpopular.

However, Sunstein also makes the point that widespread conformity deprives the public of information that it needs to have, and this lack of information can lead groups into wrong directions. A ‘majority consensus” can easily mislead individuals into inaccurate, irrational, or unjustified judgements. Dissent therefore can be an important corrective, and in Sunstein’s opinion, many groups and institutions have too little of it.

Sunstein goes on to introduce three points that he emphasises throughout the book: The first is that people who are confident and firm have a particularly strong influence (the confidence heuristic).The reason for this is simple: when someone says something very confidently, people assume that they really know what they are talking about. This isn’t always the case of course, and confident people can consequently lead groups into making very big mistakes. Secondly, people are extremely vulnerable to the unanimous views of others: ‘if everyone else thinks this, then I suppose they must be right’. Hence a single dissenter can have a huge impact, because they weaken the power of unanimity. Thirdly, if people belong to a group we distrust or dislike, or a kind of ‘outgroup’, they are far less likely to influence us. In fact, we might say or do the very opposite. This phenomenon is known as ‘reactive devaluation’: ‘Well, if she says it, it must be wrong’, or even ‘He is right, but I don’t want to be seen to be agreeing with him.’ Conversely, people are especially likely to conform when the group consists of people they like or admire.

Sunstein also makes a very interesting distinction between compliance and acceptance: people comply when they defer to others who they believe to be wrong. In that case they conform in public but not in private. People accept when they internalize the view of the group. That is, although previously (and privately) they had disagreed with the group, they take on the majority view, not because they have been convinced by reasons and evidence, but because it is more important to them to be accepted by the group, so they censor their own point of view or even relevant information. As Sunstein remarks, if certain views are punished, unpopular views might eventually be lost to public debate, so that what was once ‘unthinkable’ is now ‘unthought’. If people know that a certain view is considered to be unacceptable by the majority, they won’t even allow themselves to think it. This of course is common in totalitarian societies, but it also happens in more democratic, ‘open’ societies, because the need to belong and be accepted is so strong.

A particularly interesting and important idea in the book is that of cascades. A cascade is a large-scale social phenomenon in which many people think, believe, or do something because of the beliefs or actions of a few ‘early movers’. According to Sunstein there are two main types of social cascade – informational and reputational. An informational cascade is an idea or a supposed fact that becomes widely accepted, so that it ‘cascades’ through society. People often mistake a cascade for a series of separate and independent judgements, but this is not the case. When something becomes a cascade, we believe what others say because we assume they have independent information that we don’t have. But those people too are just following their predecessors, who they too assume hold information that they don’t have. Thus, the blind lead the blind.

In a reputational cascade, some people may disagree with the majority consensus, but they nonetheless go along with the crowd, because they fear that dissenting may adversely affect their reputation. According to Sunstein, even the most confident people can fall prey to this, silencing themselves in the process. In a recent interview, the scientist Rupert Sheldrake, speaking about his own unorthodox scientific views, said that he knows many other scientists agree with him, but don’t say so for fear of the consequences:

There’s a strong distinction between people’s private opinions and what they say in public. In public, if you want to get on in the world as a scientist — if you want to get your grant applications renewed, if you want to get promoted — then if you have dissident views you keep quiet about them. I think it’s rather like Russia under Brezhnev. In the Soviet Union very few people believed in Marxist materialism, the official state atheist philosophy, but they pretended to believe it because it was the only way they’ll get promoted and get ahead in your career (sic). But when the Soviet Union collapsed, how many true believing Marxists and communists were there? There were certainly some, but it was a minority, and I think it’s so much the same in science.2

Those who don’t care about their reputation, or at least care more for the truth than they do about their reputation, and who consequently say what they really believe to be true, perform a valuable public service, often at their own expense. That expense can be very high for the earliest dissenters, because, as Sunstein says, they are conspicuous, individually identified, and easy to isolate for reprisals. The epidemiologist Sunetra Gupta is a critic of the lockdown approach to the Covid19 pandemic, and was one of the primary authors of the Great Barrington Declaration.3 She has recently written about her experience following an interview in which she explained her position. She says that at the time of the interview

… I still had no idea how politicised and nasty this controversy would become – I was still happy to share early hypotheses, not realising that they would be systematically collected and deployed to try to destroy my professional credibility, including by people I had previously respected and admired.4

Not that all cascades are necessarily based on misinformation. It’s possible of course that those ‘early movers’ — the people who initiate a cascade — are correct, and that there is therefore good reason to do what they recommend. The danger with cascades is the assumption that, because so many people in so many media are relaying the same message, it must be true. That assumption could be misplaced. So what are we to do? The most responsible thing is to go to the facts — the data — to check whether the messages we are receiving are true or false. This can be done relatively easily thanks to the internet. By this I don’t mean going to what journalists, social commentators or even experts tell us (for experts often disagree with one another, and some may have a vested interest in a particular view); I mean looking at the data. For instance, if you hear that the number of deaths per year from natural disasters has risen over the last hundred years, look it up.5

If you discover that what everyone seems to be saying is not true, or not as true as is being claimed, then you have a choice: you can either speak up and try to correct the general mistake — and so become a dissenter — or you can remain silent. As Sunstein writes:

People silence themselves not because they believe they are wrong but because they do not want to face the disapproval that, they think, would follow from expressing the view they believe to be correct. The problem and the result are pluralistic ignorance: ignorance, on the part of most or all, of what most people actually think. In the face of pluralistic ignorance, people can assume, wrongly, that others have a certain view, and they alter their statements and actions accordingly.

Sunstein points out that there is an important distinction between a dissenter and a contrarian. Dissenters have important information that may benefit society, so it’s good if people listen to them and consider what they say, hard as it may be to do that (hard because once we believe something to be true, we become attached to that view, and don’t want to let it go. As Mark Twain once said, ‘it’s easier to fool people than to convince them that they have been fooled.’) Contrarians, on the other hand, think they will be rewarded financially or otherwise simply for disagreeing with others (thus they can also be called controversialists). Because they aren’t interested in the truth, their pseudo disagreements confuse rather than clarify. It’s obviously important to distinguish between dissenters and contrarians, but that isn’t easy. Sunstein doesn’t offer any ways to help us do this, and as it’s a matter of personal integrity, or the lack of it, the only thing we can do is try to get a sense of the person. Why are they making these claims? What is their motivation? But sensing another’s motivation, especially if you don’t know them personally, is fraught with difficulties, one of them being that it’s easier to doubt their motives than it is to investigate their claims (which takes time) and risk having to change your mind (which people are naturally averse to doing). Not to mention the possibility that if the evidence shows that the dissident is right, you may have to become a dissident yourself! (Which probably means becoming a bit unpopular.) But if your interest is in the truth rather than having an easy life, it’s best to give the benefit of the doubt to anyone who disagrees with the majority view, and look into it yourself. You may think that this would entail some wasted time if you discover that the evidence shows that the contrarian or dissident is wrong. But you won’t really have wasted time; if you look into a criticism of your opinion and find it groundless, that will turn what was perhaps an opinion taken on faith into a view based on evidence, and if you already had some evidence, checking the dissenter’s claim will clarify and strengthen your understanding of the issue.

Another thing Sunstein warns us of is the availability heuristic. Because most of us lack statistical knowledge on questions about trends, probability, etc., we rely on examples that come readily to mind (that are easily ‘available’). The problem with this is that we tend to remember only dramatic events. If, for example, we are told that floods are becoming more common, we can easily remember the last few times there were serious floods, and these memories seem to confirm the assertion. Sunstein goes on to say that the availability heuristic doesn’t operate in a social vacuum, it’s partly a function of social interactions. One person mentions the floods of a certain year, which reminds another of the floods a few years before that, which reminds another of the floods in another part of the country, and so on, thus initiating an availability cascade. ‘All these floods, yes, they are becoming more frequent.’ (Are they? Look it up.)

Another interesting social dynamic is that of Group Polarization.

When like-minded people talk with one another, they usually end up thinking a more extreme version of what they thought before they started to talk.

This happens especially in social media forums, which can be mere ‘echo chambers’ where people of similar views hear only the views that they themselves hold. Perhaps they would better be called ‘amplification chambers’. In this context Sunstein explores what he calls the dynamics of outrage. People who begin with a high level of outrage become still more outraged as a result of group discussion with others who are similarly outraged. This dynamic has a great influence on feuds:

… one of the characteristic features of feuds is that members of feuding groups tend to talk only to one another, fuelling and amplifying their outrage and solidifying their impression of the relevant events.

Sunstein offers three reasons for group polarization. The first is connected with conformity and cascades. People respond to the arguments made by others, and the arguments expressed in any group with an initial position will inevitably be skewed towards that position. The second reason is to do with confidence, corroboration, and extremism. If other people share and corroborate your views, you are likely to become more confident that you are correct, and hence to move in a more extreme direction. The third reason involves social comparison. Sunstein writes

… people want to be perceived favourably by other group members and also to perceive themselves favourably. Their views may, to a greater or lesser extent, be a function of how they want to present themselves. Once people hear what others believe, they adjust their positions in the direction of the dominant position, to hold on to their perceived self-presentation […] People might emphasise shared views and information, and downplay unusual perspectives and new evidence, simply from a fear of group rejection and the desire for general approval.

However, it’s not inevitable that groups polarise. Sunstein proposes five factors which can increase or decrease polarisation. The first three tend to increase and the last two to decrease it. Firstly, extremists are especially prone to polarization, and because there is a link between confidence and extremism, the confidence of particular members plays an important role. Confident people are both more influential and more prone to polarisation. Secondly, polarization will increase if members of a group have a shared identity and a high degree of solidarity. Thirdly, over time, group polarization tends to increase because of what Sunstein terms ‘exit’: some people leave a group because they reject the direction in which the group is going. The group becomes smaller, but its members will be both more like-minded and more willing to take extreme measures. The fourth factor (the first that tends to decrease group polarization) is to do with the connection between truth and confidence. When one or more people in a group know the right answer to a factual question, the group is likely to shift in the direction of accuracy. This is because the person who knows the answer will speak with confidence and authority, so is likely to be convincing. Incidentally, there is a link between what prevents polarisation and what shatters cascades: a person who knows, and is known to know, the truth. Fifthly, polarization is less likely to happen, or can be decreased, when the group consists of individuals drawn equally from two extremes, probably because of the existence of information and persuasive arguments from both sides of a debate.

As I said at the beginning of this review, in my opinion Conformity is a valuable book. I value it both as a citizen and as a member of the Triratna Buddhist Community. As to the latter, I am uncomfortably aware that studies have shown that groups of people tied together by affection, friendship and solidarity tend not to do very well because their members tend to lean too much towards conformism. That is because harmony is more important to the members of such groups than making sure that decisions are based on the best information. As Sunstein says:

Bonds of affection and solidarity are often important to group members, and many people do not appreciate dissent and disagreements. Perhaps the real point of the relevant group or organisation is not to perform well but to foster an optimistic outlook and good relationships. Conformists avoid creating the difficulties that come from contestation but at the expense, often, of a good outcome; dissenters tend to increase contestation while also improving performance.

Religious groups are particularly susceptible to this phenomenon for obvious reasons. The founder of the Triratna Buddhist Community, Urgyen Sangharakshita, emphasised the importance of becoming independent of the need for the approval of the group, and becoming an individual. His definition of an individual included, amongst other things, taking responsibility for one’s actions, the ability and willingness to think for oneself, the willingness to state one’s opinions, even if one knows those opinions are unpopular, and therefore the willingness to stand on one’s own if necessary. Sangharakshita’s definition of the spiritual community is ‘a free association of individuals’. In other words, a spiritual community is not merely a religious group. Or rather, a spiritual community ideally is not a religious group. In practice, of course, members of a spiritual community don’t always live up to that ideal, and when they don’t, the spiritual community degenerates into a group.

Members of the Triratna Buddhist Community therefore have a duty to be individuals, at least to the extent that they are able. This may mean at times dissenting from the opinion of the majority, with the attendant risk of causing disharmony. Obviously, each person has to weigh up the pros and cons of speaking up at any particular time: that is, to balance the need to speak the truth with the need to maintain harmonious relations. We have to ask ourselves whether the issue is important enough to justify risking the harmony of the community. There are times, I would suggest, when it is not. Why risk falling out with others over a trivial matter? However, there will be times when we do consider speaking the truth to be more important than harmony. As Sangharakshita once said ‘honest collision is better than dishonest collusion’. In any case, there are different types of harmony. The harmony that depends on people withholding their opinions is a fragile and shallow kind of harmony, and it may well be worth risking some dissonance in order to create a stronger and deeper harmony.

But there is also an other-regarding aspect to this. As Buddhists we have a responsibility towards the wider society in which we live. To go along with the crowd when we believe it is wrong is to neglect that responsibility. It is trading our commitment to the truth for an easy life. Of course, that commitment to the truth entails the effort to find out what the truth is, which takes time and effort. To hold an opinion, to act on that opinion, and to encourage others to act on it, without first making sure, as far as possible, that the opinion is based on the truth, is to betray our commitment to help others, and to make the world a better place.

Cass Sunstein’s Conformity: The Power of Social Influences is an important book. I do have some criticisms in addition to the one I mentioned at the beginning of this review. The book is in some ways culture-specific. Sunstein is an American, and he takes all of his examples from the USA. More specifically, he is a professor of law, and many of his discussions draw narrowly on that field, especially in the final chapter, Law and Institutions. Thus, at times the book seems to be written for everyone, but at other times only for Americans with a special interest in the law. Still, for the reader prepared to bear with this particularity, the discussions turn out to be very interesting and widely relevant in their implications. (For instance, Americans are apparently more likely to complete their tax form honestly if they think most other people are doing so. I would guess that people in other countries do likewise.) My final criticism is that nearly all of the statements in the book are based on studies carried out by social scientists and psychologists. This is not a fault of course, but if you are going to describe a study, you ought to make it as clear as possible. Unfortunately, there are times when Sunstein doesn’t explain a study very well, so readers may struggle to understand how the conclusions were reached. Apart from these quibbles, it’s a very good book, and will be of great benefit to anyone who wishes to free themselves from the group and become an individual, in Sangharakshita’s sense of the word. It will also inspire anyone who wants to live by the truth and make the world a better place. And it will challenge conformists to question and investigate issues rather than take them on faith. Highly recommended.


  1. This is supported by Hans Rosling in hs book Factfulness
  2. PoetryEast with Rupert Sheldrake: Where Science Meets Consciousness – YouTube
  3. Great Barrington Declaration (gbdeclaration.org)
  4. Sunetra Gupta: have my Covid hypotheses held up? – The Post (unherd.com)
  5. [5] Natural Disasters – Our World in Data Scroll down to the chart entitled ‘Global Deaths from Natural Disasters 1900-2019’, and scroll further down to ‘Global annual deaths by natural disasters, by decade’.

Ratnaguna has been a member of the Triratna Buddhist Order for 45 years. He is a well-known teacher and has written four books - The Art of Reflection, Great Faith Great Wisdom (with Dharmachari Śraddhāpa), Kindfulness (in Spanish, with Dharmachari Dharmakirit), and, under his civil name, Gary Hennessey, The Little Mindfulness Workbook.

Print Friendly, PDF & Email

More by this author

A Buddhist Utopia
The second in a series exploring what the Buddha said about society. In this one we look in detail about how, according to the Buddha, an empire should be ruled.
Posted in: Buddhism, Politics
Gangetic plain
A commentary on the seven principles the Buddha gave to the Vajjika League, saying that, if followed, would prevent its decline and assure its growth and prosperity.
Posted in: Buddhism, Politics


Whole societies can split into mutually unintelligible ‘tribes’.
Ratnaguna looks at instances from ancient texts of the Buddha debating, and draws out lessons that we can learn from him.
Moral rectitude functions as a form of power.


Subscribe to Apramada. You’ll receive an email when new content is published.