fee.org /articles/what-is-golf-scalias-greatest-dissent/

Scalia's Greatest Dissent: "What Is Golf"

Daniel Bier 40-51 minutes 2/15/2016

Progressivism vs. sarcasm

Monday, February 15, 2016

The late Justice Scalia wrote many important and eloquent majority opinions, but he may be best known for his often scathing, always witty, and sometimes hilarious dissents. It's impossible to pick one that is his objectively "best" — though many of his dissents will likely end up being more influential than the controlling opinion they criticized.

But, for my money, his opinion in the case of PGA Tour v. Martin is the funniest and some of the clearest writing ever to come down from the Supreme Court.

A quick background of the case:

  • Casey Martin suffered from a circulatory disorder that made it hard for him to walk a golf course.
  • Martin asked to use a golf cart in a PGA qualifying tournament; PGA refused, citing the rule requiring golfers to walk between holes.
  • Martin sued under the Americans with Disabilities Act, which requires "public accommodations" to make "reasonable modifications" in order to allow disabled customers to use them — unless such modifications "would fundamentally alter the nature of such ... accommodations."

The question for the Supreme Court was 1) whether the ADA required pro golf tournaments to admit contestants with disabilities, and 2) whether allowing a disabled contestant to ride around in a golf cart would "fundamentally alter the nature" of a PGA golf tournament.

In a 7-2 ruling, the Court held that it wouldn't "fundamentally" alter the game and ordered the PGA Tour to let Martin ride around in a golf cart.

Scalia dissented. Vehemently.

Here is an excerpt from the dissent that should serve as a textbook model of wit and clarity and humor in legal writing. (Note also his emphasis on the difference between what one ought to do and what the law requires, and his profound appreciation for unintended consequences.)

SCALIA, J., dissenting

The Court attacks this “fundamental alteration” analysis by asking two questions: first, whether the “essence” or an “essential aspect” of the sport of golf has been altered; and second, whether the change, even if not essential to the game, would give the disabled player an advantage over others and thereby “fundamentally alter the character of the competition.” It answers no to both.

Before considering the Court’s answer to the first question, it is worth pointing out that the assumption which underlies that question is false. Nowhere is it writ that PGA TOUR golf must be classic “essential” golf.

Why cannot the PGA TOUR, if it wishes, promote a new game, with distinctive rules (much as the American League promotes a game of baseball in which the pitcher’s turn at the plate can be taken by a “designated hitter”)? If members of the public do not like the new rules – if they feel that these rules do not truly test the individual’s skill at “real golf” (or the team’s skill at “real baseball”) they can withdraw their patronage.

But the rules are the rules. They are (as in all games) entirely arbitrary, and there is no basis on which anyone – not even the Supreme Court of the United States – can pronounce one or another of them to be “nonessential” if the rulemaker (here the PGA TOUR) deems it to be essential.

If one assumes, however, that the PGA TOUR has some legal obligation to play classic, Platonic golf – and if one assumes the correctness of all the other wrong turns the Court has made to get to this point – then we Justices must confront what is indeed an awesome responsibility. It has been rendered the solemn duty of the Supreme Court of the United States, laid upon it by Congress in pursuance of the Federal Government’s power “to regulate Commerce with foreign Nations, and among the several States,” to decide What Is Golf.

I am sure that the Framers of the Constitution, aware of the 1457 edict of King James II of Scotland prohibiting golf because it interfered with the practice of archery, fully expected that sooner or later the paths of golf and government, the law and the links, would once again cross, and that the judges of this august Court would some day have to wrestle with that age-old jurisprudential question, for which their years of study in the law have so well prepared them: Is someone riding around a golf course from shot to shot really a golfer?

The answer, we learn, is yes. The Court ultimately concludes, and it will henceforth be the Law of the Land, that walking is not a “fundamental” aspect of golf.

Either out of humility or out of self-respect (one or the other) the Court should decline to answer this incredibly difficult and incredibly silly question. To say that something is “essential” is ordinarily to say that it is necessary to the achievement of a certain object. But since it is the very nature of a game to have no object except amusement (that is what distinguishes games from productive activity), it is quite impossible to say that any of a game’s arbitrary rules is “essential.”

Eighteen-hole golf courses, 10-foot-high basketball hoops, 90-foot baselines, 100-yard football fields–all are arbitrary and none is essential. The only support for any of them is tradition and (in more modern times) insistence by what has come to be regarded as the ruling body of the sport–both of which factors support the PGA TOUR’s position in the present case.

(Many, indeed, consider walking to be the central feature of the game of golf – hence Mark Twain’s classic criticism of the sport: “a good walk spoiled.”)


My belief that today’s judgment is clearly in error should not be mistaken for a belief that the PGA TOUR clearly ought not allow respondent to use a golf cart. That is a close question, on which even those who compete in the PGA TOUR are apparently divided; but it is a different question from the one before the Court. Just as it is a different question whether the Little League ought to give disabled youngsters a fourth strike, or some other waiver from the rules that makes up for their disabilities.

In both cases, whether they ought to do so depends upon (1) how central to the game that they have organized (and over whose rules they are the master) they deem the waived provision to be, and (2) how competitive – how strict a test of raw athletic ability in all aspects of the competition – they want their game to be. But whether Congress has said they must do so depends upon the answers to the legal questions I have discussed above – not upon what this Court sententiously decrees to be “decent, tolerant, [and] progressive.”

And it should not be assumed that today’s decent, tolerant, and progressive judgment will, in the long run, accrue to the benefit of sports competitors with disabilities. Now that it is clear courts will review the rules of sports for “fundamentalness,” organizations that value their autonomy have every incentive to defend vigorously the necessity of every regulation. They may still be second-guessed in the end as to the Platonic requirements of the sport, but they will assuredly lose if they have at all wavered in their enforcement.

The lesson the PGA TOUR and other sports organizations should take from this case is to make sure that the same written rules are set forth for all levels of play, and never voluntarily to grant any modifications. The second lesson is to end open tryouts. I doubt that, in the long run, even disabled athletes will be well served by these incentives that the Court has created.

Complaints about this case are not “properly directed to Congress.” They are properly directed to this Court’s Kafkaesque determination that professional sports organizations, and the fields they rent for their exhibitions, are “places of public accommodation” to the competing athletes, and the athletes themselves “customers” of the organization that pays them; its Alice in Wonderland determination that there are such things as judicially determinable “essential” and “nonessential” rules of a made-up game; and its Animal Farm determination that fairness and the ADA mean that everyone gets to play by individualized rules which will assure that no one’s lack of ability (or at least no one’s lack of ability so pronounced that it amounts to a disability) will be a handicap.

The year was 2001, and “everybody was finally equal.” (K. Vonnegut, Harrison Bergeron.)

Read the entire opinion here. (All citations omitted here, except for the last one.)

Daniel Bier

Daniel Bier

Daniel Bier is the executive editor of The Skeptical Libertarian.

How Calvin Coolidge Responded to a Voter Concerned That Republicans Had Nominated a Black Dentist For Congress

Monday, February 20, 2023

February is when we Americans pause to note Black History Month, Valentine’s Day, and Presidents’ Day. Allow me a little literary license to connect all three: Black Americans ought to love Calvin Coolidge

For multiple reasons, Americans of every ethnicity should get better acquainted with our 30th president. Coolidge was the last chief executive to balance the budget in every year of his presidency. His administration also cut tax rates dramatically and reduced the national debt substantially. After five and a half years in the White House, he left the federal government smaller than he had found it. He earned the nickname “Silent Cal” because he was a quiet, good listener in social settings. Surprisingly, he also held more press conferences than any other president, with an average of nearly two conferences per week.

In 1924, a man named Charles Gardner wrote to Coolidge to protest the Republican Party’s nomination of a black dentist for a New York congressional seat. Coolidge’s unequivocal reply could have been written by Frederick Douglass or Martin Luther King Jr.:

My dear Sir:

Your letter is received, accompanied by a newspaper clipping which discusses the possibility that a colored man may be the Republican nominee for Congress from one of the New York districts. Referring to this newspaper statement, you say:

'It is of some concern whether a Negro is allowed to run for Congress anywhere, at any time, in any party, in this, a white man’s country. Repeated ignoring of the growing race problem does not excuse us for allowing encroachments. Temporizing with the Negro whether he will or will not vote either a Democratic or a Republican ticket, as evidenced by the recent turnover in Oklahoma, is contemptible.'

Leaving out of consideration the manifest impropriety of the President intruding himself in a local contest for nomination, I was amazed to receive such a letter. During the war 500,000 colored men and boys were called up under the draft, not one of whom sought to evade it. They took their places wherever assigned in defense of the nation of which they are just as truly citizens as are any others. The suggestions of denying any measure of their full political rights to such a great group of our population as the colored people is one which, however it might be received in some other quarters, could not possibly be permitted by one who feels a responsibility for living up to the traditions and maintaining the principles of the Republican Party.

Our Constitution guarantees equal rights to all our citizens, without discrimination on account of race or color. I have taken my oath to support that Constitution. It is the source of your rights and my rights. I propose to regard it, and administer it, as the source of the rights of all the people, whatever their belief or race. A colored man is precisely as much entitled to submit his candidacy in a party primary, as is any other citizen. The decision must be made by the constituents to whom he offers himself, and by nobody else.

You have suggested that in some fashion I should bring influence to bear to prevent the possibility of a colored man being nominated for Congress. In reply, I quote my great predecessor, Theodore Roosevelt:

'I cannot consent to take the position that the door of hope—the door of opportunity—is to be shut upon any man, no matter how worthy, purely upon the grounds of race or color.'

Yours very truly, 

Calvin Coolidge

A flinty New Englander and staunch individualist, Coolidge never possessed a racist bone in his body. Two months before his response to Gardner, black students greeted him warmly when he delivered the commencement address at Howard University.

Racial hostility, ancient tradition, and social prejudice,” he told them, “are not to be eliminated immediately or easily. But they will be lessened as the colored people by their own efforts and under their own leaders shall prove worthy of the fullest measure of opportunity..” 

In a message to Congress just weeks after taking office in August 1923, Coolidge declared that the rights of the country’s black citizens were “as sacred as those of any other” citizens and that it was “both a public and private duty to protect those rights.” 

Coolidge’s predecessor and 1920 running mate, Warren Harding, held similar views, by the way. In October 1921, Harding journeyed to Birmingham, Alabama, to denounce racism in front of an audience of 30,000 — thereby becoming the first American president of the 20th century to openly call for the political equality of blacks. Both presidents, and fellow Republican Herbert Hoover after them, did much to reverse the segregation within the federal government that Democrat President Woodrow Wilson had put in place.

Coolidge strongly pushed for anti-lynching legislation, but it was mostly blocked in Congress by Democrats. Given his personal philosophy as well as his actions as president, it’s inconceivable that Coolidge would praise the pro-Klan film The Birth of a Nation (as Wilson did) or that he would nominate a Klan member to the U.S. Supreme Court (as President Franklin Roosevelt did — namely, Hugo Black).

When black athlete Jesse Owens won four gold medals at the 1936 Olympic Games in Berlin, Roosevelt snubbed him by inviting only the white Olympians to dinner at the White House. Coolidge would never have been so callous.

Originally, the focus of Presidents’ Day was upon Washington and Lincoln, but in recent years it’s become an opportunity to more broadly assess the records of other presidents as well. It’s especially appropriate this year to dust off old Silent Cal, not just because his basic decency, character, and performance deserve widespread appreciation, but also because on Aug. 2, 2023, we will mark the centennial of his becoming president when Harding suddenly died. 

Among the reasons to pay tribute to Coolidge, add this one: On equal rights, he was ahead of his time and much of the country, and way ahead of the other major political party.

This article was originally published on the American Spectator.

The Real Race Revolutionaries: How Minority Entrepreneurship Can Overcome America's Racial and Economic Divides

Sunday, February 19, 2023

Alfredo Ortiz has a message for all of the progressive politicians and activists working to close the economic gap between white and non-white Americans: Please stop.

In his new book The Real Race Revolutionaries (December 2022), Ortiz, a long-time advocate for small business owners and their employees in the US, argues that the government policies that are ostensibly intended to equalize economic outcomes between the white majority and minority groups in America have actually had the opposite effect. Given the growth of government regulation and spending and the relative lack of progress over the last several decades, it’s hard to argue with him.

Ortiz’s argument rests on one especially important insight that will be familiar to fans of market economics but remains stubbornly resistant to being understood by the general public. The laws and regulations that have long been promoted as disciplining businesses and protecting consumers usually end up protecting large established companies and restricting market access to new entrants and outsiders. Those established players are, of course, disproportionately the straight white males we’ve been warned about ( i.e., the ones who control the majority of enterprises and the largest share of wealth in America already). Greater government intervention into the productive sector of the economy therefore ends up actually harming minority workers and entrepreneurs on net, almost by definition.

Every new regulation and tax hike is subtracting something from the bottom line of existing businesses and making it harder for new start-ups to come into existence. As I wrote in 2021 regarding new environmental, social, and governance (ESG) rules, stricter standards and more expensive requirements privilege incumbents over new entrants, larger firms over smaller firms, and firms that already have larger legal, regulatory compliance, and lobbying departments. Because current regulatory frameworks are already biased in favor of larger firms, heightened burdens will only reinforce that effect. As JPMorgan Chase CEO Jamie Dimon noted in a 2013 interview, the regulatory requirements of the 2008 Dodd-Frank Act – sold to the American public as an anti-Wall Street accountability measure – simply helped him build a “bigger moat” for Morgan against its smaller competitors.

More regulation, higher spending, bigger deficits, and higher taxes aren’t the first things most people think of as obstacles to minority success, but they absolutely create economic conditions that disproportionately harm people climbing their way up from the bottom of the socioeconomic ladder. Inflation, which has been at record levels in recent times, will easily kill a low-margin restaurant or store that’s just trying to get off the ground. Sound money and stable economic conditions matter more to small business owners with less capital than to large enterprises with more access to financing options and thus room to maneuver in the long term. My Competitive Enterprise Institute colleagues Iain Murray and Ryan Young made this point back in 2016 when they were countering the then-popular work of French economist Thomas Piketty. They pointed out in their paper “The Rising Tide: Answering the Right Questions in the Inequality Debate” that sound monetary policy would do more to help working Americans than most of the interventionist policies championed by progressive critics of capitalism.

With these dynamics in mind, it begins to make more sense that maximizing economic opportunity for minority Americans means lowering the barriers to entry to the greatest possible extent. That includes barriers to employment as well as barriers to becoming a new business owner (which in the case of businesses with only a single employee are basically the same thing). Ortiz recently contributed an excellent short article here at FEE discussing the considerable barriers to people starting up their own gig titled “Occupational Licenses Are Killing Minority Entrepreneurship.”

Mandatory state-level licenses are stopping hard-working Americans from becoming everything from undertakers and hair braiders to florists and interior designers. The current state of occupation licensing is even stopping people who are already fully licensed in one state from moving and doing the exact same job on the other side of an arbitrary geographical line.

And Ortiz is not alone in going after such restrictions. A wide variety of economists, policy experts, and politicians have now come out in favor of reforming or abolishing occupational licensing requirements altogether (or at least recognizing existing qualifications across state lines). Just to mention a few recent examples, Adam Thierer and Trace Mitchell of the Mercatus Center published “Occupational Licensing Reform and the Right to Earn a Living: A Blueprint for Action” in 2020, Shoshana Weissmann of the R Street Institute wrote last year about licensing and the student loan crisis, former Brookings Institution fellow Ryan Nunn wrote about how to eliminate the anti-competitive effects of occupational licensing in 2019, and the Obama administration started a licensing reform effort back in 2016. The Institute for Justice has also become famous over many years for defending the rights of Americans across the country to work in their preferred professions without government permission. A handful of governors and state legislators, the ones who created this problem in the first place, are now starting to take some of this advice and begin fixing the mess.

But even with incremental wins like this, we are still left with a disconnect between the mainstream narrative by left-wing policy advocates who claim to care about minority success on the one hand but who advocate for policies that would lead to less economic growth and opportunity on the other. In part that’s because they don’t really believe in the possibilities of free markets in the first place. From this progressive – and often, explicitly Marxist – perspective, the only path to fairness for anyone who is not already rich is government control and redistribution rather than new innovation and production. Ortiz presents an old but classic example among black economic thinkers. He contrasts Booker T. Washington, who famously advised his fellow Americans to pursue entrepreneurship, hard work, and self-reliance with W.E.B Du Bois, a leading socialist thinker of his day, who insisted that capitalism itself was responsible for racism and thought a future intellectual elite of black leaders should focus on winning the game of political influence instead. Their legacies are similarly divergent: in 1946, Washington was honored on the first US coin to feature a black American, the Booker T. Washington Memorial half dollar. In 1959, Du Bois received the Lenin Peace Prize (formerly the Stalin Peace Prize), an honor bestowed by the former Soviet Union.

That divergence of perspectives, of course, continues today. Ortiz points to Ta-Nehisi Coates and Ibram X. Kendi as inheritors of Du Bois’ pro-socialist/anti-individualist legacy. Presumably he would also include the self-proclaimed “trained Marxists” who founded the Black Lives Matter Global Network Foundation. Meanwhile you have a deep bench of 20th and 21st century black intellectual leaders who fit quite well on Team Washington. Whether it’s economists like Thomas Sowell, the late Walter Williams, and Glenn Loury or the Hoover Institution’s Shelby Steele or Columbia University’s John McWhorter, there are plenty of writers, professors, and activists who don’t think that the future prosperity of non-white Americans has to come from the end of a government gun.

For some reason, however, the sort of well-meaning white progressive who has multiple copies of White Fragility on her bookshelf seems to have trouble understanding that.

Much of progressive legal and political theory has notoriously privileged intentions over real-world outcomes. The moral high ground ends up being ceded to people who can emote most persuasively rather than those who have ideas that actually work.

As Thomas Sowell himself put it, “Much of the social history of the Western world over the past three [now more like six] decades has involved replacing what worked with what sounded good.” Ortiz and his book are an example of a man trying to pull Americans and their ostensible leaders back from the false promise of what merely sounds good and guide them to what will actually work.

Let’s hope he’s successful.

Richard Morrison

Richard Morrison is the Senior Editor at the Competitive Enterprise Institute.

What Ayn Rand Understood about Romantic Love That so Many Fail to Grasp

Sunday, February 19, 2023

We recently recognized Valentine’s Day: a holiday dedicated to amorous love.

While I spent the evening covering a campus fashion show for the Dartmouth Review all alone, I’d like to share some of my favorite quotes from Ayn Rand on the subject. While Rand is perhaps most renowned for her political and moral philosophy, Objectivism constitutes a full philosophical system that includes a beautiful theory of love.

Many people conceive of love as unconditional and selfless. While this sounds sweet and wholesome prima facie, Rand’s fiction and philosophy reveal why this conception of love is far from the ideal. In Rand’s breakthrough 1943 novel, The Fountainhead, the protagonist (Howard Roark) issues one of the pithiest, most impactful and memorable lines in all of her fiction:

To say ’I love you’ one must know first how to say the ‘I.’

What does Roark mean by this? Love is not something that exists abstractly but as the union between two individuals. For this love to mean something, both individuals involved in the romance, a subspecies of the “happy commerce” of friendship, must have a robust sense of self. That is, each person must possess values independently and demonstrate the requisite virtues to achieve them. Love is not a substitute for self-esteem but a consequence thereof.

Hank Rearden, in Rand’s seminal 1957 tome Atlas Shrugged, expounds upon the role of romantic love in relation to one’s highest values:

[Lovers] can be only travelers you choose to share your journey and must be travelers going on their own power in the same direction.

To lose oneself, i.e., one’s values, virtues, and self-esteem, in the ecstasy of romance is to consign one’s relationship to the same fate one’s consigned his individuality: oblivion. Such is the natural and inexorable consequence of treating love as a substitute instead of a complement to the self.

The following line, uttered by Howard Roark in The Fountainhead, strikes many as antithetical to the widely accepted conception of love as sacrificial:

"I love you, Dominique. As selfishly as the fact that I exist. As selfishly as my lungs breathe air. I breathe for my own necessity, for the fuel of my body, for my survival. I’ve given you, not my sacrifice or my pity, but my ego and my naked need.”

To demonstrate the truthfulness of love as selfish, one can do a simple proof by contrapositive. In other words, is it true that the opposite of selfishness implies the opposite of love? I believe so. Imagine, if you will, your loved one informing you that they love you selflessly, i.e., they love you not for their survival, not out of their ego and naked need, but because they know you need them for your survival and out of your ego and naked need? I predict that you would be aghast by such an admission and properly regard their feelings towards you as altruistic and well-intentioned but not as love. It follows, then, that true love is a reflection of mutual selfish satisfaction that both partners derive from each other’s company.

Rand expounded upon her theory of love as selfish in a 1964 interview for a rather unlikely publication: Playboy.

“It is for your own happiness that you need the person you love,” Rand said, “and that is the greatest compliment, the greatest tribute you can pay to that person.”

So next year on February 14, a day dedicated to true love, I encourage you to express to your significant other that you love them “as selfishly as [you] lungs breathe air.”

Jack Nicastro

Jack Nicastro is a fellow with FEE's Henry Hazlitt Project for Educational Journalism. He is a student at Dartmouth College majoring in Economics and minoring in Philosophy. At school, Jack is a senior correspondent for the Dartmouth Review, the president of the Dartmouth Libertarians, and co-chair of the Political Economy Project Student Leadership Council. Outside of school, Jack is a regional coordinator for Students For Liberty.

Follow him on Twitter and Substack.

Two Game Theory Lessons from HBO’s Hit Show ‘Succession’

Saturday, February 18, 2023

HBO’s hit show Succession returns for its fourth season on March 26. The critically acclaimed show is one of the most interesting family dramas on TV, as the Roy family finds themselves in some fascinating situations. As a professor of economics who has taught game theory for more than fifteen years, I am always looking for game theory lessons in contexts my students will know, such as in TV shows, music, or movies.

Game theory is the study of how entities will approach serious situations as if they were a game and the strategic choices they’ll make when playing the “game.” While the term “game” is included, game theory is used to study serious issues like competition between corporations and how to prevent nuclear war. The Roy family provides plentiful opportunities to showcase game theory concepts. I will share two examples below.

In what might be the most obvious warning ever—there are many spoilers ahead.

The Prisoner’s Dilemma in Action

First, Succession shows us the prisoner’s dilemma, and shows it is alive and well. Three Roy children—Kendall, Roman, and Shiv—are considered at one time or another to succeed their father, Logan, as CEO. When in conversations with their father, all of them said quite negative things about the competency of their siblings to handle the role of CEO. By doing so, all of them stand little chance of being chosen—say a 5 percent chance each. However, if they’d all chosen to say kind things about their siblings, it’s likely one of the three would have been chosen, so they’re probability would have been at or close to one in three, or 33 percent.

So why didn’t they cooperate and say nice things about each other? It turns out they were in a prisoner’s dilemma-style game. A prisoner’s dilemma game is one in which the players all get a higher payout if they cooperate, but each has an individual incentive not to cooperate. And therefore, they all end up with a lower payout. And that happens here.

Let’s think about the situation more closely. If one of the siblings, say Kendall, was the only one to speak poorly about his siblings, the others would have had negative things said about themselves while Kendall would have been universally praised. The other siblings would have had no chance, while Kendall would have been an overwhelming favorite to succeed his father as CEO. But since all three siblings have this same incentive, they all start to speak poorly about each other.

This is the most-commonly taught game in game theory and economics, and another context you often see this is with firms competing. If you have two firms in a market, they might do better if both keep their prices a bit higher. However, if only one lowers their prices, that firm would get far more business. But because both firms know this, you usually see all firms end up with lower prices—and engaging in price wars—when keeping prices high would have made them all better off.

While this might be bad for firms, it is great for consumers; a competitive market without government interference tends to give consumers quality products at lower prices.

Bargain From a Position of Strength

A second game theory lesson from Succession is about bargaining theory. When bargaining, it is helpful to have strong alternatives. In game theory, the acronym is BATNA, which stands for the Best Alternative To the Negotiated Agreement. Game theory analysis shows that the better your BATNA, the better outcome you should be able to get when bargaining with someone. Likewise, when your opponent’s BATNA is weaker, you should get a better outcome. Therefore, if you can strengthen your BATNA or weaken your opponent’s, you’ll get a better outcome from bargaining.

Bargaining plays a central role in the action of Succession, but we’ll focus on one situation at the end of Season 1. Kendall is planning a hostile takeover of his father’s company. During a wedding party, however, Kendall takes a ride with Andrew Dodds. When both are on drugs, Kendall drives, gets into an auto accident, and Andrew dies. Kendall escapes, and nobody knows he was at the scene nor finds out about it, or so he thinks. But when someone goes missing, naturally, people get concerned, and eventually Andrew’s body is discovered. Nobody seems to know that Kendall was there, except for Logan (his father) and his team, who find out, and the situation shifts.

Kendall’s BATNA is now far worse, as there is a threat of legal action based on this accident should his father or his team reveal Kendall’s involvement. Because of that, Kendall accepts a bargained position far worse than the one he would have accepted prior to the accident.

In a much less dramatic real-world example, I counsel individuals that the best time to bargain over a job offer is when they are still employed (at another job). An unemployed person has a far worse BATNA than somebody looking to change jobs. For students looking for their first job, having multiple job offers gives them more bargaining leverage than a person with no other offers.

HBO’s Succession is entertaining and well-written. It also has many strategic interactions which give us a good opportunity to learn some game theory. And with economics and entertainment combined, what could be better?

Matthew Rousu

Matthew Rousu is Dean of the Sigmund Weis School of Business and a Professor of Economics at Susquehanna University.

The History of Slavery You Probably Weren't Taught in School

Saturday, February 18, 2023

In “Recognizing Hard Truths About America’s History With Slavery,” published by FEE on February 11, 2023, I urged an assessment of slavery that includes its full “historical and cultural contexts” and that does not neglect “uncomfortable facts that too often are swept under the rug.”

The central notion of both that previous essay and this follow-up is that slavery was a global norm for centuries, not a peculiar American institution. America is not exceptional because of slavery in our past; we may, however, be exceptional because of the lengths to which we went to get rid of it. In any event, it is an age-old tragedy abolished in most places only recently (in the past two centuries or so). As British historian Dan Jones notes in Powers and Thrones: A New History of the Middle Ages,

Slavery was a fact of life throughout the ancient world. Slaves—people defined as property, forced to work, stripped of their rights, and socially ‘dead,’ could be found in every significant realm of the age. In China, the Qin, Han, and Xin dynasties enforced various forms of slavery; so too did ancient rulers of Egypt, Assyria, Babylonia, and India.

Milton Meltzer’s Slavery: A World History is both comprehensive and riveting in its presentation. He too recognizes the ubiquity of human bondage:

The institution of slavery was universal throughout much of history. It was a tradition everyone grew up with. It seemed essential to the social and economic life of the community, and man’s conscience was seldom troubled by it. Both master and slave looked upon it as inevitable…A slave might be of any color—white, black, brown, yellow. The physical differences did not matter. Warriors, pirates, and slave dealers were not concerned with the color of a man’s skin or the shape of his nose.

The indigenous populations of both North and South America, pre-European settlement, also practiced slavery. Meltzer writes,

The Aztecs also made certain crimes punishable by enslavement. An offender against the state—a traitor, say—was auctioned off into slavery, with the proceeds going into the state treasury…Among the Mayans, a man could sell himself or his children into slavery…The comparatively rich Nootkas of Cape Flattery (in what is now northwestern Washington state) were notorious promoters of slaving. They spurred Vancouver tribes to attack one another so that they could buy the survivors.

Perhaps because it conflicts with race-based political agendas, slavery of Africans by fellow Africans is one of those uncomfortable truths that often flies under the radar. Likewise, industrial-scale slavery of Africans by nearby Arabs as well as Arab slavery of Europeans are historical facts that are frequently ignored. Both subjects are explored in The Forgotten Slave Trade: The White European Slaves of Islam by Simon Webb and Slavery and Slaving in African History by Sean Stilwell.

Slavery cannot be justified or excused by enlightened people, but it can be studied, explained, put in context, and understood—if all the facts of it are in the equation. It’s a painful topic, to be sure, which is even more reason to leave nothing out and to prevent political agendas from getting in the way.

The widespread sin of “presentism” poisons our understanding of such hot-button topics as slavery. As I wrote in August 2020,

Terms for this way of looking at the past range from intertemporal bigotry to chronological snobbery to cultural bias to historical quackery. The more clinical label is “presentism.” It’s a fallacious perspective that distorts historical realities by removing them from their context. In sports, we call it “Monday morning quarterbacking.”

Presentism is fraught with arrogance. It presumes that present-day attitudes didn’t evolve from earlier ones but popped fully formed from nowhere into our superior heads. To a presentist, our forebears constantly fail to measure up so they must be disdained or expunged. As one writer put it, “They feel that their light will shine brighter if they blow out the candles of others.”

Our ancestors were each a part of the era in which they lived, not ours. History should be something we learn from, not run from; if we analyze it through a presentist prism, we will miss much of the nuanced milieu in which our ancestors thought and acted.

Watch this 8-minute video, Facts About Slavery Never Mentioned in School and you might ask, “Why didn’t I hear this before?”

The answer may simply be that the facts it lays out are politically incorrect, which means they are inconvenient for the conventional wisdom. They don’t fit the “presentist” narrative.

What I personally find most fascinating about slavery is the emergence in recent centuries of ideas that would transform the world’s view of it from acceptance to rejection. Eighteenth Century Enlightenment ideals that questioned authority and sought to elevate human rights, liberty, happiness, and toleration played a role. So did a Christian reawakening late in the 18th and early 19th Centuries that produced the likes of abolitionists William Wilberforce and others.

The Declaration of Independence pricked the consciences of millions who came to understand that its stirring words were at odds with the reality many black Americans experienced on a daily basis. And as capitalism and free markets spread in the 19th Century, slavery faced a competition with free labor that it ultimately could not win. Exploring the potency of those important—indeed, radical—forces would seem to me to be more fruitful and less divisive than playing the race card, cherry-picking evidence to support political agendas, or promoting perpetual victimhood.

The prolific economist and historian Thomas Sowell has written about slavery in many of his voluminous articles and books. For Conquests and Cultures: An International History, he devoted fifteen years of research and travel (around the world twice, no less). Though the book is about much more than slavery, the author reveals a great deal about the institution that few people know.

I close out this essay with excerpts from this Sowell classic, and I strongly urge interested readers to check out the suggestions for additional information, below:


Inland tribes [in Africa] such as the Ibo were regularly raided by their more power coastal neighbors and the captives led away to be sold as slaves. European merchants who came to buy slaves in West Africa were confined by rulers in these countries to a few coastal ports, where Africans could bring slaves and trade as a cartel, in order to get higher prices. Hundreds of miles farther south, in the Portuguese colony of Angola, hundreds of thousands of Africans likewise carried out the initial captures, enslavement and slave-trading processes, funneling the slaves into the major marketplaces, where the Portuguese took charge of them and shipped them off to Brazil. Most of the slaves shipped across the Atlantic were purchased, rather than captured, by Europeans. Arabs, however, captured their own slaves and penetrated far deeper into Africa than Europeans dared venture….


Over the centuries, untold millions of human beings from sub-Saharan Africa were transported in captivity to other parts of the world. No exact statistics exist covering all the sources and all the destinations, and scholarly estimates vary. However, over the centuries, somewhere in the neighborhood of 11 million people were shipped across the Atlantic as slaves, and another 14 million African slaves were sent to the Islamic nations of the Middle East and North Africa. On both routes, many died in transit.


The horrors of the Atlantic voyage in packed and suffocating slave ships, together with exposure to new diseases from Europeans and other African tribes, as well as the general dangers of the Atlantic crossing in that era, took a toll in lives amounting to about 10 percent of all slaves shipped to the Western Hemisphere in British vessels in the eighteenth century—the British being the leading slave traders of that era. However, the death toll among slaves imported by the Islamic countries, many of these slaves being forced to walk across the vast, burning sands of the Sahara, was twice as high. Thousands of human skeletons were strewn along one Saharan slave route alone—mostly the skeletons of young women and girls…In 1849, a letter from an Ottoman official referred to 1,600 black slaves dying of thirst on their way to Libya.


The prime destination of the African slave trade to the Islamic world was Istanbul, capital of the Ottoman Empire, where the largest and busiest slave market flourished. There women were paraded, examined, questioned, and bid on in a public display often witnessed by visiting foreigners, until it was finally closed down in 1847 and the slave trade in Istanbul moved underground. In other Islamic countries, however, the slave markets remained open and public, both to natives and foreigners…This market functioned until 1873, when two British cruisers appeared off shore, followed by an ultimatum from Britain that the Zanzibar slave trade must cease or the island would face a full naval blockade.


From as early as the seventeenth century, most Negroes in the American colonies were born on American soil. This was the only plantation society in the Western Hemisphere in which the African population consistently maintained its numbers without continual, large-scale importations of slaves from Africa, and in which this population grew by natural increase. By contrast, Brazil over the centuries imported six times as many slaves as the United States, even though the U.S. had a larger resident slave population than Brazil—36 percent of all the slaves in the Western Hemisphere, as compared to 31 percent for Brazil. Even such Caribbean islands as Haiti, Jamaica and Cuba each imported more slaves than the United States.

For Additional Information, See:

You Can Never Again Say You Did Not Know by Lawrence W. Reed 

Uniquely Bad—But Not Uniquely American by Kay S. Hymowitz

Recognizing Hard Truths About America’s History With Slavery by Lawrence W. Reed

No, Slavery Did Not Make America Rich by Corey Iacono

Shackles of Iron: Slavery Beyond the Atlantic by Stewart Gordon

Slavery, A World History by Milton Meltzer

100 Amazing Facts About the Negro by Henry Louis Gates, Jr.

Conquests and Cultures: An International History by Thomas Sowell

Thomas Sowell on Slavery and This Fact: There are More Slaves Today Than Were Seized from Africa in Four Centuries by Mark J. Perry

Facts About Slavery Never Mentioned in School by Thomas Sowell (video)

Republish This Article

This work is licensed under a Creative Commons Attribution 4.0 International License, except for material where copyright is reserved by a party other than FEE.

Please do not edit the piece, ensure that you attribute the author and mention that this article was originally published on FEE.org