It’s no secret why conservatives are lending financial and political support to the Asian-Americans suing Harvard for discrimination in admissions. They want to kill affirmative action and replace it with a “race-blind” system.
Spare us. If you want to destroy discrimination in college admissions, underrepresented minorities are small fry. Instead, the biggest favors are showered on the children of alumni, who are five times more likely to gain admission than those without a Harvard bloodline. Indeed, at a trial in federal court in Boston this week, the plaintiffs who are accusing the elite college of discrimination suggested that abolishing so-called legacy preferences could be a way to widen the applicant pool and keep the student body diverse, even without affirmative action.
“Legacy students” now make up almost a third of the incoming class at Harvard, with comparable numbers at other elite universities. How that came to pass is a strange story that raises profound questions about the function and future of higher-education admissions.
The first school to grapple with the problem of legacy students in the 19th century was the U.S. Military Academy. Founded in 1802, it swiftly grew, but capped the size of incoming classes, making admission increasingly selective.
In 1818, Congress debated a bill that would have given the sons of veterans killed in the War of 1812 preferential treatment in admissions to West Point. It elicited intense opposition. One congressman declared that it would “create a privileged order in the country,” while another warned that such a policy would thwart the academy’s mission to select only “the most fit and most worthy.” The bill died.
Nonetheless, fears that admissions might be rigged dogged the institution. In 1841, Alden Partridge, the former superintendent, warned that West Point was creating an aristocracy that “has already become, in a great degree, hereditary.”
In 1843, Congress stipulated that every congressional and territorial district could send one student to West Point. The applicants had to be nominated by members of Congress, and in succeeding decades, many politicians began administering competitive exams to potential beneficiaries, making admissions far more selective and meritocratic.
By contrast, private colleges and universities did not confront the problem of legacy students because anyone who could meet certain standards — mastery of Greek and Latin, among other requirements — gained admission. Like a literacy test for voting, this ensured that non-elites almost never applied, effectively guaranteeing that the children of alumni would have a place. This was particularly true because most schools did not cap the size of entering classes.
Legacy admissions began, ironically enough, out of efforts to make Harvard more inclusive. In the late 19th century, the university’s patrician president, Charles W. Eliot, began to broaden the university’s admissions beyond the pool of elite, prep schools that supplied most of each year’s incoming class.
As the historian Jerome Karabel has noted, Eliot abolished the Greek requirement; he would later suspend the Latin requirement, too, under certain conditions. Soon, Harvard started to admit a growing number of boys from public schools who were allowed to compete for a growing number of scholarships that paid their tuition.
These efforts to raise standards of admission — to admit the best and the brightest rather than the “stupid sons of the rich,” as Eliot pungently put it — succeeded. Harvard became more inclusive, enrolling a growing number of talented students from a wide range of backgrounds.
But these efforts, eventually emulated by other private colleges and universities, had an unanticipated effect. Increasingly, Jewish public-school students aced the exams and swept the scholarships, becoming an increasingly visible presence on campus.
The Protestant elites who ran elite schools wailed about the so-called “Jewish problem,” or what some called the “Hebrew invasion.” They instituted quotas on the number of Jewish students admitted from certain schools, but this did not lower the number of Jewish students; it simply shifted the geographic distribution.
Nor did it assuage the alumni. One graduate of Harvard reported returning a quarter century after graduation to find “Jews to the right of me, Jews to the left of me,” adding pointedly that “not one of these appeared to be of the same class as the few Jews that were in college in my day but distinctly of the class usually denominated ‘Kikes.’”
Sadly, he was hardly alone in his prejudice. As growing numbers of alumni threatened to send their precious sons elsewhere, Harvard abandoned admissions on scholarship alone, substituting a far more subjective process that evaluated personality traits and athletic ability. At the same time, the college instituted selective admission. It was no longer enough to ace an entrance exam; you had to have what prep school kids schooled in French would have described as a certain je ne sais quoi.
This move went hand in hand with an implicit or explicit policy of favoring the children of alumni. In 1925, for example, the Yale Board of admissions voted that the new “limitation of numbers shall not operate to exclude any son of a Yale graduate who has satisfied all the requirements for admission.” A few years later, it codified this policy still further, requiring non-legacy applicants to score higher on entrance exams.
Here and elsewhere, legacy students began to supplant Jewish students, a pattern that held into the postwar era. In 1949, for example, Wilbur Bender, the head of Harvard admissions, simply said that “we do discriminate in our admissions policy … and I hope we always will.”
And discriminate they did at all the elite universities, giving special preference to legacy students while simultaneously forcing everyone else to vie for the remaining seats. But calls for a more inclusive study body from the 1960s onward prompted some universities to roll back the number of legacies admitted in order to build more diverse student body.
These policies, particularly those instituted at Princeton and Yale, sparked a bitter backlash among prominent alumni. William F. Buckley led the charge at Yale, mourning that the university had ceased to “the kind of place where your family goes for generations.”
Buckley was particularly bemoaned that a “Mexican-American from El Paso High School with identical scores on the achievement tests and identically ardent recommendations from their headmasters, had a better chance of being admitted to Yale than Jonathan Edwards the Sixteenth from St. Paul’s School.”
Faced with a growing alumni rebellion — and unlike West Point, very much dependent on tuition dollars and donations — Yale and other universities backed down from attempts to roll back legacy admissions. Private universities would not be purely meritocratic institutions the way that the military academy became.
Instead, they adopted the strategy that remains in place today: Reserve a quarter to a third of seats for legacy students, with the remaining seats reserved for those who help achieve the kind of diversity and eclecticism that may be missing among alumni children.
These competing imperatives — admit enough legacy students to keep the alumni happy; admit enough non-traditional students to make a reasonable claim to being representative — will always be at war.
But it’s worth recalling that neither constituency existed before the first attempts to democratize these otherwise elitist institutions. Discrimination and inclusion have a shared history: they emerged almost simultaneously.
Lawsuits notwithstanding, they’re likely to remain twinned for the foreseeable future.
(Bloomberg)