Education  
comments_image Comments

4 Ways College Admissions Committees Stack the Deck in Favor of Already Privileged Applicants

It's time to stop stigmatizing affirmative action as an "unfair advantage" for historically unrepresented groups.
 
 
Share

Photo Credit: Joy Brown via Shutterstock.com

 

Affirmative action has been the subject of much media debate recently, as the U.S. Supreme Court began hearing oral arguments on October 10 involving the controversial Fisher v. University of Texas“reverse racism” case. The plaintiff, Abigail Fisher, alleges that she was declined admission to the university as a result of affirmative action policies that left her at a disadvantage because she is white.

Contemporary debates about affirmative action policies that take race into account tend to presume that, in our post-Civil Rights era, the U.S. is a pure meritocracy that rewards the best and brightest. But this isn’t quite true. Affirmative action is used to offset other arbitrary identifiers that admissions committees are allowed to consider, many of which further enshrine existing social hierarchies according to race and class. Here are just four criteria admissions committees are allowed to consider that reward already privileged students.

1. Legacy Admissions

Six years ago, I had a conversation with a Canadian professor who had earned his PhD at an Ivy League American university. He told me he could never understand the U.S. tolerance for legacy admissions (in which a student with a family member who attended a given university would be privileged in that school’s admissions process) and suggested that legacy admissions “amounts to affirmative action for the rich.” Another Canadian professor who had earned her PhD at an elite American university overheard our conversation and chimed in. She recalled teaching numerous legacy students whose academic performances she found substandard at best.

Former President George W. Bush’s mediocre academic record at Yale is one of the more famous examples of this phenomenon, one that is a particularly big problem at Ivy League schools and other elite institutions. In their 2011 book, Higher Education, How Colleges Are Wasting Our Money and Failing Our Kids – and What We Can Do about It, Andrew Hacker and Claudia Dreifus note that each legacy applicant to Brown University has the word “legacy” written in the top corner of his or her file. It isn’t clear precisely what effect this has on an applicant’s chances; the Brown Alumni Association, whose Web site instructs viewers to contact it “for a discussion of Brown legacy statistics,” tells AlterNet that Brown does not publish information about legacy admissions. But we know that legacy status matters overall.

To wit: according to the Chronicle of Higher Education, a 2011 Harvard study showed that, “all other things being equal, legacy applications got a 23.3-percentage-point increase in their probability of admission” to 30 elite universities. And students with at least one parent who attended the competitive institutions as undergraduates – called “primary legacies” – had a staggering 45.1-percentage-point admissions advantage. In other words, a non-legacy applicant with, say, a 10 percent change of admission would have a 33.3 percent chance of admission as a legacy student and a 55.1 percent chance of being admitted as a primary legacy.

Why do schools do this? Admissions departments are reluctant to discuss the practice, but as economist Peter Sacks told the New York Times, “Elite institutions have an implicit bargain with their alumni…You give us money, and we will move your kids to the front of the line.”

The inequalities perpetuated by legacy admissions are shocking on their own, but all the more so when you take into account how little consideration is given to poor students, in light of the academic and other difficulties they may have faced. In his book, Privilege: The Making of an Adolescent Elite at St. Paul’s School, Shamus Rahman Khan writes that while the legacy children of elites receive special consideration, “poorer students are afforded no such luxury.” He explains, “Though poor students experience a host of disadvantages – from lower-quality schools to difficult access to out-of-school enrichment programs to the absence of support when they struggle – colleges are largely blind to such struggles, treating poorer students as if they were the same as rich ones.” Except, that is, when those wealthier students happen to be legacies.

2. SAT Scores

It’s a well-known truism that the Scholastic Aptitude Test is a racist tool for college admissions. It’s also the first thing most admissions committees see on a student’s application.

The test’s racist reputation extends to its origins. Its creator, psychologist Carl C. Brigham, was a well-known eugenicist; his 1923 Army Alpha Test, first adapted as the Standard Aptitude Test for Harvard admissions in 1934, culminated in a racial breakdown of scores. Brigham felt that American education was in a downward spiral and argued that deterioration in American intelligence would “proceed with an accelerating rate as the racial mixture becomes more and more extensive.” 

Of course, standardized testing did not have only reactionary proponents in its early incarnations. It was first taken up by universities to create a more meritocratic – and less legacy-based – admission system. But it hasn’t shaken out in such a meritocratic way. For example, students from relatively well-off families can take expensive courses through private companies like Kaplan and the Princeton Review to learn tricks for boosting their SAT scores. Kaplan’s most popular 18-hour course costs $599. At Princeton Review, courses range from $299 to $1999.

Studies still find that SAT scores discriminate against minorities and women – and do a poor job of forecasting future student performance. According to the non-profit advocacy group Fair Test, women score an average of 35-40 points lower than men on the SAT --despite earning overwhelmingly higher first year grades once enrolled in college. For non-native English speakers, test scores are about 91 points lower despite first-year grades equivalent to those of white native English speakers.

Finally, according to Fair Test’s Web site, “The ability of SAT I scores to predict freshman grades, undergraduate class rank, college graduation rates, and attainment of a graduate degree is weaker for African-American students than for whites.” The SAT has become such a rite of passage in US culture that its biases are rarely discussed. But, in fact, it disproportionately favors white male students, while putting equally deserving female students and students of color at a comparative disadvantage.

3. Parental Income

When I was applying to colleges from my home state of North Carolina in 1997, many of my peers chose not to apply to Duke University, believing that their parents’ middle-class income would be found wanting and prevent them from being admitted, given that Duke was not, at the time, a “need-blind” university. In those days, students were widely under the impression that they were required to list parental income and assets directly on the Duke application. Duke’s dean of undergraduate admissions, Christopher Guttentag, tells AlterNet that the policy was gone by at least 1992, when his tenure began. But the recent history of “need-aware” admissions at Duke helped create today’s deeply entrenched beliefs in urban North Carolina communities that Duke is a school for the children of wealthy parents.

Of course, any U.S. citizen who requires financial aid to attend college completes a Free Application for Federal Student Aid (or FAFSA) that includes similar information about parental income. When a college has a need-aware policy, this means that students from relatively well-off families – that is, families who may be able to pay close to full tuition -- could be privileged when it comes to the admissions game. In order to become need-blind (and forgo the need to weigh a student’s ability to pay for her or his education),  a school needs to have a way of funding the students it admits. This means need-blind policies will always be at risk during difficult economic times, and students in need always vulnerable to economic forces beyond their control.

Duke and many other research institutions have long since switched to a need-blind admissions process, but not by any legal mandate. And many colleges, especially small liberal arts schools, have started talking about ending need-blind admissions since their endowments have not fully recovered from the 2008 economic downturn. According to an October 30 Inside Higher Ed report, the colleges considering the change include “wealthy institutions like Grinnell College and not-so-wealthy institutions like Albright College, in Pennsylvania.” And the prestigious Wesleyan University has already abandoned its need-blind policy because according to University President Michael Roth, it was just too expensive.  

4. Criminal Background Check

During the mid-1990s, college applications commonly asked whether or not applicants had ever been convicted of a felony. Felony conviction was – and is – considered a legitimate reason for discrimination against an applicant. Most high school seniors applying for colleges are asked this routine question on their applications, but for some, actual criminal background checks have become much more commonplace.

In 2007, when a Virginia Tech student opened fire on students and professors on campus before killing himself, many in the public and the media asked why Virginia Tech hadn’t routinely subjected its applicants to criminal background checks. Since then, according to University Business, criminal background checks have been on the rise. That same year, the University of North Carolina system started ordering background checks on certain students, “because they had unexplained gaps in their applications or admitted involvement in a crime.” The practice has only been taken up slowly, however, as universities fear being accused of profiling potential students. Thus far, the checks have become more prevalent among students entering vocational fields like teaching, pharmacy, nursing or physical therapy, in which students must work with either minor students or vulnerable patients.

But the practice is so new that its legal implications have yet to be hashed out, and there is no current mechanism for distinguishing non-violent felonies like marijuana possession from more serious crimes like armed burglary or assault. It’s already difficult for students with minor adolescent drug convictions to attend college – under current federal law, they are not eligible for student loans. And because black youth are imprisoned at a rate nearly 10 times that of white youth for drug offenses, and black and Hispanic youth account for about 70 percent of all youth arrests, this policy disproportionately penalizes youth of color, many of them poor. Furthermore, this policy means that one mistake in early adolescence can permanently destroy a student’s chances of attending college if they don’t earn full scholarships or their parents can’t afford to pay for their education.

Clearly, there are many arbitrary factors that go into a college admissions decision -- some of which benefit already-privileged applicants, and others that attempt to correct for deep historical imbalances. It’s long past time to stop stigmatizing affirmative action, and look instead at the various ways the system still unfairly privileges well-off students, while continuing to perpetuate the inequalities affirmative action is supposed to help eradicate.

Kristin Rawls is a freelance writer whose work has also appeared in the Christian Science Monitor, GOOD Magazine, Religion Dispatches, Killing the Buddha, Global Comment and elsewhere online.