On Race and Academia

On Race and Academia

Star InactiveStar InactiveStar InactiveStar InactiveStar Inactive
 

The Supreme Court outlawed the use of race-based affirmative action in college admissions.

That practice was understandable and even necessary 60 years ago. The question I have asked for some time was precisely how long it would be required to continue. I’d personally come to believe that preferences focused on socioeconomic factors — wealth, income, even neighborhood — would accomplish more good while requiring less straightforward unfairness.

But many good-faith people believed, and continue to believe, that it is a clear boon to society for universities to explicitly take race into account. The arguments for and against have been made often, sometimes by me, so here I’d like to do something a little bit different. As an academic who is also Black, I have seen up close, over decades, what it means to take race into account. I talked about some of these experiences in interviews and in a book I wrote in 2000, but I’ve never shared them in an article like this one. The responses I’ve seen to the Supreme Court’s decision move me to venture it.

The culture that a policy helps put into place can be as important as the policy itself. And in my lifetime, racial preferences in academia — not merely when it comes to undergraduate admissions but also moving on to grad school and job applications and teaching careers — have been not only a set of formal and informal policies but also the grounds for a culture of perceptions and assumptions.

I grew up upper-middle-class in Philadelphia in the 1980s. As early as high school, I picked up — from remarks of my mother’s, who taught at a university, as well as comments in the air at my school — that Black kids didn’t have to achieve perfect grades and test scores in order to be accepted at top colleges. As a direct result, I satisfied myself with being an A- or B+ student, pursuing my nerdy hobbies instead of seeking the academic mountaintop. I was pretty sure it wouldn’t affect my future in the way that it might for my white peers.

 
I have no reason to think affirmative action played much of a role in the colleges I went to, as neither was extremely selective at the time. In grad school, I was told by a mentor, a Black man, that race had been the reason I wound up in the top-20 pile of applicants in linguistics in the department where I got my Ph.D. I had minimal experience with linguistics proper, and my G.P.A. was very good but nowhere near perfect. (Those hobbies!) But I have always thought of that as racial preferences the way they should have been, merely additive around the margins. I’d done well on tests like the G.R.E., my grades in language courses were top level, and I had written a senior thesis that made it clear I had a linguistics frame of mind.

But things got different later. When I was a grad student in linguistics going on the market for jobs, I was told that I needn’t worry whether I would get bids for tenure track positions because I was Black and would therefore be in great demand. Deep down, to me, it felt I was on my way to being tokenized, which I was, especially given that my academic chops at the time did not justify my being hired for a top job at all.

I was hired straight out of my doctoral program for a tenure-track job at an Ivy League university in its august linguistics department. It became increasingly clear to me that my skin color was not just one more thing taken into account but the main reason for my hire. It surely didn’t hurt that, owing to the color of my skin, I could apparently be paid with special funds I was told the university had set aside for minority hires. But more to the point, I was vastly less qualified by any standard than the other three people who made it onto the list of finalists. Plus, I was brought on to represent a subfield within linguistics — sociolinguistics — that has never been my actual specialty. My interest then, as now, was in how languages change over time and what happens when they come together. My dissertation had made this quite clear.

 
At the time I was not very politicized, and I assumed that my race had merely been a background bonus to help me get hired. Only later did the reality become more apparent, when I learned just who else had been on that shortlist. (I will never forget how awkward it was when I met one of them — older than me, with more gravitas in the field — some years later. I sensed that we both knew what had happened and why.) I had been hired by white people who, quite innocently, thought they were doing the right thing by bringing a Black person onto the faculty. I bear them no malice; under the culture we were all living in, I would have done the same thing.

Around this time I gave some really good talks and some just OK ones; I always knew the difference. But I couldn’t help noticing that I would get high praise even for the mediocre ones, by white people who were clearly gratified to acknowledge a Black academic. And in the meantime, I was hopelessly undercooked for the position I had been hired for. I was not utterly clueless, but I simply didn’t know enough yet — and especially not enough to be in a position to counsel graduate students.


I needed some years of postdoctoral study. They say you don’t really know it till you teach it, and that’s largely true: Having never actually taught a class, I needed to teach some. I needed to hang around linguistics for a longer time in general. There are formative experiences key to being a real linguist that I had not yet had, such as long-term work with speakers of my language of focus, Saramaccan.

The doctoral program I had been in had gone through a phase of allowing students perhaps too much leeway in deciding which courses to take. Many students took this as an occasion to sit at the feet of their mentors and drink in what they knew. But my natural orientation has always been autodidactic, and so I basically went off into a corner and focused like a laser on one issue that particularly interested me — how creole languages form — while developing only a passing acquaintance with linguistics beyond it. With undergrads, I could coast on stage presence, but grad students know the real thing when they see it and when they don’t. I looked like a fool.

I didn’t like it. But because I am obsessive, I ultimately dedicated myself to boning up and then some. I read and read and read. I spoke closely with as many linguists as I could. I took up new interests within the field. I did intense study of my language of focus. I taught classes outside my comfort zone. That is, I became a normal academic.

But it all felt like a self-rescue operation, an effort to turn myself into a good hire after the fact. That backfilling of needed skills is a lot to ask of someone who also needs to do the forward-looking research necessary to get tenure.

Of course, not everyone endeavors this Sisyphean task, and the culture I refer to has a way of ensuring others don’t have to. There is a widespread cultural assumption in academia that Black people are valuable as much, if not more, for our sheer presence as for the rigor of what we actually do. Thus, it is unnecessary to subject us to top-level standards. This leads to things happening too often that are never written as explicit directives but are consonant with the general cultural agenda: people granted tenure with nothing approaching the publishing records of other candidates, or celebrated more for their sociopolitical orientations than for their research.


 I had uncomfortable experiences on the other side of the process as well. In the 1990s, I was on some graduate admissions committees at the university where I then taught. It was apparent to me that, under the existing cultural directive to, as we have discussed, take race into account, Black and Latino applicants were expected to be much more readily accepted than others.

I recall two Black applicants we admitted who, in retrospect, puzzle me a bit. One had, like me, grown up middle-class rather than disadvantaged in any salient way. The other, also relatively well-off, had grown up in a different country, entirely separate from the Black American experience. Neither of them expressed interest in studying a race-related subject, and neither went on to do so. I had a hard time detecting how either of them would teach a meaningful lesson in diversity to their peers in the graduate program.

Perhaps all of this can be seen as collateral damage in view of a larger goal of Black people being included, acknowledged, given a chance — in academia and elsewhere. In the grand scheme of things, my feeling uncomfortable on a graduate admissions committee for a few years during the Clinton administration hardly qualifies as a national tragedy. But I will never shake the sentiment I felt on those committees, an unintended byproduct of what we could call academia’s racial preference culture: that it is somehow ungracious to expect as much of Black students — and future teachers — as we do of others.

That kind of assumption has been institutionalized within academic culture for a long time. It is, in my view, improper. It may have been a necessary compromise for a time, but it was never truly proper in terms of justice, stability or general social acceptance. Whatever impact the Supreme Court’s ruling has on college admissions, its effects on the academic culture of racial preference — which by its nature often depends less on formulas involving thousands of applicants than on individual decisions involving dozens — will take place far more slowly.

But the decision to stop taking race into account in admissions, assuming it is accompanied by other efforts to assist the truly disadvantaged, is, I believe, the right one to make.