Should Youth Be Tried as Adults?

Juvenile-Justice

There are some very strict laws in the US surrounding age. You have to be 21 to drink, 18 to buy tobacco, 16 to obtain a driver’s license, and the list goes on. For the most part, these laws are enforced to the absolute degree; being 17 years and 364 days will not allow you to legally buy a cigarette. However there is one major exception in our country’s reliance on age related law enforcement.

In the criminal justice system, we are free to prosecute juveniles as adults, including children below the age of 16.

There are in fact numerous reasons why juveniles should not be tried as adults, and why this method fails as a criminal justice approach. For one, we have tried a one-size-fits-all criminal justice system in the past, and it didn’t work out. In the early 19th century, there was only one criminal justice system in the US. Americans had a growing national concern about youth being tried and imprisoned alongside adults. In response, the nation established a framework for our current juvenile justice system. Cook County in Illinois established the first juvenile court in 1899, and the rest of the states followed suit over the next thirty years.

Today, the term juvenile justice encompasses the “area of criminal law applicable to persons not old enough to be held responsible for criminal acts.” This system is specifically designed to rehabilitate troubled and delinquent youth, reflecting the notion that youth can indeed be reformed into responsible citizens. There is an apparent need for a separate justice system to address the unique nature of juvenile offenses.

Despite this system, minors are still able to be prosecuted as adults today. Each state has different standards to determine if cases will be sent to adult court, but the courts typically examine factors such as the defendant’s age, maturity, past attempts at rehabilitation, harm caused, etc. Much of the discretion in these cases is left up to the states. In fourteen states, there is no minimum age for trying a child as an adult, while other states have set low minimum ages such as 10, 12 or 14.  The lack of consistency from a national perspective is a major part of the issue.

More importantly, there is an ethical dilemma in the reasoning here. The entire rationale behind creating a juvenile justice system is that children are distinctly not the same as adults. Our society operates based on the principle that juveniles are less mature and overall less capable of making rational decisions for themselves. This is especially true of younger children, who would rarely be subject to the same degree and type of discipline as an adult. If research has proven that children do not possess fully developed mental faculties, then on what basis would we hold a child accountable as an adult?

A crucial difference between children and adults is that children are capable of immense improvement and change while still in their formative years. A person of young age is not finished developing, and we cannot ascertain who they will turn out to be if they are given the proper support and rehabilitation. Although there still remains a possibility of positive change with adult offenders, it is generally thought that adults are more set in their ways, meaning that the criminal justice system is less effective in intervening to change their behaviors. Subjecting youth to the same standards as an adult undermines and defies the potential of a still growing person.

Of course, the majority of juveniles being sentenced as adults are not young children, although there have been kids as young as 12 years old sentenced in adult court. Most youth tried as adults are 16 or 17 year olds who have committed particularly heinous crimes. Some people contend that it is more justifiable to sentence someone who is 17 or 18 as an adult, since they are close to the age of criminal culpability. Despite that fact, and the very serious nature of some crimes,  I fail to see how committing an offense “ages” a defendant, unless we are to make the false assumption that perpetrating evil deeds requires an age limit.

18 years is the standard, legally accepted age at which a person can be held responsible for his/her criminal actions. No matter what offense a juvenile commits, their age should not become a variable factor that can suddenly be adjusted according to our moral code.I find that this line of thinking presents a dangerously slippery slope, where we will always be left wondering where to draw the line between being “close to an adult” and being a “juvenile”.

Rather than sentence juveniles under laws that were not designed for them, I propose that we modify the current sentencing procedures of the juvenile justice system. Currently, a juvenile offender can only be imprisoned until he/she turns 21. This presents a serious problem for judges, since some juveniles have committed crimes that warrant a longer sentence, especially those who were 16 or 17 at the time of their offense. Rather than send youth to an adult prison prematurely, we should reform the law so that judges can sentence a juvenile past the age of 21. The individual can be transferred to an adult prison upon reaching the max-out age of the juvenile detention center.

My primary concern lies in enforcing laws in a consistent and rational manner. Currently, we fail to do so by transferring juvenile criminals into a system that does not reflect nor address the defendant’s legal age.

Being Biracial

The United States has long prided itself on being a “melting pot” with a mix of ethnicities and nationalities. Today, Americans choose to date and have children with people outside of their racial groups more than ever before, creating a true “melting pot”in a new sense of the term. Pew Social Trends estimates that the share of multiracial babies has risen from 1% in 1970 to 10% in 2013. In the new mixed race America, questions of  identity naturally arise, and our answers to these questions provide an insightful look into current racial perspectives in the United States.

The topic of my blog makes it obvious that I have a particular interest in US race relations, particularly in regard to the law and the criminal justice system. As the child of a white father and a black mother, I think that I have always been acutely aware of the role that race plays in all of our lives; the tensions and differences that are present in our society exist for me internally, as well. Most biracial people reflect that they feel some conflict with their identity and how they are viewed in the United States, whether it stems from the inevitable childhood questions from peers about whether you feel “more black or more white” or whether it is simply the feeling of not fully identifying with any one group. In the US, people tend to be recognized as just one race, and racial roles can inform the way people dress, speak, and behave. These norms make it difficult to embrace being biracial. I imagine that every mixed child knows the struggles that are unique to these experiences.

That’s why, when I heard about a new children’s book called Mixed Me, written by actor Taye Diggs, I was especially intrigued. Taye Diggs has a biracial child with his former wife, Idina Menzel, and he wrote Mixed Me in order to help children understand, accept, and appreciate themselves for who they are. His book speaks to some common experiences of being biracial, including having the ubiquitous curly hair, and taking pride in who you are. Diggs has also spoken publicly about encouraging his son to identify as biracial. 

diggs.jpg

Taye Diggs and his son

Surprisingly, Diggs’ ideas about his son’s race have received a lot of backlash on social media apps like Instagram. Diggs had to defend himself against critics who accused him of being self-hating by encouraging his son to identify as mixed. Some accused him of attempting to deny his son’s blackness in a sense. This dialogue got me thinking about what it really means to be biracial in today’s world, and why Americans, including some in the African-American community, are still strangely uncomfortable with the notion of being mixed.

With statistics projecting that people will continue to intermix at increasing rates, I think it’s time to revisit the traditional American ways of thinking about race. Taye Diggs is prompting a dialogue that has its contentious roots in our nation’s past.

The United States in fact has a long, complicated history with recognizing multiracial people. As a nation, we have a tradition of being conceptually divisive about being black and white–you are either one, or the other, but not both. The infamous “one-drop” rule, originating in the South during the time of slavery, meant that just a drop of “black blood” effectively made a person “black”. More than anything else, this rule treated whiteness as an ideal of purity that would be tainted and diminished by even the slightest degree of black relation.

In Louisiana, perhaps the most racially mixed place in the country during the 1800s and 1900s, there was an elaborate set of terms to identify the degrees of “blackness” in individuals. A “quadroon” was a person who was one quarter black, while an “octoroon” was a person who was one-eighth black. The common theme here is that none of these people could ever be considered white. And today, very little has changed in the starkly rigid, separated way that we think about race. The US Census didn’t allow people to choose more than one race until the year 2000. Even as systems change, people’s attitudes necessarily haven’t.

Much of our concept of identity seems to be based on appearance, and in that sense it is logical that mixed people are often seen as their appearance suggests. But I wonder why, should some choose to identify as biracial or even as white, they should be denied that option. Why are we still relying on the “one-drop” standards, which were instituted primarily as a method of oppression and maintaining racial separatism? No one is expecting anyone to be able to magically identify others’ backgrounds, but if someone chooses to embrace and identify with two races, I think that should be respected.

I say all of this with the understanding that biracial people might willingly identify solely with their minority half, which is their absolute right. Research shows that multiracial people with a black background have attitudes and experiences that are more closely aligned with the black community. According to Pew, the opposite is true for those mixed with Asian and white, who identify more closely with their white background.  I suspect that self-identification is affected partially by how we have been conditioned to view ourselves, and how legacies from hundreds of years in the making have informed our perspectives.

As a society, we might have a lot to learn from embracing multiracial-ness. After all, things don’t have to be so black and white. I support a system that embraces diversity, both between ethnic groups and within them. Taye Diggs’ has the right idea in not automatically asking his son to choose just one racial identity for himself. His new book Mixed Me is a wonderful reminder for multiracial children that they should feel comfortable in their own skin.


If you’re at all interested in the changing racial demographics of the United States (like I am!), check out this fascinating National Geographic Article,  “Visualizing Race, Identity, and Change”. 

Op-Ed: Athletes Have Unique Ability to Force Change on College Campuses

On some of the most racist college campuses in the country, black students are among the most cherished and celebrated.

Not just any black students, but the black athletes who bring fame, bragging rights, and a steady revenue stream to football and basketball programs. At schools like Florida State, the 2014 BCS National Football Champions, black men are nearly 70 percent of the football team. The same is true throughout much of the SEC, which is one of the nation’s premier conferences for college football. It seems that where sports reign supreme, people are willing to bury their racial prejudices beneath Heisman trophies and NFL draft prospects.

Due to their great influence, college athletes are uniquely situated to take a stand and force people to listen about racial issues on their campuses. When colleges’ sports programs suffer because some of their athletes refuse to play, or because they choose to attend schools with better racial climates, then schools in the South and elsewhere will be forced to address race concerns that they have historically ignored.

News broke over the weekend that University of Missouri football players were protesting in response to racial incidents on campus. A group of players refused to play football until the President of the University of Missouri, Tim Wolfe, resigned from his position. Wolfe had continually refused to address racial problems on campus, and the black community had complained about his inaction all semester. Their protests and outcries had been ignored by Wolfe and the rest of the administration. But the involvement of the football team seems to have been a turning point for the university’s response.

Missouri’s athletic program generated $83.7 million dollars in revenue last year. It is simply bad economics, and bad public relations, for members of the school’s most prized team to refuse to play. Their refusal was a move that the University could not ignore, and last week, Tim Wolfe resigned as President of the University of Missouri.

The victory won by Missouri’s football team is one that can be replayed at schools across the country. If athletes decide to engage themselves in campus concerns, they have considerable leverage to force people to listen. Missouri’s football team sent a message that is deeply profound and powerful: you cannot celebrate Black athletes while ignoring the needs of Black students. This type of thinking ushers in a new wave of possibilities for social activism on college campuses.

There are numerous ways for athletes to show solidarity with students of color at their schools. Even if athletes don’t take the bold step of refusing to play, they can take to social media to call attention to ongoing issues. College athletes have the platform at their disposal to generate media attention with every tweet and Instagram post. The news about players’ speaking out on social media would reach outlets that don’t typically cover race issues, like ESPN and other sports media. The surrounding publicity is a major catalyst for change, as is evident with the recent events at the University of Missouri.

Reports of discrimination and intolerance are far from a thing of the past on US college campuses. In 2014, a group of three students tied a noose around the neck of a campus statue of the first black student to attend Ole Miss. Earlier this year in 2015, a University of South Carolina student was suspended for distributing a photo on social media depicting herself writing racial slurs.The University of Alabama faces ongoing allegations of racial discrimination in their fraternities and sororities. And these are just a handful of recent incidents occurring at SEC schools, in addition to the racial concerns occurring at schools like Yale University, University of Oklahoma, Duke University, and many others.

Due to their public influence, college athletes, including the large proportion of black basketball players, are in the best possible position to call attention to these concerns. They can force colleges to put their money where their mouth is. The average minority student simply does not have the leverage to force the type of change that occurred at the University of Missouri this weekend. Their efforts are no less important, but college athletes, joining in with already occurring social movements at their respective schools, lend a much-needed visibility to the effort to eradicate discrimination and intolerance in higher education.

After all, diversity on college campuses must mean more than just recruiting black athletes who benefit the university. To overthrow indifferent administrations, to make diversity a priority and not an option, and to create campus climates that celebrate people of all backgrounds, we need the university’s most public and privileged figures—the athletes—to speak up. 

You’re Not Taking My Bacon From Me

A recent post by fellow blogger The Political Pawprint got me thinking. In the new era of food consciousness, as we learn more about what foods and food practices are harming us, what lifestyle changes should the average American make?

In light of new reports detailing the harmful effects that processed meats can have on our bodies, should people stop eating things like sausage, hot-dogs, and (gasp) bacon?! Science has come a long way in the past century towards uncovering the best practices to live a healthy life. But no matter how many reports are published about the dangers of fried food and GMOs, people continue to purchase food products that have been deemed unhealthy. Is it because people simply refuse to believe it, as Polka Dots & Politics suggests in her comment on The Political Pawprint’s post? Is it all sheer ignorance that could be rectified through greater awareness?

I argue something different. The primary reason that most people will not dramatically change their diets is because people evaluate the costs and benefits of decisions, and many of us care a lot about the quality of our lives, not just the longevity. Sure, some people probably don’t know about the new findings on processed meat. And absolutely, many Americans simply lack the resources to completely rethink their diets. But beyond those reasons, many Americans choose not to change their diets because they simply enjoy eating harmful foods, maybe even more than they are concerned about the effects of eating these foods.

The question that interests me is what motivates people to make healthy changes. Perhaps living the healthiest lifestyle does not entail living the happiest lifestyle. Few Americans are frequently thinking about their own mortality and future health (I know I’m not on a daily basis). The gratification of the present is a strong pull against the unknowns of the future. And, I argue, maybe that’s not such a bad thing. 

New science seems to be revealing hidden health effects in so many different things that humans consume and use. Sometimes I wonder if, in fifty years, we will all be sheltered in some kind of bubble sphere to protect us from all of the harmful crap in our environment.

Is this in our future?

Is this in our future?

This is not to say I’m a science skeptic; I believe the reports revealing the negative health effects of everyday foods and items. I’m just not convinced that it is preferable to live my life avoiding a plethora of things that make me happy in order to maybe secure myself a more healthy future.

I say maybe because, in reality, life is one big maybe. There are so many uncontrollable and even random factors that dictate our lives and health. Due to the unforeseeable nature of the future, I personally value my enjoyment to be a very serious component of choosing what I consume. I actually enjoy eating fruits and vegetables, so I naturally eat them, but I wouldn’t judge the person who avoids all vegetables because they hate the taste. I don’t think I’m alone in weighing the quality costs/benefits of adopting a more healthy lifestyle.

The Political Pawprint mentions that dietary changes often occur in light of new scientific findings being made public, but that people often then revert back to old habits when the news dies down. I believe this drop off occurs not because people forget or become unable to continue with new diets. It’s because people simply decide that the changes are not worth it, and that they are better off eating what makes them happy, ideally in moderation, but often not.

Crucially, the importance of spreading awareness in terms of health cannot be understated. One must only remember the awareness campaigns against tobacco smoking in the US, and the incredible impact that they had on reducing cigarette smoking, to be convinced that greater awareness can indeed change people’s behavior. But, there are still millions of Americans who smoke cigarettes, and millions more in Europe and around the world who continue to do so despite the widespread knowledge that it kills your body.

I believe the best thing to do is to inform people about the scientific effects and harmfulness of their actions, and let them make the decisions that best suit their own lives. I don’t think it’s wrong or condemnable for people to make decisions that defy the currently “healthy” option. Moderation is the best principle to preach (rather than just eliminating unhealthy things), but it is understandable, if not lamentable, that some people will choose to ignore best practices. Let people have their bacon if they so please, and don’t judge them for deciding that it’s worth the risks.

Let us never forget, too, as we think about our own health and judge the motivations of those around us, that it really isn’t always possible for people to make dramatic dietary changes. Food production in this country often results in making processed, unhealthy food cheaper than unprocessed fresh options. We should never let food privilege blind us to the fact that many people do not choose their food, and instead eat what they can, not what they want.

Notwithstanding that, as this post argues: you can go ahead and put vegan tempeh bacon in every fast food place in this country. I’m still ordering the real thing on my hamburger.

In Defense of Hip Hop Culture

Rap3If you watch Fox News, you’ve seen it. If you listen to conservative politicians, you’ve heard it. There is a huge, ongoing outcry against hip hop/rap music, and more specifically the culture that surrounds it. Critics of hip-hop culture claim that the music incites violence, glorifies crime, and perpetuates negative stereotypes about women. Some take the criticisms a step further. Fox News anchor Bill O’Reilly is one of the most vocal opponents of rap music: this year, he went so far as to blame hip hop/rap for the decline of organized religion in the US. 

Hip-hop music indeed receives a lot of negative backlash in mainstream society. And while O’Reilly’s idiocy always manages to expose itself, the dialogue about rap music, and whether it is a healthy art form, is a relevant topic of discussion. The lyrics in modern rap music can be quite violence-laden and off-putting, especially to the outside ear. Rappers like Chief Keef and Bobby Shmurda are just a few of many recent artists who built their careers on music that describes drug-dealing, gang activity, and gun violence. Criticism against rap music has also emerged from within the hip-hop world itself, as more socially conscious rappers claim that mainstream rap negatively influences young listeners and the ever impressionable American public.

But hip-hop/ rap music, from the “gangsta” rap described above to the more mellow sounds of Common or J Cole, is as valid and necessary an art form as any other genre of music. The purpose of music is to relate an experience or to tell a story–more often than not a story that is personal to the music’s creator. Taylor Swift, for instance, writes music about her love life and her past relationships because those are her life experiences that she wants to share with others. The late Amy Winehouse wrote music about her struggles with substance abuse. Music is, put simply, an expression of our personal selves that others can then relate to.

In the context of hip hop/ rap, young urban youth tell their own personal stories, within the context of their environments. Many rap artists are from communities where violence is an everyday norm. Who are we to try to moderate their self-expression simply because most of us fail to relate to the environment they are from?

Take Chicago, Illinois, for example. Chicago has been a recent hotbed of popular young rap artists. The city has been dubbed “Chiraq” in reference to the astoundingly high rates of violence, especially homicides, that plague the city. A handful of neighborhoods in Chicago have higher rates of violence than the most dangerous countries in the world. The rap music coming out of Chicago reflects and expresses this reality.

One popular artist, Lil Durk, hails from the Englewood neighborhood of Chicago, an area with a higher per capita homicide rate than Belize, El Salvador, and Guatemala. Accordingly, Lil Durk’s music is rife with mentions of the violence and crime that is so common in his neighborhood. A sample of one of his more popular songs, “500 Homicides”, is a fairly standard look at urban rap. The lyrics include lines such as,

“load up the block, and reload the 8, do a drill on the op, no clones I see dots…”

“silver spoon, you don’t know how hunger feel, dreaming bout 100 mill, step on that curb with 100 pills…”

These lyrics may not be considered profound, but they communicate some of the realities that Lil Durk faces. If he can’t make music about what he has experienced and seen, then what can or should he make music about? One could even argue that Lil Durk’s success as a rap artist is keeping him off of the streets, and keeping him from committing the same crimes that he raps about. I would argue that it is better for urban youth to make music about negative activities, and perhaps achieve success doing so, than for them to actually go out and engage in said activities (although of course, as we have seen with the high-profile arrests of many of rap’s hottest stars, the two are not mutually exclusive).

People who argue that”gangsta rap” encourages violence are actually making an error in causality. The violence is not occurring because of the music; the music is being made because of the violence. As an essay called The Social Significance of Rap & Hip-Hop Culture from Stanford University explains, “….hip-hop music is a symptom of cultural violence, not the cause.” People are crafting lyrics that in some way, exaggerated or not, relate to what is actually taking place in their communities. Our outrage and horror should be directed more at the socioeconomic ills that lead to gang banging and violence than it should be at the music that merely describes it.

Rap music has a long history of being used as a form of resistance, through which oppressed groups can speak out against the powers that subjugate them. The music gives a voice to people who otherwise might not have one. If we want to change what most young rap artists are saying in their songs, then perhaps “…we must provide them with the resources and opportunities to view the future with hope.”

More generally, it is both unfair and unrealistic to suggest that rap artists have a social responsibility to create only positive music. Every artist is simply not destined to be an activist. We have a tendency to place celebrities on an unfairly high pedestal, demanding that they serve as our role models, or assuming that they are somehow inherently morally superior to the average person. Famous musicians gain stardom for their talents as entertainers, not for their humanitarian feats. Let’s enjoy the music and the art for what it is. If we are continually disappointed in the content of our music, or the behavior of our musicians, it is only because we have set unnecessarily high expectations for people who achieve celebrity, even beyond the rap genre.

I think our country would do well to cease the rampant idealization of celebrities–rap artists included. Of course they are responsible for their actions and words, but we defy logic when we demand that they sacrifice authenticity for the purpose of easing our public conscience.

The Broken Politics of Jury Selection

The right to trial by jury is guaranteed by Article Three of the US Constitution as well as the Fifth Amendment. Fair and impartial juries are a crucial component of the criminal justice system. Unfortunately, the process of jury selection can undermine the credibility of the jury by introducing both bias and discrimination.

According to information found on the American Bar Association website, jury selection begins when the court clerk summons twelve people on the jury list to the jury box. The judge then speaks briefly about the case, and the judge and lawyers then begin questioning the potential jurors. The process of questioning potential jurors is called voir dire– it is done to ensure that potential jurors are not biased or unfair. If either lawyer suspects that a juror might be biased in some way relating to the case, the attorney can ask the judge to dismiss the juror for cause.

The ABA goes on to explain that lawyers also have a specific number of peremptory challenges to dismiss a juror without stating a cause, but these are limited in number, unlike challenges for cause which can be requested on an unlimited basis.

In an ideal setting, attorneys and judges would not discriminate on the basis of race or sex in selecting jurors. Challenges for cause are meant to root out legitimately biased jurors, such as those who know someone that is involved in the case. They are not intended to be used to systematically dismiss jurors of color, or to ensure any kind of racial majority in the jury. Yet, discrimination on the basis of race remains a very real aspect of jury selection.

In the upcoming Supreme Court case Foster v. Chatman, the Court will examine the issue of racial prejudice in jury selection yet again. The Court previously found in Boston v. Kentucky that it is unconstitutional to strike jurors because of their race. The new case shows that racial prejudice in jury selection is far from a thing of the past. According to a recent Huffington Post article, the latest case shows damning evidence of attorneys in a 1987 Georgia murder case striking jurors because of race. The attorneys presented weak, unjustifiable challenges for cause, but all were accepted by the judge, with the result being an all-white jury convicting and sentencing a Black teenager.

The Foster v. Chatman case is notable because it provides concrete evidence of racial discrimination. The attorneys took notes labeling black jurors based on race and specifically rooted out every black potential juror. Many of the reasons listed to strike black jurors applied equally well to white jurors who were allowed to stay on the case.

Perhaps most revealing, and ultimately damaging, about this kind of racial bias is that the judge allowed it to take place. It is reasonable to assume that attorneys’ motives to win the case might prompt them to create the most favorable jury possible, by striking people who might not vote their way. But why are judges so willing to accept biased challenges for cause?

In the ideal system of checks and balances, the check on an attorney’s power in jury selection is the judge. Perhaps there is an inherent conflict of interest in asking the judge, who operates within and works for the criminal justice system, to oversee the process of picking a jury. Given the wealth of evidence showing that the criminal justice system is biased against black defendants, in terms of arrest rates, sentencing, death penalty convictions, etc., it seems unlikely that all judges can be entrusted to ensure that jury selection is a fair process.

Tellingly, racial discrimination in jury selection has been an enduring phenomenon, despite legal efforts to stop it. Studies have found that in some counties, there is no African American representation on the jury in 80 percent of criminal trials (Jefferson Parish, Louisiana, for one). In addition, some District Attorney’s offices appear to actually train their prosecutors to exclude racial minorities from jury service and then to hide their racial bias under bogus challenges.

While it is absolutely necessary to preserve voir dire challenges to excuse legitimate bias, the scope of these challenges should be greatly limited. Juries will function best, and promote justice most ideally, with minimal interference from attorneys and others motivated actors. People have a constitutional right to be tried in front of a jury of their peers–necessarily including people  who might look like them, or be from similar backgrounds. For prosecutors to handpick juries to deliver the verdicts they desire is to circumvent the purpose of having a jury in the first place. There is obviously a belief among prosecutors that minority jurors will be sympathetic to the defendant. This only means that prosecutors might have to do their jobs better to truly convince a  fairly selected jury that a defendant is guilty.

Minorities are part of our population, and thus deserve the same right to serve on juries as any other citizen. While minorities might have differing opinions about criminal justice, or provide unfavorable verdicts for prosecutors, this is simply the design and outcome of the jury system. We cannot feasibly root out all those who might disagree with us and still claim that we have a representative “jury of peers”. As with all elements of our legal system, jury selection can only promote justice inasmuch as we implement it fairly and without regard to our own biases and preferences.

Rethinking the Mass Incarceration of Drug Offenders

The United States, not unlike many other countries, faces an ongoing problem with the use and sale of illegal drugs. In response to rising crime rates and drug usage in the 1980s and 1990s, the federal government rapidly increased the sentences for drug offenses. This resulted in a system of mass incarceration unrivaled in the developed world. Today, we have a prison system where nearly half of all federal prisoners are serving time for drug offenses (FAMM 1). Despite the nation’s efforts to combat drugs, we have overwhelmingly failed through a punitive approach.

Rather than imprison low-level, non-violent, drug offenders, we should instead offer drug treatment and counseling to the large proportion of those suffering from addiction, a medical condition. From a criminal justice perspective, incarcerating low-level drug users does not protect public safety, nor does it reduce the offenders’ chances of abusing drugs again. To be more effective in combating drug crimes, as well as more economical, low-level drug offenders should not go to prison; rather, they should partake in court-ordered drug treatment programs to address and cure their physical drug addiction.

To evaluate if punitive drug sentences are an effective criminal justice method, it is first important to understand how prison is designed to function in the criminal justice system. The Federal Department of Corrections states that it’s mission is to,, “protect society by confining offenders in the controlled environments of prisons and community-based facilities…that provide work and other self-improvement opportunities to assist offenders in becoming law-abiding citizens.” (Federal Burea of Prisons 1). The state of Ohio’s Department of Corrections, one of the nation’s largest state prison systems, is more succinct in its goals: its vision and mission are simply to “reduce crime in Ohio” and “reduce recidivism among those we touch” (Ohio Department of Rehabilitation and Correction 1). To that end, imprisoning drug offenders is supposed to deter them from further drug use, while keeping criminals off the streets. However, imprisoning non-violent, or “low-level” drug offenders–who are the majority of those in prison for drug offenses–accomplishes neither of these criminal justice goals. (The Sentencing Project 1.)

Specifically, in thinking about reducing recidivism, research has shown that prison sentences will not reduce drug use. According to the National Association of Drug Court Professionals, approximately 95% of offenders return to drug abuse after being released from prison (NADCP 1). This is because drug abuse is a distinctly different kind of crime than most others that result in prison sentences. According to current medical knowledge, drug addiction is a brain disease, interfering with, “an individual’s ability to make voluntary decisions, leading to compulsive drug craving, seeking, and use” (National Institute on Drug Abuse 1).To imprison drug offenders assumes a majorly incorrect premise: that they have some degree of rational control over their crime.

People addicted to drugs will not respond to prison as a deterrent to reoffending because they are not weighing the consequences of their decisions as healthy people do. Researcher Valerie Wright makes this argument, stating that those who are under the influence of drugs and alcohol are unlikely to be, “..deterred by either the certainty or severity of punishment because of their temporarily impaired capacity to consider the pros and cons of their actions.” (Wright 2).Prison then becomes a revolving door for people suffering from drug addiction, as we fail to treat the underlying cause of the crime. It is estimated that,“60-80% of drug abusers commit a new crime (typically a drug-driven crime) after release from prison (Doug, Schiraldi, and Zeidenberg 9). Thus the prison system cannot reduce recidivism among drug offenders as it is designed to for other types of criminals.

In relation to the other primary goal of the prison system- securing public safety- many low-level drug offenders do not actually pose a significant danger to the public. According to the Sentencing Project, nearly 72.1% of federal prisoners serving time for drug offenses are non-violent offenders with no history of violence (The Sentencing Project 1). A genuine concern for public safety in our country seems misguided, seeing as how there are more people incarcerated for drug crimes than for crimes related to weapons, explosives, and arson (Malveaux 1). Research indicates that resources spent on drug offenders would be better spent on keeping violent criminals off the street.

David B. Kopel, the former assistant attorney general for the state of Colorado and now a research director and policy analyst, argues that drug sentencing may actually lead to more violent criminals being free, thus negatively impacting public safety. Specifically, he claims that “the mandatory drug minimums have led to reduced punishment for violent crime”, as parole standards have been lowered to accommodate for a dramatic increase in inmate population (Kopel 6). In this way, drug sentencing may actually be hindering public safety, rather than protecting it, as the criminal justice system strives to do. The American public, whose safety is in question, agrees that criminal justice priorities are in the wrong place. The Pew Center on the States reports that 62% of Americans strongly favor sending fewer low-risk, non-violent offenders to prison in order to keep violent criminals in prison for their full sentence (FAMM 1).

Given the unique nature of drug abuse, and the inadequacies of the prison system in combating it, we should provide treatment and other services to drug offenders in lieu of incarceration. Drug treatment programs, as part of a larger effort of the alternative sentencing regime, have been shown to reduce substance abuse and recidivism among drug offenders. In a report called “Treatment or Incarceration?” released by the Justice Policy Institute, researchers Mcvay, Schiraldi, and Ziedenberg show that drug treatment programs, combined with education and life skills training, have been more effective than incarceration at curbing drug abuse (McVay, Schiraldi, and Ziedenberg (3).

The researchers specifically note the success of treatment programs like “Break the Cycle”, a probation program focusing on drug treatment, testing, and sanctions. Offenders who participated in Break the Cycle were less likely to be arrested during the first six months of supervision (11). The University of Maryland determined that Break the Cycle successfully reduced substance abuse and re-arrest rates for participants (11). Mcvay, Schiraldi, and Ziedenberg attribute these positive outcomes to drug treatment, and they highlight the fact that similar other programs exist around the country.

Importantly, implementing alternative sentencing programs is not the same as decriminalizing drugs. Treatment should be both court-mandated and court-monitored. One program, commonly referred to as Drug Court, requires the offender to actually plead guilty to an offense before entering the program (NADCP 1). The approach is still entirely within a criminal justice framework. In fact, drug treatment programs actually require an element of judicial supervision in order to be successful. According to the NADCP, 60 to 80% of offenders drop out of treatment prematurely unless they are regularly supervised by a judge (NADCP 1). Since the prison system fails to adequately rehabilitate drug abusers, it is crucial to shift state resources and attention from the penal system to the judicial system in treating offenders.

Drug treatment also provides cost-saving benefits, proving to be more economically sustainable and thus preferable for the criminal justice system. Even a residential drug treatment program can cost half as much as the average term of imprisonment (McVay, Schiraldi and Zeidenberg 5). The Sentencing Project Report highlights the “abundance of research indicating the cost-effectiveness of treatment for drug abuse rather than incarceration” (Sentencing Project 3). In our country today, overpopulation threatens to seriously undermine our prison system. There have already been bipartisan calls to reduce the size of the prison system, as it has wreaked havoc on the budgets and resources of many states, with California being one notable example. In light of the rising cost of incarceration, it becomes even more important to release non-violent drug abusers who could be better helped through alternative programs. The cost saving benefits also negate any argument suggesting that drug treatment should occur within the prison itself. It is cheaper, and equally advantageous, to treat offenders without further inflating the population of the penal system.

It would be impossible to mention the mass incarceration of drug offenders without mentioning the disparate racial effects that have plagued this phenomenon from the beginning. Racial injustice illuminates yet another reason that the current system of mass incarceration fails to work. According to the NAACP, there are 5 times as many white drug users as black drug users, yet blacks are sent to prison for drug offenses at 5 times the rate of whites (NAACP 1). Tracing the racial inequalities in drug sentencing is complex, tracing back to major discrepancies between crack/powder cocaine laws, but the effect has been the mass incarceration of mostly people of color. African Americans now serve as much time in prison for drug offenses as whites do for violent offenses (The Sentencing Project 2). While imprisoning drug offenders is a flawed criminal justice approach on it’s own, it certainly fails even more if it targets certain groups of people disproportionately.

Going to prison brings about a lifetime of collateral consequences, including job discrimination and housing discrimination. These effects are currently disproportionately felt in Black and Latino communities, where entire generations of men have been sent to prison since the 1980s. The injustices of the past years of the War on Drugs have never been more apparent, and the call to repeal overly punitive drug laws has never been stronger. Sentencing reform is coming, with mandatory minimums being repealed in many states, and federal laws being revised to reduce racial disparities (Kopel 7). President Obama has also made it a point of his presidency to offer clemency to non-violent drug offenders who are serving life sentences, or sentences of many decades. To avoid repeating these criminal justice missteps as we move forward, we must abandon the approach of incarcerating low level drug offenders entirely . Treatment would allow drug addicts to move on from their criminal offense, without the stigma of having been in jail, and the lasting loss of opportunity that comes along with it.

In order to address the prison system’s ongoing failure to rehabilitate drug abusers, the nation must begin to look toward treatment as the primary focus. Notably, the United States is alone as a developed country in its pursuit to incarcerate those suffering from drug addiction. The success of court-mandated drug treatment programs across the country should inspire a movement towards that approach. As much as drug use is a crime in and of itself, there are few public safety goals to be achieved by imprisoning addicts. If we are to recognize that drug addiction is a disease, and not a behavior to be futilely punished through imprisonment, then it becomes clear that incarceration is not the solution for drug abusers.


Works Cited

“Drugs and Crime in America.” National Association of Drug Court Professionals.  <http://www.nadcp.org/learn/drug-courts-work/drugs-and-crime-america&gt;

“BOP: Agency Pillars.”  Federal Bureau of Prisons. Web. 17 Oct. 2015. <https://www.bop.gov/about/agency/agency_pillars.jsp>

“The Federal Prison Population: A Statistical Analysis.” The Sentencing Project. The Sentencing Project. Web. <http://www.sentencingproject.org/doc/publications/inc_federalprisonpop.pdf>

Kopel, David B. “Prison Blues: How America’s Foolish Sentencing Policies Endanger Public Safety.” Cato Institute Policy Analysis. Cato Institute, 17 May 1994. Web. <http://www.cato.org/publications/policy-analysis/prison-blues-how-americas-foolish-sentencing-policies-endanger-public-safety>

Malveaux, Julianne. “Release Low-Level, Non-Violent Drug Offenders.” North Dallas Gazette. North Dallas Ga, 22 July 2015. Web. 17 Oct. 2015. <http://northdallasgazette.com/2015/07/22/release-low-level-non-violent-drug-offenders/>

McVay, Doug, Vincent Schiraldi, and Jason Ziedenberg. “Treatment or Incarceration? National and State Findings on the Efficacy and Cost Savings of Drug Treatment Versus Imprisonment.” (2004): n. pag. JusticePolicy.org. Justice Policy Institute, Jan. 2004. Web. <http://www.justicepolicy.org/uploads/justicepolicy/documents/04-01_rep_mdtreatmentorincarceration_ac-dp.pdf>

“Medical Consequences of Drug Abuse.” Medical Consequences of Drug Abuse. National Institute on Drug Abuse, 14 Dec. 2012. Web. 17 Oct. 2015. <https://www.drugabuse.gov/related-topics/medical-consequences-drug-abuse>

“2015 Annual Report.”Ohio Department of Rehabilitation and Correction. <http://www.drc.ohio.gov/web/Reports/Annual/Annual%20Report%202015.pdf>

“Quick Facts.” Families Against Mandatory Minimums. FAMM. Web. 17 Oct. 2015. <http://famm.org/the-facts-with-sources/>

“Criminal Justice Fact Sheet.” National Association for the Advancement of Colored People. NAACP. <http://www.naacp.org/pages/criminal-justice-fact-sheet>

The Myth of the Meritocracy

There is perhaps no institution more socially defining or class bearing than the American university. Earning a college degree significantly improves your lifetime earning abilities and overall prospects. This is why, faced with alarmingly low minority enrollment in higher education, policies of affirmative action were implemented. The goal was to give people who had been historically denied equal access to education a chance to go to college. In turn, universities benefit from having more diverse and vibrant communities with adequate minority representation.

The Supreme Court has, however, been clear that race alone cannot be the sole determinant of admission. Rather, race is simply analyzed as one of many factors, to give colleges the ability and freedom to diversify their campuses. The Supreme Court has ruled that policies like racial quotas are unconstitutional, but that race is still allowed to be considered as one of many factors in college admissions. 

Despite the intended goodwill of affirmative action, the policies are incredibly controversial. Many people claim that entrance into college should be based on achievement, not a factor such as race. Often, those who oppose affirmative action claim that it discriminates on the basis of race and that it should be unconstitutional. The heated debate over affirmative action always springs back up in the Supreme Court and in the states, with some states, like California, having outlawed affirmative action in public universities.

I must say that I would like to agree that college admission should be based on merit alone. It certainly seems fair to say that people should be admitted based on personal achievement. That is, after all, the compelling lure of the American dream– that you can blaze your own path to success through hard work. In such a system, also known as a meritocracy, people would be evaluated on an equal playing field, irrespective of arbitrary factors like race, gender, or family income. In theory, it sounds totally fair.

The problem is, this system of fair evaluation does not, and has never, existed in the United States. It is a myth. We don’t have a meritocracy in college admissions, nor even in the opportunities that underlie college admissions. Race is just one of many evaluating factors that the student has no control over, and that are not based on any sort of merit.

Let’s take a look at the main factors for college admission in the United States, and whether they can be called merit-based, or merely arbitrary. Obviously, a student’s grades play the biggest role in college admissions. However, the National Association for College Admission Counseling states that colleges evaluate many other additional factors:

In order to shape their classes, colleges may consider other factors for admission, including a student’s geographic location (especially for public universities), whether a student is the first in their family to go to college (for access purposes), a student’s race or ethnicity (for diversity purposes), a student’s relation to alumni (for the purposes of development and community-sustenance), and gender (for purposes of reflecting the population).

Analyzing any one of these factors, not just race, illustrates a lack of merit-based evaluation. Additionally, some factors have the possibility of actually disadvantaging historically marginalized groups.

To that end, let’s think about the criteria of legacy admissions, or a student’s relation to alumni. Most colleges give extra preference to students who have family members that attended the school. In some colleges, having a family legacy can dramatically increase a student’s admissions prospects. For example, Harvard University’s admission rate hovers around 5.8%, but that number jumps to an astounding 30% admission rate for legacy students.

What personal merit is there behind someone’s parents or grandparents having attended Harvard? A person is just lucky enough to be born into a well-educated family. The student has earned nothing in order to be a legacy prospect, and yet he/she has a much greater chance of being admitted. The meritocracy myth falters.

Imagine now the grave unfairness that legacy admissions can have against minorities, whose grandparents or family members may have been denied the opportunity to attend college. For much of the 20th century–while certain families were establishing legacies at universities like Harvard– most minorities were denied or restricted access to these same institutions. Although it is likely not any university’s intention, legacy admissions can act similarly as the grandfather clause did for voting, by rewarding a system of past racial privileges.

At schools like the University of Southern California, legacy students are 19% of the incoming Fall 2015 Freshman class, while African-American students total only 7%. If anything is unfair about college admission standards, it’s the fact that education has been, and continues to be, an exclusive club for generations of people.

This is not to say that legacy admissions are harmful, in and of themselves. Rather, it is telling that critics of affirmative action often have no similar complaint lodged at legacy policies. And this is to say nothing about the many other non-merit based admissions factors: athletes who are given preference, or geographic considerations of one location being preferred over another, or even the emphasis on standardized testing, which has long been shown to reflect expensive preparation more than academic prowess.

M49e751598c65f

None of these factors are merit-based entirely, and some, like standardized testing, are a mix of merit and privilege. In fact, you can even make a strong argument that grades themselves are not truly merit-based, when you take into account the privileges of hired tutoring, better resources, and superior schools in more wealthy, and usually whiter, areas. If we want to make college admissions truly merit-based, then affirmative action is not nearly the root of our problems.

The basic point reveals the inherent privileges that exist in society, whether we acknowledge them or not. Removing race from college admissions does not actually remove it from the process, because race continues to be as privilege-bearing as it always has been. Pretending that opportunities are equal for everybody, or that college admission is based only on merit, does not make it true.

The myth of the meritocracy is in denying corrective based policies to minorities while perpetuating other meritless standards that are discriminatory by nature. Fundamentally, a system of fairness does not, and has never, existed. It is okay for us all to admit that people achieve things based partially on privilege. Extending some of that same type of privilege to marginalized groups is a necessary and fair action.

Black Lives Matter, In Reality

The Black Lives Matter Movement is founded on a true and well-documented premise: young African-Americans are disproportionately gunned down by law enforcement. The breadth of crime statistics have long supported this notion, showing that young black males are 21 times more likely to be shot dead by police than their white counterparts. There is also evidence that black victims of police shootings are more likely to be unarmed. In 2015 alone, a staggering 32% of black people killed by police were unarmed, compared to only 15% of white people killed by police.Despite the evidence, the Black Lives Matter movement- and the very need for such a movement- remains a controversial topic.

Some critics of Black Lives Matter outright refuse to acknowledge that law enforcement may be prone to racial bias. These people instead imagine that we live in a post-racial society, where race could not play a factor in a cop’s decision making. Such a viewpoint is incredibly naive in light of our country’s long history with racially-biased law enforcement.These people are probably the minority of Black Lives Matter Movement’s critics.

The majority consist of those who make an even more damaging argument. They contend that activists should concern themselves with the high homicide rates in urban inner cities, rather than issues of police brutality. People making this argument attempt to use urban violence as justification for police brutality. Their argument is the equivalent of saying, “You should not care if police kill you, because you kill each other anyway”.

A quote from a recent opinion piece, titled “Black Lives Matter– but Reality Not So Much”  explains the idea more fully:

The reality is that the Michael Browns are a much bigger threat to black lives than are the police. “Every year, the casualty count of black-on-black crime is twice that of the death toll of 9/11,” wrote former New York City police detective Edward Conlon in a Journal essay on Saturday. “I don’t understand how a movement called ‘Black Lives Matter’ can ignore the leading cause of death among young black men in the U.S., which is homicide by their peers.”

This argument, quoted and advanced by author Jason L. Riley, is entirely unreasonable. Firstly, people are working to curb the homicide rates in the black community. Leader and activist Rev. Al Sharpton just recently travelled to Chicago to speak about the intolerable levels of violence. He has spoken out against street violence for years, and often visits local communities around the country. He is just one of many Black activists working on this issue. There are community groups dedicated to stopping urban violence in almost every major American city. Notwithstanding these efforts, there are clear differences between inner-city violence and police brutality.

When a black person kills another black person, they are sentenced to prison for the crime. There is no breakdown in the justice system attributed to freeing young black murderers. The same cannot be said for white police officers who kill unarmed black citizens. We can discuss the social ills of urban violence, but we need to separately address the systemic injustice that isolates police officers from prosecution and conviction.

Furthermore, and perhaps most importantly, we should actually openly hold police officers to a higher standard than the one to which we hold ordinary people. An ordinary person, living in an environment that is violent, poor, and abject of hope, may unfortunately fall victim to the lure of crime, or the senselessness of committing gun violence. This is not inevitably true, but it remains a fact in our current world. But a police officer, trained and acting in his duty under government authority, should follow the law at all times. In other words, a police officer cannot use the negative behaviors of the environment he patrols to justify his own missteps. And neither should we.

In his post, Riley facetiously says, “It’s about holding whites to a higher standard than the young black men in these neighborhoods hold each other to.” He should clarify to say, white police officers. This is not an unreasonable proposition. Yes, we should absolutely hold white police officers to a higher standard than the young urban poor– those same people whose behavior we are bemoaning.

This is not to say that Black Lives Matter is the perfect movement. It lacks central leadership, the legislative power of past social movements, and it may be too focused on social media. But you cannot argue with its basic premise: all people, including black people, deserve to not have their lives unnecessarily taken by the police. This is simply a human right.

The key issues here are justice and accountability. When you have multiple cops firing hundreds of rounds into an unarmed person’s car, someone should face consequences. When you have a 12 year old boy shot by a cop while playing at the park, someone should face consequences. And these are just two examples from my hometown of Cleveland, Ohio, in recent years. If cops were actually held accountable for using excessive force, I imagine that police brutality would be a far less pervasive issue.

The tragic events that we read about in the news, like the deaths of Eric Garner, Walter Scott, and Oscar Grant, happen everyday in neighborhoods across the country. Accepting that this injustice exists, and refusing to continue making excuses for it to keep happening, are the first steps toward creating positive change. Black Lives Matter has brought attention to an ugly and unjust truth about our country. The only way to stop the long pattern of police brutality against African-Americans is to finally hold police officers accountable when they use unreasonable force.

The Complicated Character of Cornel West

c_west__smaller_Dr. Cornel Ronald West defies simple description. He is a leading philosopher, author, and academic with an Ivy League resume, but he is also a hip-hop artist, a prominent activist for the African-American community, and a self-proclaimed prophet. In all of his many roles, Dr. West is a man with considerable public influence. He is famous almost as much for his public feuds as he is for his provocative theories on race and class oppression in the United States. Since Dr. West published his bestselling book Race Matters in 1994, cementing himself as a leading public intellectual, he has had many disagreements with other public figures and been subject to much criticism. West’s persona (and to some degree, his celebrity) is built on him being unapologetic, attention-grabbing, and, according to many of his critics, increasingly outrageous.

West has risen to the status of a public intellectual for his “brilliance and ability to articulate the complex lived experience of African-Americans”, but it can be difficult to separate his achievements from his antics (Hayes 75). His primary conflict as a public intellectual is in balancing the dual role of being both public, and being intellectual. In the public sphere, his intellectualism is often eclipsed by his persona and his desire for recognition. The incredible genius of Dr. West is always there—in his electrifying speeches, in his impressive academic writing, and in his thoughtful sociocultural commentary—but it is much harder to realize beneath the behemoth shadow of his public self.

From an early age, Cornel West exhibited the qualities that would define his life: a commitment to Christianity, a passion for racial justice, and sheer intelligence. Growing up in Sacramento, CA, though he is a native of Tulsa, OK, West was heavily inspired by the Baptist church that he attended in his youth (Encyclopedia Brittanica 1). His encyclopedia biography explains that:

“…West regularly attended services at the local Baptist church, where he listened to moving testimonials of privation, struggle, and faith from parishioners whose grandparents had been slaves.”

The religious convictions of African-Americans, and the long tradition of the black church, informed West’s own faith and likely set the stage for his later theological studies.

Equally as big an influence as the Church was the growing power of the Black Panther Party, which impressed and inspired young West. He learned about the importance of local political activism and community building from the Black Panthers (Dartmouth University 1). At a young age, West engaged in political activism by refusing to salute the American flag in protest of the second-class treatment of African-Americans in the United States (Dartmouth University 1). Importantly, and tellingly for his future, West never separated his religious ideals from his political ideals. He found in one a continuance of the other, and thus combined and strengthened both his Christianity and his commitment to social justice.

Academically, young Cornel West excelled, entering Harvard University on a scholarship at just 17 years old, and graduating magna cum laude three years later (Encyclopedia Brittanica 1). Always seeking to expand his knowledge, he earned a PhD in Philosophy at Princeton University in 1980 and then launched his career as an academic (Dartmouth University 1). In the years following, Dr. West ascended the ladder of higher educational institutions. He taught classes on religion, African-American politics, and philosophy at Yale, Harvard, the University of Paris, and Princeton (“About Dr. Cornel West” 1). In 1994, following the Rodney King riots in LA and amidst much racial turbulence nationwide, Dr. West published his most popular book, Race Matters. The book is a collection of essays about the ongoing race debate in America, toppling topics like black sexuality, the absence of black leadership, and the relations between African Americans and other minorities (West 1). A national bestseller, Race Matters catapulted Dr. West into the national spotlight.

In Race Matters, Dr. West advances theories that significantly informed the national dialogue about race. He presents the idea of “black nihilism”, describing how African American communities suffer from a “breakdown of traditional social institutions..and the concomitant growth of hopelessness, lovelessness, and meaningless, particularly among the black urban dispossessed” (Hayes 76). According to West, the despair of the black urban poor helps contribute to their oppression. In his own words, West states,

“the major enemy of black survival in America has been and is neither oppression nor exploitation but rather the nihilistic threat—that is, loss of hope and absence of meaning. For as long as hope remains and meaning is preserved, the possibility of overcoming oppression stays alive.” (West, “Race Matters”)

The principle of black nihilism is rooted in West’s Christian beliefs and his pragmatic philosophy. It also harkens back to his impressions from the Black Panther Party, specifically the belief in the importance of community building. The nihilism theory resonated particularly with African-Americans who could relate to the long-held feelings of hopelessness and futility that West described. In the book, West also distinctly suggests that black nihilism is caused by an absence of black political and intellectual leadership (Hayes 76). He is very critical of America’s most famous black leaders and he strongly suggests that we need better leaders in order to uplift the black community. In that way, his theory is somewhat conservative because it places responsibility and agency with the minority to improve their own condition, rather than solely with the overarching system. However, West also recognizes that the system itself is oppressive, and in Race Matters he criticizes black conservatives for blaming poor black people for their predicament (Hayes 76). He is thus able to offer both liberal and conservative critiques by acknowledging the breakdown in the African-American psyche while also being careful not to blame African-Americans for their own oppression. In a quote from the book, West eloquently says,

“We indeed must criticize and condemn immoral acts of black people, but we must do so cognizant of the circumstances into which people are born and under which they live” (West, “Race Matters”).

While not everyone agreed with Dr. West’s theories, he became very popular for his multi-faceted critique and for remaining grounded in his Christian morals. The race debate in America had long been very divided and polarized between liberals and conservatives, blacks and whites, but West helped reinvigorate the conversation by promoting theories that crossed partisan and racial lines. Above all else, his book firmly established the idea, on a national platform, that race is important and that it matters in American society.

After the 1994 publication of Race Matters, West’s status as a public intellectual grew. Over the next decade, he became a frequent guest on CNN, C-Span, and shows like the Bill Maher Show (“Dr Cornel West” 1). He continued to engage in activism and create new academic works, including another popular 2004 book called Democracy Matters, in which he describes the damaging imperialism and nihilism present in our democracy (Malkani 117). Yet as his celebrity grew, Dr. West forayed more and more away from his scholarly work, ultimately leading some to question his actions.

According to Michael Eric Dyson, another leading African American public intellectual, Dr. West has experienced a scholarly decline in the years since becoming famous. His music ventures, including a hip-hop spoken word album released in 2001, were seen by some as vain efforts to garner attention for himself. In 2002, West had a position at Harvard University, and he was admonished by then-President Lawrence Summers for devoting too much time to his side ventures (Dyson 1). Whether President Summers’ criticism of West is merited is difficult to say, but it does show that some people were displeased with West’s evolution as his popularity grew.

Perhaps most damaging of all to Dr. West’s credibility, he has been incredibly vocal of his dislike for and disapproval of President Obama. Once a strong supporter of Obama’s presidential campaign, Dr. West has since criticized the president for being a “Rockefeller Republican in Blackface”, a “neoliberal opportuntist”, a non-progressive, and generally for not being serious about issues of injustice and equality (Dyson 1). West claims that he is unhappy with Obama’s performance as president, but others suggest that he is jealous of the success of the first black president, or simply feeling snubbed by the fact that Obama did not pander to West once he reached office (Dyson 1). Either way, West’s public bashing of Obama has been so severe and lacking in respect that it borders on irrational. Thinking back to some of West’s earlier theories from Race Matters, he has long disliked and criticized black leadership, so it makes sense that he would criticize Obama. Still, his failure to respect the first black president is damaging to his image.

In addition to attacking Obama, West has also criticized MSNBC’s Melissa Harris-Parry, and African-American activist giants like Jesse Jackson and Al Sharpton. Dyson shows how West often invokes the same arguments in criticizing prominent black figures–suggesting that they are puppets for the white patriarchal system. In the case of Sharpton, for instance, West labeled him as the “head house negro on the Obama plantation” (Dyson 1). Such attacks attempt to diminish the authenticity of other black public intellectuals, and simultaneously seek to uplift West, who in comparison can present himself as the only true fighter for the black cause. West’s attacks on others have led to deepening ideological rifts with his contemporaries, including Dyson, who was once West’s mentee and longtime friend. West and Dyson had a falling out in 2012, when West launched his familiar attack against Dyson, claiming that he was a pawn in the Obama administration (Dyson 1).

This is all to say that Dr. West struggles as a public figure because, at the end of the day, he has an unquenchable thirst to be recognized and appreciated. All of Dr. West’s public actions in recent years have been self-serving, even tearing others down in order to make himself look more like the black prophet and revolutionary that he imagines himself to be. An article in New York Magazine put it best, saying that West’s “greatest flaw” is “his hunger for adulation” (Miller 1). This egotistical desire for recognition underlies his many disputes, bouts of self-righteousness, and unsubstantiated attacks on others.

West would serve himself well to heed the words of blogger and author Stephen Mack. In his blog post “The “Decline of the Public Intellectual (?)”, Mack states that, “…we need to be more concerned with the work public intellectuals must do, irrespective of who happens to be doing it.” The public should be able to talk less about Dr. West’s egotisms and more about his ideas. West misunderstands his function in society as a public intellectual. His desire to be famed and appreciated is not part of his duty. It is the public intellectual’s job to ensure that the audience is hearing things worth talking about—and that does not include personally motivated attacks against the President, or some future album of Cornel West reggae music. West would do well to divulge his public intellectual work from his ego.

Dr. West can, and has, put forth insightful and intellectual contributions to the public discourse on race. He has the platform he always wanted, but the central key is how he uses it.


Works Cited

“About Dr. Cornel West” Dr. Cornel West – Offical Websitehttp://www.cornelwest.com/bio.html#.Vf47-lpCZlJ

“Cornel West”. Encyclopædia Britannica. Encyclopædia Britannica Online. Encyclopædia Britannica Inc. 2015.  http://www.britannica.com/biography/Cornel-West

“Cornel West Biography.” Dartmouth.edu. Dartmouth University. http://www.dartmouth.edu/~dof/pdfs/Cornel_West_bio.pdf

Dyson, Michael Eric. “The Ghost of Cornel West.” New Republic. The New Republic. 19 Apr. 2015. http://www.newrepublic.com/article/121550/cornel-wests-rise-fall-our-most-exciting-black-scholar-ghost

Hayes III, Floyd W.”Review: Cornel West on Social Justice” The Journal of African American History. Vol. 89, No. 1 (Winter, 2004), pp. 75-79 http://www.jstor.org.libproxy1.usc.edu/stable/4134047

Mack, Stephen. “The New Democratic Review: The “Decline” of the Public Intellectual (?).” The New Democratic Review: The “Decline” of the Public Intellectual (?). 30 Aug. 2015. http://www.stephenmack.com/blog/archives/2015/08/the_decline_of_11.html#more

Miller, Lisa. “Why Cornel West Can’t Seem to Find Love and Justice in His Own Life.” NYMag.com. New York Media LLC. 06 May 2012.  http://nymag.com/news/features/cornel-west-2012-5/

Malkani, Sara. “Review: Democracy Matters: Winning the Fight Against Imperialism by Cornel West” Pakistan Horizon. Vol. 58, No. 3 (July 2005), pp. 117-120 http://www.jstor.org.libproxy1.usc.edu/stable/41394106

West, Cornel. Race Matters. Boston: Beacon Press, 1993.