December birthdays must not be forgotten

No, you cannot just buy one gift.

Like the fruitcake of celebrations, December birthdays get thrown into the mess of holiday festivities without a second thought. All of the sudden, after waiting for 12 long months, December birthdays pass with nothing more to commemorate them than a distinctly Christmas-y cake and a couple of presents wrapped in Santa’s face.

For the rest of us, our birthday is the focal point of the month. Days are taken off work, money is saved, invitations sent out, cakes baked, calendars marked and, most importantly, we still get a haul of presents and a plethora of partying when Christmas rolls around.

But when your birthday happens to be just as the year is about to come to a close, no one is too concerned with you or your less festive, sort of inconvenient celebration — especially if it means taking a break from frosting sugar cookies and adorning ugly Christmas sweaters.

I have noticed December birthdays are almost viewed as selfish. Really, the whole idea of a party celebrating nothing more than our own glorious existence is a little self-centered, but I think we have developed this weirdly negative feeling about people who try to intercept a little bit of our holiday cheer for themselves.

I am ashamed to say that when someone asks me — very politely, I might add — if I am free to celebrate their birthday sometime in December, my mind immediately cringes at the thought of turning my Christmas charisma off to focus on their birthday. In addition, I am exasperated at the thought of having to find yet another present and, honestly, just kind of tired of eating cake.

But that is not the fault of the poor souls with a December birthday. By pure luck I was born in July, the perfect birth month. I am right in the middle of the year, everyone is free (unless they’re on a trip), there’s nothing else going on in that month (besides the Fourth of July, but c’mon) and everyone has a summer job conveniently lining their pockets with some birthday-present-buying dough.

However, my dear mother was born on Dec. 4 and, for every year I’ve been alive, I have watched her get the shaft on her birthday. Not only does she get overshadowed by both Thanksgiving and Christmas — being perfectly sandwiched between the two — but she also gets completely forgotten. Everyone is so caught up in Black Friday shopping, turkey stuffing, Christmas card sending, peppermint bark making, hall decking and tree trimming that Dec. 4 comes and goes without hardly a half dozen “Happy Birthday” posts on Facebook.

Now, never once has she complained. In fact, she is often the one suggesting we just mash her birthday and Christmas presents together, however I attribute that solely to her wonderful parenting. She wouldn’t want for my brother, sister and me to have to hunt down presents for her twice in one short season. However, our hearts tighten when we think about how one of our favorite people doesn’t get celebrated properly.

There is a tweet circulating around which says “Christmas shopping is hard when your mom deserves an island and you can’t afford a candle.” Well, it’s even harder when your mom deserves a whole kingdom, but the holiday season has already robbed you blind. Some years, my siblings and I have been smart (or just lucky) and can get her presents which are almost grand enough, but no matter what, we make sure to throw her a party. We do this because, honestly, it sucks having a birthday at the same time as Jesus’ birthday, but it sucks a little bit less if someone can at least scrounge up some reindeer-less decor and light a couple of candles while singing “Happy Birthday.”

So, for the sake of those with a December birthday, make sure their day isn’t accidentally forgotten between the Christmas carols and Hallmark movie marathons. Perhaps someone could write an article about December birthdays just for the chance to say “Happy Birthday, Momma!” Just an idea, though.

Colleges offer opportunities to grow, students fail to take them

Fear not. While I know many will say college does not prepare one for the job market, I beg to differ. The point of all of our work is to develop what the Washington Post’s Jeffrey J. Selingo calls “the best skill (a student) can learn in college,” which “is actually the ability to learn.”

If we start by thinking about our first class of the day — perhaps it is an 8 a.m. lecture or a 10:30 a.m. lab — we will find that, for an almost full class, there sure are a lot of empty seats. Brown University’s Daily Herald reports approximately 26.2 percent of students at their university skip class at least once or twice a month.

Now, skipping class does not seem like it would do much harm, and academically, it often does not. We can easily get the notes from our friends and maybe even study a little harder for the test, but the developmental detriments of skipping class are often ignored.

By skipping class, however, we are essentially prioritizing something over our thousand-dollar education. Most of the time we claim to be putting sleep above school, which is, no doubt, a good choice for one’s health.

However, where we often go wrong is when we decided to watch one more episode of Grey’s Anatomy instead of going to bed. Therefore, the next morning when we sleep through class, the priority is not sleep, nor school, but what both were compromised for — another 30 or 45 minutes on Netflix.

Being a chronic binge watcher of “Psych,” I am guilty of this 10 times over. However, I realize as of late the negative effects of choices I make in one instance might not have immediate, lesson-teaching consequences. Therefore, unless I spend some time seriously reflecting, I might not be learning all I need to from my mistakes.

By not doing that, I am missing out on the chance to grow from being the wide-eyed high schooler I was when I first arrived on campus. In his article, Selingo referenced a study that said, “32,000 students at 169 colleges and universities  found that 40 percent of college seniors fail to graduate with the complex reasoning skills needed in today’s workplace.”

We are graduating having completed the learning requirements for our degree, but missing the less tangible skills that make one a good employee.

Another study referenced in a USA Today article, alarmingly titled, “Degree alone not enough to prepare grads for workforce” by Melanie Dostis, reported, “students (are) lacking skills in areas such as organization, leadership and personal finance, as well as street smarts.”

I do not think these things could be taught effectively, though. Even if the education system was completely restructured, which many have suggested as a solution, some skills cannot be taught.

Skills like these are ones we learn through our many trials and tribulations. We pick them up along the way and implement them in our lives accordingly. But what has happened is we have stopped not only learning them, but even noticing them at all.

Instead, we focus on keeping our GPA up and hunting down as many internships as our carefully constructed résumé can handle. Our tunnel vision on success has taken over, and as we focus on having the right numbers, answers and experiences, we  often pass on opportunities that promote self-development.

So, when one says our college degree is not enough, I understand the concern, but a piece of paper never should be enough to qualify us for something as dynamic as a career. Especially because in that career we will not just be employees, we will be asked to serve as confidants, teammates, problem solvers, mediators, inventors, conversationalists, debaters and leaders.

If we truly expect a degree to teach us more than information and skills for a type of trade, then that is where the problem lies. No one ever said a college degree would guarantee a successful career, because that is determined by knowledge, but also personality.

For our own future, we must choose, every single day, to equip ourselves with the skills that will make us hirable, successful people. Fifty years from now, when we are all retired, it is not going to matter if we could write up a spreadsheet faster than the girl two cubicles away. Instead, it is going to matter if we have learned as much as possible about ourselves and truly grew in those 50 years.

Recent GOP debate shows poor state of U.S. politics

The fault, dear Brutus, is not in our stars, but in ourselves...” Those famous words pulled from Shakespeare’s “Julius Caesar” might be a little more relevant today than we realize. Within the last century, we have become quick to point fingers at our politicians but seem to forget we are the ones who gave them this supposedly misused power.

As the architects of our own government, we are entrusted with the task of selecting who we use as the building blocks of our political tower. Therefore, if the tower falls, the fault is our own, right?

However, no one likes to be responsible when things go wrong. When our government goes awry and blame begins to be placed, a defensive battle often breaks out between the people and politicians, neither side wanting to be left with the smoking gun.

As a result, we have seen a great animosity rise toward our government — an animosity that has festered into a sort of antipathy on both sides, making it hard to empathize with each other.

This is a problem that could eventually lead to our downfall because, as President Abraham Lincoln warned, “a house divided against itself cannot stand.” So, perhaps the solutions to our national woes are not as complicated as we are making them.

We may not need a new tax law, but we absolutely need respect for the families surrendering their income to pay them. We may not need a smaller Federal government, but we absolutely need one that holds a desire to serve its people above all else.

Somewhere along the way, politics stopped being about these relationships and started being about reputations. This was especially evident in last week’s GOP debate. It was made very clear that few are concerned with having productive conversations, focused on growing in understanding of one another. Instead, there seems to be an insatiable desire to mire the reputation of those around us.

CNBC’s John Harwood, for example, unashamedly asked Mike Huckabee during the debate to bash one of his fellow Republican candidates, Donald Trump, by asking Huckabee, “When you look at (Trump), do you see someone with the moral authority to unite the country?”

Now, I am not sure when it was decided Huckabee would serve as CNBC’s morality consultant for the evening, however I don’t think the candidates were the ones who needed to be morally evaluated.     

By the end of the night, CNBC’s moderators had done quite the job of exemplifying the strife between politicians and the public. As outsiders to the political process, I fear some citizens may have forgotten that the men and women who volunteer to lead our country are, in fact, still men and women.

Even though Huckabee handled CNBC’s malicious question with grace, that doesn’t mean he, or any of the other candidates, will handle them that way all of the time. And when they don’t, we should not be penalizing them, but instead take comfort in their humanity. We want people leading our country who recognize the fallibility of human nature.

But, if we are going to expect them to recognize that, we should do the same in return. We must be quick to forgive, slow to forget, but steady in support of the men and women who put their necks out for us.

Yet, we still see so many striving to tear down our politicians and our government. By doing this, they do not realize they are compromising the integrity of the political tower they’ve helped build.

Instead of extending politicians the respect we expect ourselves to be treated with, the whole political process is too often made a mockery of by those who it was created to serve. In fact, this is one reason why our voter turnout rates are plummeting.

Any effort to take the election process seriously is undermined by those who cast an uninformed vote, focus on the personality of our candidates instead of their political prowess or antagonize the government they expect to protect them. Because of this, voting is becoming a terrible exercise in futility.

The Washington Post reported that only “36.4 percent of the voting-eligible population cast ballots” in the 2014 midterm election. This is because those who take elections seriously feel discouraged by the lack of reverence for our political process and those who do not take elections seriously are comfortable in their lack of involvement.

The author of this Washington Post article, Jose A. DelReal, added, “the last time voter turnout was so low during a midterm cycle was in 1942, when only 33.9 percent of eligible voters cast ballots.” That means our voter turnout rate this past year was only slightly higher than when the United States was preoccupied with World War II.

While I understand that right now we are similarly distracted with overseas fighting, we also have exponentially more information, resources and opportunities than we did in 1942 to educate ourselves on the activities of the federal government and the specifics of our elections.

Therefore, it is almost inexcusable for our voter turnout rates to be so lacking. What is especially troubling is the fact that the percentage of college-age students (ages 18 to 29) voting in presidential elections is declining, as shown in a diagram created by The Washington Post’s Kennedy Elliott and Scott Clement last October.

It is not like we are without opinions, though. Several University of Texas at Austin students, for example, have begun protesting the new campus carry law, or Senate Bill 11, which many feel extremely opposed to. However, this is a bill passed by the very legislature these students (and all of us) voted into office.

On top of that, some most likely refused to vote altogether, thus neglecting their civic duty and essentially forfeiting their right to criticize our senators and the decisions they make. We often wait too long to make choices that reflect our opinions, putting ourselves in situations like this. The time to speak up is not when the bill is already law, but long before it reaches that point.

So, as voting season comes into full swing, let’s try to make our voices heard not by bickering about those we have already voted into power, but instead by quietly standing up for the things we believe in by electing those who can protest for us.

By doing so, we will take a huge step toward rebuilding the reciprocal and respectful relationship the Founding Fathers intended us to have with our government. It is only when we are no longer fighting against the government, and fighting alongside it, that we will see the revival and political rebalancing of this great nation.

Students must fight trend of self-induced insomnia

Running on one hour of sleep and a few V8 energy booster drinks, I recently embarked on a weekend trip to Floydada with my friends. Like most college students, functioning on minimal amounts of sleep has become second nature. In fact, for the first few hours of the trip, my exhaustion was suppressed by copious amounts of caffeine and the fact that my body had not yet realized its one hour of sleep was actually not a nap, but all the rest it was going to get for the day.

However, after tiptoeing up a precarious stack of boulders to a waterfall, weaving our way up the side of a cliff and hiking up a canyon wall adorned with cacti, my eyes began to close themselves for me. Realizing I had thoroughly fatigued myself with all of that adventuring, I resigned to the back seat of my friend’s car for a nap.

We college students often do this same thing to ourselves throughout the week, and frequently do not get enough sleep for everything we try to do. For example, after pulling an all-nighter studying, probably for that one test that makes us want to cry for a while and consider dropping out (this happens to everyone, right?), we do much more than our body has the energy for. We go to class, study for quizzes, put in some hours at the gym, yawn though our shift at work, clean ourselves up for a mixer at some bar, sleep for maybe four or five hours and then start all over again the next day. College in a nutshell, right?

Eventually, however, this catches up with us. It catches up with us when we call our mom sobbing about how the world is officially ending because we forgot to submit something major on Blackboard. It catches up with us when we groggily silence the alarm that was supposed to wake us in time to take that pass/fail determining midterm, leaving us to rush to class, heart pounding and vision blurred as we throw on real pants and yank our backpack over our shoulders. It catches up with us when we realize that even though we just stayed up studying for 12 hours straight, we still do not remember enough about the classical dynamic of spinning tops to answer that one physics free response question.

This is why we need sleep. Even though it seems like investing as much of our time as possible in being awake and “productive” is the admirable thing to do, studies show our productivity is greatly lessened in more ways than one by depriving our bodies and brains of sleep.

A U.S. News article titled “College Students Not Getting Enough Z’s,” quotes doctoral health student and author Adam Knowlden, saying sleep is “key for memory consolidation,” which is probably why we have trouble remembering the information when we sacrifice our sleep for staying up. In addition, he explains how “during sleep, the brain acts like a hard drive on a computer. It goes in and cleans up memories and makes connections stronger and it gets rid of things it doesn’t need.” If we are not letting our brains complete or, in some cases, even start that process, then we are seriously compromising our memory’s integrity.

Knowlden goes one step further to add, “if a student is sleep deprived, it affects the whole (learning) process. Students aren’t able to learn, they’re not able to remember, it’s harder to concentrate and it affects mood. They’re working their way through college and not maximizing their learning potential.” If this is the case, our attempts to work harder by sleeping less might actually be undermining our goal of reaching our full academic potential.

In an article by The Huffington Post’s Tyler Kingkade titled “College Students Aren’t Getting Enough Sleep,” Kingkade quotes Shelly D. Hershner, a professor of neurology at the University of Michigan, who says, “a lot of students realize they are sleepy, but I don’t think they understand the ramifications.” The consequences of playing loose and fast with our sleep are heavy not only in the immediate future, but also years and years and years down the road when we have almost forgotten what an exhaustion headache feels like.

For example, health journalist Carrie Arnold reports in a Prevention Magazine article that regular sleep-deprivation could “‘speed up the development of Alzheimer’s Disease,’ according to a study published in the Neurobiology of Aging.” So, while remembering some abstract theory for one of our many tests may seem important now, when we’re 75 and can’t remember the name of the college we attended, much less the name of our husband or wife, it just doesn’t seem quite as important.

Roxanne Prichard, assistant professor of psychology at the University of St. Thomas in Minnesota, is quoted by ABC News in an article titled, “Stressed Out College Students Losing Sleep,” saying college students, “forgo sleep, not realizing that they are sabotaging their physical and mental health.” She goes on to elaborate on potential health risks from lack of sleep, which range from “problems with a person’s immune and cardiovascular system” to substantial weight gain.

Therefore, next time we elect to burn the midnight oil instead of counting sheep, college students should remember when we sacrifice our sleep, we are sacrificing so much more than a few hours of rest.

Listeria outbreak is no joke, society proves too forgiving

There are many dangerous things in the world, but ice cream should not be one of them. When my mom nestles a creamy scoop of vanilla Blue Bell ice cream next to a slice of my birthday cake, there should be little worry involved. Yet, when our trusted friends at Blue Bell Creameries recalled their products in the wake of a listeria outbreak, we started to think twice before consuming their frozen confections.

This past April, Lindsey Wise of The Seattle Times wrote, “Listeria monocytogenes is a germ that can contaminate food and cause a deadly infection, listeriosis, that’s characterized by high fever, severe headache, stiffness, nausea, abdominal pain and diarrhea.”

Similar to the flu, these symptoms really don’t seem too unusual. However, Wise goes on to say, “Listeriosis has been known to cause miscarriage or stillbirth, a particular concern for expectant mothers craving ice cream.”

Imagine an expectant mother, who already has a million things in the world she is supposed to avoid, thinking one little Blue Bell fudge pop probably wouldn’t do any harm. This should be a completely reasonable thought. After all, her baby’s life should not be at risk simply because she wanted some ice cream.

It should be obvious that a food manufacturer’s top priority is to make food that won’t kill us. It is more important than making the food taste good. It is more important than the food looking nice. All we ask is these food manufacturers make food that does not kill us.

This is why the listeria outbreak in Blue Bell ice cream is not so much an issue that needs to be resolved by forcing us to fast from Blue Bell for a little while, but an issue that needs to be resolved with some serious trust rebuilding.

Please do not get me wrong. I absolutely love Blue Bell ice cream. “Is it Blue Bell?” my dad would ask every time I offered to scoop some ice cream up for us. And if I tried to pass off a different vanilla ice cream as Blue Bell, my dad could tell. Have you ever seen someone reject ice cream? It is a sad sight.

With that said, it is quite clear Blue Bell has a serious fan base — a fact the company is very aware of. They know we serve their little mini ice cream cups at birthday parties, share a box of their ice cream sandwiches on hot summer days and construct our ice cream sundaes around their huge tubs of frozen heaven.

Therefore, when the report from their U.S. Food and Drug Administration inspection last April started by noting Blue Bell’s “failure to perform microbial testing where necessary to identify possible food contamination,” I was shocked.

They weren’t testing the “food contact surfaces” for bacteria, they didn’t know they had been harboring three deadly strands of it and they sure as heck didn’t know their lack of information was about to cost people their lives. We trust them to make a quality product, and yet they didn’t seem to be taking that seriously.

On top of that, Blue Bell knew listeria is a common bacterial problem in ice cream because it thrives in cold, damp environments and on chilly, wet products. As Alexandra Sifferlin says in her Time Magazine article titled, “How Ice Cream Gets Contaminated — and Sometimes Kills,” there have been several recent cases of listeria contaminating other chilled dairy products, all resulting in at least one death.

So it is probable Blue Bell  was very aware listeria is a real and deadly problem for manufacturers of dairy products. Yet, for such a well-established company, it did not seem to be taking appropriate precautions to prevent against a listeria outbreak.

In fact, the FDA even confirms this when, in Blue Bell’s inspection report, they note, “all reasonable precautions (were) not taken to ensure that production procedures do not contribute contamination from any source.”

Blue Bell was not even taking reasonable precautions to prevent a bacterial outbreak. The company we invite into our homes, our celebrations and our lives is not even doing the minimum of what is required to keep us safe. The report also goes on to add, “employees did not wash and sanitize hands thoroughly in an adequate hand-washing facility,” which is a basic requirement for any food handling operation. I can imagine that complaint on one of McDonald’s’ reports, but not in a report from our precious Blue Bell.

If you are wondering where the listeria originated, Sifferlin quotes a food safety lawyer in her aforementioned article that says, “Likely what happened is the piece of machinery (that processed the ice cream) was contaminated. The liquid form of the ice cream goes through the machine when it’s not yet frozen, but around 40 degrees, and it’s a great place for (listeria) to grow.” So, it makes a lot of sense  the outbreak could have originated on the machines, especially when we see the FDA described some of Blue Bell Creameries’ equipment as “rusty” and with “peeling/flaking paint.”

So, even though my stomach growls every time I walk past our empty coolers, I am just not sure I trust Blue Bell the way I used to, and I am not sure how long it will be until I trust them again.

They have been working hurriedly to return to our lives, which makes me wonder if they have cut even more corners than before to get their product back on the shelves as soon as possible, all of which is troubling because this whole issue was never really about the listeria, it was the fact that Blue Bell  was selling us a product that put our lives at risk and they didn’t really care enough to stop until they got caught.

Women are not more dramatic about pain, should get same hospital attention as men

"Shin splints,” the physician told me. “It’s probably just shin splints,” he said, dismissing my atrophied leg with the catch-all diagnosis. He then prescribed a couple of Advil as he signed a doctor’s note to excuse me from track practice. Staring back at my dad and I, he said the X-ray didn’t show anything unusual and that I should be fine within a week.

I hung my head. By now we had spent a small fortune on what seemed like nothing more than my own wimpy whining. Almost all runners get shin splints — I was just the one who couldn’t handle them.

Despite the doctor’s diagnosis, something still seemed amiss. Shin splints do not cause one’s leg to shrink two sizes. They tend to affect both legs, not just one. They are usually from overuse of muscles, causing them to get swollen and irritated, a weak core, “flat feet,” or minor stress fractures on little bones, according Web MD. Most importantly, however, they simply do not hurt like my leg did.

I still remember the nights when I would be doing homework and all of the sudden my shin felt like it was slowly being broken in half. Writhing on the floor, I would grasp for my leg praying that maybe I could keep it from snapping. One night, it was so excruciating that all I could do was clutch to my mom, quiet tears dampening her shirt.

Once I leveled back out to the pain I had grown used to, I felt silly and like maybe I just wasn’t being tough enough. This feeling would follow me all the way to my doctor’s appointments where I would stumble through my sentences, trying to explain how my shin felt during those upsetting episodes.

The various sports doctors I saw offered different explanations, but ultimately fell back on the belief that it was really nothing some time on the bench and light painkillers couldn’t fix.

Finally, toward the end of the season, my physician found what had been causing so much trouble in my leg. Lacing up and down my tibia was a web of fractures. They were thin enough to be hard to see in X-rays, extensive enough to look almost normal. I guess when one’s entire bone is covered in little lines, it seems like they’re supposed to be there.

Immediately, they put me in a support boot, which I had to wear for at least four months. If it wasn’t healed by then, they said they would have to find another way to mend the bone. The bone had been slowly getting worse as I hobbled along with a diagnosis of basic “shin splints” for the weeks prior.

Let me assure you, I am not the wimp I thought I was. I am the kid who laughed her way through a staph infection, the right side of my face swelling larger and larger each day. I am the kid who pierced her ears with nothing more than a cube of ice and a needle. When I was in track freshman year, I got my foot caught on a hurdle, did a full 360 and then tore up my whole right hip, leaving a six-inch long scar.

So, with that said, I am concerned the doctor’s dismissive diagnosis of “shin splints” might have been based on the fact that, as The Atlantic’s Joe Fassler quotes in his article, “How Doctors Take Women’s Pain Less Seriously,” “women cry — what can you do?”

While women might cry when Allie and Noah spend their last night together in “The Notebook” or when our brand new iPhone shatters (I know I did), we don’t cry over broken nails and spilled milk and stupid cuts and scrapes. We are tougher than that.

If one wants a really good example of this, I recommend reading Fassler’s aforementioned article about his wife, Rachel, and her experience in the emergency room. He starts by saying, “Rachel’s not the type to sound the alarm over every pinch or twinge.” However, when he watched her “collapse on [their] bed, her hands grasping and ungrasping like an infant’s, [he] called the ambulance then helped [her] to the bathroom to vomit.”

Once at the hospital, they got her on a gurney and waited. Fassler says, “Rachel was nearly crucified with pain, her arms gripping the metal rails blanched-knuckle tight.” When he finally found a nurse, she assured him Rachel would just “have to wait her turn,” then said “You’re just feeing a little pain, honey,” to Rachel’s poor face as it twisted with pain.

Several agonizing hours later, she was dismissed as just another patient with kidney stones and cast into the hospital’s peripherals. Little did they know her real ailment was something called ovarian torsion. Fassler explains that she had an ovarian cyst which “grew, undetected, until it was so large that it finally weighed her ovary down, twisting the fallopian tube like you’d wring out a sponge.” He goes on to say, “it creates the kind of organ-failure pain few people experience and live to tell about.”

Finally, “14 and a half hours from when her pain had started,” Rachel was properly diagnosed and prepped for an emergency surgery. However, to me, that seems about 13 hours too late.

Fassler references the essay, “The Girl Who Cried Pain,” in his article, which explains something called the “Yentl Syndrome,” or where “women are more likely to be treated less aggressively in their initial encounters with the health care system until they prove that they are as sick as male patients.”

In addition, he writes, “nationwide, men wait an average of 49 minutes before receiving an analgesic for acute abdominal pain [while] women wait an average of 65 minutes for the same thing.” Overall, when a woman speaks up about not feeling well, the professional world doesn’t quite seem to believe her.        

The unfortunate thing is, I see this all the time. When one of my childhood friends began feeling excruciating pain in her lower abdomen accompanied with frequent seizures, the doctors thought it was an attention-seeking, psychologically-induced problem, which makes sense, because everyone wants to have seizures, drop out of school and be bedridden. And don’t even get me started on the years my mom spent trying to get a doctor to acknowledge her now-diagnosed Crohn’s Disease.

This dismissal of female pain as lesser or lighter than males’ is silently growing into a huge problem. Women everywhere are biting their lips and clenching their fists to get through pain they shouldn’t be ashamed to mention. We should not be perpetuating the idea that women are wimpier, because it is an idea so ludicrous, it never should have existed at all.

Online anonymity promotes violence

If you send me a text message, I will probably not text you back. If I do, it will most likely be hours after you texted me, because I am really not on my phone enough to see your text. Lately, I have been trying to avoid using texting as my primary means of communication.

This is because there just seems to be a disconnect between virtual presence and the actual presence of the people we text, and in addition, there is a disconnect between the way we portray ourselves and how we really are — the latter being more dangerous. For me, I tend to be bolder when I don’t have to take immediate responsibility for my words. The distance technology allows two people to have between them cushions us from the instantaneous repercussions of our words.

For example, you can’t get slapped through the phone after firing off an insult or get a black eye from the little fist emoji your brother sent when you forgot to pick him up from karate practice. While I am sure we are all glad to postpone any and all condemnation, I am worried this might be promoting a significant lack of personal responsibility and accountability in the way we communicate with each other. When we can shirk all responsibility simply by pressing the off button on our phones, we tend to say things we probably would not be as quick to say in person.

This can become a potentially life-threatening practice when applied to the alarmingly vague Internet. If we look back about two weeks, we see a horribly real example of this with the mass shooting in Oregon.

A shooter walked onto the campus of Umpqua Community College and massacred 9 students, all of whom were Christians — a group he spewed hate toward online just beforehand. In response to this, Dale Eisinger of the Daily Beast published an article titled “Lone Wolves in the Age of the Internet.” In his article, Eisinger tells us this is, “a pattern becoming tragically more common: a mass shooting takes place and we later discover how blatantly the perpetrators expressed hate for their victims online.”

He substantiates this observation with results from a study conducted by Jason Chan, an assistant professor at the University of Michigan, who explains there is “a positive relationship between broadband Internet access and incidence of hate crimes.” Going further, Chan says “between 2001 and 2008, access to just one broadband ISP showed a 20 percent rise in hate crimes, particularly in areas of high racial tension.”

Since the domestication of the Internet began to pick up speed in the early 2000s, we have been progressively becoming more and more exposed to Internet-induced hate and its wrath. We have fallen victim to the lack of accountability technology encourages because it puts a screen between our person and the rest of the world. Now we are suffering the consequences of its many responsibility-reducing mediums of communication.

Therefore, now that we can throw stones and point fingers at those on the opposite side of the world from us, people are becoming increasingly loose with their speech. Behind a blue screen, we say things we would not say in person.

Arthur Santana, a communications professor at the University of Houston, conducted an extensive study on how the Internet’s anonymity affects the content of comments in response to online news articles. The results were reported by the New Yorker’s Maria Konnikova, who says Santana found “a full 53 percent of anonymous commenters were uncivil, as opposed to 29 percent of registered, non-anonymous commenters. Anonymity, Santana concluded, encourage(s) incivility.”

On the other hand, if someone were to even mention the word “bomb” in an airport, they would be pulled aside and fully screened. This is because we cannot only see, but also fully identify the person who might have just issued a very serious threat. Whereas, a bomb threat on the Internet is usually immensely anonymous, leaving us essentially afraid of a ghost. The Internet lacks the immediacy of the real world and therefore allows for terrorizing comments to be made, and there is little we can do about it.

The most upsetting thing is many Americans want to do something about it. We do not want our safety to be so greatly at risk. In a Gallup article by Frank Newport titled, “American Views of TSA More Positive Than Negative,” it is reported that “despite negative press, a majority of Americans, 54 percent, think the U.S Transportation Security Administration (TSA) is doing either an excellent or a good job of handling security screening at airports.”

So even though we have to hunt down mini shampoo bottles and completely remove our shoes only to put them back on five seconds later, Americans are still pleased with the TSA. This is probably because its hyper-vigilance most always successfully protects us, which is all we want at the end of the day.

If that is the case, we need to stop not allowing the dangerous anonymity that modern technology has fostered. Even if we are not the ones creating a threatening online environment, we have a responsibility to serve as watchdogs, protecting each other from faceless, nameless threats. This is not a limit on freedom of speech, but a security measure that will help make our schools, movie theaters and churches safer from those who abuse their right to unrestricted rhetoric.

At the end of the day, we are accountable for every word we speak, write, text or post, and unfortunately we might just be responsible for the words of others too. It is time we started acting accordingly because, as we saw in Oregon, our ability to take responsibility could be taken away at anytime.

"Manic Pixie Dream Girl" cliché misleading to viewers

Hi, my name is Avery Aiken and I am a recovering Manic Pixie Dream Girl.

Coined by Nathan Rabin (of The Onion) in 2005, a “Manic Pixie Dream Girl,” or MPDG, has been defined by the Oxford Dictionaries as “a type of female character depicted as vivaciously and appealingly quirky whose main purpose within the narrative is to inspire a greater appreciation of life in a male protagonist.”

The idea of this character trope, after spreading like wildfire through the tumblr-esque realms of the online world, has not only sparked an abundance of somewhat ridiculous debate regarding sexism, but also inspired books like John Green’s “Paper Towns.”

However, its reach did not end there. I began taking a good hard look at the television shows, movies, plays and books and found that this trope continuously cushions our fairytale inclined hearts from the real world.

In an article from The Atlantic published last March, writer Sophie Gilbert points out that we have essentially grown up in a world filled with these MPDGs, from Maria Von Trapp in “The Sound of Music” and Holly Golightly from “Breakfast at Tiffany’s” to the brilliant and resilient Belle from “Beauty and the Beast.”

These characters have slowly taught girls how to, as Gilbert writes, be an “agent of quirky change” and to “conquer cruelty with kindness, and embody (a) spirt of gung-ho certitude and optimism.”

With that said, I don’t really see how being deemed a MPDG could be a bad thing (except for the actual title — Manic Pixie Dream Girl — which seems a little absurd). Especially if, from this archetype, we are learning to sing about our favorite things, read books like there’s no tomorrow (or just read the same book over and over again — looking at you, Belle) and kiss the dreamy boy next door in the pouring rain.

Besides a potential case of pneumonia, there does not seem to be too big of an issue here.

However, the other side of this character, as noted in The Atlantic’s article, includes the role of being a “bearer of quirky fun and madcap outings and ultimate lifelong happiness once emotional walls have been dismantled brick by brick.”

To me, that seems like quite a large request to place upon a bright, slightly distracted young girl. The MPDG is taught that she must bring all of these things to a relationship and all she is given in return is the responsibility of her counterpart’s happiness. She is ultimately expected to maintain a consistently positive outlook on life through nearly every trial and tribulation because if her spirit falls, then so does her partner’s.

We all know, however, that it is impossible for someone to remain in a state of life-embracing bliss forever, so the almost inescapable end to a real life MPDG relationship would only be disaster, especially if the person who she invests her life in has grown dangerously used to her carrying the relationship along.

In another article by The Atlantic (they seem to enjoy editorializing this topic) titled, “The Real-World Consequences of the Manic Pixie Dream Girl,” Hugo Schwyzer shares a piercingly personal account of his experience of being in love with a MPDG.

His captivating story of the “dark-haired and impulsive” Bettina and their skinny dipping, mildly anarchist adventures in Austria, told with the 1980s flair that only a true follower of the Sex Pistols could achieve, is probably most powerful because of its ending.

Schwyzer closes his story by leaving us with the chilling news that the only reason he lost touch with Bettina was that she no longer existed for him to be in touch with. At the devastatingly young age of 20, Bettina had taken her own life and escaped this world and all of her MPDG responsibilities.

Now, after having years to reflect, Schwyzer explains that “as much as [his] adolescent self thought it adored her, he thought less about her and more about the way she made [him] feel.” Before he even dives into his story, Schwyzer claims he was “utterly infatuated” by her.

The point must be made that the two are very different. To be in love with someone is not the same as being consumed with fascination by him or her. The second one stops being fascinating, the desire to shower that person with attention vanishes.

But poor Schwyzer could not help it. A MPDG was what Bettina strived to achieve, just like many other girls today. She wanted to give him an adventure and not a love story. This was made clear in Schwyzer’s story, such as when he asks if he is her boyfriend and she responds with “a short but impassioned speech about how monogamy [is] the enemy of true love.” If that is the case, then I guess marriage is useless.

Regardless, Schwyzer goes on to add that “as unstable as she may be, the MPDG not only senses a young man’s potential in a way he can’t, she intuitively knows how to lead him to his destiny.” That alone is a pretty remarkable gift, but if only it stopped there.

“She [also] knows him better than he knows himself, or so he believes. That convenient assumption allows the young man both to adore the MPDG and to avoid any responsibility for reciprocity,” he continues.

This is where the problem lies. The Manic Pixie Dream Girl is entirely too self-sufficient. When she hits a bump along the way, she must retreat back into herself in order to overcome that obstacle, often leaving the relationship to crumble.

Therefore, we must make sure our relationships are two-sided, authentic and not an infatuation. One half of the partnership cannot merely exist to be admired and adventurous because, if we are honest, our actions are not always exciting and admirable. When they are not, we must have a support system that understands our fallibility instead of becoming disillusioned.

Leave dream girls to exist in dreams and pixies to exist in fairytales. Let us avoid anything that is “manic” because the word makes me uncomfortable, and instead focus on creating real, sustainable images for ourselves. In the end, this virtue is far more remarkable.

Tinder damaging to society, changes the way singles date

The clock reads 2:23 a.m. and I realize I have been helping my friend decode text messages since the sun went down.

Now, had this been eight years ago, there would not have even been text messages to decode because, let’s be honest, no one wants to punch the “1” key three times just to get a stupid “c.” However, thanks to our pal Steve Jobs and the creators of mobile applications like Tinder, we can build an entire relationship without even having to speak a word (or triple punch our phone keys).

Since then, we’ve slowly slipped into an age of emoji-to-emoji conversation, almost entirely abandoning the chore of communicating “IRL,” or in real life. This inclination to substitute real life for the more controlled virtual world has brought us into an age of collecting friends on Facebook, curating our Instagram accounts and swiping right on Tinder as frequently as Beyoncé tells us to go “to the left.”

The development of our dependence on these things is hardly our own faults. As “hookup culture” has grown and our lives have become busier and busier, the idea of taking someone on multiple dates seems time-consuming and wasteful.

So, if there is a way to swim around in the dating pool without having to pause Netflix, why on earth would we want to do anything else?

Well, for one reason, studies show that we were never meant to know/see/date all of the single people within a 50-mile radius of us. According to MTV, that gives our little human brains too many options. In an MTV article titled, “The Science Behind Why Tinder is Effing Up Your Love Life,” right above a GIF of “The Big Bang Theory’s” Sheldon Cooper flipping out, it reads, “humans evolved to be addicted to new sexual opportunities, but not this many opportunities.”

Yes, it is exciting when we meet new people, especially when it is people we are attracted to. However, it can quickly become too much of a good thing when apps like Tinder flood us with options just as fast as the streets of Lubbock flood.

The sensation is similar to choosing food from the menu at The Cheesecake Factory or a Chili’s Grill and Bar. There are so many choices that one can hardly decide which subcategory to peruse, let alone what to actually order. And, chances are, whatever we do decide upon will most likely be rather disappointing and leave us wondering if our hour and a half out and $12.62 could have been put to better use.

This is not to say Tinder and other similar dating sites have been ineffective. Most of them have a high success rate (meaning a lot of great relationships come out of their match making efforts) and that really is a good thing, but it is Tinder’s overwhelming success leaving me a little concerned.

With sites like Match.com and eHarmony, the end goal is to be in a stable relationship. However, with Tinder, the cooler, less committal (and probably a little less desperate) cousin of these websites, strives to be able to provide its users with a one night stand nicely wrapped up in a bow.

In a Vanity Fair article published this month, Nancy Jo Sales (who coined the term “dating apocalypse”) compares this aspect of Tinder to online shopping. We get a picture of what we’re “buying,” a few commonplace details and a location to pick up our purchase (assuming we get messaged back). We have essentially commercialized and devalued dating, and I am not sure the world is ready for that. In fact, I am positive it’s not.

First of all, as much as we want to insist that women like sleeping around just as much as men, there is no denying it is riskier for us. And, let’s be real, I am pretty sure no little girl sits at home and dreams about the day when she can spend her Friday nights answering booty calls.

As David Buss, a professor of psychology at The University of Texas at Austin, points out in the same Vanity Fair article, “when there is a surplus of women, or a perceived surplus of women, the whole mating system tends to shift toward short-term dating. Men don’t have to commit so they pursue a short-term mating strategy... and women are forced to go along with it in order to mate at all.”

So, not only is the abundance of options Tinder supplies a tad overwhelming, but it has also dramatically shifted the way we play the dating game.

With Tinder, every day we can relive our middle school years and play a solid round of “Hot or Not,” but this time with a more serious outcome. The thing is, however, we’re not in middle school anymore. This is college and if we don’t find someone to commit to by the time we’re out of here, chances are, we’ll end up on eHarmony answering hundreds of questions about what we think a perfect date would be.

So, for all of us out there who watch John Hughes movies and long for the days when the only way to get ahold of a girl was to call her landline, brace yourselves. The dating apocalypse has begun.