Category: Opinion

May 26, 2015 George Hyde

This topic is a wee bit technical for most, but I can guarantee that when most people see something at an unusual frame rate, they can feel the difference.

What is a frame rate, anyway? Well, first we need to know what a frame is, and it’s just that: a single shot of a film, or game, or any other video form. How many frames can fit into a single second is what people refer to as a frame rate.

With me so far? I hope so.

Let’s use a standard example. An average movie at a movie theater (say, “Mad Max: Fury Road”, which I’ve reviewed here) runs at 24 frames a second. That’s the standard for films, and it’s been so for decades. But there was one movie that tried to buck that trend, back in 2012. Can you remember what it was?

The knight on the left moves at 30 frames per second, while the knight on the right moves at 60.

That’s right; it was “The Hobbit: An Unexpected Journey,” which ran at many theaters at 48 frames per second. As a result, the film felt a lot smoother. This made a lot of people very uncomfortable; after all, movies have been at half that frame rate for a really long time now, so they weren’t used to it. It felt like a making-of documentary, or something on the Discovery Channel, since many of those shows run at that frame rate.

Video game enthusiasts have it a bit more complicated. Back in the eighties and early nineties, many games ran at 60 frames per second, which was the standard. But now, after the advent of 3D graphics and prettier games like “The Witcher 3” (which I’ve also reviewed here), most modern, big-budget games run at 30 frames per second, which eases up some of the game system’s processing power and allows developers to use it to make their games prettier and shinier.

And in my humble opinion, it’s time to do away with lower frame rates.

Many will argue that the human eye can’t see past 30 frames per second, and that’s not entirely true. There are some people that can’t see past that, but most people can see well and far in advance of that. Professional baseball players perceive reality at a blistering frame rate that is far higher than most people can see, which is required to play the game efficiently.

Like every other aspect of biology, this is an area where we’re all different. Every eye has a different frame rate threshold, and most scientists call this phenomenon flicker fusion. But I think it’s a fair assessment that when it comes to entertainment that’s viewed on a screen, like a movie or video game, most people can see well past 60 frames a second.

Many TV manufacturers and filmmakers have boasted about technological feats like 3D and 4K resolutions that either makes pictures more detailed or gives them depth. I think we all know by now that 3D is a gimmick, useful only for films that cater to it. And while 4K TVs will dominate in the future, many people can’t tell the difference between it and existing resolutions. Increasing the frame rate of movies, however, will result in a difference that the human eye will definitely pick up, and it requires nothing more than a powerful camera that can record at that frame rate. No glasses, no expensive television; only the human eyes you already possess are necessary to watch and enjoy content at a higher frame rate.

Indeed, I picked up a monitor for my gaming PC last year that’s capable of displaying up to 144 frames a second, and it looks and feels incredible. For movies and most games, that’s excessive. But I will say this.

There’s a reason that “Call of Duty” dominates the gaming landscape today, and that reason could well be that it’s among the few games today that always runs at a smooth 60 frames per second, no matter what the platform you play it on. To go lower would render the experience so unlike “Call of Duty” that even the most dedicated fans would swear against it.

And while many people were uncomfortable seeing “The Hobbit” at 48 frames per second, I would advise those people to give it a second chance. Audiences have yet to get used to it, but I dearly hope that they do. A film at 48 frames per second contains twice the frames as a film at the standard 24, and that means more detail and much smoother motion. It is, objectively, a better experience, and while many look towards 3D and 4K as the future of film, I think this is where it’s at.

So the next time you see a movie or game claiming it runs at a higher frame rate, give it a chance. I guarantee you’ll be blown away.

February 11, 2015 John Sallee

As February is here, so are the annual Grammy Awards. It will be the 57th year that the National Academy of Recording Arts and Sciences will be handing out awards that recognizes artistic excellence. Many people have ruled this award show as “music’s biggest night”. LL Cool J will host the event, which will make…

January 29, 2015 George Hyde

Gch_0426

Larissa Kramer
Freshman
English
“Usually I tune in to the Super Bowl commercials, but the Puppy Bowl isn’t a bad option either.”

Gch_0429

Josh Byrd
Senior
Electrical Engineering
“I play in a band called The Quiet Cull. We do practices on Sunday afternoons. So most likely, I’ll be making loud noises and rocking my face off.”

Gch_0423

Dakota Lee

Freshman
Computer Science
“Probably working.”

July 28, 2014 Jacob Holley-Kline

With the season finale of Telltale Games’ excellent “The Wolf Among Us” finally out, it’s apparent that episodic video games have once again taken a place in the industry. Telltale Games works almost exclusively with this format.

The developer’s most well-known titles, “The Walking Dead: Season One,” “The Walking Dead: Season Two” and “The Wolf Among Us,” have five episodes released bi-monthly. With this schedule, the developer can create a sustainable engine, apply it to each episode and focus on character development and story.

The result has been staggering. “The Walking Dead” is an emotional gut-punch through and through and the world of Fabletown in “The Wolf” is endlessly fascinating. This is also thanks in part to the excellent source material Telltale has worked with — Robert Kirkman’s comic “The Walking Dead” and Bill Willingham’s comic “Fables.”

Unsuccessfully, Valve attempted this episodic model with the releases of “Half-Life 2: Episode One” and “Episode Two,” with “Episode Three” so absent and under-wraps that it’s become a bitter joke in the industry.

One of the bigger problems with episodic video games when they first arose was price. After buying all the episodes to a series the cost might exceed what a gamer would pay for a new full game. With the advent of the “Season Pass” system, wherein players pay something like $24.99 for all current and future episodes, this problem is all but solved.

Adding wait time between releases also adds a new dimension to the games. With TV shows having their own forums and subreddits on Reddit, viewers from all around the world can gather after an episode airs and speculate on and critique the series.

Rather than speculating on or critiquing an 8-20 hour experience like “Spec-Ops: The Line,” gamers can discuss on an episode-to-episode basis, engineering hype for the next installment. Combine this with Telltale’s signature approach, “the story is tailored by how you play,” and players get a myriad of different stories, paths and choices elucidated on release day. Waiting is one of the best tools for gamers and developers alike. A little breathing time between experiences is essential to the experience. And what an experience playing a well-crafted game is.

Buy into the episodic experience. It’s well worth the price and will keep you entertained for days to come.

July 8, 2014 Jacob Holley-Kline

Let’s be honest: Sitting through movie credits is boring. A bunch of names the viewer probably won’t recognize scroll from bottom to top until it fades to black or opens to a special scene.

But sitting through the credits is rewarding. It’s not about trying to recognize the names of the crew like you would the cast. It’s about acknowledging the crew’s contribution to the movie. After all, their fingerprints are all over it. Watching their names scroll by is an act of witness, not one of recognition. With the advent of CGI, movie viewers know that those creatures or robots on-screen were made by dozens, if not hundreds, of people. Being the second unit assistant best boy grip is a thankless job, and the only place crew members like that are honored is in the credits.

But when all that white text looks the same then it’s a perfect time to reflect on the movie. Leaving the theater just as the lights turn on is fine, but unless the movie was truly something special the viewer may forget just what they saw.

The credits can turn watching into seeing. It’s easy to watch a movie, but reading it is different. The credits are the calm after the storm. Unless there’s an after-credits scene. Before, these scenes were rare, but now they’re expected.

Viewers can do themselves a favor by sitting through the credits even without the promise of an extra scene. Seeing the sheer number of names passing by is enough to appreciate just what it takes to make a movie. Appreciating the movie itself means the viewer should reflect on it, and the credits are a perfect time for that.

March 3, 2014 Jacob Holley-Kline

The history taught in schools is propagandistic and white-centric. American kids are raised in a racist and post-colonialist education system. Students come out the classroom believing that, prior to the Enlightenment, people of color (PoC) did not exist in Europe. This is blatantly untrue and a shortsighted view of history.

Often, history is a study in “whiteness,” white people, white view, white history, white influence. According to art historian Jeanne S. M. Willette, art made by PoC was classified as “folk art” or “ethnic art.” Any artist who wasn’t white was qualified by their race.

PoC have a long and complex history. The Moors, medieval Muslims and Arabs coexisted with white Spaniards until their expulsion along with the Jews in 1609.

According to BBC, records show that black men and women have lived in Britain since the 12th century. Their numbers exploded in the 16th century because of the slave trade.

Artists of the era would often position PoC at the edges and corners of canvasses, treating them as ornaments rather than people. Their role in the painting is to reinforce the white supremacist hierarchy, often gazing in wonderment at their masters.

This is reflective of that time period because PoC were, generally speaking, objects. According to the BBC, PoC were offered as gifts to the commanders of slaving vessels and sold into service at auctions. From there, their names and identities were changed, effectively erasing their heritages from public record.

Before the advent of digital editing software, PoC were regularly painted over. This implies that PoC don’t even serve an ornamental function. They are made less than even objects.
It’s been argued, according to The Walters Art Museum, that Giulia de’ Medici featured in Jacopo da Pontormo’s 1539 “Portrait of Maria Salviati de’ Medici and Giulia de’ Medici” was of African ancestry. Reinforcing this, the child was painted over numerous times until she was detected during a cleaning in 1937.

But why whitewash history? Because the system is whitewashed to begin with. After being taught that white people built civilization and mastered the arts, learning that PoC did the same thing invalidates our white supremacist system. It’s hard for anyone to find out they’ve been lied to.

Numerous teachers and professors were educated in the same white supremacist system that a majority of America is. They teach what they were taught and what they have learned and they have curricula to stick to and administrators to please.
There are plenty of good professors and teachers out there, but being taught that there’s nothing south of North Africa is inexcusable. Seek out this history for yourself; white men aren’t the only influential figures in world history.

 

February 23, 2014 Jacob Holley-Kline

In a world of three second takes, Michael Bay movies and explosions, patience in cinema with long takes and wordless scenes are difficult. After all, why watch an old farmer herd goats and eat dust in “Le Quattro Volte” (The Four Times) or a washed up country star try to find redemption in “Tender Mercies”? As French philosopher Jean Jacques-Rousseau said, “Patience is bitter, but its fruit is sweet.”

In a movie like “Transformers” the viewer “sees,” but in a movie like “Volte” the viewer “watches.” It’s a natural consequence of watching a single shot drawn out over 10-15 minutes. At that point, what’s not seen is as equally, or more, important than what’s seen.

That’s not to say popcorn flicks are a bad thing. They’re often harder to read than something overtly metaphorical or allegorical.

Movies are a special thing. With a book, the reader can choose how fast or slow to read, but with a movie, the viewer’s experience is up to the director. But what’s the point? There’s a point in Bela Tarr’s swan song “The Turin Horse” where there’s nothing interesting happening on-screen because, if the movie’s done its job, the viewer will be somewhere else entirely.
These types of movies are meditations on ideas instead of outright explorations. They take a theme and put it through the ringer in the first 5-10 minutes and deal with the philosophical consequences for the remaining time. The viewer should meditate with it.

If one really sticks with a slow, meditative flick, they’ll reap the rewards by the end. Slow-burning cinema is meant to leave a mark.
Now, there are widely released movies that have just as much as emotional or existential punch — “Schindler’s List,” “12 Years a Slave” and “The Lego Movie” come to mind. And when it comes down to it, Hollywood is a business and executives see some movies as investment-worthy and others as not. Chances are “Schindler’s List” wouldn’t have made $320 million if it were as drawn out as “The Turin Horse.”
Give the slower movies a look. They’re a cerebral experience. When attention is at a deficit, it’s more than worth it to give patience a chance.

February 23, 2014 Jacob Holley-Kline

“Love means never having to say you’re sorry,” is what someone would say if they were never in love. That quote comes from the 1970 Romance, “Love Story.”

It’s hard growing up with messages like this. Surrounded by movies like “Titanic,” “Maid in Manhattan,” “Serendipity” and any Nicholas Sparks adaptation.

In movies like this, love is immediate, eternal and wordless. Love at first sight is impossible to avoid and, even if tragedy separates two lovers, they will be together forever.

This scenario has been played out on movie screens for generations and it’s a damaging notion.

Researchers at Heriot Watt University in Edinburgh, Scotland found that relationship problems discussed in their counseling center reflected misconceptions about relationships and love, thanks to Hollywood movies.

One misconception is that, if your love is true, you will know everything about each other and what’s wrong in your relationship without talking.

Nobody’s a mind a reader and communication is key in a relationship. Some stuff is hard to talk about, sex especially.

In romantic movies, sex is something that begins (and ends) without consent or communication.

Sex can be fun and transcendent all at once, but it begins with trust and understanding, two things unachievable without talking.

Romance in movies is hyper realized, enhanced like the actors themselves. Little of what is on-screen is true. During the honeymoon phase of a relationship, it may feel like two lovers have, “crossed the oceans of time” to find each other, to quote Dracula.

But once that’s over, the real work begins and one comes to find out that that initial infatuation was only a foundation on which the real and toilsome love is built. It’s worth it, in the end, if you love that person.

It’s important to not get wrapped up in romantic movies, books and video games. The characters on-screen live in an idealized world with idealized people in idealized relationships and these ideals are important.

Real life is a struggle and getting away from that for 90 minutes or two hours at a time isn’t a bad thing, it provides an escape and catharsis. But at what cost?

January 28, 2014 Jacob Holley-Kline

There’s a long and oppressive history of cultural appropriation in pop music. White pop stars trying on different cultural and racial identities to sell records. It’s never been okay and it never will be. One of the more recent examples comes from Katy Perry.

January 21, 2014 Jacob Holley-Kline

Intimacy is a hard thing to achieve, especially for anyone raised in a home wracked by abuse, drug addiction or alcoholism. I grew up in an alcoholic household and I, like anyone who’s ever wanted to connect with other people, have a hard time connecting in any meaningful way.

November 26, 2013 Jacob Holley-Kline

I miss the Street. I mean, I miss the Street franchise. “NBA Street Vol. 2” has to be one of the best basketball video games ever made. I don’t even like basketball or sports games in particular, but “Vol. 2” grabbed me like no other sports game had.

September 24, 2013 George Hyde

I was lucky enough to land smack-dab on the head of an unwitting reporter at a university newspaper. It’s no CNN or Al-Jazeera, but it’ll do for spreading the message of mankind’s impeding doom. Or something like that.

July 25, 2013 Jacob Holley-Kline

Writer Rafe Martin and illustrator Tatsuro Kiuchi detailed the folktale “Green Willow” in the book “Mysterious Tales of Japan.”

Legend has it that a samurai named Tomotada set out on a quest for his master and met a young girl named Green Willow along the way. Green Willow and her parents lived in a small hut hidden by three willow trees. They offered Tomotada haven for the night.

He and Green Willow fell in love and later married. One morning Green Willow screamed, “My tree! They are cutting my tree!” and then fell into Tomotada’s arms. She turned into a pile of golden willow leaves. Tomotada adopted a nomadic Buddhist lifestyle, wandering Japan’s hillsides and meditating.
At the end of his life, Tomotada discovered three willow tree stumps, two old and one young, in the field where he and Green Willow met. He prayed over the stumps and built a hut in their memory.
In the spring, a green willow shoot sprouted and Tomotada tended to it until the end of his life. His bones joined the earth, and a willow shoot grew in his place.

According to the book, “The trunks of the two willows grew together. The branches intertwined. Down under the earth the roots found each other in the darkness and embraced.”

Stories like this connect people not only with each other, but also with cultural history and their selves. The character Green Willow and her parents are literally connected to the trees in their front yard.

Tomotada falls in love with the graceful Willow, and comes to love nature after she dies. It’s only after he loves and loses her that he recognizes his true purpose in life. Like Tomotada and Green Willow, our  ancestors have stories to tell. Through trials and tribulations they have learned and gained wisdom.

The oral tradition of folk tales began in families, small villages and tribes. These stories were passed down through generations and are an inexorable link to our past. Minute details of their historical context remain. Each story has the teller’s DNA in it.

When hearing and telling these stories, we begin to understand the era in which they took place and the values our ancestors held. Thanks to social media, stories can now be more widely read. Movie theaters have replaced the storytelling circle and Facebook has replaced traditional family meetings.

This is not necessarily a bad thing. Modern folktales are called “urban legends.” These stories are just as fantastical, but grounded in a modern setting. Think about the legendary Mothman or Bigfoot or the Loch Ness Monster — all of these creations evolved from alleged eye witness accounts, video footage and photographs like older folktales evolved from eye witness accounts, scrolls and paintings.

Any art form will perpetuate itself throughout the ages. It will be molded to fit whatever setting is necessary for people to experience it. Through this evolution the storytellers and listeners begin to understand just what brings people together and what brings people closer to their selves. There’s nothing quite so powerful as clinging to a mystery, but the mystery of folktales lies not in how mysterious they are — it’s that they’re not mysterious at all.

The story of Green Willow is a Buddhist allegory about man’s need for connection. According to biological anthropologist David Daegling stories of Bigfoot-like creatures have appeared in indigenous folklore for generations, but only in 1958 did Bigfoot hit the world stage. The Loch Ness Monster was reportedly first seen in 1933.  The Mothman first appeared in a Point Pleasant, W.Va., newspaper  November 1966.

In 1958, the prosperity that post-World War II America experienced took a sharp downturn in the Recession of 1958. Adolf Hitler was appointed chancellor of Germany in 1933, ushering in a new era of revolution. In 1966, the Vietnam War was hitting its most brutal stages. All of these tales come at a time of change, and the stories of old are no different.

When faced with great change and ensuing mystery, people look for a reason to continue on. These stories, no matter what form they come in, give people something to search and hope for. Maybe the Mothman, Bigfoot and the Loch Ness Monster do exist. Maybe somewhere in Japan, Tomotada and Green Willow are embracing under the earth. No matter what, these stories have and will exist as long as people are around.

June 25, 2013 George Hyde

It’s that special time again! It’s been a while since the console wars had been flamed up, and I fondly remember the middle school days when I would argue with my friends that yes, the Wii could totally hold its own against the Xbox 360 and PlayStation 3. Now with the Electronic Entertainment Expo (or “E3” for short) over with, the call of war has been sounded once again. So which side should we be on?

Microsoft’s entry is the Xbox One, which claims to be the one device people will need accompanying their televisions (hence the confusing “One” moniker). It can act as a game console, Blu-ray player, cable box and social media hub. It’s received a lot of flack, for its restrictions on used games and offline play, always requiring the console to be connected online. However, Microsoft later redacted those requirements and stated that it will allow used games and offline play. However, there’s no reason to believe that they won’t implement those restrictions in the future, so be wary.

Sony, on the other hand, has proudly proclaimed that the PlayStation 4 will allow used games and not require an online connection. But for those with Internet connections, it has a wide array of social features, including a “share” button on the controller allowing players to share screenshots and video footage from their games. It also allows game streaming over the popular Ustream service (it should be noted that the Xbox One has similar sharing features, instead using Twitch for its streaming services).

And lastly, there’s Nintendo, who is still holding on despite middling sales of the Wii U. Their console actually came out last year and uses technology more similar to current consoles rather than real “next-gen” technology. So the Wii U, like the Wii before it, will graphically be a generation behind. However, Nintendo is confident that its first-party titles, like the upcoming “Super Mario 3D World” and “Super Smash Bros,” will push more units.

The three consoles are actually very similar, so the only reason to pick up one over the other is for exclusive titles, and both consoles have many that appeal to a wide array of audiences.

So which one should you get? This would have been an easy answer had Microsoft stuck with its restrictions. But since the One is more open, the answer is more complex. So what I’ll say is this:

If you’re an Xbox fan, the One is the best option. If you’re a PlayStation fan, the PS4 is the best option. If you’re a Nintendo fan (or are on a budget), the Wii U is the best option. Or, alternatively, if you’re a PC gamer, stick to your guns and maybe upgrade down the line.

Whichever path is chosen, this generation will be an interesting one to see unfold.

June 11, 2013 George Hyde

“’I am Ozymandias,’ sayeth the stone,

‘the king of kings; this mighty city shows,

the wonders of my hand!’ The city’s gone,

naught but the leg remaining to disclose

the site of this forgotten Babylon.”

 

Every fan of art has seen the Mona Lisa, or at least a very accurate copy of it. Every film buff has seen “Citizen Kane.” Every serious fan of music has heard the works of Bach and Beethoven, every literary expert is familiar with the works of Shakespeare and every game enthusiast has played through “Chrono Trigger” — hopefully.

Famous works like these, of any medium, have been passed down from generation to generation, sometimes over hundreds or thousands of years. They illuminate different times and different cultures and serve as a valuable asset to the world’s culture.

But for every classic passed down, many, many more are lost to the ages and forgotten in time. Until it was released on Steam only about a month ago, “System Shock 2” suffered this fate for over a decade.

For any medium, this phenomenon is disastrous, as these artistic works are bridges to different eras and times. They influence the art of today and tomorrow, just as they did in the past.

This problem has escalated recently with the advent of Digital Rights Management, also known by its more dreaded acronym, “DRM.” DRM utilizes many different methods, from constant Internet connections (like in “SimCity”) to restrictions on how many computers the work can be installed on. This is done to authenticate digital films, games, books and others to help ward off piracy, but at what cost?

DRM hinders artistic preservation in tons and tons of ways. If, 20 years from now, a person wanted to play “Diablo III,” they wouldn’t be able to unless Blizzard’s servers were still up, which grows more unlikely as time goes on.

But then again, this is a problem that only grows more complicated as the world struggles to make these works profitable in a digital age. A concrete solution won’t come for a very long time.

So if you find a gap in your gaming schedule — which shouldn’t be an issue considering how little games come out in the summer — play through a classic you’ve never played before. Like film and music, gaming’s history is important, and it’s vital that we keep it alive, not only for our sake, but for future generations to come.

May 29, 2013 Heather Hamilton

Viral ad campaigns and “leaked” information should be evil. They make fans feel preyed upon for their predictability, and stupid for being so excitable.

But man, do they get the job done.

Marvel’s “Agents of S.H.I.E.L.D.,” a new live-action television show set to air on ABC, is just the latest show or movie to use the advertising technique. Along with its first promo trailer, short videos under 20 seconds long have been popping up depicting a variety of things related to the Marvel universe and the S.H.I.E.L.D. agency. The “surveillance” videos are allegedly being posted on a blog called Rising Tide, which aims to uncover the truth about S.H.I.E.L.D.

That info alone suggests an interesting layer to the television show, indicating that such an organization will likely be a staple in the series. But viral marketing doesn’t just stop there; the blog actually exists, and it looks just like what one would expect of a blog dealing with a government agency. Allegedly their videos keep being taken down from an outside source, and they are angry about it and determined to continue their work.

That kind of depth in a marketing campaign brings out the best in fans and usually accomplishes its goal of raising interest.

Why should we feel bad about ourselves for falling for these ploys? Because we’re predictable, and the marketing companies know that as soon as we’re interested, they can count on us to either purchase their product or tune in to their show. We should feel bad because the campaigns trick us into feeling like we’re being invited to join in rather than being asked to watch the show or movie to earn its producers money. They trick us into feeling like whatever reality their product exists in is real, even if we know better.

That’s what the many viral ad campaigns did for the “Paranormal Activity” movies as well. For instance, in preparation for the second movie, several short viral videos were “leaked” onto the Internet. Some had hidden images, others had hidden video footage viewers had to scroll sideways to unlock, and still others had deleted or alternate scenes from the movie. One viral bit included a phone number for the home the first movie took place in, and any curious fans who dialed it heard an answering machine message for the couple. The movies are supposed to be scary, and their viral campaign tried to keep the creep factor high and succeeded.

Viral ad campaigns are meant to capture our attention and entertain us. They are meant to hold our interest long enough for the product to be obtainable, so that we’ll feel compelled to partake.

They work because we want to be entertained. We want something to ponder over and investigate, we want to feel like we’re special, like we have to unravel a mystery.

The latest mystery is “S.H.I.E.L.D.,” which comes out sometime this fall. I personally don’t care about the secrets of this fictional organization though. There’s only one question I want Rising Tide to try to answer for us.

How the heck did Agent Coulson survive “The Avengers?”


 

May 29, 2013 Heather Hamilton

Heroes aren’t jokes.

Cartoon superheroes wearing their underwear on top of their pants are funny, and it’s perfectly fine to go nuts with the wisecracks. But the real heroes, men and women who presented a choice of “do or do not” and choose “do” — especially when risking their lives — are not jokes. They deserve respect.

Charles Ramsey is a hero. He heard a woman’s scream and instead of ignoring it or calling 911 and letting someone else deal with it, like many people would, he went to the source and freed a woman who had been missing for 10 years. His heroics then enabled two other missing women to also be freed from a house of horrors.

And all people really talk about when they mention him is how hilarious his interviews are. News stations are highlighting certain quotes while largely ignoring others. Internet memes are already popping up about his mention of eating McDonald’s when he heard the commotion. One YouTuber has even highlighted other quotes, such as the one where Ramsey talks about barbecuing with the women’s captors, and turning them into an auto-tuned song.

Ramsey’s heroic deed is not a joke. If one were to suddenly be bombarded with reporter after reporter after learning the neighbors were kidnappers and rapists, what would the average person say? Would he or she have a prepared script? Would that person be perfectly composed? The answer is no.

If it were me, I’d stare dumbly at the reporter and have no idea what to say. At least Ramsey was composed enough to speak and describe his version of events. Did he say something to make a few people chuckle? Apparently, but there isn’t really anything to laugh at in this situation. Look at the horror this man stumbled into. What would you say if you experienced it?

I love memes, and I spend way too much time on the Internet watching ridiculous videos, but not everything should be made into a joke.

Here’s what people should be focusing on: Three women believed to be dead are alive, and now free, because of this man. The FBI was offering up to a $25,000 reward for information aiding in the investigation of Amanda Berry’s disappearance. There was also a reward offered for information about another woman, Georgina DeJesus, though the amount was undisclosed.

Ramsey doesn’t want the reward and has said he would rather the women he saved have it. This is what we should be talking about. We should be talking about a selfless man saving three women believed to be dead and not wanting to be rewarded for it.

May 1, 2013 Heather Hamilton

Don’t stop.
When seniors walk across the commencement stage and enter what professors and administrators call the “real world,” the things they had to do to graduate will fall to the wayside. Most won’t read 100 pages a night to keep up in their English classes. Most writers won’t write stories with difficult prompts. Most artists won’t challenge themselves with art techniques that don’t initially interest them. Most performers may not learn excruciatingly tedious pieces just for the sake of learning something new.
We seniors will be free, and we’ll let all the challenges we’ve had to endure for our education drop off us like they’re burdens rather than learning tools.
That’s a mistake. Because if we let ourselves go, we’ll forget.
I took biology my senior year of high school, and the only thing I can remember is the stench of formaldehyde and baby pig innards. And how long a pig’s long intestine is. And how freshmen scream when your teacher allows you to parade pig innards through the classroom next door for giggles.
But I digress. The point is that I don’t remember anything of relevance from that class six years later because I don’t use it every day.
I also took two years of French in high school, and then our program was cut. When I came to UAA, they refused to waive my language requirement. I failed to test out because I hadn’t studied it in two years and couldn’t remember anything. When I eventually took the class though, it all came bak after the first week. It was bliss! I coasted for the entire first semester because it covered everything I learned in high school, and the knowledge was still locked away in my head.
I love the French language, but it’s been another few years since I’ve studied it. Now if someone spoke French to me, I’d just stare at them and blink like an idiot. I let it happen again. But I’m confident that, because I love it so much, because I care, I could get it back just as easily as I did the first time.
A very wise man once told me that no new knowledge is wasted. So, my fellow peers, those things that it took to sharpen your mind and help earn you that degree? Don’t stop doing them. Creativity never dies, but skill does rot away when it goes unused.
Learn a piece by Mozart, even if you hate his compositions. Pull three writing prompts out of a hat and work them into a short story on a day off. Do you hate using watercolors? Practice it anyway. You never know when you’ll need these skills for that career you’ve been fighting all these years to be qualified for. Maybe a thing that was a past chore will turn into a joy, now that you’re doing it on your own terms instead of under the whip of a professor.
Now if you’ll excuse me, I’m going to bone up on my French.
Félicitations, classe de 2013!

April 23, 2013 Heather Hamilton

If there is one thing that is completely and utterly deplorable, it’s taking a mas­sive tragedy and creating lies about it.

Hours after the Boston Marathon trag­edy April 15, Photoshopped images and videos began circulating across the Inter­net, especially on Facebook. One image was of a little girl who was allegedly killed in the explosions while running on behalf of Sandy Hook Elementary.

Was a little girl really running for the memory of other children? Absolutely not.

Instead, an eight-year-old boy was killed while watching the race with his family. His mother was also injured and consequently had surgery because of a head injury.

Other than the image circulated with a little girl running, there were no reports of a little girl dying.

Was this tragedy not bad enough? Do people honestly feel the need to embellish and worsen matters by publishing fake stories with images of children?

“Family Guy” isn’t known for being tasteful, but even creator Seth MacFar­lane calls an Internet hoax involving the show deplorable. Someone mashed two clips from a March episode into a sequence that blatantly suggests the main character, Peter Griffin, bombed the Bos­ton Marathon.

In the real episode, the character merely ran over several runners to win.

As previously stated, “Family Guy” isn’t the most sensitive TV program around, but rearranging clips out of con­text and claiming the episode “predicted” the bombing is both horrific and utterly repulsive.

It can be hard to accept sometimes, but coincidences do happen. Conspiracy theories can also be fun — like the one saying aliens built the pyramids and the Egyptian government is hiding the evi­dence — but taking a genuinely awful event and making it seem worse is hor­rible.

“Family Guy” didn’t predict the Bos­ton Marathon bombing. A little girl didn’t die in the bombing either.

But a little boy was killed and his mother was hospitalized. Isn’t that ter­rible enough?

These happenings are heartbreaking as is. Why on earth do people feel like it’s their job to spread rumors to make events worse?

If you want to make the world a bet­ter place, act like one of those marathon runners who ran to a nearby Blood Bank location and donate your blood to save a life. When you discover that children have been murdered, send families cards and flowers. Consider donating money to help families give their children proper funerals — they don’t come cheap. I’m sure it would mean the world to some families to know people in their corner are praying or even just thinking of them.

If you want to make a difference, mis­informing the public is not the way to go.

April 9, 2013 Heather Hamilton

There’s a special place in “you-know-where” for bad mov­ie sequels. Unfortunately, it’s a very crowded place.

That doesn’t stop fans from getting excited though. Ellen DeGeneres just announced her involvement in Disney and Pix­ar’s upcoming sequel to “Finding Nemo.” the Internet is buzzing with general delight over “Find­ing Dory,” set to release in 2015.

But a good premise doesn’t mean it will be a good film. There are some general pitfalls that “Finding Dory” and other sequels need to avoid to live up to the quality of original movies.

First, all returning charac­ters should be portrayed by the original actors. This can be dif­ficult to accomplish with sched­uling and contract negotiations, but main characters need to have the same actor. To the audience, that actor is the character. No one else will ever live up to the first performance in most cases. If one of the original actors can­not be signed, a plausible rea­son should be given in the film for it. This method worked great for Megan Fox’s character in the latest “Transformers” movie — although, it may have helped that her character was replaced by a Victoria’s Secret model.

Second, certain themes from the first movie should be kept intact but shouldn’t go over­board. “The Lion King” did not have equally successful sequels, but at least writers maintained a literary subtext. The origi­nal movie was based loosely on “Macbeth,” while “Lion King 2: Simba’s Pride” was a happier retelling of “Romeo and Juliet.” “Lion King 1 1/2” may not have had this element, but it was a re-telling of the first movie from a different perspective, so the lack of subtext was made up for in other ways.

Third, a sequel should only be released in 3-D if the visual technology is integrated into the cinematographic composition. And I mean to James Cameron levels of epic, such as in “Ava­tar.” Anything less is a waste of moviegoers’ time and money. The Cameron model shouldn’t be followed exactly, though. An original plot should never be for­saken for incredible 3-D effects. That is also a no-no.

Fourth, no questions should remain unanswered. The “Saw” movies do a remarkable job of this. Overall, any question from the surprisingly intricate sev­en-movie series was eventually answered. It was very gratify­ing for those who cared about more than the guts and gore being spilled every five minutes. The only problem was that it sometimes took three movies to answer a question; in one case, it took six movies to answer the question of whether or not a char­acter had survived the first mov­ie. If funding for production had been cut off after any given mov­ie, at least two to three main plot points would be left unanswered. Waiting too long to answer the questions is dangerous, even if you fully intend to do it eventu­ally.

Fifth, if a movie is a box office hit, the production pro­cess for the sequel should start right away to feed the audience’s desire for more. Marvel has already released a five-minute teaser about its second phase of hero movies leading up to “The Avengers 2.” The company has even given fans a timeline. “Iron Man 3” will be released later this year. “Captain America: The Winter Soldier” and “Guardians of the Galaxy” are both slated for 2014 releases. “Ant-Man” is planned for 2015, thought it is slated to be released after “The Avengers. “The Avengers 2,” by the way, has a tentative release date for May 1, 2015.

This is how to successfully sell sequels and the movies lead­ing up to them. It is necessary to plan ahead and take the effort to make the first installments wor­thy of a follow-up.

DeGeneres is signed on and excited for “Finding Dory.” If Disney and Pixar take care of the story and its characters, it’ll be a great movie for everyone who enjoyed “Finding Nemo.” The success of the beloved “Toy Story” trilogy proves Pixar is capable of a multi-generational series. And while many mov­ies are forced to endure terrible sequels, there are definitely steps that production companies can take to make the most of them.

April 3, 2013 Heather Hamilton

People don’t visit Alaska for the culture.

With mountains, wildlife, glaciers and cruises, we have such a breathtakingly beautiful state that people will travel from all over the world to see and be a part of.

March 18, 2013 Heather Hamilton

Worse than planking and leagues away from flash mobbing is the “Harlem Shake” craze. Not only is it pointless, but it’s also a horrible parody of something cool.

Here’s a basic breakdown of a standard “Harlem Shake” video: One person, often wearing a mask or helmet, begins repeating a simple dance move, usually a pelvic thrust or a shrugging motion while leaning from side to side. While this is happening, the song “Harlem Shake” by Baauer plays in the background. Once the lyrics “do the Harlem Shake” pass, the bass drops and the video cuts to a shot of the same room with several people in eccentric outfits doing strange dance moves, such as humping the air or doing the worm. Each video is roughly 30 seconds long.

Comedian Filthy Frank uploaded the first video to YouTube Jan. 30. It was being used as an opener for the rest of his video, but the intro caught like wildfire. By Feb. 2, several parody videos were uploaded to YouTube, and the video meme only grew from there.

Some incarnations of the trend are genuinely interesting. In Egypt, four pharmaceutical students were arrested for violating the country’s strict indecency laws when they stripped to their underwear in a middle-class neighborhood to film their version of the video. Another, much larger group managed to get away with it however — in front of the Great Pyramids, no less.

In both Egypt and Tunisia, the Harlem Shake craze is being used as protests against social and personal restrictions as well.

It is impressive that an Internet trend can play a role in global politics — but the reality of the trend being the Harlem Shake leaves much to be desired.

The real Harlem Shake is a dance move created in 1981 by a Harlem resident named Al B. The dance was originally named after him before being referred to as the “Harlem Shake.” It was popularized in a G. Dep music video for the song “Let’s Get It” in 2001.

The first Harlem Shake video by Filthy Frank only uses this dance move and blatantly makes a mockery of it. Many other videos do the same, even if they don’t realize it.

In later a video called “Harlem’s Reaction to the Harlem Shake,” many Harlem residents expressed frustration toward the video parodies.

One man said, “I feel like they’re trying to disrespect us.” A few seconds later, a girl said, “They’re making us look bad.”

I’m sure the residents of Harlem can take a joke as much as anyone else, but this particular dance is deeply rooted in Harlem culture. It is an expression of the area’s cultural identity. Parodies of that identity are degrading and ultimately erode the identity itself. That’s never OK.

While many have proclaimed the meme’s death, it’s still cropping up in new places, such as Alaska.

On March 4, Daniel Burgess published a Harlem Shake video of Sand Lake Community Council members on YouTube. There’s nothing very inappropriate about the video, but the fact that it exists is disappointing, especially since Burgess is the president of the council.

On a larger scale, “Supernatural” stars Jensen Ackles and Jared Padalecki recently produced a version of the Harlem Shake featuring themselves and the show’s cast and crew. It may be a marketing scheme, but new popular videos prevent the trend from dying out.

And it needs to. It’s a mockery of an awesome, existing dance move, and the parody portrayal of it is tasteless.

Even the “Gangnam Style” trend was better than this — all it did was make people look silly. Flash mobbing was fantastic and can be used to both invite spectators to participate (as was done in a “New Dances” performance at UAA last year) and make a point about something socially.

Why can’t activists speaking out about social rights try those things? Surely the message would be so much stronger if its presentation was less cringe-worthy.

And please, can the next Internet phenomenon be less horrific?

March 5, 2013 Heather Hamilton

My fiance and I were browsing at Bosco’s in the Dimond Mall, and we overheard two male employees discussing Lara Croft’s redesigned look, which will debut in “Tomb Raider” March 5, 2013. I sensed they were both unhappy with the redesign, but I shrugged it off. A few minutes later, I glanced up at the TV and saw a game trailer.