Star InactiveStar InactiveStar InactiveStar InactiveStar Inactive

In the “bad old days” of the early Web, many people (myself included) preached constantly that you couldn’t design Web pages to any specific monitor resolution. The eggheads in charge of W3C came up with CSS to try to make Web design as display-agnostic as possible.

Of course, “we” lost. Designers are going to design to a specific format no matter what you tell them. They took CSS and used it to make their pages even MORE resolution-dependent. If you ever hit a Web page that doesn’t get the CSS quite right for your browser choice, you know what that means. Elements all layered on top of each other. Fonts too small to read or too big to fit in their “boxes” so you lose half the text.

Well, now the problem is creeping into video game design.

When I bought a PS3 several years ago, I bought my first “big screen” TV along with it—42”, 1080p, LCD. PS3 games look beautiful on it. So do Blu Ray movies. Heck, so do regular DVD movies. What has never looked good on it is a computer screen. When I have hooked my computer (via HDMI) to the TV and set the resolution to 1080p, the screen is unreadable from a normal (~10 feet) distance, unless you have very good eyesight (I don’t).

But, this wasn’t a problem with video games because video games in the 360/PS3 era were not “designed to 1080p.” High definition TVs were still rare enough and the gaming hardware still underpowered enough that games were generally designed to 720p—even if they eventually sent 1080p output to the TV, the original design was based on 720p.

Now, with the PS4 I see game designers have upped their “base” resolution to, at least, 1080p. Here’s the problem—what looks good to designers on their high-resolution, high-pixel-density monitors a couple of feet away from their face looks horrible on my HD, low-pixel-density[1] television that’s ten feet away from my chair. Put simply, the text is too small; I can’t read it. And I’m not alone. On top of that, small elements in the game are easily overlooked[2]. And if you don’t even have HD TV? Forget about it.

Assassin’s Creed IV. Wolfenstein: The New Order. Dragon Age: Inquisition. If I actually want to read any of the text in those games, I have to go stand a few feet away from the TV or yank my chair half-way across the room. What’s really maddening about this is it’s easily fixable. All that’s needed is a font-size adjustment either built into the game or built into the console’s OS.

Or, you know, game designers could start allowing for us old, impaired-vision folks who don’t have Retina Displays right in front of our noses.

[1] Seriously, pixel density (pixels-per-inch or PPI) matters. Here’s a Web site that measures PPI for you. My 5.2” 1080p smartphone has a PPI of 423—nearly twice human visual acuity. At 42”, 1080p is only 52 PPI. It makes a huge difference, especially with text.

[2] Don’t get me started on the tiny loot bags dropped by enemies in Dragon Age: Inquisition.

Star InactiveStar InactiveStar InactiveStar InactiveStar Inactive

A is for Alpha Protocol, better Bond than Bond
B is for Bethesda & BioWare, for games that go on and on
C is for Corvo, defender of the queen
D is for Dragons, and the Dungeons where they preen
E is for Experience Points, which every gamer needs
F is for Fallout, and super mutants to bleed
G is for GameFAQs, the world’s smartest site
H is for Hearts, collecting them is a gamer rite
I is for inFamous, yes, even Second Son
J is for Jet Set Radio, hey, graffiti is fun
K is for Knights of the Old Republic, force powers galore
L is for LEGO, Heroes & Hobbits & Jedi and more
M is for Mass Effect, an RPG in space
N is for Need for Speed, world’s greatest race
O is for Origins, Dragon Age’s first
P is for Plants vs Zombies, with plenty of sunburst
Q is for Quick-Time Events…OK, that one’s a trick
R is for Ratchet, and Clank his robot sidekick
S is for Sony, for PlayStations so fine
T is for Tri-Force, aged like great wine
U is for Uncharted, the tales of Nathan Drake
V is for Varrick, the dwarf who’s something of a rake
W is for Wolf Among Us, a fable quite gritty
X is for X-Men Legends, with mutants quite witty
Y is for Yoshi, Mario’s great treasure
Z is for Zero Punctuation, a guilty pleasure

Star InactiveStar InactiveStar InactiveStar InactiveStar Inactive

Ask most fans of “Western-style” RPG video games who the top developers are and they will probably give you, in some order: Bethesda, BioWare, and Blizzard. Throw out Blizzard because they make primarily action-RPGs, and you have Bethesda and BioWare constantly releasing a stream of high-quality, well-reviewed hits. However, the ‘B’ at the beginning of the names and the genre of video games they share are the only things these two companies have in common. Their games are as different as night and day.

If BioWare games are of the “Choose your own adventure” style, then Bethesda games are closer to MadLibs. BioWare’s games are heavy on narrative. You play each “chapter” in the story and, at the end of the chapter, you can pick where to go next and your choices have some effect on the story, but you still get to the end no matter what. Along the way you can pick up side quests to help fill in the spaces of your adventure, but the focus is always on playing out the main story.

Bethesda games throw narrative out the window. Each Bethesda game consists of multiple short stories that are so separate they aren’t even on a first-name basis with each other. Beyond those are even shorter stories that you won’t discover unless you just go poking your nose in where it doesn’t belong, which is sort of the point of a Bethesda game. Bethesda games are open-world games because that’s all they really are. I know that sounds obvious, but you can’t produce linear “chapters” like BioWare does unless you have, you know, actual chapters. In a story.

Now, there are good and bad points for each type of game. I happen to love both. But, based on sales figures, gamers love Bethesda’s style more than BioWare’s. So, somebody at BioWare (or at BW’s parent company, Electronic Arts) looked at the sales figures and decided what BioWare needs to sell more games is to make an open world!

:sigh: And so we get Dragon Age: Inquisition. A game that’s BioWare through-and-through—after all, they only know how to make narrative-heavy games—but has such huge “levels[1]” with so much filler, the narrative gets snowed under rather quickly. Peel back the layers and there’s BioWare imprinted everywhere on this game.

Meaningless step-n-fetch quests? Got those. Mini-games? Yep, at least two (there may be more I haven’t discovered yet). Crafting that does you absolutely no good because you loot better stuff from your enemies? Uh huh. Items that don’t drop from enemies until you actually get the quest to collect them? Please, don’t mention it again lest my head explode.

We fans put up with this junk because of the good stuff BioWare throws into their games. Lots of cutscenes with fantastic dialogue. Meaningful and deep relationships. Lots of character (and lots of characters—double meaning intended). Those are all in DAI as well, but it pales with all the not-so-good junk BioWare shovels in to make this an “open-world” game.

What sets Bethesda games apart—and, apparently, attracts more players—is not the open world, it’s the fact there IS NO narrative. The only “motivation” for poking around is poking around. You can’t just take the open world concept (which BioWare didn’t even really do properly), and shoehorn epic narrative into it. You end up with lots of empty space when you do that, so BioWare filled the empty space with junk.

Lots and lots of junk. I’m only a dozen hours into the game and my quest log looks like the punch list for the healthcare.gov Web site designer.

It’s a slog, but I’ll keep slogging away. The narrative requires it.

[1] Don’t be fooled. DAI is NOT an “open world” game. It’s a game with individual levels. It’s just each level is huge. Sort of like the planet exploration from the first Mass Effect game. But with more mind-numbing junk thrown around the open space.

Star InactiveStar InactiveStar InactiveStar InactiveStar Inactive

keyboardsmallThis is the first of two articles about PC gaming. In this episode, I’m going to cover the advantages and disadvantages of PC gaming (and, yes, there are disadvantages). In the sequel, I will provide a realistic PC gaming rig build for under $1,000. I’m not going to try to artificially hit some lowball figure by leaving out parts or dumbing down components.

First, a word from our sponsor (me)…

I’m a Gamer

I’ve been playing games since I was a teen, which means my first shiny new console was an Atari 2600. In the late 80s I got an NES. I followed that up with a Super NES. In the mid-90s I purchased my first “modern[1]” PC and started playing games on that. I was primarily a PC gamer from about 1996 through 2006, building my systems after my first one got old and tired. I skipped the N64/PS1 generation entirely and only got a Gamecube in the middle of that generation. I switched back to console gaming in January 2007 when I got a Wii. I upgraded to a PS3 in late 2008 and now have that PS3, a Vita, and a PS4. I still have a gaming PC, but it doesn’t get used much.

All of that provides my bona fides and my bias. I have “played both sides” of the debate. Right now I prefer console gaming, but I keep my PC upgraded enough that I can play the occasional indie game or older PC game. I buy the vast majority of new games for PS3 or PS4. I don’t dismiss PC gaming out-of-hand; but, I don’t think PC gaming is the Master Race. It’s just another option, one with advantages and disadvantages.

User Rating: 5 / 5

Star ActiveStar ActiveStar ActiveStar ActiveStar Active

This is part two of a two-part post on PC gaming. In the first episode, I examined the advantages and disadvantages for PC gaming. In this sequel, I proffer a realistic gaming rig that you can build yourself. Note that putting together a PC from components for the first time can be a bit scary, but it’s not really that hard. All prices, unless otherwise noted, come from Amazon.com. You can generally get the same parts at the same price from Newegg.com, but all the Amazon parts are Prime, so…free shipping. Prices are current as of October 22, 2014.

The Parts

First, this build is going to be a micro-ATX build. Micro-ATX computers are smaller, have lower power requirements, and some of the parts (case, motherboard) are cheaper. The tradeoff is less upgradeability and a tighter working compartment. I’ll note regular ATX options where applicable if you want to go that route. Second, this is going to be an inexpensive gaming system. I’m not going super-cheap by using inferior parts, which leads to a system that can’t play new AAA games. I’m also not going state-of-the-art (or even moderate state-of-the-art), which will keep costs down. This means you shouldn’t expect full 1080p/60fps graphics for the newest games.

Star InactiveStar InactiveStar InactiveStar InactiveStar Inactive

There’s a humorous scene near the beginning of Mel Brook’s History of the World, Part I. Presented as the birth of the art critic, it depicts a caveman relieving himself on a cave painting. While clearly intended as a joke, it speaks to the conflicting nature of artistic critique and how that conflict is escalating in the Internet Age. The problem with critiques of anything, but especially art, is a disconnect between what a critic is able to offer and what audiences want.

A critical assessment of something necessarily includes the critic’s subjective impressions of that thing. Even for a consumer good, for example a car, includes the critic’s opinion on feel of the car, handling, ride comfort, placement of dashboard controls, etc. With art, subjectivity is really all the critic can report—there’s no objective way to experience artistic work.

Meanwhile, the audience for critiques increasingly demands objectivity, despite the fact a purely objective critique will be boring, and, in the case of art, impossible. This clamor is further complicated by the fact art has essentially become a consumer good. All consumers want to know is: is the product (even if the product is art) worth my money? But, no individual critic on Earth can answer that question “correctly” for everyone.

Since critics cannot “get it right” for everyone, everyone becomes a critic of the critics. This has been the case ever since “Letters to the Editor” first became a thing a few hundred years ago. Modern media accelerated the dissemination of “reader opinions.” I remember reading critical letters in nearly every issue of the comics I subscribed to as a teen[1].

Along comes the Internet, and not only is it that much easier to vent your spleen to the “rotten critics,” but everyone now has a platform to broadcast their opinions. Furthermore, the Internet is revealing a nasty undercurrent of psychopathic personalities who viciously attack other people rather than merely debating opinions. The result is a barrage of “opinion pieces” from every corner of the globe masquerading, in some cases, as critical reviews.

Real critiques are an in-depth examination of the subject. Artistic critiques, especially, should be about the reviewer’s experience. What does it mean? How does it make one feel? What questions does it raise? Why are these questions important? Even reviews of consumer product need that human element rather than a rote listing of “features” and “bugs” and “does it work.”[2]

True critiques are diminishing as the rise of loud, often vulgar, and frequently hostile, “review” sites take over the Web and engage in abusive battles of words with their readers. And those types of “click-bait” writers are only increasing. In Pixar’s “The Incredibles,” villain Syndrome says, “When everyone’s super, no one will be.”

When everyone’s a critic…

[1] ROM: Spaceknight, Micronauts, and Star Wars in case you were wondering.

[2] For that kind of review, rating aggregators and e-tail “review” systems are a wonderful replacement, allowing, at a glance, what other consumers judge to be a product’s worth.