By LUDWIG VON KOOPA - And there shouldn't be.
For decades, Nintendo gamers have always declared their belief in “gameplay over graphics.” That said, before the Wii, Nintendo console systems were always ahead of, or met the graphical standards of, gaming consoles of the day. But if you go back another decade before the Wii, you were in the fifth generation of consoles: The 32-bit/64-bit era of the mid-1990s and on. This was when consoles were just starting to enter 3D graphics, slowly leaving behind the 2D stylised sprites of the 8-bit and 16-bit eras. This was when consoles were boasting so-called realism in graphics.
As anyone who was around back then knows, the first 3D games were far from realistic-looking, not to mention they had terrible cameras. They were a blocky, polygonal mess, filled with penguin hands. There's a reason when a guest poster wrote a review on Metal Gear Solid (the original one on PlayStation 1), he didn't mention anything about the graphics. Because they look like this:
 |
I admit that not having facial features helps your stealth out. |
That's why so many characters back then had gloves incorporated into their character designs. It makes character modelers’ jobs a lot easier when working with hardware that can't render individual fingers.
So, take this thought experiment: Indie developers love to have their games in 8-bit or 16-bit styles. Could an indie game possibly sell well if they purposefully go for 32-bit or 64-bit graphics? Did you think about it? Want to know my answer?