Bugs and exploits are quite possibly every developer’s worst nightmare—no matter how hard you try, and no matter how much time you spend, there’s always a few that are gonna slip past you and make it to launch. However, reactions amongst gamers tend to be somewhat divided; while many believe that it’s a company’s responsibility to ensure that the product is as perfect as possible, others feel that there’s no need to fix every single little thing, and that some of them might even make for a better game! Tonight, two of our writers go head-to-head to discuss it!
Justin Alvey: Devs Shouldn’t Aggressively Patch All Exploits
Is everyone who uses an exploit in a video game or video game hardware a dirty “cheater?” You know, the kind of guy you empty an entire clip into in
Call of Duty that still doesn’t die. The dude who hacks a Spiritomb with Wonder Guard or a Shedinja with Sturdy into their Pokémon game and then battles with them against random online players? Or worse, are they all pirates?
No. I’m living proof. I never “cheat” when playing with other people, unless I inform them of what I’m doing and they agree to it. I’ve been “cheating” since the days of the SNES, when I gave myself infinite hearts in
The Legend of Zelda: A Link to the Past to make things easier while I solved the puzzles. It was my first Zelda game, after all, and now I can beat it without “cheats” in less than a day. Had I not “cheated” back then, Nintendo might never have received hundreds of dollars from the enthusiastic Zelda fan I became. And I’m not the only nice “cheater” out there. There’s a thriving community of “cheaters” on PC that only “cheat” to keep up with those whose skills in multiplayer games vastly outclass their own. After all, who wants to play a game at which they consistently lose? Console exploits and hacking also give us mods like Project M and fan translations like that of Mother 3. Valve even experimented earlier this year with paid mods.
Finding and patching each and every possible exploit also drains valuable resources. Hacking is inevitable. Trying to prevent all exploits and hacks only serves to hurt the “good guys” while not doing much against the “bad guys”. The recent
Brazilian PS4 hack is proof of that. It makes far more sense for companies to adopt a triage approach and focus on patching or preventing exploits that can lead to rampant, easy piracy, letting the rest continue to exist – while keeping a watchful eye on them, of course. But perhaps the most compelling argument I can make – at least for me – is that of save backups. Xbox Live and PSN both have cloud saves, and a PC’s entire hard drive can be easily backed up. But handhelds in general – and Nintendo products in particular – provide no legal means of backing up all of our hard work. When I got back into using exploits in the late 2000’s, the ability to make backups was the primary reason. I’ve lost too many hours of hard work over the years to sit idly by while companies like Nintendo ignore what is now practically an industry standard. I’d gladly pay for cloud saves à la Xbox Live Gold and PS Plus on Nintendo devices. I’m willing to bet I’m far from alone on that one. That’s why – even if I never mod or edit a game or its data ever again – I will continue to use exploits in a responsible way. That is, at least, until the day comes that all consoles allow me to create backups of my game saves that aren’t tied to my specific hardware.
Steven Rollins: Developers Owe it To Their Consumers to Patch Exploits
In the modern age of gaming, developers should be expected to support a game for a reasonable amount of time after its release. When everything can be updated with a simple patch, all it takes is effort to develop a fix. Developers owe it to the hundreds of thousands of people who bought their game to provide as close to an exploit-free gaming session as they possibly can, whether in a single-player setting or a multiplayer one. Not doing so can cause reputations to be tarnished and potential customers to be lost.
When a company designs a game, they have in mind a particular way in which consumers should experience its contents. Exploits, while technically part of the game, are there accidentally. Using them takes away from the experience the developer intended the player to go through. Take a game like
Pokémon. The idea behind Pokémon can best be expressed by the old saying “It’s about the journey, not the destination.” Sure you beat a certain gym or finally conquered the Elite Four. These are just small sections of the game, highlighting only a few small points on your journey.
Pokémon is about raising your team to its fullest potential. The battles you went through, the Pokémon you catch, the evolutions of these Pokémon—all of these things make Pokémon a personal and unique experience that you miss out on if you use exploits to get ahead. Even if it turns out that my team is horrible, I will always treasure that team more than one I would’ve cheated to obtain. You wouldn’t start reading a book only to skip the middle chapters in an effort to get to the end. Why would you do it with a game?
Online exploits are even worse, however. Not only does using them affect your game, it affects the games of everyone you play with. Whether you are taking an active stance against other players, such as making yourself invincible or being able to see through walls, or going at it passively (for example, using an exploit to gain levels really quickly to unlock newer, more powerful items), you are unfairly gaining an advantage over the majority of that game’s community.
There’s a reason any particular game is developed the way it is. The purest gaming experience we can achieve is to play the game the way it was intended. Exploits are created as the result of mistakes in the development process and when used, go against the integrity of that ultimate, pure experience. Developers should continue to patch these exploits to uphold that integrity.