Monthly Archives: January 2013

Hitting the to-hit sweet spot

(This post continues a discussion I started in “What does D&D have to do with ironclad ships?”)

Through the evolution of Dungeons & Dragons, the game uses two mechanics to determine an attack’s damage output: to-hit rolls and damage rolls.

In D&D, hitting is fun, while missing feels like a bummer, particularly if you waited a while for your turn. Why not remove misses from the game?  Once characters gain enough hit points to survive a few attacks, D&D could drop to-hit rolls and─with a few adjustments─still work.

Skipping damage rolls

Back in the third-edition era, Jonathan Tweet advised dungeon masters to speed high-level fights by substituting average damage for rolled damage. In fourth edition, first-level characters have enough hit points to make this approach possible at any level. In a The Dungeon Master Experience post, Chris Perkins explains how he uses nearly average damage plus 1d6, added for a bit of flavor.

The Chivalry & Sorcery game (1977) skipped damage rolls, assuming “that the damage inflicted by a particular weapon in the hands of given character will be more or less constant.”

The notion of dropping to-hit rolls may seem strange but here’s the crux: Against high hit points, to-hit rolls just turn into another damage-reduction mechanic. In fourth edition, everyone but minions has fairly high hit points and everyone hits about 60% of the time. The damage dealt in combat would work out about the same if we just assumed everyone always hits and multiplied everyone’s hit points by 1.67.

Of course, your to-hit percentage varies from a set percentage, because armor can make hitting more difficult. How can a system without to hit rolls account for armor class? In my post, “The brilliance of unrealistic hit points,” I explained how hit points in D&D function as an ingenious damage-reduction mechanic. Virtually every tabletop role-playing game that aimed for realism made armor reduce damage. In our hypothetical rules variant, why not use D&D’s damage-reduction mechanic to represent protective armor? Suppose different types of armor multiplied your hit points. Suppose high dexterity increased hit points, giving more ability to “turn deadly strikes into glancing blows,” just like the game says.

Does abandoning to-hit rolls seem crazy? It has been done.

Tunnels & TrollsTunnels and Trolls (1975) stands as the one early system that defied RPG hobby’s early mania for realism. Ken St Andre, T&T’s designer, aimed for greater simplicity. T&T drops the to-hit roll entirely. Combatants simply weigh damage rolls against each other. Like Tunnels & Trolls, most RPG-inspired video games seem to drop anything equivalent to a to-hit roll. The damage simply rises as you click away at your enemy.

Drawbacks of always hitting

While undeniably simpler, and probably about as realistic as D&D, the you-always-hit approach suffers three problems:

  • Especially with ranged attacks, rolling to hit fits real-world experience much closer than assuming a hit and skipping straight to damage. Technically speaking, skipping the to-hit roll feels bogus.
  • The two possible outcomes of a to-hit roll offer more drama than the mostly-average damage roll.
  • The to-hit roll provides intermittent, positive reinforcement to the process of making an attack.

You know all about positive reinforcement. It makes you stay up late chasing one more level when you should be in bed. Reinforcement that randomly rewards a behavior is more powerful than reinforcement that occurs every time. In a casino, the best bet comes from the change machines, which pay 1-for-1 every single time. Intermittent, positive reinforcement drives people to the slot machines. As the Id DM might say, “The variable ratio schedule produces both the highest rate of responding and the greatest resistance to extinction.” In short, hitting on some attacks is more fun than hitting on every attack.

Although D&D uses a d20 roll to hit, the game plays best when the odds of hitting stand closer to a coin flip. At the extremes, the math gets strange. For combatants who almost always hit, bonuses and penalties become insignificant. For combatants looking for a lucky roll, small bonuses can double or triple their chance to hit. Also, the game plays better when your chance of hitting sits around 60%. When your nears 95%, the roll becomes a formality. When your chance dwindles toward 5%, then the to-hit the roll feels like a waste of time, especially because beating the long odds probably still earns minimal damage.

When the chance of hitting stays between, say, 30% and 70%, the game reaps play benefits:

  • The to-hit roll provides intermittent, positive reinforcement to the process of making an attack.
  • Many attacks can hit and inflict damage, providing constant, positive feedback to players while everyone contributes to the fight.

So D&D designers face the challenge of arranging to-hit bonuses and armor classes so to-hit rolls require a result between a 6 and 14, a precious small sweet spot for all the levels, magic, and armor classes encompassed by the game. In practice, designers like to push your chance to hit toward the top of the 30-70% range, because while missing remains part of the game, it’s no fun.

The Palladium Role-Playing Game (1983) recognized the play value of the to-hit roll, but it avoided D&D’s problem of finessing to-hit bonuses and armor classes to reach a sweet spot. The game simply declares that “any [d20] roll above four (5 to 20) hits doing damage to the opponent.” Armor in the game takes damage, effectively serving as an additional pool of hit points.

In upcoming posts, I’ll discuss the very different steps the D&D Next and 4E designers took to find the to-hit sweet spot.

Next: Riding the power curve

The brilliance of unrealistic hit points

(This post continues a discussion I started in “What does D&D have to do with ironclad ships?”)

After the role-playing game hobby’s first 10 years, designers turned from strict realism and began to design rules that both supported a game’s flavor and encouraged its core activities. Runequest‘s realistically lethal combat systemParanoia 1st edition game fit the fearful world of Call of Cthulhu (1981), as did a new sanity system. Paranoia (1984) built in rules that encouraged a core activity of treachery, while giving each character enough clones to avoid hard feelings.

Today, this innovation carries through stronger then ever. Dungeons and Dragons’ fourth-edition designers saw D&D’s fun in dynamic battles and showing off your character’s flashy capabilities, so they optimized rules that heightened that aspect of the game, possibly to the expense of other aspects.

When Dave Arneson mashed rules for ironclads into Chainmail, he probably gave little thought to supporting the D&D play style that would launch a hobby, but he created some brilliant conventions.

Chainmail gameThe best idea was to give characters steadily increasing hit point totals that “reflect both the actual physical ability of the character to withstand damage─as indicated by constitution bonuses─and a commensurate increase in such areas as skill in combat and similar life-or-death situations, the ‘sixth sense’ which warns the individual of otherwise unforeseen events, sheer luck and the fantastic provisions of magical protections and/or divine protection.” (Gary wrote this rationale for hit points in the first edition Dungeon Master’s Guide.)

Every “realistic” system to follow D&D used hit points to measure a character’s body’s physical capacity to survive injury. In D&D, rising hit points work as an elegant damage-reduction mechanic. Using hit points for damage reduction boasts a number of virtues:

  • Combat plays fast because players do not have to calculate reduced damage for every single hit.
  • Although damage is effectively reduced, the reduction never makes a combatant impervious to damage.
  • Once characters gain enough points to survive a few blows, hit points provide a predictable way to see the course of battle. If a fight begins to go badly, the players can see their peril and bring more resources like spells and potions to the fight, or they can run. In a realistic fight, things can go bad in an instant, with a single misstep resulting in death.
  • Most attacks can hit and inflict damage, providing constant, positive feedback to players while everyone contributes to the fight. Realistic combatants do not wear down from dozens of damaging blows; instead each hit is likely to kill or maim. In more realistic systems like Runequest and GURPS, when two very skilled combatants face off, they block or dodge virtually all attacks. The duels turn static until someone muffs a defense roll and lets a killing blow slip through. This model may be realistic─it reminds me of those Olympic competitions where years of training turn on a single, split-second misstep─but the realistic model lacks fun. No popular sports begin as sudden-death competitions where the first to score wins.
  • Battles can gain a dramatic arc. Fights climax with bloodied and battle-worn combatants striving to put their remaining strength into a killing blow. No one likes to see the climactic battle fizzle with a handful of bad rolls, especially at their character’s expense.

Bottom line: Using hit points for damage reduction enables a combat system where you can hit a lot, and hitting is fun.

Critics of inflated hit points still had a point. Using hit points as a damage-reduction mechanic can strain credulity, especially when you cannot explain how a character could reasonably reduce the damage he takes. Why should an unconscious or falling hero be so much more durable than a first-level mook?  Why does cure light wounds completely heal the shopkeeper and barely help a legendary hero? Over the years, we’ve seen attempts to patch these problems. For example, I liked how fourth edition’s healing surge value made healing proportional to hit points, so I’m sorry to see D&D Next turn back to the traditional hierarchy of cure spells.

D&D maintains a deliberate vagueness about the injuries inflicted by a hit. This abstraction makes possible D&D’s brilliant use of hit points as a damage-reduction mechanic. Fourth edition exploits the ambiguity more than ever, making plausible the second wind and the healing power of a warlord’s inspiration. 4E explicitly makes hit points as much a measure of resolve as of skill, luck and physical endurance. Damage apparently exists as enough of an abstraction that even if a hit deals damage, it doesn’t necessarily draw blood.

Even as 4E aims for the loosest possible interpretation of a hit, it makes the hit roll more important than in any prior edition. In 4E, melee hits can inflict crippling effects without saves. Just getting hit automatically subjects you to poison, or paralysis, or whatever. In past editions, if the spider bit or the ghoul clawed, you took the damage, but you still got an immediate save.

In the early days of the RPG hobby, many games attempted to fuse D&D’s fantastic setting with a more realistic model of combat damage. Although a few of these games enjoyed success, none recreated the combat-intensive, dungeon-bashing play style pioneered by D&D. At the time, no one seemed to realize that the clever damage-reduction mechanism built into game enabled the game’s play style.

Video game designers figured it out. Virtually every video game that combines fighting with character improvement features D&D-style rising hit points.

Next: Hitting the to-hit sweet spot

What does D&D have to do with ironclad ships?

Dave ArnesonWhen Dave Arneson set out to create the combat system that would become a pillar of Dungeons & Dragons, he did not aim to create a realistic simulation.  In a 2004 interview, he describes the system’s genesis from Gary Gygax’s Chainmail rules.

Combat in Chainmail is simply rolling two six-sided dice, and you either defeated the monster and killed it…or it killed you. It didn’t take too long for players to get attached to their characters, and they wanted something detailed which Chainmail didn’t have. The initial Chainmail rules was a matrix. That was okay for a few different kinds of units, but by the second weekend we already had 20 or 30 different monsters, and the matrix was starting to fill up the loft.

I adopted the rules I’d done earlier for a Civil War game called Ironclads that had hit points and armor class. It meant that players had a chance to live longer and do more. They didn’t care that they had hit points to keep track of because they were just keeping track of little detailed records for their character and not trying to do it for an entire army. They didn’t care if they could kill a monster in one blow, but they didn’t want the monster to kill them in one blow.

So the D&D rules for hit points and armor class stem from rules for ironclad ships trading cannon blasts, hardly the basis for an accurate simulation of hand-to-hand battles.

Soon after I began playing D&D, the unrealistic combat rules began to gnaw at me. In the real world, armor reduces the damage from blows rather than making you harder to hit. Shouldn’t it work the same way in the game? And how could a fighter, no matter how heroic, survive a dozen arrow hits, each dealing enough damage to kill an ordinary man? In reality, a skilled fighter would stand a better chance of evading blows, but no better chance of surviving a single hit.

Quest for realism

In the decade after D&D’s introduction, a mania for creating realistic alternatives to D&D dominated the hobby. Every D&D player who ever wielded a sword in the Society of Creative Anachronism cooked up a more realistic alternative to the D&D combat system. Runequest (1978) stands as the greatest early success. Characters’ hit points remained constant, but they became more able to dodge and block blows. Hit locations transformed characters from blobs of hit points into flesh and bone. Armor reduced damage by deflecting and cushioning blows. Arms Law and Claw Law

If you enjoyed the AD&D Weapon Armor Class Adjustment table, but felt it needed to go much, much further, the Rolemaster Arm’s Law (1980) system offered more than 30 tables matching weapons versus armor.

In this era, everyone formulated a critical hit table, because nothing adds fun to a system like skewered eyes, fountaining stumps, and sucking chest wounds. (Follow this blog for my upcoming list of supposedly fun, but not fun, things we did in the early days of role playing.)

I sought realism as much as anyone, first with Runequest, and then with GURPS. I quickly learned that making combat more realistically deadly made D&D-style, combat-intensive play impractical. Forget dungeon crawls; even skilled characters would eventually perish to a lucky blow. As I described in Melee, Wizard, and learning to love the battle map, early D&D combat lacked excitement anyway, so I hardly missed all the fights.

But I would come to realize that my dismissal of the D&D combat system was completely wrong.

Next: The brilliance of unrealistic hit points

But how do you win?

Mazes_and_monstersI discovered Dungeons & Dragons in 1977 with the blue basic set. This was before the general public came to understand that D&D was a possibly satanic form of play-acting typically performed in steam tunnels. When I described my new passion to folks, they always asked the same question: “How do you win?” I would explain that you could improve your character, but no one actually won.

I inevitably got the same follow up question. “If you can’t win, then what’s the point?” The notion that you played to have fun without any chance of winning puzzled everyone.

The how-do-you-win question, more than anything else, reveals the gulf between how people thought of games in 1977 and just a few years later. Now, for example, we rarely think of video games as something you win rather than something you finish. But in 1977, the first few video games, Space War and Pong, only featured head-to-head competition; the point had to be to win.

I would like to credit D&D and other tabletop RPGs with broadening games from competition to fun activity, but I trace the change to video games. In 1978, Space Invaders began appearing in every bar and bowling ally in the country. Even though you could never win, the game became a sensation. The space invaders march endlessly toward your cannon until you inevitably lost. The point of the game turned from winning to performing a fun activity.

Space Invaders snapshot

Space Invaders and the avalanche of video games that followed changed the way people saw games. From time to time, people still ask me to explain Dungeons & Dragons, but it’s been at least thirty years since anyone asked me the question that used to be inevitable. “If you can’t win, then what’s the point?”

Next: What does D&D have to do with ironclad ships?

Designing for spells that spoil adventures

In my last two posts, starting with Spells that can ruin adventures, I discussed the various spells with the potential to spoil Dungeons & Dragons adventures, turning hours of fun into a quick ambush. You may say, “Why worry? Just rule that these spells don’t exist in your campaign.” Clearly, you have enough foresight to carefully examine the spell lists, establishing a list of dangerous spells and magic items that might ruin your campaign plans. Of course, you could also rule that Zone of Truth doesn’t exist in your game the minute it becomes a problem. But your players will hate that.

The D&D system’s spells and magic contribute to an implied setting that most D&D players and DMs share. As a DM, you can ban spells, but that offers no help for authors of adventures for organized play or for publication. Authors writing D&D fiction also must work around these spells, or ignore them and hope the readers fail to notice.

The fourth edition attempted to eliminate every last adventure ruining effect. Fly effects really just let you jump. The ethereal plane is gone, or at least inaccessible. Linked portals replace the long-range teleport spell. While I favor this approach over keeping all the problem spells in in the system, I concede that the purge might have been heavy handed.

So that brings us to today. Seeing Zone of Truth in the D&D Next spell list inspired me to write these posts. These spells and effects need careful weighing of the benefits they offer to the game, and more thought to how they effect adventures and the implied game setting.

For the designers of D&D, I have the following suggestions:

  • Spells that compel honesty or discern lies do not add enough to the game to earn a place in the game. These spells could exist as a optional elements.
  • Spells that detect evil should only detect the supernatural evil of undead, outsiders and the like.
  • Divination spells must provide hints and clues rather than unequivocal answers, and should discourage players from seeking answers too often.
  • Scry spells must be subject to magical and mundane counters such as the metal sheeting that blocked Clairvoyance and Clairaudience in the first edition.
  • Scry spells should never target creatures, like Scrying, but only known locations, like Clairvoyance and Clairaudience.
  • Ethereal travel must be subject to barriers such as gorgon’s blood mortar, permanent ethereal objects, and perhaps even vines, as mentioned in the original Manual of the Planes.
  • The game should offer some magical countermeasures to teleportation, such as Anticipate Teleport, and the ability to make these spells permanent.
  • The Dungeon Master’s Guide needs a chapter on magical effects that the DM should plan for in campaign and adventure design, starting with fly and divination.

Next: But how do you win?

Scry and fry

(Part 2 of a series, which begins with Spells that can ruin adventures.)

Third edition Dungeons & Dragons added the Scrying spell, which unlike Clairvoyance and Clairaudience could target a creature rather than just a familiar location. Scrying worked in conjunction with Teleport to make villains vulnerable to the scry-buff-teleport system of ambush, also known as scry and fry.

The target of the Scrying spell gets a save, but the wizard can always wait for another attempt, or just scry Igor or minion #3. The game offers a couple of eighth-level, defensive spells in Screen and Mind Blank. Will the Dark Lord mind blank Igor too? None of these spells can be made permanent, so apparently, every high-level villain needs archmages on staff just to thwart do-gooder knuckleheads who can cast a sixth-level spell. In practice, the best defense might be a DM with the chutzpah to fudge an improbable number of saves. The Pathfinder system makes the spell easy to counter with lead shielding. Why didn’t Gary Gygax think of that? He did. In the first edition, metal sheeting blocked Clairvoyance and Clairaudience.

The teleport ambush worked so well, and the game offered so few countermeasures that Monte Cook stepped up in 2001 and included spells like Teleport Block, Teleport Tracer, and Teleport Redirect in his Book of Eldritch Might. In 2005, the Spell Compendium finally added practical countermeasures to the base game with Anticipate Teleportation and Greater Anticipate Teleportation. These spells delay the arrival of teleporting creatures into an area long enough to foil an ambush.

Anticipate Teleportation serves as an excellent example of a countermeasure that allows problematic spells to continue working, while adding interesting complications that makes using them risky. If Anticipate Teleportation can be made permanent, then it adds a perfect solution to the game. Teleport Redirect, on the other hand, counters Teleport with an easily lethal trap. As a DM, I want to avoid killing an entire party due to an unwise teleport.

Players don’t like having the DM nullify the cool things they can do, even if it’s cloaked in the guise of the villain’s wards and traps. If your villain happens to use some of the gotcha effects, you’re really going to see some angry glares across the table. “So you’re saying that after you heard us talking about teleporting, the bad guy just ‘happened’ to have Teleport Redirect in place.”

Players hate when you use your DM’s knowledge of their plans to invalidate their cleverness or cool toys. (And unless you can point to the gotcha, spelled out in advance, in ink, players will always suspect this.)

The ideal defenses to game-ruining spells make the spells riskier to use without invalidating them, like Anticipate Teleport. The counter effects cannot be so devastating that players feel punished for daring to use their hard-earned magic against the DM’s pet villains. And some countermeasures, like metal sheeting, need to be within reach of canny villains who cannot afford to keep archmages on retainer.

Next: Designing for spells that spoil adventures

Spells that can ruin adventures

Have you ever had an adventure spoiled by a spell? Through the history of Dungeons & Dragons, a variety of spells carried the potential to short circuit or spoil whole categories of adventures—at least without significant planning to avoid the spells’ potential.

Spells like Detect Lie (later Discern Lies) and Zone of Truth threaten to eliminate intrigue. They would turn A Song and Ice and Fire into short story.

When spells like Commune and Speak with Dead in the game, you can forget whodunits.

The Prince of Murder’s army of assassins cannot keep him safe in his mountain aerie if the characters can scry and fry.

Many of the adventure spoiling spells existed in the early days, but given the play styles of the times, they posed few problems.

Once upon a time D&D games took place in huge sprawling dungeons like the one under Castle Greyhawk, where monsters wandered and players balanced their own encounters by deciding how deep they dared to go.

Adventures never featured intrigue. You never needed to find the real killer from among a group of suspects. As the Dungeon Crawl Classics adventures advertised, “NPCs were there to be killed.”

Detect Lie probably started as a way to determine if the captive Kobold was lying about the treasure behind the “untrapped” door ahead. It also deterred the thief from stealing your stuff. Know Alignment simply existed so the cleric could tell the paladin who to kill first.

A few troublesome spells existed in the early days, but Gary built in solutions for the DM. The description of Commune says, “It is probably that the referee will limit the use of Commune to one per adventure, one per week, or even one per month, for the gods dislike frequent interruption.” Strangely, when you want to know who betrayed the party, the gods always prove too busy. The Contact other Plane spell could potentially gather lies or drive the caster insane. How bad do you want to know? In practice, these spells typically provided the Dungeon Master with a way to give hints to stuck players.

In the early days, information spells couldn’t ruin adventures, but travel and movement spells could.

As long as the players stayed indoors, Fly wasn’t a big deal. Outside, it let players fly past obstacle and enemies or just bomb and strafe them from out of reach. Every DM who fails to plan for flying will see mid-level encounters ruined, but you learn fast.

Ethereal travel can threaten to take dungeons right of the game. Any cleric with the 5th level Plane Shift spell could take seven friends ethereal, allowing them to waft through the dangerous dungeon stuff and go straight for the treasure. AD&D attempted to limit the problem by populating the ethereal with tough wandering monsters and the random Ether Cyclone. Apparently that failed to deter enough adventurers because Tomb of Horrors includes this note: “Character who become astral or ethereal in the Tomb will attract a type I-IV demon 1 in 6, with a check made each round.”

The Manual of the Planes finally gave Acererak and other dungeon makers options other than contracting with the Abyss for ethereal security. Now you could overlap your stronghold with barriers such as ethereal stone, or you could mix gorgon blood into your mortar. Inexplicably, third edition made the gorgon-blood trick an optional rule. Thanks guys. Who’s side are you on?

By the time 3E came around, some designers had become so immersed in the story slant of D&D that they forgot how broken ethereal travel could be. How else can we explain Ghostform–just add invisibility to Ghostform and you can phase through any dungeon. Ghostform appeared at 4th level and rose to 8th in errata! The four level revision must be a record.

Eventually, even in the early days, the mega-dungeon seemed a little tired to a lot of folks. Dave Arneson started mocking the routine in his Blackmoor campaign, where the dungeon entrance featured turnstiles and holy water dispensers.

In the mid 70s, at a kitchen table somewhere, for the first time ever, a DM told his players that their characters met a cloaked stranger in the back of the inn with a special job. The plotted adventure was born. Suddenly the DM needed to plan adventures around a class of spells that could ruin everything.

You might suppose the new interest in plot would lead the second edition designers to reconsider all the spells that stand as an obstacle to fun plot elements like mystery, double-dealing, and skulduggery. Mostly, the designers doubled down by adding spells like Zone of Truth. At least they added a saving throw to Detect Lie, giving any DMs willing to fudge die rolls the power to save their adventures. (Unless the players just rely on Detect Evil to determine who to kill.)

I cannot imagine situations where the truth and alignment-determining spells add to the game. They only stand as an obstacle to certain types of adventures.

Next: Scry and fry