Changing the balance of power

(This post continues a discussion I started in “What does D&D have to do with ironclad ships?”)

Axe_of_Dwarvish_LordsSkip Williams‘s second edition adventure Axe of Dwarvish Lords staged a type of battle no Dungeons & Dragons adventure has tried before or since. This adventure pitted 13-15 level characters against a warren full of goblins. As you might expect, the warren’s individual goblins typically only hit on a 20, and only because everything hits on a 20. If one earned a lucky shot, he would inflict minimal damage.  With any edition’s standing rules, 13th-level character faced with goblins would simply grind out countless attacks against inconsequential resistance. With any edition’s standing rules, this scenario fails. So Skip cheated, I mean, he designed new rules. The adventure adds two pages of rules for group tactics that allow the goblins to do things like volley arrows in area attacks, and to combine melee attacks to earning bonuses to hit. In this fourth-edition era, we’re used to monsters making exceptions to the rules, but not in 1999. Back then, monsters broke the rules because a bad DM thought he could win D&D. Personally, I liked the way the new rules enabled an otherwise unplayable confrontation, but when the goblins start breaking the rules as previously understood, I can imagine some players calling a cheat.

For the first time in D&D’s history, the next iteration attempts to enable playable confrontations between powerful characters and hordes of weak monsters, without resorting to special rules. The key, as I discussed in “Hitting the to-hit sweet spot,” is arranging everyone’s to-hit bonuses and armor classes into the small range that grants everyone a reasonable chance to hit.

D&D Next hits the sweet spot by limiting the to-hit bonuses characters gain in exchange for greater bonuses to the damage they inflict.

This exchange intentionally shifts one aspect of the game’s balance of power.

Low-power combatants benefit against high-power opposition

Mobs of weak monsters can threaten higher level characters, still be able to hit, and let their numbers overcome the characters’ higher hit points. On the flip side, the dungeon master can pit parties against fewer, more powerful monsters, without having to select monsters specifically designed as a solos or elites. This re-enables the sort of sandbox play where players can choose a difficulty level by plunging as deep into the dungeon as they dare.

High-power combatants lose against low-power opposition

When your legendary hero faces goblins, the damage each blow deals hardly matters, because dead is dead. But your hero’s chance of hitting a lowly goblin rarely improves. Your hero feels like a zero.

Meanwhile, in the DM’s chair, if you want to pit a single giant against a party of lower-level characters, the fight can go badly. The giant’s one attack often misses, but when it hits, it kills. As a DM, I still prefer a solo with lots of attacks, each inflicting lower damage. If monster designers look to give brutes alternate attacks that threaten many targets at once, then we enjoy the best of both worlds.

Fighters suffer the most

The accuracy-for-damage trade matters most to fighters. Fireball and Blade Barrier work as well as ever. The rogue remains content to sneak up on the goblin king. But fighter-types should hew through the rabble like grass until, bloodied and battle worn, they stand triumphant. Instead, they wind up muffing to-hit rolls against one mook.

The game could stick with logarithmic power curves and narrow tiers of level-appropriate monsters, but I think better fixes exist.

For example, cleave-like maneuvers help by spreading damage across a string of attacks, but if your fighter’s first attack misses, your turn finishes and all the goblins laugh at you. Next’s whirlwind attack maneuver lets a fighter attack several adjacent enemies with a single attack roll, but fanning a bunch of goblins somehow seems even less heroic than missing just one.

Is the medicine worse than the disease?

Earlier editions of the game offer a solution, a solution so odious that I hesitate to mention it. If fighters gain multiple attacks per round, the misses matter less because there’s more where that came from!

Multiple attacks stink because resolution takes too long, especially if the fighter must roll damage and resolve each attack before moving on to the next swing. Also, D&D’s designers have struggled to parcel out extra attacks as fighters gain levels. Jumping from one attack directly to two results in a rather sudden leap in power.  Instead, AD&D gave fighters extra half attacks, and a need to remember half attacks.  Third edition traded half attacks and the memory issue for weaker attacks and fiddly attack penalties. Yuck.

Multiple attacks also solve a problem Mike Mearls mentioned in a tweet.  “Ability mod to damage unbalances at low levels, is irrelevant at high levels.” Without multiple attacks per round, a high-level fighter’s strength bonus to damage becomes inconsequential. With multiple attacks, each attack benefits from the bonus.

If D&D Next’s designers can find a good way to allow fighters and brutish monsters to gain multiple attacks against weaker opponents, then a key piece of the Next design puzzle falls into place.

Next:  Tracking initiative (I’m done with theory for a while.)

D&D Next trades to-hit bonuses for enhanced damage

(This post continues a discussion I started in “What does D&D have to do with ironclad ships?”)

As I discussed in “Riding the power curve,” the next iteration of Dungeons & Dragons attempts to straighten out fourth edition’s logarithmic power curve by refusing to let characters benefit from both steep bonuses to hit and big increases to damage. Instead, characters mostly get increases to damage.

When we compare D&D Next to early editions, Next limits the to-hit bonuses characters gain as they advance in exchange for greater bonuses to the damage they inflict.

Before I delve into the benefits and drawbacks of this exchange, I ought to address two practical objections to trading to-hit bonuses for damage.

Should skill increase damage?

Some argue that a more skillful combatant’s blows should not deal more damage. After all, a crossbow bolt always hits with the same force, so it should always strike with the same damage. Personally, when I’m struck by a crossbow bolt, I care deeply about where it hits. Maybe that’s just me.

Miyamoto MusashiAs I explained, in “The brilliance of unrealistic hit points,” hit points in D&D work as a damage-reduction mechanic. As characters increase in level, their rising hit points reduce the effective damage they suffer. Reasonably, as characters increase in level, they could also grow better at inflicting damage by overcoming defenses to strike vulnerable places or to apply more force to a blow.  I’m no Miyamoto Musashi, but I’ve earned enough bruises sparring with practice swords to know that finding an opening to tap an opponent demands less skill than finding enough room for a kill strike─or even a cut.

And if you worry about unusual cases of oozes struck by crossbows, adjust at the table.

“The primary thing when you take a sword in your hands is your intention to cut the enemy, whatever the means. Whenever you parry, hit, spring, strike or touch the enemy’s cutting sword, you must cut the enemy in the same movement. It is essential to attain this. If you think only of hitting, springing, striking or touching the enemy, you will not be able actually to cut him.” Miyamoto Musashi, The Book of Five Rings

Hits inflict more than damage

In D&D, a hit can bring the threat of poison, level drain, and many other secondary effects. In these cases, the attack’s damage matters less than dealing the hit. A higher level character’s chance to hit improves less, so their chance of inflicting secondary effects sees little improvement.

This matters, but it matters less than you may think.

First, to-hit rolls take a much smaller place in D&D Next than in 4E. D&D Next switches from non-AC defenses back to saving throws. Virtually all spell attacks return to skipping the to-hit roll entirely.

Second, attacks versus AC return to focusing on damage. To an extent, I liked how 4E added tactical richness to combat by devising interesting attacks. However, for my taste, too many effects appeared in play. I grew tired of seeing combatants perched on stacks of Alea markers, unable to do anything but stand and make saves.

In D&D Next, as in early editions, weapon attacks mostly inflict damage, and the attacks that threaten something like poison or level drain usually come from monsters.

carrion crawlerThird, the saving throw returns as a defense against bad things other than damage. In 4E, hits against AC can inflict crippling effects without saves. Just getting hit automatically subjects you to poison, or paralysis, or whatever. In older editions, when the spider bit or the ghoul clawed, you took then damage but you also saved versus poison or paralysis. I appreciate 4E’s streamlined system, but dropping the defensive saving throw contributed to battlefields bogged down with more conditions and other markers than even the designers anticipated.

D&D Next brings back saving throws as a defense against effects like poison and level-drain. We no longer need to rely on to-hit rolls as the final test of whether a poisoned dagger drew enough blood to overcome your constitution. Because monsters make most of the attacks that poison, paralyze, drain, and so on, most players should be happy to see the save return.  Plus, despite the extra roll, the save probably speeds play by reducing the number harmful conditions that take effect.

Despite these three points, in D&D next, your high-level character is weaker when she makes attacks versus AC to inflict crippling effects. If I were to design, say, a poisoner class, I would make their chance to hit nearly automatic, and focus on saving throws as the principle defense against poison.

Next: Changing the balance of power

Bounded accuracy and matters of taste

(This post continues a discussion I started in “What does D&D have to do with ironclad ships?”)

In my last post, I wrote about how to-hit and damage bonuses contributed to Dungeons & Dragons’ power curve. When we compare D&D Next to early editions of D&D, we see a key trade off: The Next design reins in the to-hit bonuses characters gain as they advance. In compensation, characters gain greater bonuses to the damage they inflict. This trade off stems from something the designers called bounded accuracy, which spurred controversy. While most of the discussion focuses on bounded accuracy’s place in combat, in “Two problems that provoked bounded accuracy,” I wrote about bounded accuracy and ability checks.

Months ago, I wrote to explain that the influence of ability bonuses was too small for ability checks, so you might suppose I would like to see characters earning big to-hit pluses as they advance levels. But characters engage in many combats and make countless attack rolls, so even small bonuses earn big payoffs, and I’m fine with that. However, I understand that aspects of the bounded-accuracy controversy hinge on matters of taste.

In fourth edition, as characters leveled, they enjoyed steep increases in to-hit bonuses matched with continuing increases in the damage each attack dealt. This led to characters increasing exponentially in power. If you hit twice as often, and each hit does twice the damage, than you boast four times the power. Of course, monsters follow a similar power curve, so you never notice unless characters face creatures outside their narrow level band.

In character, your logarithmic increase in power feels exciting as unbeatable monsters and impossible challenges quickly become possible, and then easy.

Repainted town guardIf you want to keep suspension of disbelief, do not dare to consider the world-building implications of the 4E power curve. I checked the stats for a town guard in a heroic-tier Living Forgotten Realms adventure. As scaled for party level 10, this rank-and-file guard has AC 26 and 106 hit points. Where were these super guards a few adventures ago when the goblins attacked the town? The goblins could only hit AC26 on a 20, so they would have needed to make an average of 262 attacks on each guard to earn a kill. Of course, you can suppose that in your world, you have no super guards, but what happens when you reverse the roles, and a lone giant shows up to defeat an army?  Obviously, many players never consider this balance of power, so the game hums along. Those of us who cannot help thinking of such things find it all distasteful.

What if there are no super guards? Nowadays, the D&D rules specifically limit players to non-evil characters. In the early days, no such limitation existed. D&D focused more on killing things for selfish gains than on heroically driving back the darkness. I remember players musing that it made little sense to loot the dungeon when easy pickings lay in town. What happens when a player decides to “role play” his evil character by singlehandedly massacring and looting a town full of level-0 folk? Fortunately, my players always honored the social contract and returned to the dungeon.

Beyond the exponential power curve, players have other preferences. How high a level do you need to be before you should be allowed to hit Asmodeus on a 19? (Keep in mind, since first edition, a roll of 20 always hits.) How much of a bonus should attributes provide as compared to your per-level bonuses? I don’t think I can sway you on these matters any more than I can coax you into a new favorite ice cream flavor.

Next: D&D Next trades to-hit bonuses for enhanced damage

Riding the power curve through D&D’s editions

(This post continues a discussion I started in “What does D&D have to do with ironclad ships?”)

Signed Greyhawk CoverIn the very first set of Dungeons & Dragons (1974) rules, every weapon dealt 1d6 damage. Short of magic, characters could only improve their damage output by improving their bonus to hit. More hits equals more damage. Soon, Supplement I: Greyhawk (1975) gave different weapons different damage dice and introduced the strength bonus to damage. Since then, each edition seems to give characters more ways to hit for more damage.

By the fourth edition, as characters leveled, they enjoyed steep increases in to-hit bonuses matched with unprecedented increases in the damage each attack dealt. This contributed to characters increasing exponentially in power. It explains why 4E monsters only remain effective for narrow bands of levels, and it explains the nervous ticks of every DM who has run an epic table. In past editions, only the wizard saw that kind of power curve, and the non-wizards eventually grew tired of serving as wand caddies for the Wiz.

D&D Next aims to create a power curve in line with earlier editions, while preventing the runaway power traditional for wizards. If you prefer the exponential power curve created in 4E, then you might have to look for a legendary hero module in Next, or stick with 4E and bless any dungeon master eager to run a high-level game.

Greyhawk also introduced Weapon Armor Class Adjustment, a chart that granted bonuses to hit based how well your particular weapon worked against a style of armor. The table only makes sense because, in the original game, armor class really represented a particular style of armor, such as leather or chainmail. Obviously, dexterity and magical bonuses to armor class quickly turned the table into nonsense. (If you want to make sense of the table, you must apply the dexterity and magical modifiers as penalties to the attack roll.) In practice, no one used the table and the “class” in armor class lost significance.

While D&D Next thankfully steers clear of weapon armor class adjustment, the system returns to the older practice of making armor class a measure of actual armor, or at least something equivalent.

The D&D Next approach brings back a problem that has bedeviled every edition of the game except fourth. In D&D, to-hit bonuses rise automatically, level after level, while armor class remains roughly the same. Sure, as characters acquire better equipment, armor class improves a little, and in most D&D editions AC starts a little ahead. But characters gain to-hit bonuses automatically, and eventually, inevitably, to-hit bonuses outrun armor class. Everyone begins to hit all the time. As I explained in “Hitting the to-hit sweet spot,” D&D works best when combatants hit between 30% and 70% of the time.

Fourth edition fixes the problem by granting everyone automatic increases to AC to match their automatic increases in to-hit bonuses. Now armor class becomes a function of a character or monster’s role and its level. Any reasonably optimal character would boast the same AC as peers in the same role. Armor exists as the flavorful means that some characters used to reach the armor class dictated by their role. This kept armor classes on par with bonuses to hit, while making monster design simple.

armorD&D Next attacks the old problem from the opposite direction. Instead of matching automatic increase with automatic increase, D&D next limits to-hit bonuses so they never overwhelm the relatively static range of armor classes.

In 4E, in defense as in offense, characters increase exponentially in power. The fixed AC bonuses that 4E granted with each level combined with rising hit points to grant everyone steady increases to two forms of defense. You automatically get harder to hit even as the hits do less effective damage. If you’re twice as hard hit and you can sustain twice the damage, your defenses are four times better.

D&D next attempts to straighten out the logarithmic power curve by refusing to let characters double-dip. Rather than gaining steep bonuses to hit along with increases to damage, you just get increases to damage. Rather than gaining constant improvements to armor class along with additional hit points, you just gain addition hit points. Of course, I’m simplifying to make a point. Characters still gain bonuses to hit as they advance, but they gain at a fraction of the rate seen in third and fourth edition.

When we compare D&D Next to early editions of D&D, the design reins in the to-hit bonuses characters gain as they advance. In compensation, characters gain greater bonuses to the damage they inflict. Like any design decision, this strategy makes some trade offs, which I will explore in an upcoming post.

Next: Bounded accuracy and matters of taste

Hitting the to-hit sweet spot

(This post continues a discussion I started in “What does D&D have to do with ironclad ships?”)

Through the evolution of Dungeons & Dragons, the game uses two mechanics to determine an attack’s damage output: to-hit rolls and damage rolls.

In D&D, hitting is fun, while missing feels like a bummer, particularly if you waited a while for your turn. Why not remove misses from the game?  Once characters gain enough hit points to survive a few attacks, D&D could drop to-hit rolls and─with a few adjustments─still work.

Skipping damage rolls

Back in the third-edition era, Jonathan Tweet advised dungeon masters to speed high-level fights by substituting average damage for rolled damage. In fourth edition, first-level characters have enough hit points to make this approach possible at any level. In a The Dungeon Master Experience post, Chris Perkins explains how he uses nearly average damage plus 1d6, added for a bit of flavor.

The Chivalry & Sorcery game (1977) skipped damage rolls, assuming “that the damage inflicted by a particular weapon in the hands of given character will be more or less constant.”

The notion of dropping to-hit rolls may seem strange but here’s the crux: Against high hit points, to-hit rolls just turn into another damage-reduction mechanic. In fourth edition, everyone but minions has fairly high hit points and everyone hits about 60% of the time. The damage dealt in combat would work out about the same if we just assumed everyone always hits and multiplied everyone’s hit points by 1.67.

Of course, your to-hit percentage varies from a set percentage, because armor can make hitting more difficult. How can a system without to hit rolls account for armor class? In my post, “The brilliance of unrealistic hit points,” I explained how hit points in D&D function as an ingenious damage-reduction mechanic. Virtually every tabletop role-playing game that aimed for realism made armor reduce damage. In our hypothetical rules variant, why not use D&D’s damage-reduction mechanic to represent protective armor? Suppose different types of armor multiplied your hit points. Suppose high dexterity increased hit points, giving more ability to “turn deadly strikes into glancing blows,” just like the game says.

Does abandoning to-hit rolls seem crazy? It has been done.

Tunnels & TrollsTunnels and Trolls (1975) stands as the one early system that defied RPG hobby’s early mania for realism. Ken St Andre, T&T’s designer, aimed for greater simplicity. T&T drops the to-hit roll entirely. Combatants simply weigh damage rolls against each other. Like Tunnels & Trolls, most RPG-inspired video games seem to drop anything equivalent to a to-hit roll. The damage simply rises as you click away at your enemy.

Drawbacks of always hitting

While undeniably simpler, and probably about as realistic as D&D, the you-always-hit approach suffers three problems:

  • Especially with ranged attacks, rolling to hit fits real-world experience much closer than assuming a hit and skipping straight to damage. Technically speaking, skipping the to-hit roll feels bogus.
  • The two possible outcomes of a to-hit roll offer more drama than the mostly-average damage roll.
  • The to-hit roll provides intermittent, positive reinforcement to the process of making an attack.

You know all about positive reinforcement. It makes you stay up late chasing one more level when you should be in bed. Reinforcement that randomly rewards a behavior is more powerful than reinforcement that occurs every time. In a casino, the best bet comes from the change machines, which pay 1-for-1 every single time. Intermittent, positive reinforcement drives people to the slot machines. As the Id DM might say, “The variable ratio schedule produces both the highest rate of responding and the greatest resistance to extinction.” In short, hitting on some attacks is more fun than hitting on every attack.

Although D&D uses a d20 roll to hit, the game plays best when the odds of hitting stand closer to a coin flip. At the extremes, the math gets strange. For combatants who almost always hit, bonuses and penalties become insignificant. For combatants looking for a lucky roll, small bonuses can double or triple their chance to hit. Also, the game plays better when your chance of hitting sits around 60%. When your nears 95%, the roll becomes a formality. When your chance dwindles toward 5%, then the to-hit the roll feels like a waste of time, especially because beating the long odds probably still earns minimal damage.

When the chance of hitting stays between, say, 30% and 70%, the game reaps play benefits:

  • The to-hit roll provides intermittent, positive reinforcement to the process of making an attack.
  • Many attacks can hit and inflict damage, providing constant, positive feedback to players while everyone contributes to the fight.

So D&D designers face the challenge of arranging to-hit bonuses and armor classes so to-hit rolls require a result between a 6 and 14, a precious small sweet spot for all the levels, magic, and armor classes encompassed by the game. In practice, designers like to push your chance to hit toward the top of the 30-70% range, because while missing remains part of the game, it’s no fun.

The Palladium Role-Playing Game (1983) recognized the play value of the to-hit roll, but it avoided D&D’s problem of finessing to-hit bonuses and armor classes to reach a sweet spot. The game simply declares that “any [d20] roll above four (5 to 20) hits doing damage to the opponent.” Armor in the game takes damage, effectively serving as an additional pool of hit points.

In upcoming posts, I’ll discuss the very different steps the D&D Next and 4E designers took to find the to-hit sweet spot.

Next: Riding the power curve

The brilliance of unrealistic hit points

(This post continues a discussion I started in “What does D&D have to do with ironclad ships?”)

After the role-playing game hobby’s first 10 years, designers turned from strict realism and began to design rules that both supported a game’s flavor and encouraged its core activities. Runequest‘s realistically lethal combat systemParanoia 1st edition game fit the fearful world of Call of Cthulhu (1981), as did a new sanity system. Paranoia (1984) built in rules that encouraged a core activity of treachery, while giving each character enough clones to avoid hard feelings.

Today, this innovation carries through stronger then ever. Dungeons and Dragons’ fourth-edition designers saw D&D’s fun in dynamic battles and showing off your character’s flashy capabilities, so they optimized rules that heightened that aspect of the game, possibly to the expense of other aspects.

When Dave Arneson mashed rules for ironclads into Chainmail, he probably gave little thought to supporting the D&D play style that would launch a hobby, but he created some brilliant conventions.

Chainmail gameThe best idea was to give characters steadily increasing hit point totals that “reflect both the actual physical ability of the character to withstand damage─as indicated by constitution bonuses─and a commensurate increase in such areas as skill in combat and similar life-or-death situations, the ‘sixth sense’ which warns the individual of otherwise unforeseen events, sheer luck and the fantastic provisions of magical protections and/or divine protection.” (Gary wrote this rationale for hit points in the first edition Dungeon Master’s Guide.)

Every “realistic” system to follow D&D used hit points to measure a character’s body’s physical capacity to survive injury. In D&D, rising hit points work as an elegant damage-reduction mechanic. Using hit points for damage reduction boasts a number of virtues:

  • Combat plays fast because players do not have to calculate reduced damage for every single hit.
  • Although damage is effectively reduced, the reduction never makes a combatant impervious to damage.
  • Once characters gain enough points to survive a few blows, hit points provide a predictable way to see the course of battle. If a fight begins to go badly, the players can see their peril and bring more resources like spells and potions to the fight, or they can run. In a realistic fight, things can go bad in an instant, with a single misstep resulting in death.
  • Most attacks can hit and inflict damage, providing constant, positive feedback to players while everyone contributes to the fight. Realistic combatants do not wear down from dozens of damaging blows; instead each hit is likely to kill or maim. In more realistic systems like Runequest and GURPS, when two very skilled combatants face off, they block or dodge virtually all attacks. The duels turn static until someone muffs a defense roll and lets a killing blow slip through. This model may be realistic─it reminds me of those Olympic competitions where years of training turn on a single, split-second misstep─but the realistic model lacks fun. No popular sports begin as sudden-death competitions where the first to score wins.
  • Battles can gain a dramatic arc. Fights climax with bloodied and battle-worn combatants striving to put their remaining strength into a killing blow. No one likes to see the climactic battle fizzle with a handful of bad rolls, especially at their character’s expense.

Bottom line: Using hit points for damage reduction enables a combat system where you can hit a lot, and hitting is fun.

Critics of inflated hit points still had a point. Using hit points as a damage-reduction mechanic can strain credulity, especially when you cannot explain how a character could reasonably reduce the damage he takes. Why should an unconscious or falling hero be so much more durable than a first-level mook?  Why does cure light wounds completely heal the shopkeeper and barely help a legendary hero? Over the years, we’ve seen attempts to patch these problems. For example, I liked how fourth edition’s healing surge value made healing proportional to hit points, so I’m sorry to see D&D Next turn back to the traditional hierarchy of cure spells.

D&D maintains a deliberate vagueness about the injuries inflicted by a hit. This abstraction makes possible D&D’s brilliant use of hit points as a damage-reduction mechanic. Fourth edition exploits the ambiguity more than ever, making plausible the second wind and the healing power of a warlord’s inspiration. 4E explicitly makes hit points as much a measure of resolve as of skill, luck and physical endurance. Damage apparently exists as enough of an abstraction that even if a hit deals damage, it doesn’t necessarily draw blood.

Even as 4E aims for the loosest possible interpretation of a hit, it makes the hit roll more important than in any prior edition. In 4E, melee hits can inflict crippling effects without saves. Just getting hit automatically subjects you to poison, or paralysis, or whatever. In past editions, if the spider bit or the ghoul clawed, you took the damage, but you still got an immediate save.

In the early days of the RPG hobby, many games attempted to fuse D&D’s fantastic setting with a more realistic model of combat damage. Although a few of these games enjoyed success, none recreated the combat-intensive, dungeon-bashing play style pioneered by D&D. At the time, no one seemed to realize that the clever damage-reduction mechanism built into game enabled the game’s play style.

Video game designers figured it out. Virtually every video game that combines fighting with character improvement features D&D-style rising hit points.

Next: Hitting the to-hit sweet spot

What does D&D have to do with ironclad ships?

Dave ArnesonWhen Dave Arneson set out to create the combat system that would become a pillar of Dungeons & Dragons, he did not aim to create a realistic simulation.  In a 2004 interview, he describes the system’s genesis from Gary Gygax’s Chainmail rules.

Combat in Chainmail is simply rolling two six-sided dice, and you either defeated the monster and killed it…or it killed you. It didn’t take too long for players to get attached to their characters, and they wanted something detailed which Chainmail didn’t have. The initial Chainmail rules was a matrix. That was okay for a few different kinds of units, but by the second weekend we already had 20 or 30 different monsters, and the matrix was starting to fill up the loft.

I adopted the rules I’d done earlier for a Civil War game called Ironclads that had hit points and armor class. It meant that players had a chance to live longer and do more. They didn’t care that they had hit points to keep track of because they were just keeping track of little detailed records for their character and not trying to do it for an entire army. They didn’t care if they could kill a monster in one blow, but they didn’t want the monster to kill them in one blow.

So the D&D rules for hit points and armor class stem from rules for ironclad ships trading cannon blasts, hardly the basis for an accurate simulation of hand-to-hand battles.

Soon after I began playing D&D, the unrealistic combat rules began to gnaw at me. In the real world, armor reduces the damage from blows rather than making you harder to hit. Shouldn’t it work the same way in the game? And how could a fighter, no matter how heroic, survive a dozen arrow hits, each dealing enough damage to kill an ordinary man? In reality, a skilled fighter would stand a better chance of evading blows, but no better chance of surviving a single hit.

Quest for realism

In the decade after D&D’s introduction, a mania for creating realistic alternatives to D&D dominated the hobby. Every D&D player who ever wielded a sword in the Society of Creative Anachronism cooked up a more realistic alternative to the D&D combat system. Runequest (1978) stands as the greatest early success. Characters’ hit points remained constant, but they became more able to dodge and block blows. Hit locations transformed characters from blobs of hit points into flesh and bone. Armor reduced damage by deflecting and cushioning blows. Arms Law and Claw Law

If you enjoyed the AD&D Weapon Armor Class Adjustment table, but felt it needed to go much, much further, the Rolemaster Arm’s Law (1980) system offered more than 30 tables matching weapons versus armor.

In this era, everyone formulated a critical hit table, because nothing adds fun to a system like skewered eyes, fountaining stumps, and sucking chest wounds. (Follow this blog for my upcoming list of supposedly fun, but not fun, things we did in the early days of role playing.)

I sought realism as much as anyone, first with Runequest, and then with GURPS. I quickly learned that making combat more realistically deadly made D&D-style, combat-intensive play impractical. Forget dungeon crawls; even skilled characters would eventually perish to a lucky blow. As I described in Melee, Wizard, and learning to love the battle map, early D&D combat lacked excitement anyway, so I hardly missed all the fights.

But I would come to realize that my dismissal of the D&D combat system was completely wrong.

Next: The brilliance of unrealistic hit points

But how do you win?

Mazes_and_monstersI discovered Dungeons & Dragons in 1977 with the blue basic set. This was before the general public came to understand that D&D was a possibly satanic form of play-acting typically performed in steam tunnels. When I described my new passion to folks, they always asked the same question: “How do you win?” I would explain that you could improve your character, but no one actually won.

I inevitably got the same follow up question. “If you can’t win, then what’s the point?” The notion that you played to have fun without any chance of winning puzzled everyone.

The how-do-you-win question, more than anything else, reveals the gulf between how people thought of games in 1977 and just a few years later. Now, for example, we rarely think of video games as something you win rather than something you finish. But in 1977, the first few video games, Space War and Pong, only featured head-to-head competition; the point had to be to win.

I would like to credit D&D and other tabletop RPGs with broadening games from competition to fun activity, but I trace the change to video games. In 1978, Space Invaders began appearing in every bar and bowling ally in the country. Even though you could never win, the game became a sensation. The space invaders march endlessly toward your cannon until you inevitably lost. The point of the game turned from winning to performing a fun activity.

Space Invaders snapshot

Space Invaders and the avalanche of video games that followed changed the way people saw games. From time to time, people still ask me to explain Dungeons & Dragons, but it’s been at least thirty years since anyone asked me the question that used to be inevitable. “If you can’t win, then what’s the point?”

Next: What does D&D have to do with ironclad ships?

Designing for spells that spoil adventures

In my last two posts, starting with Spells that can ruin adventures, I discussed the various spells with the potential to spoil Dungeons & Dragons adventures, turning hours of fun into a quick ambush. You may say, “Why worry? Just rule that these spells don’t exist in your campaign.” Clearly, you have enough foresight to carefully examine the spell lists, establishing a list of dangerous spells and magic items that might ruin your campaign plans. Of course, you could also rule that Zone of Truth doesn’t exist in your game the minute it becomes a problem. But your players will hate that.

The D&D system’s spells and magic contribute to an implied setting that most D&D players and DMs share. As a DM, you can ban spells, but that offers no help for authors of adventures for organized play or for publication. Authors writing D&D fiction also must work around these spells, or ignore them and hope the readers fail to notice.

The fourth edition attempted to eliminate every last adventure ruining effect. Fly effects really just let you jump. The ethereal plane is gone, or at least inaccessible. Linked portals replace the long-range teleport spell. While I favor this approach over keeping all the problem spells in in the system, I concede that the purge might have been heavy handed.

So that brings us to today. Seeing Zone of Truth in the D&D Next spell list inspired me to write these posts. These spells and effects need careful weighing of the benefits they offer to the game, and more thought to how they effect adventures and the implied game setting.

For the designers of D&D, I have the following suggestions:

  • Spells that compel honesty or discern lies do not add enough to the game to earn a place in the game. These spells could exist as a optional elements.
  • Spells that detect evil should only detect the supernatural evil of undead, outsiders and the like.
  • Divination spells must provide hints and clues rather than unequivocal answers, and should discourage players from seeking answers too often.
  • Scry spells must be subject to magical and mundane counters such as the metal sheeting that blocked Clairvoyance and Clairaudience in the first edition.
  • Scry spells should never target creatures, like Scrying, but only known locations, like Clairvoyance and Clairaudience.
  • Ethereal travel must be subject to barriers such as gorgon’s blood mortar, permanent ethereal objects, and perhaps even vines, as mentioned in the original Manual of the Planes.
  • The game should offer some magical countermeasures to teleportation, such as Anticipate Teleport, and the ability to make these spells permanent.
  • The Dungeon Master’s Guide needs a chapter on magical effects that the DM should plan for in campaign and adventure design, starting with fly and divination.

Next: But how do you win?

Scry and fry

(Part 2 of a series, which begins with Spells that can ruin adventures.)

Third edition Dungeons & Dragons added the Scrying spell, which unlike Clairvoyance and Clairaudience could target a creature rather than just a familiar location. Scrying worked in conjunction with Teleport to make villains vulnerable to the scry-buff-teleport system of ambush, also known as scry and fry.

The target of the Scrying spell gets a save, but the wizard can always wait for another attempt, or just scry Igor or minion #3. The game offers a couple of eighth-level, defensive spells in Screen and Mind Blank. Will the Dark Lord mind blank Igor too? None of these spells can be made permanent, so apparently, every high-level villain needs archmages on staff just to thwart do-gooder knuckleheads who can cast a sixth-level spell. In practice, the best defense might be a DM with the chutzpah to fudge an improbable number of saves. The Pathfinder system makes the spell easy to counter with lead shielding. Why didn’t Gary Gygax think of that? He did. In the first edition, metal sheeting blocked Clairvoyance and Clairaudience.

The teleport ambush worked so well, and the game offered so few countermeasures that Monte Cook stepped up in 2001 and included spells like Teleport Block, Teleport Tracer, and Teleport Redirect in his Book of Eldritch Might. In 2005, the Spell Compendium finally added practical countermeasures to the base game with Anticipate Teleportation and Greater Anticipate Teleportation. These spells delay the arrival of teleporting creatures into an area long enough to foil an ambush.

Anticipate Teleportation serves as an excellent example of a countermeasure that allows problematic spells to continue working, while adding interesting complications that makes using them risky. If Anticipate Teleportation can be made permanent, then it adds a perfect solution to the game. Teleport Redirect, on the other hand, counters Teleport with an easily lethal trap. As a DM, I want to avoid killing an entire party due to an unwise teleport.

Players don’t like having the DM nullify the cool things they can do, even if it’s cloaked in the guise of the villain’s wards and traps. If your villain happens to use some of the gotcha effects, you’re really going to see some angry glares across the table. “So you’re saying that after you heard us talking about teleporting, the bad guy just ‘happened’ to have Teleport Redirect in place.”

Players hate when you use your DM’s knowledge of their plans to invalidate their cleverness or cool toys. (And unless you can point to the gotcha, spelled out in advance, in ink, players will always suspect this.)

The ideal defenses to game-ruining spells make the spells riskier to use without invalidating them, like Anticipate Teleport. The counter effects cannot be so devastating that players feel punished for daring to use their hard-earned magic against the DM’s pet villains. And some countermeasures, like metal sheeting, need to be within reach of canny villains who cannot afford to keep archmages on retainer.

Next: Designing for spells that spoil adventures