Tag Archives: armor class

The Tangled Origins of D&D’s Armor Class, Hit Points, and Twenty-Sided Die Rolls To-Hit

In 1977, when I first read the Dungeon & Dragons basic rules, the way armor class improved as it shrunk from 9 to 2 puzzled me. Shouldn’t higher numbers be better? Players just used AC to find a row on a table, so rising ACs would have worked as well. Magic armor introduced negative ACs, making the descending numbers even more awkward. Also, many of the demons described in 1976 in the Eldrich Wizardry supplement sported negative armor class.

D&D’s designers seemed to think rising armor classes made more sense. The game rules stemmed from co-creator Gary Gygax’s Chainmail rules for miniature-figure battles. Chainmail rated armor from 1 to 8, with better armor gaining higher values. Co-creator Dave Arneson based his Blackmoor fantasy campaign on Chainmail. His campaign developed into D&D. In Blackmoor, higher armor classes represented better armor.

So how did the first D&D rules set the puzzling convention of descending armor class?

The answer lies toward the end of the genesis of D&D’s combat system.

In the original D&D rule books, the combat system that everyone used appears as the Alternative Combat System. “Alternative” because players could just use the combat system from Chainmail instead. When Dave launched Blackmoor, he tried the Chainmail system. But it focused on battles between armies sprinkled with legendary heroes and monsters. For ongoing adventures in the dungeon under Castle Blackmoor, the rules needed changes. Original Blackmoor player Greg Svenson recalls that within about a month of play, the campaign created new rules for damage rolls and hit points. (More recently, Steve Winter, a D&D designer since 1st edition, tells of playing the original game with the Chainmail combat rules.)

Much of what we know about how Dave adapted the rules for his Blackmoor campaign comes from two sources: a 2004 interview and The First Fantasy Campaign, a raw publication of notes for his game. Most quotes in this post come from those sources.

Chainmail’s melee combat matrix

To resolve melee combat, Chainmail used a combat matrix. Players matched the attacking weapon or creature against the defender, rolled a pair of 6-sided dice, and consulted the table for an outcome. “That was okay for a few different kinds of units, but by the second weekend we already had 20 or 30 different monsters, and the matrix was starting to fill up the loft.”

Dave abandoned the matrix and extended Chainmail’s rules for missile attacks to melee combat. In Chainmail, ranged attackers rolled 2d6, and tried to roll higher than a target number based on increasing armor classes. Blackmoor gained melee to-hit rolls.

Chainmail’s man-to-man combat and ranged combat tables

In Chainmail, creatures lacked hit points, so a single hit killed. But with extraordinary individuals like heroes, wizards, and dragons, a saving throw allowed a last chance to survive. For example, the rules say, “Dragon fire will kill any opponent it touches, except another Dragon, Super Hero, or a Wizard, who is saved on a two dice roll of 7 or better.”

Under rules where one hit destroyed a character, Dave tried to spare player characters by granting saving throws against any hit. “Thus, although [a character] might be ‘Hit’ several times during a melee round, in actuality, he might not take any damage at all.”

But the system of saving throws still made characters too fragile to suit players. “It didn’t take too long for players to get attached to their characters, and they wanted something detailed which Chainmail didn’t have,” Dave explains.

Chainmail battle on a sand table

“I adopted the rules I’d done earlier for a Civil War game called Ironclads that had hit points and armor class. It meant that players had a chance to live longer.” In a Chainmail battle that featured armies spanning a sand table, hit points would have overwhelmed players with bookkeeping. But the Blackmoor players liked the rule. “They didn’t care that they had hit points to keep track of because they were just keeping track of little detailed records for their character and not trying to do it for an entire army. They didn’t care if they could kill a monster in one blow, but they didn’t want the monster to kill them in one blow.”

When players rolled characters, they determined hit points. For monsters, hit points were set based “on the size of the creature physically and, again, on some regard for its mythical properties.” Dave liked to vary hit points among individual monsters. To set the strength of a type of monster while rolling for an individual’s hit points, he probably invented hit dice.

Dave said he took the armor class from Ironclads, but the concept came from Chainmail and the term came from its 1972 revisions. I suspect Dave meant that he pulled the notion of hit points and damage from a naval game that featured both armor ratings and damage points. Game historian Jon Peterson explains, “The concepts of armor thickness and withstanding points of damage existed in several naval wargames prior to Chainmail.” Still, nobody has found the precise naval rules that inspired Dave. Even his handwritten rules for ironclad battles lack properties resembling armor class. Perhaps he just considered using the concept in a naval game before bringing the notion to D&D.

In Blackmoor, Dave sometimes used hit locations. Perhaps naval combat inspired that rule. When ships battle, shells that penetrate to a boiler or powder keg disable more than a cannonball through the galley. Likewise, in man-to-man combat, a blow to the head probably kills.

Dave’s rules for hit locations only reached D&D in the Blackmoor supplement, which came a year after the game’s release. But hit locations made combat more complicated and dangerous. Realistic combat proved too deadly for the dungeon raids in D&D. So D&D players never embraced hit locations. Even Dave seemed to save the rule for special occasions. “Hit Location was generally used only for the bigger critters, and only on a man-to-man level were all the options thrown in. This allowed play to progress quickly even if the poor monsters suffered more from it.” Dave ran a fluid game, adapting the rules to suit the situation.

By the time Dave’s fantasy game established hit points, 2d6 to-hit rolls, and damage rolls, he showed the game to Gary Gygax.

Next: Gary Gygax improves hit points by making them more unrealistic, and then adds funny dice

Fourth Edition Proved D&D Works Without Saving Throws, So Why Did They Come Back?

Fourth edition dropped saving throws in favor of to-hit rolls and showed that D&D works without saves.

Mathematically, to-hit rolls and saving throws just flip the numbers so that a high roll benefits the person casting the die. Rather than having a lightning bolt trigger saves, why not just let wizards make lightning attacks against their targets? Why not just have poison attack a character’s fortitude?

By dropping saving throws, the fourth-edition designers eliminated a redundant mechanic. The change added consistency and elegance to D&D. Wizards finally got to cast spells and to make attack rolls.

If banishing saving throws made D&D more elegant, why did fifth edition bring them back? After all, the fifth-edition designers made elegance a key goal for their design. See From the brown books to next, D&D tries for elegance.

Until fourth edition, saving throws survived based on tradition and feel.

The tradition dates to when Tony Bath had toy soldiers saving verses arrows. (See my last post.) The fifth-edition designers aimed to capture tradition, but also the best qualities of earlier editions. Why not capture some of the elegant design of fourth edition?

The feel comes from a sense that the player controlling the most active character should roll the dice. D&D could drop to-hit rolls in favor of saves versus swords, but that feels wrong. On the other hand, characters seem active when they resist a charm, shake off a ghoul’s paralysis, or spring away from rushing flames. Sure, a wizard is saying magic words, a dragon is exhaling, but the action focuses on the heroes escaping the flames.

Plus, the saving throw mechanic tends to send a few more rolls to the players. Players like to roll dice, especially when the roll decides their character’s fate. When attack rolls replaced saving throws, spellcasters got to make more attack rolls, but most characters lack spells. Without saving throws, players flamed by dragon breath never get to take fate in their hands and roll a save. Instead, they just subtract damage.

So saving throws returned to D&D.

If saving throws and attack rolls share a common place in the game, what makes them different from each other?

As a dungeon master, have you ever asked a player dodging a trap’s darts to make a dexterity or reflex save? I have. I handled it wrong. Don’t fault me too much. A save gives a character a chance to escape. Characters springing away from darts or scything blades or falling stones seem to deserve a save. But that intuition is wrong. Such traps should make attacks. The Dungeon Master’s Guide never spells out this distinction.

Just as the reflex defense and AC in fourth edition defended against different sorts of attacks, in fifth edition, dexterity saves and armor class apply to different hazards. The difference comes from armor. D&D’s lead designer Mike Mearls explains that to determine whether to use an attack roll or a save, ask “Would a suit of plate mail protect from this?” Armor protects against darts, scythes, and so on, so traps using such hazards make attacks. Poisonous fumes, lightning, and mind blasts all ignore armor, so targets make saves. I would rather face a fireball protected by plate, but the rules emphasize the agility needed to escape the flames.

Originally, Tony Bath’s saving throws represented the value of armor. Now, saving throws only apply when armor can’t help.

Mearls confesses that the D&D rules don’t always make this save-or-attack distinction consistently. Plate mail certainly protects against falling rocks, and the falling-rock traps in the third-edition Dungeon Master’s Guide all make attacks. But the falling-rock traps in Lost Mine of Phandelver prompt dexterity saves. Better to leap from harm’s way, I suppose.

One area of inconsistency irks me.

Why should plate armor protect against the incorporeal, life-draining touch of creatures like specters and wraiths? Here, tradition and feel led the D&D designers to use attack rolls in a place where saving throws make more sense. If insubstantial creatures forced a target to make a dexterity saving throw, their life draining would imitate third edition’s touch attacks without a single extra rule. Plus, these undead would play like more distinct and interesting threats. Forget the feel of a to-hit roll, incorporeal creatures should force saving throws.

Proficiency and bounded accuracy in D&D Next

In my last post, I wrote about how the Dungeons & Dragons Next proficiency bonus jams all the tables and rules for attack bonuses and saving throw bonuses and check bonuses into a single rising bonus. This consolidation yields a simpler system, but the proficiency mechanic influences every corner of the game.

Attack roll tables from D&D Rules Cyclopedia

Attack roll tables from D&D Rules Cyclopedia

Proficiency bonuses increase slowly compared to similar bonuses in earlier versions of the game. They top at a mere +6 at 19th level. This slow progression stems from a principle the designers called bounded accuracy, because none of the designers come from the marketing team. Actually, “accuracy” refers to bonuses to the d20 rolls made to-hit, land spells, and make checks. Accuracy is “bounded” because the game no longer assumes characters will automatically gain steep bonuses as they advance to higher levels. See the Legends and Lore post, “Bounded Accuracy” for more.

Bonus to attack

Before third-edition D&D, armor class never rose much. In “‘To Hit’ vs. Armor Class,” longtime D&D designer Steve Winter charts the progression between to-hit rolls and AC. Steve concludes, “In AD&D, as characters advance up the level scale, they constantly gain ground against the monsters’ defenses. A 15th-level fighter doesn’t just hit lower-level monsters more often; he hits all monsters, even those of his own level, more reliably than before.”

This meant that rising attack bonuses eventually made attack rolls into a formality. Mechanically that works, because in early editions, as fighters’ gained levels, their damage increased not because each blow dealt more damage, but because they hit more often.

But attack rolls benefit D&D for two reasons:

  • Hit-or-miss attack rolls add fun. To-hit rolls offer more drama than damage rolls, and the rolls provide intermittent, positive reinforcement to attacks. See “Hitting the to-hit sweet spot” for more.
  • If to-hit bonuses overwhelm armor bonuses, armor and armor class becomes meaningless to high-level combatants. Perhaps this finally explains the chainmail bikini.

To keep attack rolls meaningful, fourth edition makes ACs rise automatically, even though nothing in the game world justifies the rise. (You might say that the rise in AC reflects combatants’ rising ability to evade attacks, but a rise in hit points reflects the same slipperiness.) The steep rise in AC meant that lower-level creatures couldn’t hit higher-level combatants and forced all battles to feature combatants of similar levels. In 4E, physical armor just provides a flavorful rational for the AC number appropriate for a level and role.

D&D Next returns to the older practice of making armor class a measure of actual armor, or at least something equivalent. At high levels, the game keeps to-hit rolls meaningful by limiting the proficiency bonus to that slight +6 at 19th level. With such a small bonus, to-hit rolls never climb enough to make armor pointless. For more, see “Bounded accuracy and matters of taste.”

In the last public playtest, and for the first time in D&D history, every class shares the same attack bonuses. In Next, characters don’t stand out as much for how often they hit as for what happens when they hit.

Bonus to checks

In third and fourth editions, characters gained steep bonuses to skill checks as they advanced in levels. Each game managed the bonuses in a different way, and each approach led to different problems.

In 3E, characters who improved the same skills with every level became vastly better at those skills than any character who lacked the skill. Eventually, DCs difficult enough to challenge specialists become impossible for parties that lacked a specialist. On the other hand, DCs easy enough to give non-specialists a chance become automatic for specialists. By specialists, I don’t mean a hyper-optimized, one-trick character, just a character who steadily improved the same skills.

In 4E, skills grant a constant, +5 bonus, and every character gains a half-level bonus to every check, so everyone gets steadily better at everything. This approach means that no character grows vastly better than their peers at the same level. It does mean that by level 10, a wizard with an 8 strength gains the ability to smash down a door as well as a first-level character with an 18 strength. To keep characters challenged, and to prevent suddenly mighty, strength-8 wizards from hulking out, 4E includes the “Difficulty class by level” table which appears on page 126 of the Rules Compendium. With this table in play, characters never improve their chance of making any checks, they just face higher DCs. Most players felt like their characters walked a treadmill that offered no actual improvements.

For more on checks in 3E and 4E, see “Two problems that provoked bounded accuracy.”

With the proficiency bonuses, D&D Next attempts to thread a needle. High-level bonuses should not reach so high that challenges for proficient characters become impossible for the rest. But the bonuses should go high enough to give proficient characters a chance to stand out and shine.

At the top end, a 19th-level character with an suitable 20 ability score and proficiency will enjoy a +11 to checks. This bonus falls well within the 1-20 range of a die roll, so most tasks within reach of specialist also fall within the ability of an lucky novice. If anything, the maximum +11 for a talented, proficient, level-20 superhero seems weak.

Two bonuses form that +11, the proficiency bonus and the ability modifier. To me, a proficiency bonus that starts at +2 at level 1 and rises to +6 at level 19 threads the needle well enough.

New characters gain a +2 proficiency bonus as opposed to the +4 or +5 skill bonuses in the last two editions. This paints new D&D Next characters as beginners, little better than untrained. New characters must rely on talent to gain an edge.

However, talented characters barely gain any edge either. Typical new characters gain a +3 ability modifier from their highest score. I’ve shown that ability modifiers are too small for checks. Players make 11.3 attack rolls for every 1 check, according to plausible research that I just made up. With so many attacks, a +3 to-hit bonus lands extra hits. With so few checks, a +3 bonus ranks with the fiddly little pluses that the designers eliminate in favor of the advantage mechanic.

The playtest package’s DM Guidelines advise skipping ability checks when a character uses a high ability score: “Take into account the ability score associated with the intended action. It’s easy for someone with a Strength score of 18 to flip over a table, though not easy for someone with a Strength score of 9.” The D&D Next rules demand this sort of DM intervention because the system fails to give someone with Strength 18 a significant edge over a Strength 9 character. The result of the d20 roll swamps the puny +4 bonus. In practice, the system math makes flipping the table only sightly easier at strength 18.

Update: The published game grants level-one characters a +2 proficiency bonus as opposed to the +1 that appeared in the final playtest.

In a curious move, the final public playtest packet eliminates the Thievery skill. Instead, the designers opt to make thieves proficient with thieves’ tools. Why? This results from the elimination of fiddly little pluses such as the +2 once granted by thieves’ tools. Without the +2, why bother with the tools? Now thieves need the tools to gain their proficiency bonus. Somewhere, sometime, a confused player will add a proficiency bonus that they assume they have for thievery, to a bonus for the tools, and double-dip two bonuses.

Next: Saving throw proficiency and ghouls

Riding the power curve through D&D’s editions

(This post continues a discussion I started in “What does D&D have to do with ironclad ships?”)

Signed Greyhawk CoverIn the very first set of Dungeons & Dragons (1974) rules, every weapon dealt 1d6 damage. Short of magic, characters could only improve their damage output by improving their bonus to hit. More hits equals more damage. Soon, Supplement I: Greyhawk (1975) gave different weapons different damage dice and introduced the strength bonus to damage. Since then, each edition seems to give characters more ways to hit for more damage.

By the fourth edition, as characters leveled, they enjoyed steep increases in to-hit bonuses matched with unprecedented increases in the damage each attack dealt. This contributed to characters increasing exponentially in power. It explains why 4E monsters only remain effective for narrow bands of levels, and it explains the nervous ticks of every DM who has run an epic table. In past editions, only the wizard saw that kind of power curve, and the non-wizards eventually grew tired of serving as wand caddies for the Wiz.

D&D Next aims to create a power curve in line with earlier editions, while preventing the runaway power traditional for wizards. If you prefer the exponential power curve created in 4E, then you might have to look for a legendary hero module in Next, or stick with 4E and bless any dungeon master eager to run a high-level game.

Greyhawk also introduced Weapon Armor Class Adjustment, a chart that granted bonuses to hit based how well your particular weapon worked against a style of armor. The table only makes sense because, in the original game, armor class really represented a particular style of armor, such as leather or chainmail. Obviously, dexterity and magical bonuses to armor class quickly turned the table into nonsense. (If you want to make sense of the table, you must apply the dexterity and magical modifiers as penalties to the attack roll.) In practice, no one used the table and the “class” in armor class lost significance.

While D&D Next thankfully steers clear of weapon armor class adjustment, the system returns to the older practice of making armor class a measure of actual armor, or at least something equivalent.

The D&D Next approach brings back a problem that has bedeviled every edition of the game except fourth. In D&D, to-hit bonuses rise automatically, level after level, while armor class remains roughly the same. Sure, as characters acquire better equipment, armor class improves a little, and in most D&D editions AC starts a little ahead. But characters gain to-hit bonuses automatically, and eventually, inevitably, to-hit bonuses outrun armor class. Everyone begins to hit all the time. As I explained in “Hitting the to-hit sweet spot,” D&D works best when combatants hit between 30% and 70% of the time.

Fourth edition fixes the problem by granting everyone automatic increases to AC to match their automatic increases in to-hit bonuses. Now armor class becomes a function of a character or monster’s role and its level. Any reasonably optimal character would boast the same AC as peers in the same role. Armor exists as the flavorful means that some characters used to reach the armor class dictated by their role. This kept armor classes on par with bonuses to hit, while making monster design simple.

armorD&D Next attacks the old problem from the opposite direction. Instead of matching automatic increase with automatic increase, D&D next limits to-hit bonuses so they never overwhelm the relatively static range of armor classes.

In 4E, in defense as in offense, characters increase exponentially in power. The fixed AC bonuses that 4E granted with each level combined with rising hit points to grant everyone steady increases to two forms of defense. You automatically get harder to hit even as the hits do less effective damage. If you’re twice as hard hit and you can sustain twice the damage, your defenses are four times better.

D&D next attempts to straighten out the logarithmic power curve by refusing to let characters double-dip. Rather than gaining steep bonuses to hit along with increases to damage, you just get increases to damage. Rather than gaining constant improvements to armor class along with additional hit points, you just gain addition hit points. Of course, I’m simplifying to make a point. Characters still gain bonuses to hit as they advance, but they gain at a fraction of the rate seen in third and fourth edition.

When we compare D&D Next to early editions of D&D, the design reins in the to-hit bonuses characters gain as they advance. In compensation, characters gain greater bonuses to the damage they inflict. Like any design decision, this strategy makes some trade offs, which I will explore in an upcoming post.

Next: Bounded accuracy and matters of taste

Hitting the to-hit sweet spot

(This post continues a discussion I started in “What does D&D have to do with ironclad ships?”)

Through the evolution of Dungeons & Dragons, the game uses two mechanics to determine an attack’s damage output: to-hit rolls and damage rolls.

In D&D, hitting is fun, while missing feels like a bummer, particularly if you waited a while for your turn. Why not remove misses from the game?  Once characters gain enough hit points to survive a few attacks, D&D could drop to-hit rolls and─with a few adjustments─still work.

Skipping damage rolls

Back in the third-edition era, Jonathan Tweet advised dungeon masters to speed high-level fights by substituting average damage for rolled damage. In fourth edition, first-level characters have enough hit points to make this approach possible at any level. In a The Dungeon Master Experience post, Chris Perkins explains how he uses nearly average damage plus 1d6, added for a bit of flavor.

The Chivalry & Sorcery game (1977) skipped damage rolls, assuming “that the damage inflicted by a particular weapon in the hands of given character will be more or less constant.”

The notion of dropping to-hit rolls may seem strange but here’s the crux: Against high hit points, to-hit rolls just turn into another damage-reduction mechanic. In fourth edition, everyone but minions has fairly high hit points and everyone hits about 60% of the time. The damage dealt in combat would work out about the same if we just assumed everyone always hits and multiplied everyone’s hit points by 1.67.

Of course, your to-hit percentage varies from a set percentage, because armor can make hitting more difficult. How can a system without to hit rolls account for armor class? In my post, “The brilliance of unrealistic hit points,” I explained how hit points in D&D function as an ingenious damage-reduction mechanic. Virtually every tabletop role-playing game that aimed for realism made armor reduce damage. In our hypothetical rules variant, why not use D&D’s damage-reduction mechanic to represent protective armor? Suppose different types of armor multiplied your hit points. Suppose high dexterity increased hit points, giving more ability to “turn deadly strikes into glancing blows,” just like the game says.

Does abandoning to-hit rolls seem crazy? It has been done.

Tunnels & TrollsTunnels and Trolls (1975) stands as the one early system that defied RPG hobby’s early mania for realism. Ken St Andre, T&T’s designer, aimed for greater simplicity. T&T drops the to-hit roll entirely. Combatants simply weigh damage rolls against each other. Like Tunnels & Trolls, most RPG-inspired video games seem to drop anything equivalent to a to-hit roll. The damage simply rises as you click away at your enemy.

Drawbacks of always hitting

While undeniably simpler, and probably about as realistic as D&D, the you-always-hit approach suffers three problems:

  • Especially with ranged attacks, rolling to hit fits real-world experience much closer than assuming a hit and skipping straight to damage. Technically speaking, skipping the to-hit roll feels bogus.
  • The two possible outcomes of a to-hit roll offer more drama than the mostly-average damage roll.
  • The to-hit roll provides intermittent, positive reinforcement to the process of making an attack.

You know all about positive reinforcement. It makes you stay up late chasing one more level when you should be in bed. Reinforcement that randomly rewards a behavior is more powerful than reinforcement that occurs every time. In a casino, the best bet comes from the change machines, which pay 1-for-1 every single time. Intermittent, positive reinforcement drives people to the slot machines. As the Id DM might say, “The variable ratio schedule produces both the highest rate of responding and the greatest resistance to extinction.” In short, hitting on some attacks is more fun than hitting on every attack.

Although D&D uses a d20 roll to hit, the game plays best when the odds of hitting stand closer to a coin flip. At the extremes, the math gets strange. For combatants who almost always hit, bonuses and penalties become insignificant. For combatants looking for a lucky roll, small bonuses can double or triple their chance to hit. Also, the game plays better when your chance of hitting sits around 60%. When your nears 95%, the roll becomes a formality. When your chance dwindles toward 5%, then the to-hit the roll feels like a waste of time, especially because beating the long odds probably still earns minimal damage.

When the chance of hitting stays between, say, 30% and 70%, the game reaps play benefits:

  • The to-hit roll provides intermittent, positive reinforcement to the process of making an attack.
  • Many attacks can hit and inflict damage, providing constant, positive feedback to players while everyone contributes to the fight.

So D&D designers face the challenge of arranging to-hit bonuses and armor classes so to-hit rolls require a result between a 6 and 14, a precious small sweet spot for all the levels, magic, and armor classes encompassed by the game. In practice, designers like to push your chance to hit toward the top of the 30-70% range, because while missing remains part of the game, it’s no fun.

The Palladium Role-Playing Game (1983) recognized the play value of the to-hit roll, but it avoided D&D’s problem of finessing to-hit bonuses and armor classes to reach a sweet spot. The game simply declares that “any [d20] roll above four (5 to 20) hits doing damage to the opponent.” Armor in the game takes damage, effectively serving as an additional pool of hit points.

In upcoming posts, I’ll discuss the very different steps the D&D Next and 4E designers took to find the to-hit sweet spot.

Next: Riding the power curve