Tag Archives: Tony Bath

Fourth Edition Proved D&D Works Without Saving Throws, So Why Did They Come Back?

Fourth edition dropped saving throws in favor of to-hit rolls and showed that D&D works without saves.

Mathematically, to-hit rolls and saving throws just flip the numbers so that a high roll benefits the person casting the die. Rather than having a lightning bolt trigger saves, why not just let wizards make lightning attacks against their targets? Why not just have poison attack a character’s fortitude?

By dropping saving throws, the fourth-edition designers eliminated a redundant mechanic. The change added consistency and elegance to D&D. Wizards finally got to cast spells and to make attack rolls.

If banishing saving throws made D&D more elegant, why did fifth edition bring them back? After all, the fifth-edition designers made elegance a key goal for their design. See From the brown books to next, D&D tries for elegance.

Until fourth edition, saving throws survived based on tradition and feel.

The tradition dates to when Tony Bath had toy soldiers saving verses arrows. (See my last post.) The fifth-edition designers aimed to capture tradition, but also the best qualities of earlier editions. Why not capture some of the elegant design of fourth edition?

The feel comes from a sense that the player controlling the most active character should roll the dice. D&D could drop to-hit rolls in favor of saves versus swords, but that feels wrong. On the other hand, characters seem active when they resist a charm, shake off a ghoul’s paralysis, or spring away from rushing flames. Sure, a wizard is saying magic words, a dragon is exhaling, but the action focuses on the heroes escaping the flames.

Plus, the saving throw mechanic tends to send a few more rolls to the players. Players like to roll dice, especially when the roll decides their character’s fate. When attack rolls replaced saving throws, spellcasters got to make more attack rolls, but most characters lack spells. Without saving throws, players flamed by dragon breath never get to take fate in their hands and roll a save. Instead, they just subtract damage.

So saving throws returned to D&D.

If saving throws and attack rolls share a common place in the game, what makes them different from each other?

As a dungeon master, have you ever asked a player dodging a trap’s darts to make a dexterity or reflex save? I have. I handled it wrong. Don’t fault me too much. A save gives a character a chance to escape. Characters springing away from darts or scything blades or falling stones seem to deserve a save. But that intuition is wrong. Such traps should make attacks. The Dungeon Master’s Guide never spells out this distinction.

Just as the reflex defense and AC in fourth edition defended against different sorts of attacks, in fifth edition, dexterity saves and armor class apply to different hazards. The difference comes from armor. D&D’s lead designer Mike Mearls explains that to determine whether to use an attack roll or a save, ask “Would a suit of plate mail protect from this?” Armor protects against darts, scythes, and so on, so traps using such hazards make attacks. Poisonous fumes, lightning, and mind blasts all ignore armor, so targets make saves. I would rather face a fireball protected by plate, but the rules emphasize the agility needed to escape the flames.

Originally, Tony Bath’s saving throws represented the value of armor. Now, saving throws only apply when armor can’t help.

Mearls confesses that the D&D rules don’t always make this save-or-attack distinction consistently. Plate mail certainly protects against falling rocks, and the falling-rock traps in the third-edition Dungeon Master’s Guide all make attacks. But the falling-rock traps in Lost Mine of Phandelver prompt dexterity saves. Better to leap from harm’s way, I suppose.

One area of inconsistency irks me.

Why should plate armor protect against the incorporeal, life-draining touch of creatures like specters and wraiths? Here, tradition and feel led the D&D designers to use attack rolls in a place where saving throws make more sense. If insubstantial creatures forced a target to make a dexterity saving throw, their life draining would imitate third edition’s touch attacks without a single extra rule. Plus, these undead would play like more distinct and interesting threats. Forget the feel of a to-hit roll, incorporeal creatures should force saving throws.

For 25 Years, D&D Put Saving Throws In Groups Made For Just 3 Creatures and 2 Spells

Today, Dungeons & Dragons of matches saving throws to ability scores. But for most of the game’s history, D&D grouped saving throws by 5 sources: spell or staff, wand, death ray or poison, turned to stone, and dragon breath. These categories and the game’s saving throw table made me wonder: What made clerics so resistant to poison? Of all possible threats, why set aside a specific save for being turned to stone? (Also, “stone” isn’t even a source, just shorthand for a list of creatures.) Dragon breath seemed overly specific too, but at least the game’s title featured dragons. And how could the remaining 3 types of save cover every other hazard in a fantastic world?

Gary Gygax based his Chainmail rules on a 1966 pamphlet by Tony Bath titled Rules for Medieval Wargames. In those rules, attackers make a roll to hit, and then defenders roll a “saving throw” to see if their armor protects them. The heavier the armor, the easier the save.

Attack rolls in Dungeons & Dragons combine Bath’s to-hit roll and saving throw. But Gary reused the name “saving throw” for another, similar purpose. This rework begin in Chainmail, the mass-battle rules that formed the basis for D&D. In Chainmail, creatures lacked hit points, so successful attacks destroyed units. But with extraordinary individuals like heroes, wizards, and dragons, a saving throw allowed a last chance to survive. For example, the rules say, “Dragon fire will kill any opponent it touches, except another Dragon, Super Hero, or a Wizard, who is saved on a two dice roll of 7 or better.”

Chainmail’s rules for spells, magical attacks, and similar hazards led to saving throws in original D&D. The first Dungeon Master’s Guide explained the concept. “Because the player character is all-important, he or she must always—or nearly always—have a chance, no matter how small, of somehow escaping what otherwise be would be inevitable destruction.”

“Someone once criticized the concept of the saving throw as ridiculous,” Gary wrote. “Could a man chained to a rock, they asked, save himself from a blast of red dragon’s breath? Why not? I replied. Imagine that the figure, at the last moment, of course, manages to drop beneath the licking flames, or finds a crevice in which to shield his or her body, or succeeds in finding some way to be free of the fetters. Why not?” Saving throws grant a last chance, but they leave the details of why a save worked to the storytelling of the player and DM. The brilliance of hit points comes from their abstraction, and in original D&D saves are just as abstract as hit points.

Chainmail only includes saves for 4 effects: dragon breath, basilisk gaze, spider poison, and spells (just fireball and lightning). Original D&D turned those 4 saves into categories, and then added “wands” as a 5th.

The detailed numbers in the saving throw table suggest that Gary set saves to simulate nuances of the game world. However, the original saving-throw table reveals no nuances and few patterns. Fighters grow to become the best at dragon slaying. Wizards gain the best spell saves and so become best at dueling other spellcasters. But why, say, should magic users boast the best resistance to petrification? And why does the death ray from a Finger of Death spell get grouped with poison rather than staying in the tougher spell category? Apparently, when Gary Gygax told players to save or die, he also gave them a break. Go figure.

In Advanced Dungeons & Dragons, the revamped saving-throw matrix reveals more patterns.

Clerics show the best saves versus paralyzation, poison, and death magic. Perhaps divine favor helps spare them from death.

Magic users start with the best saves and end with the worst. Did Gary intend to help low-level mages and to balance high-level wizards?

Fighters start with the worst saves and end with the best. Was this designed to balance the class, or just to simulate the resilience of a front-line survivor?

Although the saving throw categories lasted 25 years, these types proved clumsy.

As D&D added countless threats, the designers folded them into the same 5 categories descended from Chainmail. For dungeon masters, these peculiar categories made guesswork of choosing the right save for a new effect.

In AD&D, characters with high wisdom or dexterity gained bonuses to saves that demanded willpower or agility. But neither bonus matched a saving-throw category. The bonuses could apply to some saves in multiple categories. Remembering these bonuses—and sometimes gaining the DM’s approval to use them—added a awkward extra step.

Third edition switched to grouping saving throws by the aptitude needed to survive the danger. Now a high wisdom or dexterity cleanly benefited will or reflex saves. Finally, a high constitution helped shake the effect of poison.

The switch made saving throws simpler, but they became less abstract. The odds of someone chained to a rock using quick reflexes to survive dragon fire seemed lower than ever—low odds that fifth edition simulates by imposing disadvantage on a restrained character’s dexterity save.

When I imagine someone making a reflex or dexterity save, I picture someone leaping and tumbling from rushing flames. Then I look at the battle map and see a figure in the same square. In my overly-complicated version of D&D, evasion would make targets move out of an area of effect.

Next: Fourth edition proved D&D works without saving throws, so why did they come back?