Tag Archives: damage

Would Dungeons & Dragons Play Better If It Stayed Loyal to How Gary Gygax Awarded Hit Points?

In a typical fifth-edition Dungeons & Dragons adventure, characters will reach every battle with full hit points. Healing comes too easily to enter a battle at less than full health. Above level 10 or so, spells like Aid and Heroes Feast mean parties routinely pass their day with hit point totals above their ordinary maximums.

By the time characters near level 10, few monsters inflict enough damage to seem threatening. Except for a few outliers like giants, foes lack the punch to dent characters at maximum hit points. If round of combat results in a gargoyle hitting a 90-hit-point character 6 damage, then the fight seems like a bookkeeping exercise. “At this rate, I can only survive 14 more rounds!”

The fifth-edition design limits the highest armor classes so weaker monsters can attack stronger characters and still hit on rolls less than a natural 20. This design aims to enable hordes of low-level monsters to challenge high-level characters. In practice, the hits inflict such pitiful damage that the hero would feel less pain than the bookkeeping causes to the player. It’s the pencils that suffer the most.

The obvious fix to high-level creatures and their feeble damage is to make monsters’ attacks hurt more. Mike “Sly Flourish” Shea routinely makes creatures inflict maximum damage on every hit.

But what if the solution doesn’t come from the monsters? What if characters at double-digit levels just have too many hit points?

If high-level characters had fewer hit points, high-level monsters with their puny attacks would suddenly become a bit more threatening. Lower-level monsters could pose more of a threat high-level heroes without becoming too dangerous to low-level characters. High-level PCs would still rip through weak foes, but the survivors could deal enough damage to seem dangerous rather than laughable.

D&D no longer focuses entirely on dungeon crawls where characters judge when to rest based on their remaining store of hit points and spells. The game’s move to storytelling means characters often face just one fight per day. Healing comes cheap and easy, so characters start fights at full hit points. Lower hit points at high levels would suit the reality that characters enter every fight at maximum health. In more battles, foes would seem like credible opponents.

Of course, no one has ever argued that low-level characters sport too many hit points. New characters feel as fragile as soap bubbles. Before level 5, don’t get too attached to your hero. As characters near level 10, they begin to seem stout. They rarely go down in anything short of a slugfest, so they feel like superheroes, but not invulnerable.

But in double-digit levels character hit points keep rising at the same steep rate until DMs resort to letting monsters routinely deal maximum damage. D&D might play better if, somewhere around level 10, characters stopped gaining so many hit points.

When I first considered this notion, I dismissed it as too big a break from the D&D’s conventions. For nearly two decades, characters have gained a full die worth of hit points at every level.

Except for most of D&D’s history, somewhere around level 10, characters stopped gaining so many hit points.

From the original game through second edition, when D&D characters reached level 9 or so, they started gaining hit points at a much slower rate. In Advanced Dungeons & Dragons, fighters rising above 9th level gained 3 hit points per level with no bonus for constitution. Other classed gained even fewer points. Continuing to let characters gain a full hit die plus constitution bonus at every level defies D&D’s origins.

The original limits to hit dice served as co-creator Gary Gygax’s way of putting a soft level cap on D&D. The cap kept the game’s link to the Chainmail mass-combat rules, where the best fighters acted as “superheroes” who could match the power of 8 soldiers. Gary wanted a game where crowds of orcs or goblins could still challenge the heroes.

Admittedly, when I started playing D&D, I disliked how characters’ hit points topped out. Gary and his hit-dice tables seemed to punish players of high-level characters—especially fighters.

Although the soft cap on hit points lasted 25 years, the cap on the other perks of leveling started to disappear as soon as the first Greyhawk supplement reached gamers. While the original box topped out at 6th-level spells, Greyhawk included spells of up to 9th level. Gary never intended player characters to cast the highest-level spells, but that didn’t stop players.

By the time designers started work on third edition, they aimed to deliver perks to every class at every level from 1 to 20. The soft cap on hit points must have seemed vestigial. The designers felt the game’s math could handle a steep rise in hit points past level 10. The design abandoned any aim of making groups of low-level mooks a match for high-level heroes. Besides, a steady rise in HP made the multi-classing rules simpler.

Today’s D&D game does a fine job of awarding every class—even fighters—perks at every level. Nobody leveling into the teens gets excited about another helping of hit points. Reverting to smaller hit point advances doesn’t spoil anyone’s fun.

Fifth edition keeps levels and monsters at power levels broadly similar to those in original game. This loose compatibility makes adventures written during D&D’s first 20 years continue to work with the new edition. In theory, a DM can just swap in monster stats from the new game and play. In practice, higher-level characters have more hit points, more healing, and the creatures fail to do enough damage to keep up. Story-centered adventures make the mismatch worse.

Suppose Gary Gygax had hit points right all along. Would D&D play better if characters stopped gaining so many after level 9?

Changing the balance of power

(This post continues a discussion I started in “What does D&D have to do with ironclad ships?”)

Axe_of_Dwarvish_LordsSkip Williams‘s second edition adventure Axe of Dwarvish Lords staged a type of battle no Dungeons & Dragons adventure has tried before or since. This adventure pitted 13-15 level characters against a warren full of goblins. As you might expect, the warren’s individual goblins typically only hit on a 20, and only because everything hits on a 20. If one earned a lucky shot, he would inflict minimal damage.  With any edition’s standing rules, 13th-level character faced with goblins would simply grind out countless attacks against inconsequential resistance. With any edition’s standing rules, this scenario fails. So Skip cheated, I mean, he designed new rules. The adventure adds two pages of rules for group tactics that allow the goblins to do things like volley arrows in area attacks, and to combine melee attacks to earning bonuses to hit. In this fourth-edition era, we’re used to monsters making exceptions to the rules, but not in 1999. Back then, monsters broke the rules because a bad DM thought he could win D&D. Personally, I liked the way the new rules enabled an otherwise unplayable confrontation, but when the goblins start breaking the rules as previously understood, I can imagine some players calling a cheat.

For the first time in D&D’s history, the next iteration attempts to enable playable confrontations between powerful characters and hordes of weak monsters, without resorting to special rules. The key, as I discussed in “Hitting the to-hit sweet spot,” is arranging everyone’s to-hit bonuses and armor classes into the small range that grants everyone a reasonable chance to hit.

D&D Next hits the sweet spot by limiting the to-hit bonuses characters gain in exchange for greater bonuses to the damage they inflict.

This exchange intentionally shifts one aspect of the game’s balance of power.

Low-power combatants benefit against high-power opposition

Mobs of weak monsters can threaten higher level characters, still be able to hit, and let their numbers overcome the characters’ higher hit points. On the flip side, the dungeon master can pit parties against fewer, more powerful monsters, without having to select monsters specifically designed as a solos or elites. This re-enables the sort of sandbox play where players can choose a difficulty level by plunging as deep into the dungeon as they dare.

High-power combatants lose against low-power opposition

When your legendary hero faces goblins, the damage each blow deals hardly matters, because dead is dead. But your hero’s chance of hitting a lowly goblin rarely improves. Your hero feels like a zero.

Meanwhile, in the DM’s chair, if you want to pit a single giant against a party of lower-level characters, the fight can go badly. The giant’s one attack often misses, but when it hits, it kills. As a DM, I still prefer a solo with lots of attacks, each inflicting lower damage. If monster designers look to give brutes alternate attacks that threaten many targets at once, then we enjoy the best of both worlds.

Fighters suffer the most

The accuracy-for-damage trade matters most to fighters. Fireball and Blade Barrier work as well as ever. The rogue remains content to sneak up on the goblin king. But fighter-types should hew through the rabble like grass until, bloodied and battle worn, they stand triumphant. Instead, they wind up muffing to-hit rolls against one mook.

The game could stick with logarithmic power curves and narrow tiers of level-appropriate monsters, but I think better fixes exist.

For example, cleave-like maneuvers help by spreading damage across a string of attacks, but if your fighter’s first attack misses, your turn finishes and all the goblins laugh at you. Next’s whirlwind attack maneuver lets a fighter attack several adjacent enemies with a single attack roll, but fanning a bunch of goblins somehow seems even less heroic than missing just one.

Is the medicine worse than the disease?

Earlier editions of the game offer a solution, a solution so odious that I hesitate to mention it. If fighters gain multiple attacks per round, the misses matter less because there’s more where that came from!

Multiple attacks stink because resolution takes too long, especially if the fighter must roll damage and resolve each attack before moving on to the next swing. Also, D&D’s designers have struggled to parcel out extra attacks as fighters gain levels. Jumping from one attack directly to two results in a rather sudden leap in power.  Instead, AD&D gave fighters extra half attacks, and a need to remember half attacks.  Third edition traded half attacks and the memory issue for weaker attacks and fiddly attack penalties. Yuck.

Multiple attacks also solve a problem Mike Mearls mentioned in a tweet.  “Ability mod to damage unbalances at low levels, is irrelevant at high levels.” Without multiple attacks per round, a high-level fighter’s strength bonus to damage becomes inconsequential. With multiple attacks, each attack benefits from the bonus.

If D&D Next’s designers can find a good way to allow fighters and brutish monsters to gain multiple attacks against weaker opponents, then a key piece of the Next design puzzle falls into place.

Next:  Tracking initiative (I’m done with theory for a while.)

D&D Next trades to-hit bonuses for enhanced damage

(This post continues a discussion I started in “What does D&D have to do with ironclad ships?”)

As I discussed in “Riding the power curve,” the next iteration of Dungeons & Dragons attempts to straighten out fourth edition’s logarithmic power curve by refusing to let characters benefit from both steep bonuses to hit and big increases to damage. Instead, characters mostly get increases to damage.

When we compare D&D Next to early editions, Next limits the to-hit bonuses characters gain as they advance in exchange for greater bonuses to the damage they inflict.

Before I delve into the benefits and drawbacks of this exchange, I ought to address two practical objections to trading to-hit bonuses for damage.

Should skill increase damage?

Some argue that a more skillful combatant’s blows should not deal more damage. After all, a crossbow bolt always hits with the same force, so it should always strike with the same damage. Personally, when I’m struck by a crossbow bolt, I care deeply about where it hits. Maybe that’s just me.

Miyamoto MusashiAs I explained, in “The brilliance of unrealistic hit points,” hit points in D&D work as a damage-reduction mechanic. As characters increase in level, their rising hit points reduce the effective damage they suffer. Reasonably, as characters increase in level, they could also grow better at inflicting damage by overcoming defenses to strike vulnerable places or to apply more force to a blow.  I’m no Miyamoto Musashi, but I’ve earned enough bruises sparring with practice swords to know that finding an opening to tap an opponent demands less skill than finding enough room for a kill strike─or even a cut.

And if you worry about unusual cases of oozes struck by crossbows, adjust at the table.

“The primary thing when you take a sword in your hands is your intention to cut the enemy, whatever the means. Whenever you parry, hit, spring, strike or touch the enemy’s cutting sword, you must cut the enemy in the same movement. It is essential to attain this. If you think only of hitting, springing, striking or touching the enemy, you will not be able actually to cut him.” Miyamoto Musashi, The Book of Five Rings

Hits inflict more than damage

In D&D, a hit can bring the threat of poison, level drain, and many other secondary effects. In these cases, the attack’s damage matters less than dealing the hit. A higher level character’s chance to hit improves less, so their chance of inflicting secondary effects sees little improvement.

This matters, but it matters less than you may think.

First, to-hit rolls take a much smaller place in D&D Next than in 4E. D&D Next switches from non-AC defenses back to saving throws. Virtually all spell attacks return to skipping the to-hit roll entirely.

Second, attacks versus AC return to focusing on damage. To an extent, I liked how 4E added tactical richness to combat by devising interesting attacks. However, for my taste, too many effects appeared in play. I grew tired of seeing combatants perched on stacks of Alea markers, unable to do anything but stand and make saves.

In D&D Next, as in early editions, weapon attacks mostly inflict damage, and the attacks that threaten something like poison or level drain usually come from monsters.

carrion crawlerThird, the saving throw returns as a defense against bad things other than damage. In 4E, hits against AC can inflict crippling effects without saves. Just getting hit automatically subjects you to poison, or paralysis, or whatever. In older editions, when the spider bit or the ghoul clawed, you took then damage but you also saved versus poison or paralysis. I appreciate 4E’s streamlined system, but dropping the defensive saving throw contributed to battlefields bogged down with more conditions and other markers than even the designers anticipated.

D&D Next brings back saving throws as a defense against effects like poison and level-drain. We no longer need to rely on to-hit rolls as the final test of whether a poisoned dagger drew enough blood to overcome your constitution. Because monsters make most of the attacks that poison, paralyze, drain, and so on, most players should be happy to see the save return.  Plus, despite the extra roll, the save probably speeds play by reducing the number harmful conditions that take effect.

Despite these three points, in D&D next, your high-level character is weaker when she makes attacks versus AC to inflict crippling effects. If I were to design, say, a poisoner class, I would make their chance to hit nearly automatic, and focus on saving throws as the principle defense against poison.

Next: Changing the balance of power

Riding the power curve through D&D’s editions

(This post continues a discussion I started in “What does D&D have to do with ironclad ships?”)

Signed Greyhawk CoverIn the very first set of Dungeons & Dragons (1974) rules, every weapon dealt 1d6 damage. Short of magic, characters could only improve their damage output by improving their bonus to hit. More hits equals more damage. Soon, Supplement I: Greyhawk (1975) gave different weapons different damage dice and introduced the strength bonus to damage. Since then, each edition seems to give characters more ways to hit for more damage.

By the fourth edition, as characters leveled, they enjoyed steep increases in to-hit bonuses matched with unprecedented increases in the damage each attack dealt. This contributed to characters increasing exponentially in power. It explains why 4E monsters only remain effective for narrow bands of levels, and it explains the nervous ticks of every DM who has run an epic table. In past editions, only the wizard saw that kind of power curve, and the non-wizards eventually grew tired of serving as wand caddies for the Wiz.

D&D Next aims to create a power curve in line with earlier editions, while preventing the runaway power traditional for wizards. If you prefer the exponential power curve created in 4E, then you might have to look for a legendary hero module in Next, or stick with 4E and bless any dungeon master eager to run a high-level game.

Greyhawk also introduced Weapon Armor Class Adjustment, a chart that granted bonuses to hit based how well your particular weapon worked against a style of armor. The table only makes sense because, in the original game, armor class really represented a particular style of armor, such as leather or chainmail. Obviously, dexterity and magical bonuses to armor class quickly turned the table into nonsense. (If you want to make sense of the table, you must apply the dexterity and magical modifiers as penalties to the attack roll.) In practice, no one used the table and the “class” in armor class lost significance.

While D&D Next thankfully steers clear of weapon armor class adjustment, the system returns to the older practice of making armor class a measure of actual armor, or at least something equivalent.

The D&D Next approach brings back a problem that has bedeviled every edition of the game except fourth. In D&D, to-hit bonuses rise automatically, level after level, while armor class remains roughly the same. Sure, as characters acquire better equipment, armor class improves a little, and in most D&D editions AC starts a little ahead. But characters gain to-hit bonuses automatically, and eventually, inevitably, to-hit bonuses outrun armor class. Everyone begins to hit all the time. As I explained in “Hitting the to-hit sweet spot,” D&D works best when combatants hit between 30% and 70% of the time.

Fourth edition fixes the problem by granting everyone automatic increases to AC to match their automatic increases in to-hit bonuses. Now armor class becomes a function of a character or monster’s role and its level. Any reasonably optimal character would boast the same AC as peers in the same role. Armor exists as the flavorful means that some characters used to reach the armor class dictated by their role. This kept armor classes on par with bonuses to hit, while making monster design simple.

armorD&D Next attacks the old problem from the opposite direction. Instead of matching automatic increase with automatic increase, D&D next limits to-hit bonuses so they never overwhelm the relatively static range of armor classes.

In 4E, in defense as in offense, characters increase exponentially in power. The fixed AC bonuses that 4E granted with each level combined with rising hit points to grant everyone steady increases to two forms of defense. You automatically get harder to hit even as the hits do less effective damage. If you’re twice as hard hit and you can sustain twice the damage, your defenses are four times better.

D&D next attempts to straighten out the logarithmic power curve by refusing to let characters double-dip. Rather than gaining steep bonuses to hit along with increases to damage, you just get increases to damage. Rather than gaining constant improvements to armor class along with additional hit points, you just gain addition hit points. Of course, I’m simplifying to make a point. Characters still gain bonuses to hit as they advance, but they gain at a fraction of the rate seen in third and fourth edition.

When we compare D&D Next to early editions of D&D, the design reins in the to-hit bonuses characters gain as they advance. In compensation, characters gain greater bonuses to the damage they inflict. Like any design decision, this strategy makes some trade offs, which I will explore in an upcoming post.

Next: Bounded accuracy and matters of taste

Hitting the to-hit sweet spot

(This post continues a discussion I started in “What does D&D have to do with ironclad ships?”)

Through the evolution of Dungeons & Dragons, the game uses two mechanics to determine an attack’s damage output: to-hit rolls and damage rolls.

In D&D, hitting is fun, while missing feels like a bummer, particularly if you waited a while for your turn. Why not remove misses from the game?  Once characters gain enough hit points to survive a few attacks, D&D could drop to-hit rolls and─with a few adjustments─still work.

Skipping damage rolls

Back in the third-edition era, Jonathan Tweet advised dungeon masters to speed high-level fights by substituting average damage for rolled damage. In fourth edition, first-level characters have enough hit points to make this approach possible at any level. In a The Dungeon Master Experience post, Chris Perkins explains how he uses nearly average damage plus 1d6, added for a bit of flavor.

The Chivalry & Sorcery game (1977) skipped damage rolls, assuming “that the damage inflicted by a particular weapon in the hands of given character will be more or less constant.”

The notion of dropping to-hit rolls may seem strange but here’s the crux: Against high hit points, to-hit rolls just turn into another damage-reduction mechanic. In fourth edition, everyone but minions has fairly high hit points and everyone hits about 60% of the time. The damage dealt in combat would work out about the same if we just assumed everyone always hits and multiplied everyone’s hit points by 1.67.

Of course, your to-hit percentage varies from a set percentage, because armor can make hitting more difficult. How can a system without to hit rolls account for armor class? In my post, “The brilliance of unrealistic hit points,” I explained how hit points in D&D function as an ingenious damage-reduction mechanic. Virtually every tabletop role-playing game that aimed for realism made armor reduce damage. In our hypothetical rules variant, why not use D&D’s damage-reduction mechanic to represent protective armor? Suppose different types of armor multiplied your hit points. Suppose high dexterity increased hit points, giving more ability to “turn deadly strikes into glancing blows,” just like the game says.

Does abandoning to-hit rolls seem crazy? It has been done.

Tunnels & TrollsTunnels and Trolls (1975) stands as the one early system that defied RPG hobby’s early mania for realism. Ken St Andre, T&T’s designer, aimed for greater simplicity. T&T drops the to-hit roll entirely. Combatants simply weigh damage rolls against each other. Like Tunnels & Trolls, most RPG-inspired video games seem to drop anything equivalent to a to-hit roll. The damage simply rises as you click away at your enemy.

Drawbacks of always hitting

While undeniably simpler, and probably about as realistic as D&D, the you-always-hit approach suffers three problems:

  • Especially with ranged attacks, rolling to hit fits real-world experience much closer than assuming a hit and skipping straight to damage. Technically speaking, skipping the to-hit roll feels bogus.
  • The two possible outcomes of a to-hit roll offer more drama than the mostly-average damage roll.
  • The to-hit roll provides intermittent, positive reinforcement to the process of making an attack.

You know all about positive reinforcement. It makes you stay up late chasing one more level when you should be in bed. Reinforcement that randomly rewards a behavior is more powerful than reinforcement that occurs every time. In a casino, the best bet comes from the change machines, which pay 1-for-1 every single time. Intermittent, positive reinforcement drives people to the slot machines. As the Id DM might say, “The variable ratio schedule produces both the highest rate of responding and the greatest resistance to extinction.” In short, hitting on some attacks is more fun than hitting on every attack.

Although D&D uses a d20 roll to hit, the game plays best when the odds of hitting stand closer to a coin flip. At the extremes, the math gets strange. For combatants who almost always hit, bonuses and penalties become insignificant. For combatants looking for a lucky roll, small bonuses can double or triple their chance to hit. Also, the game plays better when your chance of hitting sits around 60%. When your nears 95%, the roll becomes a formality. When your chance dwindles toward 5%, then the to-hit the roll feels like a waste of time, especially because beating the long odds probably still earns minimal damage.

When the chance of hitting stays between, say, 30% and 70%, the game reaps play benefits:

  • The to-hit roll provides intermittent, positive reinforcement to the process of making an attack.
  • Many attacks can hit and inflict damage, providing constant, positive feedback to players while everyone contributes to the fight.

So D&D designers face the challenge of arranging to-hit bonuses and armor classes so to-hit rolls require a result between a 6 and 14, a precious small sweet spot for all the levels, magic, and armor classes encompassed by the game. In practice, designers like to push your chance to hit toward the top of the 30-70% range, because while missing remains part of the game, it’s no fun.

The Palladium Role-Playing Game (1983) recognized the play value of the to-hit roll, but it avoided D&D’s problem of finessing to-hit bonuses and armor classes to reach a sweet spot. The game simply declares that “any [d20] roll above four (5 to 20) hits doing damage to the opponent.” Armor in the game takes damage, effectively serving as an additional pool of hit points.

In upcoming posts, I’ll discuss the very different steps the D&D Next and 4E designers took to find the to-hit sweet spot.

Next: Riding the power curve