Tag Archives: to-hit rolls

D&D’s Inconspicuous Phrases That You Notice Once You Master the Rules

Despite using common language, the Dungeons & Dragons rules feature such precise wording that a close reading answers most questions and foils many schemes to break the game. You can tell that the designers dreamed up plenty of min-maxing exploits, and then engineered text that prevented any shenanigans.

Sometimes the implications of the game’s precise phrasing take experience to spot.

For example, the description for alchemist’s fire says, “Make a ranged attack against a creature or object, treating the alchemist’s fire as an improvised weapon.” That text includes plenty to unpack. Alchemist’s fire is treated as an improvised weapon, so unless you’re a tavern brawler, you don’t add your proficiency bonus to attack. Because the throw counts as a ranged attack, you add your Dexterity bonus to your attack roll. Most players miss the next implication: Ranged attacks add your Dexterity bonus to the damage roll. The specific rule for alchemist’s fire changes the general rule for when a ranged attack inflicts damage. “On a hit, the target takes 1d4 fire damage at the start of each of its turns.” As with any other damage bonus, the one for Dexterity only adds to the attack once.

(For another example of how a close reading of the rules differs from the common interpretation, check out the strict method for rolling damage from a magic missile.)

As I learned the D&D rules, I noticed phrases that once seemed innocuous, but that now reveal importance.

For example, consider the phrase “that you can see” in spell descriptions. Many spells require the caster to see the target of an effect. Invisibility rates as the game’s most potent defensive spell because so much magic requires sight for targeting. Sometimes the phrase “that you can see” turns against the players. Spirit Guardians lets casters spare any number of creatures they can see from the spell’s effect. Any invisible or otherwise out-of-sight allies must suffer the guardians’ effects.

Many monsters can cast spells “requiring no material components.” This enables a flameskull to cast Fireball despite lacking pockets full of bat guano and sulfur. (Flameskulls also cast without somatic components—an essential accommodation for their lack of hands.)

Monsters able to cast spells “requiring no components” gain a significant advantage: These creatures can cast spells without being interrupted by a Counterspell. “To be perceptible, the casting of a spell must involve a verbal, somatic, or material component.” With no components, no one notices the casting until it finishes.

The monsters able to cast without components mainly fall into two categories:

• psionic creatures like githyanki and mind flayers
• constructs

Many character features allow extra attacks “when you use the Attack action,” which creates a limitation that often goes unnoticed. For example, a monk’s extra unarmed strike requires an Attack action, so a monk cannot just take the Dash or Dodge action and then use a Bonus action to get some licks in. This same phrase prevents two-weapon rangers from casting a spell, and then making an attack with their off-hand weapon.

Most extra attacks delivered “when you use the Attack action” cost a Bonus action, but the barbarian’s Form of the Beast feature lets you make extra claw attacks as part of your Attack action. This enables such barbarians to rage and to still make that extra attack.

The D&D rules overload the terms “attack,” “melee,” and “ranged,” giving them different meanings in different contexts. That can fuel confusion. The Attack action usually includes an attack (unless you choose to grapple). But sometimes you can make an attack with a Bonus action, often “when you use the Attack action.” Spellcasters can take the Cast a Spell action, and then make a spell attack with something like a Fire Bolt. Spells like Booming Blade and Green-Flame Blade have you to make a melee attack (and not a spell attack) with a weapon as part of the Cast a Spell action.

No wonder the 2nd edition of Pathfinder attempts to cut the fog by calling a single attack a strike.

“Melee” and “ranged” can describe types of weapons and types of attacks. Usually the weapons and attacks stay in their lanes, but when you hurl a melee weapon it crosses into oncoming traffic.

A melee weapon, such as a dagger or handaxe, remains a melee weapon even when you make a ranged attack by throwing it. Normally a ranged attack adds your Dexterity bonus to damage, but the thrown property can change that general rule. The thrown property says, “If the weapon is a melee weapon, you use the same ability modifier for that attack roll and damage roll that you would use for a melee attack with the weapon. If you throw a dagger, you can use either your Strength or your Dexterity, since the dagger has the finesse property.”

When used to make a ranged attack, melee weapons that lack the thrown property count as improvised weapons. They add your Dexterity bonus to the attack and damage rolls, and deal 1d4 damage.

If I were king of D&D, my edition would adopt “strike” for a single attack, and I would consider phrases like “close attack” and “distance attack” in place of the overworked “ranged” and “melee.”

Sometimes a close reading of the D&D rules leads to interpretations that might differ from what the designers first intended. Perhaps lead designer Jeremy Crawford got questions about sneak attack, reviewed the rules, and then thought, I didn’t mean that, but it still works.

Your rogue can use the sneak attack feature “once per turn,” but it’s not limited to your turn. During a round, rogues can sneak attack on their turn and again on someone else’s turn, typically when a foe provokes an opportunity attack.

For spells like Wall of Fire and Blade Barrier, the distinction between turns and rounds also becomes important. These spells deal damage the first time you enter their effect on a turn—anyone’s turn. This means that if a monster gets forced through a Wall of Fire on consecutive turns, they accumulate more damage in a round than if they had just stayed in the fire. I suppose you get used to the heat.

The Tangled Origins of D&D’s Armor Class, Hit Points, and Twenty-Sided Die Rolls To-Hit

In 1977, when I first read the Dungeon & Dragons basic rules, the way armor class improved as it shrunk from 9 to 2 puzzled me. Shouldn’t higher numbers be better? Players just used AC to find a row on a table, so rising ACs would have worked as well. Magic armor introduced negative ACs, making the descending numbers even more awkward. Also, many of the demons described in 1976 in the Eldrich Wizardry supplement sported negative armor class.

D&D’s designers seemed to think rising armor classes made more sense. The game rules stemmed from co-creator Gary Gygax’s Chainmail rules for miniature-figure battles. Chainmail rated armor from 1 to 8, with better armor gaining higher values. Co-creator Dave Arneson based his Blackmoor fantasy campaign on Chainmail. His campaign developed into D&D. In Blackmoor, higher armor classes represented better armor.

So how did the first D&D rules set the puzzling convention of descending armor class?

The answer lies toward the end of the genesis of D&D’s combat system.

In the original D&D rule books, the combat system that everyone used appears as the Alternative Combat System. “Alternative” because players could just use the combat system from Chainmail instead. When Dave launched Blackmoor, he tried the Chainmail system. But it focused on battles between armies sprinkled with legendary heroes and monsters. For ongoing adventures in the dungeon under Castle Blackmoor, the rules needed changes. Original Blackmoor player Greg Svenson recalls that within about a month of play, the campaign created new rules for damage rolls and hit points. (More recently, Steve Winter, a D&D designer since 1st edition, tells of playing the original game with the Chainmail combat rules.)

Much of what we know about how Dave adapted the rules for his Blackmoor campaign comes from two sources: a 2004 interview and The First Fantasy Campaign, a raw publication of notes for his game. Most quotes in this post come from those sources.

Chainmail’s melee combat matrix

To resolve melee combat, Chainmail used a combat matrix. Players matched the attacking weapon or creature against the defender, rolled a pair of 6-sided dice, and consulted the table for an outcome. “That was okay for a few different kinds of units, but by the second weekend we already had 20 or 30 different monsters, and the matrix was starting to fill up the loft.”

Dave abandoned the matrix and extended Chainmail’s rules for missile attacks to melee combat. In Chainmail, ranged attackers rolled 2d6, and tried to roll higher than a target number based on increasing armor classes. Blackmoor gained melee to-hit rolls.

Chainmail’s man-to-man combat and ranged combat tables

In Chainmail, creatures lacked hit points, so a single hit killed. But with extraordinary individuals like heroes, wizards, and dragons, a saving throw allowed a last chance to survive. For example, the rules say, “Dragon fire will kill any opponent it touches, except another Dragon, Super Hero, or a Wizard, who is saved on a two dice roll of 7 or better.”

Under rules where one hit destroyed a character, Dave tried to spare player characters by granting saving throws against any hit. “Thus, although [a character] might be ‘Hit’ several times during a melee round, in actuality, he might not take any damage at all.”

But the system of saving throws still made characters too fragile to suit players. “It didn’t take too long for players to get attached to their characters, and they wanted something detailed which Chainmail didn’t have,” Dave explains.

Chainmail battle on a sand table

“I adopted the rules I’d done earlier for a Civil War game called Ironclads that had hit points and armor class. It meant that players had a chance to live longer.” In a Chainmail battle that featured armies spanning a sand table, hit points would have overwhelmed players with bookkeeping. But the Blackmoor players liked the rule. “They didn’t care that they had hit points to keep track of because they were just keeping track of little detailed records for their character and not trying to do it for an entire army. They didn’t care if they could kill a monster in one blow, but they didn’t want the monster to kill them in one blow.”

When players rolled characters, they determined hit points. For monsters, hit points were set based “on the size of the creature physically and, again, on some regard for its mythical properties.” Dave liked to vary hit points among individual monsters. To set the strength of a type of monster while rolling for an individual’s hit points, he probably invented hit dice.

Dave said he took the armor class from Ironclads, but the concept came from Chainmail and the term came from its 1972 revisions. I suspect Dave meant that he pulled the notion of hit points and damage from a naval game that featured both armor ratings and damage points. Game historian Jon Peterson explains, “The concepts of armor thickness and withstanding points of damage existed in several naval wargames prior to Chainmail.” Still, nobody has found the precise naval rules that inspired Dave. Even his handwritten rules for ironclad battles lack properties resembling armor class. Perhaps he just considered using the concept in a naval game before bringing the notion to D&D.

In Blackmoor, Dave sometimes used hit locations. Perhaps naval combat inspired that rule. When ships battle, shells that penetrate to a boiler or powder keg disable more than a cannonball through the galley. Likewise, in man-to-man combat, a blow to the head probably kills.

Dave’s rules for hit locations only reached D&D in the Blackmoor supplement, which came a year after the game’s release. But hit locations made combat more complicated and dangerous. Realistic combat proved too deadly for the dungeon raids in D&D. So D&D players never embraced hit locations. Even Dave seemed to save the rule for special occasions. “Hit Location was generally used only for the bigger critters, and only on a man-to-man level were all the options thrown in. This allowed play to progress quickly even if the poor monsters suffered more from it.” Dave ran a fluid game, adapting the rules to suit the situation.

By the time Dave’s fantasy game established hit points, 2d6 to-hit rolls, and damage rolls, he showed the game to Gary Gygax.

Next: Gary Gygax improves hit points by making them more unrealistic, and then adds funny dice

Fourth Edition Proved D&D Works Without Saving Throws, So Why Did They Come Back?

Fourth edition dropped saving throws in favor of to-hit rolls and showed that D&D works without saves.

Mathematically, to-hit rolls and saving throws just flip the numbers so that a high roll benefits the person casting the die. Rather than having a lightning bolt trigger saves, why not just let wizards make lightning attacks against their targets? Why not just have poison attack a character’s fortitude?

By dropping saving throws, the fourth-edition designers eliminated a redundant mechanic. The change added consistency and elegance to D&D. Wizards finally got to cast spells and to make attack rolls.

If banishing saving throws made D&D more elegant, why did fifth edition bring them back? After all, the fifth-edition designers made elegance a key goal for their design. See From the brown books to next, D&D tries for elegance.

Until fourth edition, saving throws survived based on tradition and feel.

The tradition dates to when Tony Bath had toy soldiers saving verses arrows. (See my last post.) The fifth-edition designers aimed to capture tradition, but also the best qualities of earlier editions. Why not capture some of the elegant design of fourth edition?

The feel comes from a sense that the player controlling the most active character should roll the dice. D&D could drop to-hit rolls in favor of saves versus swords, but that feels wrong. On the other hand, characters seem active when they resist a charm, shake off a ghoul’s paralysis, or spring away from rushing flames. Sure, a wizard is saying magic words, a dragon is exhaling, but the action focuses on the heroes escaping the flames.

Plus, the saving throw mechanic tends to send a few more rolls to the players. Players like to roll dice, especially when the roll decides their character’s fate. When attack rolls replaced saving throws, spellcasters got to make more attack rolls, but most characters lack spells. Without saving throws, players flamed by dragon breath never get to take fate in their hands and roll a save. Instead, they just subtract damage.

So saving throws returned to D&D.

If saving throws and attack rolls share a common place in the game, what makes them different from each other?

As a dungeon master, have you ever asked a player dodging a trap’s darts to make a dexterity or reflex save? I have. I handled it wrong. Don’t fault me too much. A save gives a character a chance to escape. Characters springing away from darts or scything blades or falling stones seem to deserve a save. But that intuition is wrong. Such traps should make attacks. The Dungeon Master’s Guide never spells out this distinction.

Just as the reflex defense and AC in fourth edition defended against different sorts of attacks, in fifth edition, dexterity saves and armor class apply to different hazards. The difference comes from armor. D&D’s lead designer Mike Mearls explains that to determine whether to use an attack roll or a save, ask “Would a suit of plate mail protect from this?” Armor protects against darts, scythes, and so on, so traps using such hazards make attacks. Poisonous fumes, lightning, and mind blasts all ignore armor, so targets make saves. I would rather face a fireball protected by plate, but the rules emphasize the agility needed to escape the flames.

Originally, Tony Bath’s saving throws represented the value of armor. Now, saving throws only apply when armor can’t help.

Mearls confesses that the D&D rules don’t always make this save-or-attack distinction consistently. Plate mail certainly protects against falling rocks, and the falling-rock traps in the third-edition Dungeon Master’s Guide all make attacks. But the falling-rock traps in Lost Mine of Phandelver prompt dexterity saves. Better to leap from harm’s way, I suppose.

One area of inconsistency irks me.

Why should plate armor protect against the incorporeal, life-draining touch of creatures like specters and wraiths? Here, tradition and feel led the D&D designers to use attack rolls in a place where saving throws make more sense. If insubstantial creatures forced a target to make a dexterity saving throw, their life draining would imitate third edition’s touch attacks without a single extra rule. Plus, these undead would play like more distinct and interesting threats. Forget the feel of a to-hit roll, incorporeal creatures should force saving throws.

Proficiency and bounded accuracy in D&D Next

In my last post, I wrote about how the Dungeons & Dragons Next proficiency bonus jams all the tables and rules for attack bonuses and saving throw bonuses and check bonuses into a single rising bonus. This consolidation yields a simpler system, but the proficiency mechanic influences every corner of the game.

Attack roll tables from D&D Rules Cyclopedia

Attack roll tables from D&D Rules Cyclopedia

Proficiency bonuses increase slowly compared to similar bonuses in earlier versions of the game. They top at a mere +6 at 19th level. This slow progression stems from a principle the designers called bounded accuracy, because none of the designers come from the marketing team. Actually, “accuracy” refers to bonuses to the d20 rolls made to-hit, land spells, and make checks. Accuracy is “bounded” because the game no longer assumes characters will automatically gain steep bonuses as they advance to higher levels. See the Legends and Lore post, “Bounded Accuracy” for more.

Bonus to attack

Before third-edition D&D, armor class never rose much. In “‘To Hit’ vs. Armor Class,” longtime D&D designer Steve Winter charts the progression between to-hit rolls and AC. Steve concludes, “In AD&D, as characters advance up the level scale, they constantly gain ground against the monsters’ defenses. A 15th-level fighter doesn’t just hit lower-level monsters more often; he hits all monsters, even those of his own level, more reliably than before.”

This meant that rising attack bonuses eventually made attack rolls into a formality. Mechanically that works, because in early editions, as fighters’ gained levels, their damage increased not because each blow dealt more damage, but because they hit more often.

But attack rolls benefit D&D for two reasons:

  • Hit-or-miss attack rolls add fun. To-hit rolls offer more drama than damage rolls, and the rolls provide intermittent, positive reinforcement to attacks. See “Hitting the to-hit sweet spot” for more.
  • If to-hit bonuses overwhelm armor bonuses, armor and armor class becomes meaningless to high-level combatants. Perhaps this finally explains the chainmail bikini.

To keep attack rolls meaningful, fourth edition makes ACs rise automatically, even though nothing in the game world justifies the rise. (You might say that the rise in AC reflects combatants’ rising ability to evade attacks, but a rise in hit points reflects the same slipperiness.) The steep rise in AC meant that lower-level creatures couldn’t hit higher-level combatants and forced all battles to feature combatants of similar levels. In 4E, physical armor just provides a flavorful rational for the AC number appropriate for a level and role.

D&D Next returns to the older practice of making armor class a measure of actual armor, or at least something equivalent. At high levels, the game keeps to-hit rolls meaningful by limiting the proficiency bonus to that slight +6 at 19th level. With such a small bonus, to-hit rolls never climb enough to make armor pointless. For more, see “Bounded accuracy and matters of taste.”

In the last public playtest, and for the first time in D&D history, every class shares the same attack bonuses. In Next, characters don’t stand out as much for how often they hit as for what happens when they hit.

Bonus to checks

In third and fourth editions, characters gained steep bonuses to skill checks as they advanced in levels. Each game managed the bonuses in a different way, and each approach led to different problems.

In 3E, characters who improved the same skills with every level became vastly better at those skills than any character who lacked the skill. Eventually, DCs difficult enough to challenge specialists become impossible for parties that lacked a specialist. On the other hand, DCs easy enough to give non-specialists a chance become automatic for specialists. By specialists, I don’t mean a hyper-optimized, one-trick character, just a character who steadily improved the same skills.

In 4E, skills grant a constant, +5 bonus, and every character gains a half-level bonus to every check, so everyone gets steadily better at everything. This approach means that no character grows vastly better than their peers at the same level. It does mean that by level 10, a wizard with an 8 strength gains the ability to smash down a door as well as a first-level character with an 18 strength. To keep characters challenged, and to prevent suddenly mighty, strength-8 wizards from hulking out, 4E includes the “Difficulty class by level” table which appears on page 126 of the Rules Compendium. With this table in play, characters never improve their chance of making any checks, they just face higher DCs. Most players felt like their characters walked a treadmill that offered no actual improvements.

For more on checks in 3E and 4E, see “Two problems that provoked bounded accuracy.”

With the proficiency bonuses, D&D Next attempts to thread a needle. High-level bonuses should not reach so high that challenges for proficient characters become impossible for the rest. But the bonuses should go high enough to give proficient characters a chance to stand out and shine.

At the top end, a 19th-level character with an suitable 20 ability score and proficiency will enjoy a +11 to checks. This bonus falls well within the 1-20 range of a die roll, so most tasks within reach of specialist also fall within the ability of an lucky novice. If anything, the maximum +11 for a talented, proficient, level-20 superhero seems weak.

Two bonuses form that +11, the proficiency bonus and the ability modifier. To me, a proficiency bonus that starts at +2 at level 1 and rises to +6 at level 19 threads the needle well enough.

New characters gain a +2 proficiency bonus as opposed to the +4 or +5 skill bonuses in the last two editions. This paints new D&D Next characters as beginners, little better than untrained. New characters must rely on talent to gain an edge.

However, talented characters barely gain any edge either. Typical new characters gain a +3 ability modifier from their highest score. I’ve shown that ability modifiers are too small for checks. Players make 11.3 attack rolls for every 1 check, according to plausible research that I just made up. With so many attacks, a +3 to-hit bonus lands extra hits. With so few checks, a +3 bonus ranks with the fiddly little pluses that the designers eliminate in favor of the advantage mechanic.

The playtest package’s DM Guidelines advise skipping ability checks when a character uses a high ability score: “Take into account the ability score associated with the intended action. It’s easy for someone with a Strength score of 18 to flip over a table, though not easy for someone with a Strength score of 9.” The D&D Next rules demand this sort of DM intervention because the system fails to give someone with Strength 18 a significant edge over a Strength 9 character. The result of the d20 roll swamps the puny +4 bonus. In practice, the system math makes flipping the table only sightly easier at strength 18.

Update: The published game grants level-one characters a +2 proficiency bonus as opposed to the +1 that appeared in the final playtest.

In a curious move, the final public playtest packet eliminates the Thievery skill. Instead, the designers opt to make thieves proficient with thieves’ tools. Why? This results from the elimination of fiddly little pluses such as the +2 once granted by thieves’ tools. Without the +2, why bother with the tools? Now thieves need the tools to gain their proficiency bonus. Somewhere, sometime, a confused player will add a proficiency bonus that they assume they have for thievery, to a bonus for the tools, and double-dip two bonuses.

Next: Saving throw proficiency and ghouls

Changing the balance of power

(This post continues a discussion I started in “What does D&D have to do with ironclad ships?”)

Axe_of_Dwarvish_LordsSkip Williams‘s second edition adventure Axe of Dwarvish Lords staged a type of battle no Dungeons & Dragons adventure has tried before or since. This adventure pitted 13-15 level characters against a warren full of goblins. As you might expect, the warren’s individual goblins typically only hit on a 20, and only because everything hits on a 20. If one earned a lucky shot, he would inflict minimal damage.  With any edition’s standing rules, 13th-level character faced with goblins would simply grind out countless attacks against inconsequential resistance. With any edition’s standing rules, this scenario fails. So Skip cheated, I mean, he designed new rules. The adventure adds two pages of rules for group tactics that allow the goblins to do things like volley arrows in area attacks, and to combine melee attacks to earning bonuses to hit. In this fourth-edition era, we’re used to monsters making exceptions to the rules, but not in 1999. Back then, monsters broke the rules because a bad DM thought he could win D&D. Personally, I liked the way the new rules enabled an otherwise unplayable confrontation, but when the goblins start breaking the rules as previously understood, I can imagine some players calling a cheat.

For the first time in D&D’s history, the next iteration attempts to enable playable confrontations between powerful characters and hordes of weak monsters, without resorting to special rules. The key, as I discussed in “Hitting the to-hit sweet spot,” is arranging everyone’s to-hit bonuses and armor classes into the small range that grants everyone a reasonable chance to hit.

D&D Next hits the sweet spot by limiting the to-hit bonuses characters gain in exchange for greater bonuses to the damage they inflict.

This exchange intentionally shifts one aspect of the game’s balance of power.

Low-power combatants benefit against high-power opposition

Mobs of weak monsters can threaten higher level characters, still be able to hit, and let their numbers overcome the characters’ higher hit points. On the flip side, the dungeon master can pit parties against fewer, more powerful monsters, without having to select monsters specifically designed as a solos or elites. This re-enables the sort of sandbox play where players can choose a difficulty level by plunging as deep into the dungeon as they dare.

High-power combatants lose against low-power opposition

When your legendary hero faces goblins, the damage each blow deals hardly matters, because dead is dead. But your hero’s chance of hitting a lowly goblin rarely improves. Your hero feels like a zero.

Meanwhile, in the DM’s chair, if you want to pit a single giant against a party of lower-level characters, the fight can go badly. The giant’s one attack often misses, but when it hits, it kills. As a DM, I still prefer a solo with lots of attacks, each inflicting lower damage. If monster designers look to give brutes alternate attacks that threaten many targets at once, then we enjoy the best of both worlds.

Fighters suffer the most

The accuracy-for-damage trade matters most to fighters. Fireball and Blade Barrier work as well as ever. The rogue remains content to sneak up on the goblin king. But fighter-types should hew through the rabble like grass until, bloodied and battle worn, they stand triumphant. Instead, they wind up muffing to-hit rolls against one mook.

The game could stick with logarithmic power curves and narrow tiers of level-appropriate monsters, but I think better fixes exist.

For example, cleave-like maneuvers help by spreading damage across a string of attacks, but if your fighter’s first attack misses, your turn finishes and all the goblins laugh at you. Next’s whirlwind attack maneuver lets a fighter attack several adjacent enemies with a single attack roll, but fanning a bunch of goblins somehow seems even less heroic than missing just one.

Is the medicine worse than the disease?

Earlier editions of the game offer a solution, a solution so odious that I hesitate to mention it. If fighters gain multiple attacks per round, the misses matter less because there’s more where that came from!

Multiple attacks stink because resolution takes too long, especially if the fighter must roll damage and resolve each attack before moving on to the next swing. Also, D&D’s designers have struggled to parcel out extra attacks as fighters gain levels. Jumping from one attack directly to two results in a rather sudden leap in power.  Instead, AD&D gave fighters extra half attacks, and a need to remember half attacks.  Third edition traded half attacks and the memory issue for weaker attacks and fiddly attack penalties. Yuck.

Multiple attacks also solve a problem Mike Mearls mentioned in a tweet.  “Ability mod to damage unbalances at low levels, is irrelevant at high levels.” Without multiple attacks per round, a high-level fighter’s strength bonus to damage becomes inconsequential. With multiple attacks, each attack benefits from the bonus.

If D&D Next’s designers can find a good way to allow fighters and brutish monsters to gain multiple attacks against weaker opponents, then a key piece of the Next design puzzle falls into place.

Next:  Tracking initiative (I’m done with theory for a while.)

D&D Next trades to-hit bonuses for enhanced damage

(This post continues a discussion I started in “What does D&D have to do with ironclad ships?”)

As I discussed in “Riding the power curve,” the next iteration of Dungeons & Dragons attempts to straighten out fourth edition’s logarithmic power curve by refusing to let characters benefit from both steep bonuses to hit and big increases to damage. Instead, characters mostly get increases to damage.

When we compare D&D Next to early editions, Next limits the to-hit bonuses characters gain as they advance in exchange for greater bonuses to the damage they inflict.

Before I delve into the benefits and drawbacks of this exchange, I ought to address two practical objections to trading to-hit bonuses for damage.

Should skill increase damage?

Some argue that a more skillful combatant’s blows should not deal more damage. After all, a crossbow bolt always hits with the same force, so it should always strike with the same damage. Personally, when I’m struck by a crossbow bolt, I care deeply about where it hits. Maybe that’s just me.

Miyamoto MusashiAs I explained, in “The brilliance of unrealistic hit points,” hit points in D&D work as a damage-reduction mechanic. As characters increase in level, their rising hit points reduce the effective damage they suffer. Reasonably, as characters increase in level, they could also grow better at inflicting damage by overcoming defenses to strike vulnerable places or to apply more force to a blow.  I’m no Miyamoto Musashi, but I’ve earned enough bruises sparring with practice swords to know that finding an opening to tap an opponent demands less skill than finding enough room for a kill strike─or even a cut.

And if you worry about unusual cases of oozes struck by crossbows, adjust at the table.

“The primary thing when you take a sword in your hands is your intention to cut the enemy, whatever the means. Whenever you parry, hit, spring, strike or touch the enemy’s cutting sword, you must cut the enemy in the same movement. It is essential to attain this. If you think only of hitting, springing, striking or touching the enemy, you will not be able actually to cut him.” Miyamoto Musashi, The Book of Five Rings

Hits inflict more than damage

In D&D, a hit can bring the threat of poison, level drain, and many other secondary effects. In these cases, the attack’s damage matters less than dealing the hit. A higher level character’s chance to hit improves less, so their chance of inflicting secondary effects sees little improvement.

This matters, but it matters less than you may think.

First, to-hit rolls take a much smaller place in D&D Next than in 4E. D&D Next switches from non-AC defenses back to saving throws. Virtually all spell attacks return to skipping the to-hit roll entirely.

Second, attacks versus AC return to focusing on damage. To an extent, I liked how 4E added tactical richness to combat by devising interesting attacks. However, for my taste, too many effects appeared in play. I grew tired of seeing combatants perched on stacks of Alea markers, unable to do anything but stand and make saves.

In D&D Next, as in early editions, weapon attacks mostly inflict damage, and the attacks that threaten something like poison or level drain usually come from monsters.

carrion crawlerThird, the saving throw returns as a defense against bad things other than damage. In 4E, hits against AC can inflict crippling effects without saves. Just getting hit automatically subjects you to poison, or paralysis, or whatever. In older editions, when the spider bit or the ghoul clawed, you took then damage but you also saved versus poison or paralysis. I appreciate 4E’s streamlined system, but dropping the defensive saving throw contributed to battlefields bogged down with more conditions and other markers than even the designers anticipated.

D&D Next brings back saving throws as a defense against effects like poison and level-drain. We no longer need to rely on to-hit rolls as the final test of whether a poisoned dagger drew enough blood to overcome your constitution. Because monsters make most of the attacks that poison, paralyze, drain, and so on, most players should be happy to see the save return.  Plus, despite the extra roll, the save probably speeds play by reducing the number harmful conditions that take effect.

Despite these three points, in D&D next, your high-level character is weaker when she makes attacks versus AC to inflict crippling effects. If I were to design, say, a poisoner class, I would make their chance to hit nearly automatic, and focus on saving throws as the principle defense against poison.

Next: Changing the balance of power

Riding the power curve through D&D’s editions

(This post continues a discussion I started in “What does D&D have to do with ironclad ships?”)

Signed Greyhawk CoverIn the very first set of Dungeons & Dragons (1974) rules, every weapon dealt 1d6 damage. Short of magic, characters could only improve their damage output by improving their bonus to hit. More hits equals more damage. Soon, Supplement I: Greyhawk (1975) gave different weapons different damage dice and introduced the strength bonus to damage. Since then, each edition seems to give characters more ways to hit for more damage.

By the fourth edition, as characters leveled, they enjoyed steep increases in to-hit bonuses matched with unprecedented increases in the damage each attack dealt. This contributed to characters increasing exponentially in power. It explains why 4E monsters only remain effective for narrow bands of levels, and it explains the nervous ticks of every DM who has run an epic table. In past editions, only the wizard saw that kind of power curve, and the non-wizards eventually grew tired of serving as wand caddies for the Wiz.

D&D Next aims to create a power curve in line with earlier editions, while preventing the runaway power traditional for wizards. If you prefer the exponential power curve created in 4E, then you might have to look for a legendary hero module in Next, or stick with 4E and bless any dungeon master eager to run a high-level game.

Greyhawk also introduced Weapon Armor Class Adjustment, a chart that granted bonuses to hit based how well your particular weapon worked against a style of armor. The table only makes sense because, in the original game, armor class really represented a particular style of armor, such as leather or chainmail. Obviously, dexterity and magical bonuses to armor class quickly turned the table into nonsense. (If you want to make sense of the table, you must apply the dexterity and magical modifiers as penalties to the attack roll.) In practice, no one used the table and the “class” in armor class lost significance.

While D&D Next thankfully steers clear of weapon armor class adjustment, the system returns to the older practice of making armor class a measure of actual armor, or at least something equivalent.

The D&D Next approach brings back a problem that has bedeviled every edition of the game except fourth. In D&D, to-hit bonuses rise automatically, level after level, while armor class remains roughly the same. Sure, as characters acquire better equipment, armor class improves a little, and in most D&D editions AC starts a little ahead. But characters gain to-hit bonuses automatically, and eventually, inevitably, to-hit bonuses outrun armor class. Everyone begins to hit all the time. As I explained in “Hitting the to-hit sweet spot,” D&D works best when combatants hit between 30% and 70% of the time.

Fourth edition fixes the problem by granting everyone automatic increases to AC to match their automatic increases in to-hit bonuses. Now armor class becomes a function of a character or monster’s role and its level. Any reasonably optimal character would boast the same AC as peers in the same role. Armor exists as the flavorful means that some characters used to reach the armor class dictated by their role. This kept armor classes on par with bonuses to hit, while making monster design simple.

armorD&D Next attacks the old problem from the opposite direction. Instead of matching automatic increase with automatic increase, D&D next limits to-hit bonuses so they never overwhelm the relatively static range of armor classes.

In 4E, in defense as in offense, characters increase exponentially in power. The fixed AC bonuses that 4E granted with each level combined with rising hit points to grant everyone steady increases to two forms of defense. You automatically get harder to hit even as the hits do less effective damage. If you’re twice as hard hit and you can sustain twice the damage, your defenses are four times better.

D&D next attempts to straighten out the logarithmic power curve by refusing to let characters double-dip. Rather than gaining steep bonuses to hit along with increases to damage, you just get increases to damage. Rather than gaining constant improvements to armor class along with additional hit points, you just gain addition hit points. Of course, I’m simplifying to make a point. Characters still gain bonuses to hit as they advance, but they gain at a fraction of the rate seen in third and fourth edition.

When we compare D&D Next to early editions of D&D, the design reins in the to-hit bonuses characters gain as they advance. In compensation, characters gain greater bonuses to the damage they inflict. Like any design decision, this strategy makes some trade offs, which I will explore in an upcoming post.

Next: Bounded accuracy and matters of taste

Hitting the to-hit sweet spot

(This post continues a discussion I started in “What does D&D have to do with ironclad ships?”)

Through the evolution of Dungeons & Dragons, the game uses two mechanics to determine an attack’s damage output: to-hit rolls and damage rolls.

In D&D, hitting is fun, while missing feels like a bummer, particularly if you waited a while for your turn. Why not remove misses from the game?  Once characters gain enough hit points to survive a few attacks, D&D could drop to-hit rolls and─with a few adjustments─still work.

Skipping damage rolls

Back in the third-edition era, Jonathan Tweet advised dungeon masters to speed high-level fights by substituting average damage for rolled damage. In fourth edition, first-level characters have enough hit points to make this approach possible at any level. In a The Dungeon Master Experience post, Chris Perkins explains how he uses nearly average damage plus 1d6, added for a bit of flavor.

The Chivalry & Sorcery game (1977) skipped damage rolls, assuming “that the damage inflicted by a particular weapon in the hands of given character will be more or less constant.”

The notion of dropping to-hit rolls may seem strange but here’s the crux: Against high hit points, to-hit rolls just turn into another damage-reduction mechanic. In fourth edition, everyone but minions has fairly high hit points and everyone hits about 60% of the time. The damage dealt in combat would work out about the same if we just assumed everyone always hits and multiplied everyone’s hit points by 1.67.

Of course, your to-hit percentage varies from a set percentage, because armor can make hitting more difficult. How can a system without to hit rolls account for armor class? In my post, “The brilliance of unrealistic hit points,” I explained how hit points in D&D function as an ingenious damage-reduction mechanic. Virtually every tabletop role-playing game that aimed for realism made armor reduce damage. In our hypothetical rules variant, why not use D&D’s damage-reduction mechanic to represent protective armor? Suppose different types of armor multiplied your hit points. Suppose high dexterity increased hit points, giving more ability to “turn deadly strikes into glancing blows,” just like the game says.

Does abandoning to-hit rolls seem crazy? It has been done.

Tunnels & TrollsTunnels and Trolls (1975) stands as the one early system that defied RPG hobby’s early mania for realism. Ken St Andre, T&T’s designer, aimed for greater simplicity. T&T drops the to-hit roll entirely. Combatants simply weigh damage rolls against each other. Like Tunnels & Trolls, most RPG-inspired video games seem to drop anything equivalent to a to-hit roll. The damage simply rises as you click away at your enemy.

Drawbacks of always hitting

While undeniably simpler, and probably about as realistic as D&D, the you-always-hit approach suffers three problems:

  • Especially with ranged attacks, rolling to hit fits real-world experience much closer than assuming a hit and skipping straight to damage. Technically speaking, skipping the to-hit roll feels bogus.
  • The two possible outcomes of a to-hit roll offer more drama than the mostly-average damage roll.
  • The to-hit roll provides intermittent, positive reinforcement to the process of making an attack.

You know all about positive reinforcement. It makes you stay up late chasing one more level when you should be in bed. Reinforcement that randomly rewards a behavior is more powerful than reinforcement that occurs every time. In a casino, the best bet comes from the change machines, which pay 1-for-1 every single time. Intermittent, positive reinforcement drives people to the slot machines. As the Id DM might say, “The variable ratio schedule produces both the highest rate of responding and the greatest resistance to extinction.” In short, hitting on some attacks is more fun than hitting on every attack.

Although D&D uses a d20 roll to hit, the game plays best when the odds of hitting stand closer to a coin flip. At the extremes, the math gets strange. For combatants who almost always hit, bonuses and penalties become insignificant. For combatants looking for a lucky roll, small bonuses can double or triple their chance to hit. Also, the game plays better when your chance of hitting sits around 60%. When your nears 95%, the roll becomes a formality. When your chance dwindles toward 5%, then the to-hit the roll feels like a waste of time, especially because beating the long odds probably still earns minimal damage.

When the chance of hitting stays between, say, 30% and 70%, the game reaps play benefits:

  • The to-hit roll provides intermittent, positive reinforcement to the process of making an attack.
  • Many attacks can hit and inflict damage, providing constant, positive feedback to players while everyone contributes to the fight.

So D&D designers face the challenge of arranging to-hit bonuses and armor classes so to-hit rolls require a result between a 6 and 14, a precious small sweet spot for all the levels, magic, and armor classes encompassed by the game. In practice, designers like to push your chance to hit toward the top of the 30-70% range, because while missing remains part of the game, it’s no fun.

The Palladium Role-Playing Game (1983) recognized the play value of the to-hit roll, but it avoided D&D’s problem of finessing to-hit bonuses and armor classes to reach a sweet spot. The game simply declares that “any [d20] roll above four (5 to 20) hits doing damage to the opponent.” Armor in the game takes damage, effectively serving as an additional pool of hit points.

In upcoming posts, I’ll discuss the very different steps the D&D Next and 4E designers took to find the to-hit sweet spot.

Next: Riding the power curve