Category Archives: Role-playing game design

Riding the power curve through D&D’s editions

(This post continues a discussion I started in “What does D&D have to do with ironclad ships?”)

Signed Greyhawk CoverIn the very first set of Dungeons & Dragons (1974) rules, every weapon dealt 1d6 damage. Short of magic, characters could only improve their damage output by improving their bonus to hit. More hits equals more damage. Soon, Supplement I: Greyhawk (1975) gave different weapons different damage dice and introduced the strength bonus to damage. Since then, each edition seems to give characters more ways to hit for more damage.

By the fourth edition, as characters leveled, they enjoyed steep increases in to-hit bonuses matched with unprecedented increases in the damage each attack dealt. This contributed to characters increasing exponentially in power. It explains why 4E monsters only remain effective for narrow bands of levels, and it explains the nervous ticks of every DM who has run an epic table. In past editions, only the wizard saw that kind of power curve, and the non-wizards eventually grew tired of serving as wand caddies for the Wiz.

D&D Next aims to create a power curve in line with earlier editions, while preventing the runaway power traditional for wizards. If you prefer the exponential power curve created in 4E, then you might have to look for a legendary hero module in Next, or stick with 4E and bless any dungeon master eager to run a high-level game.

Greyhawk also introduced Weapon Armor Class Adjustment, a chart that granted bonuses to hit based how well your particular weapon worked against a style of armor. The table only makes sense because, in the original game, armor class really represented a particular style of armor, such as leather or chainmail. Obviously, dexterity and magical bonuses to armor class quickly turned the table into nonsense. (If you want to make sense of the table, you must apply the dexterity and magical modifiers as penalties to the attack roll.) In practice, no one used the table and the “class” in armor class lost significance.

While D&D Next thankfully steers clear of weapon armor class adjustment, the system returns to the older practice of making armor class a measure of actual armor, or at least something equivalent.

The D&D Next approach brings back a problem that has bedeviled every edition of the game except fourth. In D&D, to-hit bonuses rise automatically, level after level, while armor class remains roughly the same. Sure, as characters acquire better equipment, armor class improves a little, and in most D&D editions AC starts a little ahead. But characters gain to-hit bonuses automatically, and eventually, inevitably, to-hit bonuses outrun armor class. Everyone begins to hit all the time. As I explained in “Hitting the to-hit sweet spot,” D&D works best when combatants hit between 30% and 70% of the time.

Fourth edition fixes the problem by granting everyone automatic increases to AC to match their automatic increases in to-hit bonuses. Now armor class becomes a function of a character or monster’s role and its level. Any reasonably optimal character would boast the same AC as peers in the same role. Armor exists as the flavorful means that some characters used to reach the armor class dictated by their role. This kept armor classes on par with bonuses to hit, while making monster design simple.

armorD&D Next attacks the old problem from the opposite direction. Instead of matching automatic increase with automatic increase, D&D next limits to-hit bonuses so they never overwhelm the relatively static range of armor classes.

In 4E, in defense as in offense, characters increase exponentially in power. The fixed AC bonuses that 4E granted with each level combined with rising hit points to grant everyone steady increases to two forms of defense. You automatically get harder to hit even as the hits do less effective damage. If you’re twice as hard hit and you can sustain twice the damage, your defenses are four times better.

D&D next attempts to straighten out the logarithmic power curve by refusing to let characters double-dip. Rather than gaining steep bonuses to hit along with increases to damage, you just get increases to damage. Rather than gaining constant improvements to armor class along with additional hit points, you just gain addition hit points. Of course, I’m simplifying to make a point. Characters still gain bonuses to hit as they advance, but they gain at a fraction of the rate seen in third and fourth edition.

When we compare D&D Next to early editions of D&D, the design reins in the to-hit bonuses characters gain as they advance. In compensation, characters gain greater bonuses to the damage they inflict. Like any design decision, this strategy makes some trade offs, which I will explore in an upcoming post.

Next: Bounded accuracy and matters of taste

Hitting the to-hit sweet spot

(This post continues a discussion I started in “What does D&D have to do with ironclad ships?”)

Through the evolution of Dungeons & Dragons, the game uses two mechanics to determine an attack’s damage output: to-hit rolls and damage rolls.

In D&D, hitting is fun, while missing feels like a bummer, particularly if you waited a while for your turn. Why not remove misses from the game?  Once characters gain enough hit points to survive a few attacks, D&D could drop to-hit rolls and─with a few adjustments─still work.

Skipping damage rolls

Back in the third-edition era, Jonathan Tweet advised dungeon masters to speed high-level fights by substituting average damage for rolled damage. In fourth edition, first-level characters have enough hit points to make this approach possible at any level. In a The Dungeon Master Experience post, Chris Perkins explains how he uses nearly average damage plus 1d6, added for a bit of flavor.

The Chivalry & Sorcery game (1977) skipped damage rolls, assuming “that the damage inflicted by a particular weapon in the hands of given character will be more or less constant.”

The notion of dropping to-hit rolls may seem strange but here’s the crux: Against high hit points, to-hit rolls just turn into another damage-reduction mechanic. In fourth edition, everyone but minions has fairly high hit points and everyone hits about 60% of the time. The damage dealt in combat would work out about the same if we just assumed everyone always hits and multiplied everyone’s hit points by 1.67.

Of course, your to-hit percentage varies from a set percentage, because armor can make hitting more difficult. How can a system without to hit rolls account for armor class? In my post, “The brilliance of unrealistic hit points,” I explained how hit points in D&D function as an ingenious damage-reduction mechanic. Virtually every tabletop role-playing game that aimed for realism made armor reduce damage. In our hypothetical rules variant, why not use D&D’s damage-reduction mechanic to represent protective armor? Suppose different types of armor multiplied your hit points. Suppose high dexterity increased hit points, giving more ability to “turn deadly strikes into glancing blows,” just like the game says.

Does abandoning to-hit rolls seem crazy? It has been done.

Tunnels & TrollsTunnels and Trolls (1975) stands as the one early system that defied RPG hobby’s early mania for realism. Ken St Andre, T&T’s designer, aimed for greater simplicity. T&T drops the to-hit roll entirely. Combatants simply weigh damage rolls against each other. Like Tunnels & Trolls, most RPG-inspired video games seem to drop anything equivalent to a to-hit roll. The damage simply rises as you click away at your enemy.

Drawbacks of always hitting

While undeniably simpler, and probably about as realistic as D&D, the you-always-hit approach suffers three problems:

  • Especially with ranged attacks, rolling to hit fits real-world experience much closer than assuming a hit and skipping straight to damage. Technically speaking, skipping the to-hit roll feels bogus.
  • The two possible outcomes of a to-hit roll offer more drama than the mostly-average damage roll.
  • The to-hit roll provides intermittent, positive reinforcement to the process of making an attack.

You know all about positive reinforcement. It makes you stay up late chasing one more level when you should be in bed. Reinforcement that randomly rewards a behavior is more powerful than reinforcement that occurs every time. In a casino, the best bet comes from the change machines, which pay 1-for-1 every single time. Intermittent, positive reinforcement drives people to the slot machines. As the Id DM might say, “The variable ratio schedule produces both the highest rate of responding and the greatest resistance to extinction.” In short, hitting on some attacks is more fun than hitting on every attack.

Although D&D uses a d20 roll to hit, the game plays best when the odds of hitting stand closer to a coin flip. At the extremes, the math gets strange. For combatants who almost always hit, bonuses and penalties become insignificant. For combatants looking for a lucky roll, small bonuses can double or triple their chance to hit. Also, the game plays better when your chance of hitting sits around 60%. When your nears 95%, the roll becomes a formality. When your chance dwindles toward 5%, then the to-hit the roll feels like a waste of time, especially because beating the long odds probably still earns minimal damage.

When the chance of hitting stays between, say, 30% and 70%, the game reaps play benefits:

  • The to-hit roll provides intermittent, positive reinforcement to the process of making an attack.
  • Many attacks can hit and inflict damage, providing constant, positive feedback to players while everyone contributes to the fight.

So D&D designers face the challenge of arranging to-hit bonuses and armor classes so to-hit rolls require a result between a 6 and 14, a precious small sweet spot for all the levels, magic, and armor classes encompassed by the game. In practice, designers like to push your chance to hit toward the top of the 30-70% range, because while missing remains part of the game, it’s no fun.

The Palladium Role-Playing Game (1983) recognized the play value of the to-hit roll, but it avoided D&D’s problem of finessing to-hit bonuses and armor classes to reach a sweet spot. The game simply declares that “any [d20] roll above four (5 to 20) hits doing damage to the opponent.” Armor in the game takes damage, effectively serving as an additional pool of hit points.

In upcoming posts, I’ll discuss the very different steps the D&D Next and 4E designers took to find the to-hit sweet spot.

Next: Riding the power curve

The brilliance of unrealistic hit points

(This post continues a discussion I started in “What does D&D have to do with ironclad ships?”)

After the role-playing game hobby’s first 10 years, designers turned from strict realism and began to design rules that both supported a game’s flavor and encouraged its core activities. Runequest‘s realistically lethal combat systemParanoia 1st edition game fit the fearful world of Call of Cthulhu (1981), as did a new sanity system. Paranoia (1984) built in rules that encouraged a core activity of treachery, while giving each character enough clones to avoid hard feelings.

Today, this innovation carries through stronger then ever. Dungeons and Dragons’ fourth-edition designers saw D&D’s fun in dynamic battles and showing off your character’s flashy capabilities, so they optimized rules that heightened that aspect of the game, possibly to the expense of other aspects.

When Dave Arneson mashed rules for ironclads into Chainmail, he probably gave little thought to supporting the D&D play style that would launch a hobby, but he created some brilliant conventions.

Chainmail gameThe best idea was to give characters steadily increasing hit point totals that “reflect both the actual physical ability of the character to withstand damage─as indicated by constitution bonuses─and a commensurate increase in such areas as skill in combat and similar life-or-death situations, the ‘sixth sense’ which warns the individual of otherwise unforeseen events, sheer luck and the fantastic provisions of magical protections and/or divine protection.” (Gary wrote this rationale for hit points in the first edition Dungeon Master’s Guide.)

Every “realistic” system to follow D&D used hit points to measure a character’s body’s physical capacity to survive injury. In D&D, rising hit points work as an elegant damage-reduction mechanic. Using hit points for damage reduction boasts a number of virtues:

  • Combat plays fast because players do not have to calculate reduced damage for every single hit.
  • Although damage is effectively reduced, the reduction never makes a combatant impervious to damage.
  • Once characters gain enough points to survive a few blows, hit points provide a predictable way to see the course of battle. If a fight begins to go badly, the players can see their peril and bring more resources like spells and potions to the fight, or they can run. In a realistic fight, things can go bad in an instant, with a single misstep resulting in death.
  • Most attacks can hit and inflict damage, providing constant, positive feedback to players while everyone contributes to the fight. Realistic combatants do not wear down from dozens of damaging blows; instead each hit is likely to kill or maim. In more realistic systems like Runequest and GURPS, when two very skilled combatants face off, they block or dodge virtually all attacks. The duels turn static until someone muffs a defense roll and lets a killing blow slip through. This model may be realistic─it reminds me of those Olympic competitions where years of training turn on a single, split-second misstep─but the realistic model lacks fun. No popular sports begin as sudden-death competitions where the first to score wins.
  • Battles can gain a dramatic arc. Fights climax with bloodied and battle-worn combatants striving to put their remaining strength into a killing blow. No one likes to see the climactic battle fizzle with a handful of bad rolls, especially at their character’s expense.

Bottom line: Using hit points for damage reduction enables a combat system where you can hit a lot, and hitting is fun.

Critics of inflated hit points still had a point. Using hit points as a damage-reduction mechanic can strain credulity, especially when you cannot explain how a character could reasonably reduce the damage he takes. Why should an unconscious or falling hero be so much more durable than a first-level mook?  Why does cure light wounds completely heal the shopkeeper and barely help a legendary hero? Over the years, we’ve seen attempts to patch these problems. For example, I liked how fourth edition’s healing surge value made healing proportional to hit points, so I’m sorry to see D&D Next turn back to the traditional hierarchy of cure spells.

D&D maintains a deliberate vagueness about the injuries inflicted by a hit. This abstraction makes possible D&D’s brilliant use of hit points as a damage-reduction mechanic. Fourth edition exploits the ambiguity more than ever, making plausible the second wind and the healing power of a warlord’s inspiration. 4E explicitly makes hit points as much a measure of resolve as of skill, luck and physical endurance. Damage apparently exists as enough of an abstraction that even if a hit deals damage, it doesn’t necessarily draw blood.

Even as 4E aims for the loosest possible interpretation of a hit, it makes the hit roll more important than in any prior edition. In 4E, melee hits can inflict crippling effects without saves. Just getting hit automatically subjects you to poison, or paralysis, or whatever. In past editions, if the spider bit or the ghoul clawed, you took the damage, but you still got an immediate save.

In the early days of the RPG hobby, many games attempted to fuse D&D’s fantastic setting with a more realistic model of combat damage. Although a few of these games enjoyed success, none recreated the combat-intensive, dungeon-bashing play style pioneered by D&D. At the time, no one seemed to realize that the clever damage-reduction mechanism built into game enabled the game’s play style.

Video game designers figured it out. Virtually every video game that combines fighting with character improvement features D&D-style rising hit points.

Next: Hitting the to-hit sweet spot

What does D&D have to do with ironclad ships?

Dave ArnesonWhen Dave Arneson set out to create the combat system that would become a pillar of Dungeons & Dragons, he did not aim to create a realistic simulation.  In a 2004 interview, he describes the system’s genesis from Gary Gygax’s Chainmail rules.

Combat in Chainmail is simply rolling two six-sided dice, and you either defeated the monster and killed it…or it killed you. It didn’t take too long for players to get attached to their characters, and they wanted something detailed which Chainmail didn’t have. The initial Chainmail rules was a matrix. That was okay for a few different kinds of units, but by the second weekend we already had 20 or 30 different monsters, and the matrix was starting to fill up the loft.

I adopted the rules I’d done earlier for a Civil War game called Ironclads that had hit points and armor class. It meant that players had a chance to live longer and do more. They didn’t care that they had hit points to keep track of because they were just keeping track of little detailed records for their character and not trying to do it for an entire army. They didn’t care if they could kill a monster in one blow, but they didn’t want the monster to kill them in one blow.

So the D&D rules for hit points and armor class stem from rules for ironclad ships trading cannon blasts, hardly the basis for an accurate simulation of hand-to-hand battles.

Soon after I began playing D&D, the unrealistic combat rules began to gnaw at me. In the real world, armor reduces the damage from blows rather than making you harder to hit. Shouldn’t it work the same way in the game? And how could a fighter, no matter how heroic, survive a dozen arrow hits, each dealing enough damage to kill an ordinary man? In reality, a skilled fighter would stand a better chance of evading blows, but no better chance of surviving a single hit.

Quest for realism

In the decade after D&D’s introduction, a mania for creating realistic alternatives to D&D dominated the hobby. Every D&D player who ever wielded a sword in the Society of Creative Anachronism cooked up a more realistic alternative to the D&D combat system. Runequest (1978) stands as the greatest early success. Characters’ hit points remained constant, but they became more able to dodge and block blows. Hit locations transformed characters from blobs of hit points into flesh and bone. Armor reduced damage by deflecting and cushioning blows. Arms Law and Claw Law

If you enjoyed the AD&D Weapon Armor Class Adjustment table, but felt it needed to go much, much further, the Rolemaster Arm’s Law (1980) system offered more than 30 tables matching weapons versus armor.

In this era, everyone formulated a critical hit table, because nothing adds fun to a system like skewered eyes, fountaining stumps, and sucking chest wounds. (Follow this blog for my upcoming list of supposedly fun, but not fun, things we did in the early days of role playing.)

I sought realism as much as anyone, first with Runequest, and then with GURPS. I quickly learned that making combat more realistically deadly made D&D-style, combat-intensive play impractical. Forget dungeon crawls; even skilled characters would eventually perish to a lucky blow. As I described in Melee, Wizard, and learning to love the battle map, early D&D combat lacked excitement anyway, so I hardly missed all the fights.

But I would come to realize that my dismissal of the D&D combat system was completely wrong.

Next: The brilliance of unrealistic hit points

Designing for spells that spoil adventures

In my last two posts, starting with Spells that can ruin adventures, I discussed the various spells with the potential to spoil Dungeons & Dragons adventures, turning hours of fun into a quick ambush. You may say, “Why worry? Just rule that these spells don’t exist in your campaign.” Clearly, you have enough foresight to carefully examine the spell lists, establishing a list of dangerous spells and magic items that might ruin your campaign plans. Of course, you could also rule that Zone of Truth doesn’t exist in your game the minute it becomes a problem. But your players will hate that.

The D&D system’s spells and magic contribute to an implied setting that most D&D players and DMs share. As a DM, you can ban spells, but that offers no help for authors of adventures for organized play or for publication. Authors writing D&D fiction also must work around these spells, or ignore them and hope the readers fail to notice.

The fourth edition attempted to eliminate every last adventure ruining effect. Fly effects really just let you jump. The ethereal plane is gone, or at least inaccessible. Linked portals replace the long-range teleport spell. While I favor this approach over keeping all the problem spells in in the system, I concede that the purge might have been heavy handed.

So that brings us to today. Seeing Zone of Truth in the D&D Next spell list inspired me to write these posts. These spells and effects need careful weighing of the benefits they offer to the game, and more thought to how they effect adventures and the implied game setting.

For the designers of D&D, I have the following suggestions:

  • Spells that compel honesty or discern lies do not add enough to the game to earn a place in the game. These spells could exist as a optional elements.
  • Spells that detect evil should only detect the supernatural evil of undead, outsiders and the like.
  • Divination spells must provide hints and clues rather than unequivocal answers, and should discourage players from seeking answers too often.
  • Scry spells must be subject to magical and mundane counters such as the metal sheeting that blocked Clairvoyance and Clairaudience in the first edition.
  • Scry spells should never target creatures, like Scrying, but only known locations, like Clairvoyance and Clairaudience.
  • Ethereal travel must be subject to barriers such as gorgon’s blood mortar, permanent ethereal objects, and perhaps even vines, as mentioned in the original Manual of the Planes.
  • The game should offer some magical countermeasures to teleportation, such as Anticipate Teleport, and the ability to make these spells permanent.
  • The Dungeon Master’s Guide needs a chapter on magical effects that the DM should plan for in campaign and adventure design, starting with fly and divination.

Next: But how do you win?

The Dungeon Master’s Guide 2 remakes the skill challenge

(Part 4 of a series, which begins with Evolution of the skill challenge.)

Just a year after fourth edition’s debut, the Dungeon Master’s Guide 2 upended the original skill challenge. The new material makes just one specific revision to the original rules:  It provides new numbers for challenge complexity and difficulty class to address serious problems with skill challenge math.

Beyond the numbers, I suspect the designers sought to remake the skill challenge as much as possible without scrapping the existing rules. The big changes come from original rules that are now ignored, and from advice and examples that completely remake how challenges run at the table.

The Dungeon Master’s Guide 2 strips away the formal game-within-a-game implied by the original skill challenge: The structure of rolling for initiative and taking turns is gone; the new summary contains no mention of it. In the example skill challenge, the players jump in to act as they wish.

I disliked the original, story-game style implied by the original skill challenge rules, and welcomed the new advice. But the core of the original skill challenge rules remained, and some friction existed between those original rules and the recast skill challenge. In this post, I will explore some points of friction, and discuss some ways to overcome them.

Scoring with failed checks discourages broad participation

The 4E designers tried the match the formulas for constructing a combat encounter with similar formulas for a skill challenge. So a skill challenge’s complexity stems from the number and difficulty of successes required─an odd choice in a way. You don’t grant experience in a combat encounter by counting how many attacks score hits.

This scorekeeping works fine when you run a skill challenge as a collaborative storytelling game within a game

In the original skill challenge, every character had a turn, and no one could pass. This forced every player to participate. The new challenge drops the formal structure, leaving the DM with the job of getting everyone involved. The DMG2 helps with advice for involving every character. However, the players know three failed skill checks add up to a failed challenge, so now some players will fight against making any checks for fear of adding to an arbitrary count of failures and contributing to a failed challenge. This stands in total opposition to the original ideal where everyone contributes.

Obviously, some failed skill checks will bring the players closer to a disaster, by alerting the guards, collapsing the tunnel, or whatever. On the other hand, the foreseeable, game-world consequences of some failures do not lead to disaster, yet players worry about attempting, say, an innocuous knowledge check because they metagame the skill challenge.

Hint: You can encourage more players to participate in a skill challenge by forcing the characters to tackle separate tasks simultaneously. For instance, if the characters only need to gain the support of the head of the merchant council, then typically one player makes all the diplomacy rolls. If the characters must split up to convince every member of the merchant council before their vote, then every player must contribute. Just give the players enough information to know which methods of persuasion will work best on which members of the council.

Scorekeeping may not match game world

In the story-game style of the original skill challenge, the players’ score can exist as a naked artifice of the game, just like the turns the rules forced them to take. I suspect that the original vision of the skill challenge assumed the DM would tell players their score of successes and failures. After all, the players could even keep accurate score themselves. This avoided the need to provide game-world signs of success or failure as the players advanced through the challenge. After the skill challenge finished, you could always concoct a game-world explanation for the challenge’s outcome.

Now on page 83, the DMG2 tells you to “grant the players a tangible congruence for the check’s success or failure (as appropriate), one that influences their subsequent decisions.” (In word choices like “tangible congruence,” Gary’s spirit lives!)

This works best if the challenge’s cause of failure is different from the players’ success. For example, if the players must infiltrate the center of the enemy camp without raising an alarm, then their successes can bring them closer to their goal even as their failures raise suspicion and take them closer to failure. These sorts of challenges create a nice tension as the players draw closer to both victory and defeat.

If moving toward success necessarily moves the players away from failure, then running the challenge poses a problem.

The first Dungeon Masters Guide introduced the skill challenge mechanic with an example where the players attempt to persuade the duke before the duke grows too annoyed to listen.  Good luck role playing the duke’s demeanor as he is poised one success away from helping while also one failure away from banishing the players.

Even worse, if a skill challenge lacks any clear marker of failure, running the challenge presents a problem. The first D&D Encounters season, Halaster’s Last Apprentice, included a challenge where the players seek to find hidden chambers in the Undermountain before they amass the three failures allowed by the rules. Why do three failures end this challenge? Is it because the players grow restless and are now all on their smart phones? The adventure suggests that rival groups might be seeking the lost chambers, but it fails to capitalize on this. The adventure follows the conventional advice by taxing each player a healing surge, and then saying that they found the crypt anyway.

“Why do we lose a healing surge?”

“Well, you know, dungeon stuff.”

Why is the game turning the dungeon stuff into a die-rolling abstraction? I thought some of us liked dungeon stuff.

Hint: You can fix a lot of bad skill challenges by adding time pressure. Every failed attempt wastes time. Too many failures and time runs out. Convince the duke before he is called to the wedding that will cement his alliance with the enemy. Find the hidden crypt before the sun sets and the dead rise.

Next: Spinning a narrative around a skill challenge

Speed factor, weapon armor class adjustments, and skill challenges

(Part 3 of a series, which begins with Evolution of the skill challenge.)

The first edition of Advanced Dungeons & Dragons included lots of rules that no one uses: weapon speed factor, weapon armor class adjustments. A little of that tradition lived on in the first year of fourth edition. No one played skill challenges exactly as written in the first fourth edition Dungeon Master’s Guide. At the very least, you did not start skill challenges by rolling for initiative.

According to the book, the Dungeon Master announces a skill challenge, the players roll initiative, and then take turns deciding on a skill to use and inventing a reason why that skill might apply to the situation. No one may pass a turn.

In short, everyone interrupts the D&D game and starts playing a storytelling game.

At Gen Con 2012, Robin D. Laws, one of the authors of the 4E Dungeon Master’s Guide 2, held a panel discussion on story advice. The Tome Show podcast recorded this panel as episode 201. When giving advice on running skill challenges, Robin Laws gives a succinct description of the original skill challenge.

“What I found myself doing when I was running 4E was putting a lot more onus on the players to describe what they were doing and make it much more of a narrative world-building than just here’s these particular obstacles that you have to overcome.
“‘You go on an arduous journey. Each of you contributes in a significant way as you’re going through the desert, and some of you wind up in a disadvantageous position. So tell me what it is you do to contribute to the survival of the party.’ And then I go around the table round-robin style and everyone would have to think of something cool and defining that they might have done.”

This flips the normal play style of D&D. Normally players encounter obstacles, and then find ways to overcome them. Now the players participate in the world building, inventing complications that their skills can overcome. I’m not saying this is wrong for a game. The market is full of storytelling games where players cooperate to tell stories, a process that can include taking turns inventing complications. This sort of collaborate storytelling may even be the preferred style of play for some D&D groups, though I have to wonder why those groups would choose to play D&D over a game that better suits their interests. I argue that for a lot of D&D players, this style did not feel like D&D very much anymore, and that is why skill challenges evolved over the course of fourth edition.

Robin’s description of the players’ role in the skill challenge is particularly interesting. He says players search for “cool and defining” things they could do. That could be fun, but challenges never play out that way. Most players just search their sheets for their best skills and try to imagine ways to justify using them. I suppose under Robin’s coaching, or with a game that encourages that play style, players might seek out cool and defining things. Unlike D&D, story games can encourage that play style mechanically. For example, story games often have mechanics where you define you characters by simply listing their unique and interesting aspects. This might be as simple as coming up with as list of adjectives or keywords describing your character.

Neither D&D’s tradition nor the skill challenge mechanic encourages players to overcome the challenge by inventing cool and defining actions for their character. D&D’s mechanics encourage players to look for their highest skill bonus, and then concoct an excuse to use it. I am certain that both Robin Laws and I both agree that this strategy makes D&D less fun than it can be.

He prefers a game where players share more of the narration, world-building role. Many fun games support that that style of play, but D&D is not one of those games. (Robin mentions that his HeroQuest game inspires the way he runs skill challenges.)

When I play D&D, I want to immerse myself in the game world and think of ways to overcome obstacles. My actions might involve skill checks, by they often do not.

Less then three months after the 4E release, Mike Mearls began his Ruling Skill Challenges column. He writes, “In many ways, the R&D department at Wizards of the Coast has undergone the same growing pains and learning experiences with skill challenges, much as DMs all over the world have.” The column starts a  process of recasting the skill challenge, making it fit better with the usual D&D play style.

Next: The Dungeon Master’s Guide 2 remakes the skill challenge

The skill challenge: good intentions, half baked

(Part 2 of a series, which begins with Evolution of the skill challenge.)

The forth edition rules make the encounter the central activity of the Dungeons & Dragons game. The Dungeon Master’s Guide says, “Encounters are the exciting part of the D&D game,” (p.22) and encourages dungeon masters to shorten the intervals between encounters. “Move the PCs quickly from encounter to encounter, and on to the fun!” (p.105)

Page 105 includes more revealing advice. “As much as possible, fast-forward through the parts of an adventure that aren’t fun.  An encounter with two guards at the city gate isn’t fun.  Tell the players they get through the gate without much trouble and move on to the fun.  Niggling details about food supplies and encumbrance usually aren’t fun, so don’t sweat them, and let the players get to the adventure and on to the fun.  Long treks through endless corridors in the ancient dwarven stronghold beneath the mountains aren’t fun.”

Personally, I think that two of those activities do seem fun—especially the trek through the dwarven stronghold. I think the passage reveals something about how the 4E designers disastrously misread some of the audience for the fourth edition game, but that’s a topic for another post.

More to the point, the passage lists the sorts of interaction and exploration that skill challenges try to turn into encounters.

The 4E designers recognized that D&D includes more than combat, so they needed a game activity that gave players an opportunity to use skills and that held the same weight as the game’s core activity, the encounter. I imagine the 4E designers filling a white board with goals like these:

  • Skill challenges should be worth experience points to give them importance equal to a combat encounter.
  • Skill challenges need a difficulty and mechanical rigor similar to a combat encounter.
  • Skill challenge mechanic should enable every player to participate, not just the players with obvious skills.

The last goal reverses the early class balance of the game, in a good way. Through most of D&D history, some characters fared poorly in combat, but got a chance to shine in exploration and role playing. In the original game, thieves were not particularly useful in a fight, but fights were short and the players spent most of their time exploring, so the thief enjoyed plenty of time in the spotlight. In 4E, the rogue ranks as one of the most effective classes in combat, but every other class gets an equal chance to shine outside of combat.

The original skill challenge rules have players rolling initiative and taking turns. To make sure that everyone has a chance to contribute on their turn, players take the role of inventing circumstances where their characters can contribute. The turn structure ensures that everyone must contribute. You cannot pass a turn. “Characters must make a check on their turns using one of the identified, primary skills or they must use a different skill, if they can come up with a way to use it to contribute to the challenge.” (p.74)  This often leads to strained justifications for skill checks.

“Does the chieftain like acrobatics?  By using acrobats and interpretive dance, perhaps I can convince him not to attack the village.”

As the name suggests, skill challenges focus on skills, not on the players’ problem-solving abilities. As I wrote in Player skill without player frustration, 4E attempted to eliminate frustration by emphasizing skill checks and skill challenges over concrete obstacles and over players’ problem solving skills. When every obstacle has a DC and multiple skills, then no one gets frustrated. If you find a locked door, you can pick the lock with thievery, or break the door with strength.

The designers saw another benefit of focusing on skills. Social skills such as diplomacy, bluff, and intimidate allow players who feel uncomfortable with play-acting to contribute without stepping out of their comfort zone. As a DM, I’ve encountered plenty of players who freeze up when I encourage them to speak as their character. I think they miss a fun aspect of the game, but I don’t force it. Nonetheless, I insist players say more than, “I diplomacize the king and I roll….”

Next: Speed factor, weapon armor class adjustments, and skill challenges

Evolution of the skill challenge

When Dungeons & Dragons fourth edition came out, I found a lot to like, and one thing I hated: the skill challenge mechanic—not the underlying idea of giving non-combat activities center stage, but the rules framework of the original skill challenge. As originally presented, the skill challenge seemed like an element from a story game awkwardly forced into D&D. I’m not saying that story game elements are wrong for another game, but they do not fit with the D&D game that I want to play.

Other fourth edition players may not have shared my dislike for the original skill challenge, but the mechanic certainly puzzled them. Less then three months after the 4E release, Mike Mearls began his Ruling Skill Challenges column. He writes, “In many ways, the R&D department at Wizards of the Coast has undergone the same growing pains and learning experiences with skill challenges, much as DMs all over the world have.” This column began a process of recasting the skill challenge into something that fit more comfortably into a classic D&D game. Aside from the underlying numbers, this process never changed the underlying rules for skill challenges—though it did ignore some.

Based on the new advice, I learned to run skill challenges that stayed true to the rules, but that felt more like classic D&D. I found that I could enjoy skill challenges and I think my players liked them more too. Nonetheless, the original skill challenge rules and the original format for describing skill challenges remained, and that foundation made running them well a bit harder.

In my upcoming series of posts, I will write about the original skill challenge, how the skill challenge evolved, and how something like a skill challenge might best appear in a future iteration of D&D.

Two Problems that Provoked Bounded Accuracy

One of the key design features of D&D Next is something the designers call bounded accuracy. Bounded accuracy reins in the steady escalation of bonuses to checks and attacks that characters received in earlier editions. I love bounded accuracy.

To explain my affection, I want to consider two problems with (nearly) unbounded accuracy in the third and fourth edition.

Third and fourth edition both assumed a steep and steady increase of plusses to your skill numbers as your character advanced. This rewarded you with a sense of accomplishment as you saw your character improve, but the increases led to problems at higher levels.

In third edition, at each level, characters received an allotment of points to improve selected skills. If you reached high level, and concentrated your improvements on the same skills, you gained huge bonuses to those skills.

The huge bonuses created a dilemma for dungeon masters and authors trying to set DCs for high level adventures. You could set very high DCs that challenged players who specialized in a skill. These DCs were impossibly high for non-specialists, so if the party lacked a specialist in a particular skill, the task became flat out impossible. Alternately, you could set low enough DCs to give non-specialists a chance, but these DCs grant the specialists an automatic success.  (Again, by specialists, I just mean a character who concentrates skill improvements on the same skill, not a super-optimized character.)

Third edition assumes that the DM will justify the sky-high DCs required to challenge high-level specialists by describing obstacles of legendary proportions. At first level, the rogue must climb a rough dungeon wall; by 20th level, he must climb a glass-smooth wall covered in wet slime—in an earthquake. At first level, you must negotiate with the mayor; by twentieth level, he’s king. And you killed his dog.

In the skill section of the third edition Epic Level Handbook, the epic-level obstacles become absurd. Here we find the DC for balancing on clouds, sweet talking hostile creatures into sacrificing their lives for you, and so on. I understand that some folks enjoy playing characters as mythic, godlike creatures, but to me, that game doesn’t seem like D&D anymore. Given the rarity of epic play, I suspect I stand with the majority.

Fourth edition tried to resolve the problem of high-level DCs becoming either impossible for typical characters or automatic for specialists. The system grants every character a flat, half-level bonus to checks. Now skilled characters maintained a flat +5 bonus when compared to their peers. Everyone enjoyed steady increases, but no one fell too far behind. This approach fixed the math, but when you compare characters of different levels, it defies logic and breaks your suspension of disbelief.

By level 10, a wizard with an 8 strength, gains the same ability to smash down a wooden door as an first-level character with an 18 strength.

“Wow, Wiz, have you been working out?”

“Thanks for noticing. My strength will be 9 soon.”

Of course, Wiz never gets a chance to show off his new prowess, because those DC 16 wooden doors have all been replaced by level-appropriate, DC 20 barred doors.

In truth, the players never really advance because they stand on a treadmill.

You can see the treadmill on page 126 of the 4e Rules Compendium in the “Difficulty Class By Level” table. Using this table, your character no longer gets better at easy checks, she just faces higher DCs. That table makes Living Forgotten Realms adventures work across entire tiers.

Fourth edition is inconsistent about whether the rising DCs in the “Difficulty Class By Level” table represent increasingly legendary barriers in the game world. For example, the DCs for breaking doors rise as the doors become sturdier. But social skills tend to be pegged to the DC-by-level table. (The system just assumes you killed the king’s dog.) The Living Forgotten Realms adventures used for organized play mostly abandon any attempt to flavor the rising DCs as increasingly legendary challenges. The challenges never change, just the DCs.

By reining in the scale of skill bonuses as character’s advance, D&D Next solves both problems. The system does not reward players with the same magnitude of improvements as their character’s advances, but the small improvements are real improvements, not steps on a treadmill.

While bounded accuracy solves problems, characters still need to stand out from their peers. A specialist should stand out and enjoy a chance to shine. The current ability bonuses are too small to achieve this. You can read my opinion on ability bonuses and checks here.