Tag Archives: saving throws

A Game Design History of the Dump Stat

In 1974, Dungeons & Dragons introduced roleplaying games and—less significantly—dump stats where players set their least-useful ability to their lowest score. According to the original D&D rules, players rolled abilities in order. Actually, by the rules as written, “it is necessary for the referee to roll three six-sided dice in order,” but everyone let players roll instead. Innovations like point-buy character generation or even rearranging rolled scores were years away.

Still, original D&D had dump stats of a sort. Fighters could trade Intelligence for Strength, the fighter’s “prime requisite.” Clerics could trade Intelligence for Wisdom. Magic users could trade Wisdom for Intelligence. Every class came with at least one potential dump stat, and these exchanges cost 2 or 3 points for 1 point of the prime requisite. When I first read those offers, the exchange rates struck me as a bad deal. I was wrong. None of those classes gained anything from their dump stat, so the trades only benefited the characters. In the original rules, Strength, Intelligence, and Wisdom just brought advantages to the class that used the ability as a prime requisite. (Intelligence brought extra languages; few players cared.) The rules prevented players from reducing Constitution and Charisma, but those abilities could help every character with more hit points or more loyal followers.

Advertisment for Melee and WizardIn 1977, the hand-to-hand combat game Melee by American designer Steve Jackson showed a different and influential approach to ability scores. Melee used just two attributes, Strength and Dexterity, but the scores brought bigger mechanical effects than in D&D. Strength permitted more damaging weapons, stouter armor, and functioned as hit points. Dexterity determined to-hit rolls and who struck first. In this combat game, dueling characters needed to enter the battlefield evenly matched, so rather than rolling attributes, players bought them with points. Modern role-playing games virtually always let players build their characters, but in 1977 the point-buy system proved a massive innovation.

Also in 1977, the obscure game Superhero ’44 used a point-buy system. In Heroic Worlds (1991), D&D Designer Lawrence Schick called that game “primitive,” but also “ground breaking.” Superhero ’44 even let players trade flaws for more points. “Characters who accept weaknesses or disabilities (Kryptonite, for instance) should be awarded with extra power.” This innovation spread to games like Champions (1981), GURPS (1986), and Savage Worlds (2003).

When I played Melee, I marveled at the balance between Strength and Dexterity. Every point moved between the two attributes traded a tangible benefit for a painful detriment, and the difficult choice between stats made character generation into a fascinating choice. Just as important, the simple choice led to fighters who played differently but who proved equally effective. No other game would ever feature such a precise balance between ability scores, but with 2 scores and just one character type, Melee’s narrow scope helped.

A magic system to accompany Melee appeared in Wizard (1978). This addition introduced a third stat, Intelligence, but wizards still needed Strength to power spells and Dexterity to cast them. Intelligence became a dump stat for the original game’s fighters, while wizards gained enough from spells to offset the need to invest in three stats. When Melee and Wizard became The Fantasy Trip roleplaying game, IQ also bought skills, so some balance between stats remained.

Some games lump Strength and some of Constitution’s portfolio together. In both The Fantasy Trip and Tunnels & Trolls (1975), wizards drew from their Strength to power their spells, and since characters in both games increased stats as they advanced, experienced TFT and T&T wizards grew muscles as swollen as steroid-fueled bodybuilders.

Choosing ability scores introduced a complication avoided when players just roll. Some stats prove more useful than others. Chivalry & Sorcery included an attribute for bardic voice. No one but bards would have invested there, and C&S lacked bard as a class. Also, the attributes that power your character’s key abilities bring much more value than the rest. The original D&D rules recognized that factor in the unequal exchanges that let players increase their character’s prime requisites.

In Advanced Dungeons & Dragons (1978), the recommended technique for generating ability scores allowed players to rearrange scores any way they liked. For most classes, Intelligence just brought extra languages and Wisdom only gave a saving throw bonus against magic “involving will force,” so these abilities became favored places to dump low scores.

In D&D, the value of ability scores mainly comes from the value the scores offer to classes that don’t require them. Constitution always comes out ahead because it adds hit points and improves a common saving throw. You may never see a fifth edition class based on Constitution because the attribute offers so much already. In earlier editions of D&D, Strength proved useful because every class sometimes made melee attacks. Nowadays, classes get at-will alternatives to melee attacks that use their prime requisite.

The value of ability score depends on what characters do in a campaign, and that adds challenge to balancing. In original D&D, shrewd players paid hirelings and henchmen to accompany their dungeon expeditions and share the danger. Characters needed Charisma to recruit and keep followers, so by some measures Charisma offered more benefits than any other attribute. But not every campaign played with hirelings. The 1977 D&D Basic Set skipped the rules for hiring and retaining help, so Charisma offered no value at all unless a DM happened to improvise a Charisma check—the game lacked formal rules for checks.

A similar factor makes Strength a common dump stat in fifth edition D&D. Strength provides the potentially valuable ability to carry more stuff, and more treasure, but few players even bother accounting for carrying capacity. The rules make dealing with encumbrance an optional variant. In the original D&D games, part of the challenge of looting the dungeon came from the logistical challenge of hauling out the loot. Runequest (1978) featured an encumbrance system that allowed characters to carry a number of “things” equal to their Strength before the weight hampered them. I remember the importance this system attached to Strength and the difficult choices of armor and equipment players faced. The secret to making Strength valuable is creating an encumbrance system that players use.When encumbrance feels like an accounting exercise that players ignore, Dexterity becomes king. By selecting ranged or finesse weapons, a Dexterity based character can approach the damage of a similar character based on Strength. Plus, a high Dexterity enables an AC nearly as stout as the heaviest armor, wins initiative, and improves common Dexterity saves rather than rare Strength saves.

Fifth edition D&D makes Intelligence another common choice for a dump stat. Of the classes in the Player’s Handbook, only wizard requires Intelligence, a prime requisite that rarely figures in saving throws. (See If a Mind Flayer Fed on D&D Characters’ Brains, It Would Go Hungry. Should PC Intelligence Matter?)

Third edition D&D boosted the value of Intelligence by awarding smart characters more skills. The fifth edition designers probably weighed the same approach, but with skills serving as key traits in the two pillars of interaction and exploration, perhaps the designers opted to award skills equally to characters of any Intelligence. So unlike in earlier editions, high Intelligence no longer brings D&D characters more skills or even languages.

Obvious dump stats limit the choices that lead to effective characters. Dump stats encourage players to create characters that fit common, optimal patterns. A fifth edition D&D party may include a wide range of classes and backgrounds, but almost everyone fits the mold of healthy, agile folks with low-average Intelligence. And not even the barbarian can open a pickle jar. (He’s dex based.)

Fourth Edition Proved D&D Works Without Saving Throws, So Why Did They Come Back?

Fourth edition dropped saving throws in favor of to-hit rolls and showed that D&D works without saves.

Mathematically, to-hit rolls and saving throws just flip the numbers so that a high roll benefits the person casting the die. Rather than having a lightning bolt trigger saves, why not just let wizards make lightning attacks against their targets? Why not just have poison attack a character’s fortitude?

By dropping saving throws, the fourth-edition designers eliminated a redundant mechanic. The change added consistency and elegance to D&D. Wizards finally got to cast spells and to make attack rolls.

If banishing saving throws made D&D more elegant, why did fifth edition bring them back? After all, the fifth-edition designers made elegance a key goal for their design. See From the brown books to next, D&D tries for elegance.

Until fourth edition, saving throws survived based on tradition and feel.

The tradition dates to when Tony Bath had toy soldiers saving verses arrows. (See my last post.) The fifth-edition designers aimed to capture tradition, but also the best qualities of earlier editions. Why not capture some of the elegant design of fourth edition?

The feel comes from a sense that the player controlling the most active character should roll the dice. D&D could drop to-hit rolls in favor of saves versus swords, but that feels wrong. On the other hand, characters seem active when they resist a charm, shake off a ghoul’s paralysis, or spring away from rushing flames. Sure, a wizard is saying magic words, a dragon is exhaling, but the action focuses on the heroes escaping the flames.

Plus, the saving throw mechanic tends to send a few more rolls to the players. Players like to roll dice, especially when the roll decides their character’s fate. When attack rolls replaced saving throws, spellcasters got to make more attack rolls, but most characters lack spells. Without saving throws, players flamed by dragon breath never get to take fate in their hands and roll a save. Instead, they just subtract damage.

So saving throws returned to D&D.

If saving throws and attack rolls share a common place in the game, what makes them different from each other?

As a dungeon master, have you ever asked a player dodging a trap’s darts to make a dexterity or reflex save? I have. I handled it wrong. Don’t fault me too much. A save gives a character a chance to escape. Characters springing away from darts or scything blades or falling stones seem to deserve a save. But that intuition is wrong. Such traps should make attacks. The Dungeon Master’s Guide never spells out this distinction.

Just as the reflex defense and AC in fourth edition defended against different sorts of attacks, in fifth edition, dexterity saves and armor class apply to different hazards. The difference comes from armor. D&D’s lead designer Mike Mearls explains that to determine whether to use an attack roll or a save, ask “Would a suit of plate mail protect from this?” Armor protects against darts, scythes, and so on, so traps using such hazards make attacks. Poisonous fumes, lightning, and mind blasts all ignore armor, so targets make saves. I would rather face a fireball protected by plate, but the rules emphasize the agility needed to escape the flames.

Originally, Tony Bath’s saving throws represented the value of armor. Now, saving throws only apply when armor can’t help.

Mearls confesses that the D&D rules don’t always make this save-or-attack distinction consistently. Plate mail certainly protects against falling rocks, and the falling-rock traps in the third-edition Dungeon Master’s Guide all make attacks. But the falling-rock traps in Lost Mine of Phandelver prompt dexterity saves. Better to leap from harm’s way, I suppose.

One area of inconsistency irks me.

Why should plate armor protect against the incorporeal, life-draining touch of creatures like specters and wraiths? Here, tradition and feel led the D&D designers to use attack rolls in a place where saving throws make more sense. If insubstantial creatures forced a target to make a dexterity saving throw, their life draining would imitate third edition’s touch attacks without a single extra rule. Plus, these undead would play like more distinct and interesting threats. Forget the feel of a to-hit roll, incorporeal creatures should force saving throws.

For 25 Years, D&D Put Saving Throws In Groups Made For Just 3 Creatures and 2 Spells

Today, Dungeons & Dragons of matches saving throws to ability scores. But for most of the game’s history, D&D grouped saving throws by 5 sources: spell or staff, wand, death ray or poison, turned to stone, and dragon breath. These categories and the game’s saving throw table made me wonder: What made clerics so resistant to poison? Of all possible threats, why set aside a specific save for being turned to stone? (Also, “stone” isn’t even a source, just shorthand for a list of creatures.) Dragon breath seemed overly specific too, but at least the game’s title featured dragons. And how could the remaining 3 types of save cover every other hazard in a fantastic world?

Gary Gygax based his Chainmail rules on a 1966 pamphlet by Tony Bath titled Rules for Medieval Wargames. In those rules, attackers make a roll to hit, and then defenders roll a “saving throw” to see if their armor protects them. The heavier the armor, the easier the save.

Attack rolls in Dungeons & Dragons combine Bath’s to-hit roll and saving throw. But Gary reused the name “saving throw” for another, similar purpose. This rework begin in Chainmail, the mass-battle rules that formed the basis for D&D. In Chainmail, creatures lacked hit points, so successful attacks destroyed units. But with extraordinary individuals like heroes, wizards, and dragons, a saving throw allowed a last chance to survive. For example, the rules say, “Dragon fire will kill any opponent it touches, except another Dragon, Super Hero, or a Wizard, who is saved on a two dice roll of 7 or better.”

Chainmail’s rules for spells, magical attacks, and similar hazards led to saving throws in original D&D. The first Dungeon Master’s Guide explained the concept. “Because the player character is all-important, he or she must always—or nearly always—have a chance, no matter how small, of somehow escaping what otherwise be would be inevitable destruction.”

“Someone once criticized the concept of the saving throw as ridiculous,” Gary wrote. “Could a man chained to a rock, they asked, save himself from a blast of red dragon’s breath? Why not? I replied. Imagine that the figure, at the last moment, of course, manages to drop beneath the licking flames, or finds a crevice in which to shield his or her body, or succeeds in finding some way to be free of the fetters. Why not?” Saving throws grant a last chance, but they leave the details of why a save worked to the storytelling of the player and DM. The brilliance of hit points comes from their abstraction, and in original D&D saves are just as abstract as hit points.

Chainmail only includes saves for 4 effects: dragon breath, basilisk gaze, spider poison, and spells (just fireball and lightning). Original D&D turned those 4 saves into categories, and then added “wands” as a 5th.

The detailed numbers in the saving throw table suggest that Gary set saves to simulate nuances of the game world. However, the original saving-throw table reveals no nuances and few patterns. Fighters grow to become the best at dragon slaying. Wizards gain the best spell saves and so become best at dueling other spellcasters. But why, say, should magic users boast the best resistance to petrification? And why does the death ray from a Finger of Death spell get grouped with poison rather than staying in the tougher spell category? Apparently, when Gary Gygax told players to save or die, he also gave them a break. Go figure.

In Advanced Dungeons & Dragons, the revamped saving-throw matrix reveals more patterns.

Clerics show the best saves versus paralyzation, poison, and death magic. Perhaps divine favor helps spare them from death.

Magic users start with the best saves and end with the worst. Did Gary intend to help low-level mages and to balance high-level wizards?

Fighters start with the worst saves and end with the best. Was this designed to balance the class, or just to simulate the resilience of a front-line survivor?

Although the saving throw categories lasted 25 years, these types proved clumsy.

As D&D added countless threats, the designers folded them into the same 5 categories descended from Chainmail. For dungeon masters, these peculiar categories made guesswork of choosing the right save for a new effect.

In AD&D, characters with high wisdom or dexterity gained bonuses to saves that demanded willpower or agility. But neither bonus matched a saving-throw category. The bonuses could apply to some saves in multiple categories. Remembering these bonuses—and sometimes gaining the DM’s approval to use them—added a awkward extra step.

Third edition switched to grouping saving throws by the aptitude needed to survive the danger. Now a high wisdom or dexterity cleanly benefited will or reflex saves. Finally, a high constitution helped shake the effect of poison.

The switch made saving throws simpler, but they became less abstract. The odds of someone chained to a rock using quick reflexes to survive dragon fire seemed lower than ever—low odds that fifth edition simulates by imposing disadvantage on a restrained character’s dexterity save.

When I imagine someone making a reflex or dexterity save, I picture someone leaping and tumbling from rushing flames. Then I look at the battle map and see a figure in the same square. In my overly-complicated version of D&D, evasion would make targets move out of an area of effect.

Next: Fourth edition proved D&D works without saving throws, so why did they come back?

Would You Play With a Dungeon Master Who Kept Your Character Sheet and Hid Your PC’s Hit Points?

Have you heard of dungeon masters who keep character sheets from players and who make all the die rolls? Instead of revealing hit points, these DMs say, “Your character feels badly injured and close to death.”

To improve a TV audience’s immersion and to avoid numbers, the Dungeons & Dragons games on Community adopted this style. The creator of Community, Dan Harmon, brought this style to his HarmonQuest live-play show. For performance, the style makes sense.

Some real DMs also took the style for simulation and immersion. They explained that the characters’ don’t know their numbers, so why should their characters? In theory, hiding the mechanical guts of the game lets players focus on the game world and on immersing themselves in their characters.

In practice, when a DM takes such measures, players see a control freak. Players worry that the DM will fudge numbers to force a plot. But even when players trust their DM’s impartiality, the hidden numbers create discomfort. The game rules serve as the physics of the characters’ world. When the DM hides numbers and mechanics, the players lose some ability to make good choices for their characters. They feel robbed of control.

Also, everyone likes to roll their own dice.

Aside from performing D&D for an audience, most stories of DMs hiding the game’s numbers date from role playing’s early days. Then, gamers experimented with styles of play that no longer seem appealing. In White Dwarf issue 75 from 1986, an article titled “Gamemanship” recommended preventing players from reading the game rules. “Players who haven’t read the rules will be unable to spring anything ‘new’ on you.” The author, Martin Hytch, aims for better role playing, but he seems like a control freak.

Nowadays, tales of DMs who hide the game’s numbers from players seem like legend. Any DMs committed to the style probably wonder why no one wants to join their game.

But every DM weighs how many game numbers to share with players. My research turned up contemporary game masters willing to hide a character’s hit points from their players.

Martin Hytch would approve. “Telling a fighter he has lost eleven hit points can have a totally different effect if the DM says, ‘The beast strikes you in the face, breaking your nose.’” I suspect few players share Martin’s devotion to immersion.

A mere description of damage leaves players confused about their characters’ conditions. The broken-nose example falls particularly short. In the DM’s estimation, does the injury leave a character halfway to death, or just a little battered? The DM knows. In the game world, the fighter knows. Only the player feels baffled.

The characters see, hear, smell, and touch the game world. They sense more of their world than even the most vivid description shows the players. The characters bring years of training and experience. They know nothing of hit points, but hit point numbers provide a measure to bridge the information gap between a character living the battle and the player at the table.

DMs rarely hide hit-point and damage numbers from players, but most DMs conceal difficulty classes. Until recently, I kept DCs to myself, but now I typically share them.

As with hit points, the difficulty class number helps span the gulf between a character’s vivid sense of the game world and a player learning from a DM’s description. When a rogue decides whether to climb a wall, she can see the bricks, mortar, and slick condensation. She can compare to walls she has climbed. At the kitchen table, a DC just sums a character’s experience.

Some folks object to sharing DC numbers because they feel numbers hinder immersion. But hiding the DC leaves plenty of immersion-foiling game in the check. The player still looks up numbers on the character sheet. They still roll a dice and add the result to a number on their character sheet. What’s another number?

In games like Runequest, Call of Cthulhu, and GURPS, players make checks by rolling under a skill number. These games put a chance of success on the character sheet and make hiding difficulties cumbersome. These games still thrive. Even though players almost always know their chance of success, no one accuses Call of Cthulhu of undermining immersion.

Characters might have more trouble telling the odds of making a saving throw than the difficulty of a jump or climb, but the benefits of revealing save DCs encourage me to reveal them too.

Especially with a high-stakes save, revealing a DC heightens the drama of a die roll. When a character’s fate rests on a roll, when a roll seizes the table’s attention, a player can figure the number they need before throwing the die. A known DC tells players that the DM can’t fudge the line between success and failure. Then the moment the die lands, everyone knows the outcome without asking the DM for an interpretation. By revealing a DC, the DM sides with the players. No one sees where the next roll will take the game. Only the dice decide.

As a DM, revealing a DC frees me from any urge to nudge the narrative by moving a hidden target to land a success or failure. My transparency shows players that their characters’ fate rests on their choices and the luck of the die rather than on a DM’s whims.

Revealing DCs also speeds those situations where several players need to make a roll. Instead of forcing each player to report their role to learn an outcome, just announce DC and let them figure the result.

None of this applies when the characters can’t know the difficulty of a task. Don’t reveal DCs for checks…

  • made to gain information using skills like Insight and Perception.
  • involving a non-player character’s state of mind, such as with Persuasion and Deception.
  • where characters know too little to estimate a difficulty.

The rest of the time, try sharing DCs. I think it makes my game better.

Saving throw proficiency and ghouls

Even at the end of the Dungeons & Dragons Next public playtest, the designers wrestled with one aspect of Next that remains broken. The ghoul problem. A live-streamed playtest session showed the problem when 4 ghouls faced a party of fifth-level characters and threatened a total-party kill. Spicy Mystery Stories March 1936Mike Mearls recounts, “The thing that irritated me the most about it was I think that this fight would be just as hard if you were 10th-level characters. Four ghouls jumping on a 10th-level cleric, as opposed to a 5th-level cleric, would have had roughly the same ability to take you down.” When ghouls hit, they force their targets to save versus paralysis. One failed save removes you from the fight. Because armor class doesn’t rise much from level to level, ghouls can hit even high-level characters. The damage doesn’t endanger the heroes with high hit points, but the saving throws still stand.

If your character lacks the proficiency needed to shrug off constitution saves, then one hit can easily drop you from the battle. As the game stands, most classes enjoy proficiency in just two of six types of save. Without proficiency, your 20th-level archmage suffers as poor a chance of shrugging off the ghoul’s touch as a level 1 initiate. Even with a +6 proficiency bonus, my money is on the ghouls.

Of course the problem isn’t unique to ghouls. It applies to anything that makes attacks that force saves.

The designers recognize this problem and the final version of Next will feature a fix. Mike explained how the game should play. “As creatures become lower level relative to you, their damage attacks remain a threat, which is nice because that’s a threat to all characters, but their special effects start to fade out. Like lower level characters worry about ghoul paralysis, higher level ones don’t because they know that they can probably make the save. The DC is low enough; their bonus is high enough.

“One of the things I’ve been thinking of is if we just did something simple, like you add half your character level to all your saving throws. And so then we know saving throw DCs scale up a bit. The important thing for me being low-level creatures can have lower DCs; high-level creatures can have higher DCs, just like you kind of expect and that fits into what should be going on in the game.”

Another possible fix could allow characters’ with hit points above a certain threshold to save automatically. While such a threshold mechanic probably won’t apply to the lowly ghoul, I expect to see it apply to various save-or-die effects.

Update: (June 19, 2014): In the Dungeons & Dragons Q&A: Starter Set and Basic rules, Mike Mearls says that the saving throw rules remain unchanged from the final playtest. This means two things:

  • The fifth edition designers have an obsessive devotion to minimal core rules. The 5E design makes several compromises to enable the game to  resolve everything using the same ability modifiers and proficiency bonuses.
  • Ghouls and other monsters that force saves against debilitating conditions will need designs that limit their lethality. For example, a threshold mechanic that allows characters’ with hit points above a certain threshold to save automatically.

Next: 9 popular things in D&D that I just don’t understand

D&D Next trades to-hit bonuses for enhanced damage

(This post continues a discussion I started in “What does D&D have to do with ironclad ships?”)

As I discussed in “Riding the power curve,” the next iteration of Dungeons & Dragons attempts to straighten out fourth edition’s logarithmic power curve by refusing to let characters benefit from both steep bonuses to hit and big increases to damage. Instead, characters mostly get increases to damage.

When we compare D&D Next to early editions, Next limits the to-hit bonuses characters gain as they advance in exchange for greater bonuses to the damage they inflict.

Before I delve into the benefits and drawbacks of this exchange, I ought to address two practical objections to trading to-hit bonuses for damage.

Should skill increase damage?

Some argue that a more skillful combatant’s blows should not deal more damage. After all, a crossbow bolt always hits with the same force, so it should always strike with the same damage. Personally, when I’m struck by a crossbow bolt, I care deeply about where it hits. Maybe that’s just me.

Miyamoto MusashiAs I explained, in “The brilliance of unrealistic hit points,” hit points in D&D work as a damage-reduction mechanic. As characters increase in level, their rising hit points reduce the effective damage they suffer. Reasonably, as characters increase in level, they could also grow better at inflicting damage by overcoming defenses to strike vulnerable places or to apply more force to a blow.  I’m no Miyamoto Musashi, but I’ve earned enough bruises sparring with practice swords to know that finding an opening to tap an opponent demands less skill than finding enough room for a kill strike─or even a cut.

And if you worry about unusual cases of oozes struck by crossbows, adjust at the table.

“The primary thing when you take a sword in your hands is your intention to cut the enemy, whatever the means. Whenever you parry, hit, spring, strike or touch the enemy’s cutting sword, you must cut the enemy in the same movement. It is essential to attain this. If you think only of hitting, springing, striking or touching the enemy, you will not be able actually to cut him.” Miyamoto Musashi, The Book of Five Rings

Hits inflict more than damage

In D&D, a hit can bring the threat of poison, level drain, and many other secondary effects. In these cases, the attack’s damage matters less than dealing the hit. A higher level character’s chance to hit improves less, so their chance of inflicting secondary effects sees little improvement.

This matters, but it matters less than you may think.

First, to-hit rolls take a much smaller place in D&D Next than in 4E. D&D Next switches from non-AC defenses back to saving throws. Virtually all spell attacks return to skipping the to-hit roll entirely.

Second, attacks versus AC return to focusing on damage. To an extent, I liked how 4E added tactical richness to combat by devising interesting attacks. However, for my taste, too many effects appeared in play. I grew tired of seeing combatants perched on stacks of Alea markers, unable to do anything but stand and make saves.

In D&D Next, as in early editions, weapon attacks mostly inflict damage, and the attacks that threaten something like poison or level drain usually come from monsters.

carrion crawlerThird, the saving throw returns as a defense against bad things other than damage. In 4E, hits against AC can inflict crippling effects without saves. Just getting hit automatically subjects you to poison, or paralysis, or whatever. In older editions, when the spider bit or the ghoul clawed, you took then damage but you also saved versus poison or paralysis. I appreciate 4E’s streamlined system, but dropping the defensive saving throw contributed to battlefields bogged down with more conditions and other markers than even the designers anticipated.

D&D Next brings back saving throws as a defense against effects like poison and level-drain. We no longer need to rely on to-hit rolls as the final test of whether a poisoned dagger drew enough blood to overcome your constitution. Because monsters make most of the attacks that poison, paralyze, drain, and so on, most players should be happy to see the save return.  Plus, despite the extra roll, the save probably speeds play by reducing the number harmful conditions that take effect.

Despite these three points, in D&D next, your high-level character is weaker when she makes attacks versus AC to inflict crippling effects. If I were to design, say, a poisoner class, I would make their chance to hit nearly automatic, and focus on saving throws as the principle defense against poison.

Next: Changing the balance of power