Tag Archives: Paranoia

Never split the party—except when it adds fun

Everyone who plays role-playing games learns the Dungeons & Dragons adage never split the party.

In the hobby’s early days, when dungeon masters were referees and players chose difficulty by dungeon level, never splitting the party always made good strategy. Parties found safety in numbers.

defending-the-bridgeThe danger of splitting the party

In a dungeon stocked with encounters suited for a full party, splitting the party jeopardizes everyone. But despite the adage, players sometimes find reasons to split the party. New players and kids always seem tempted.

Faced with a divided group, some dungeon masters will scale the challenges for smaller groups. Typically, I don’t. I usually only shrink the challenges for those new players and kids.

Experienced players who split up know they’re taking an extra risk. They feel a sense of jeopardy that the usual game can’t match. They use stealth and cunning in ways they might not with a full group, when they assume they can defeat any monsters set before them. I don’t want to lose that sense of peril, or to block their chance to approach the game differently. In a way, adjusting threats steals the players’ agency by nullifying the consequences of their actions.

Why split the party?

In today’s game, player characters do more than assault dungeons. Sometimes the elf and wizard must persuade the elven emissary, the thief and warlock need to infiltrate a manor house, and the bard and noble paladin need to charm guests at a ball. They could work better separately, but players insist on keeping the party together. So the dwarf insults the emissary, the paladin’s chainmail racket alerts the manor guards, and a motley band of killers sours the ball. Then midnight tolls and evil triumphs.

Game masters often avoid challenges suited to split parties, but I invite them. Sometimes I relish a chance to split a party.

Splitting the party can give soft-spoken players a chance in the spotlight. Player characters gain unique chances to reveal their character’s personality and talents.

Way back in a post on skill challenges, I suggested using time pressure to force each PC to participate. “If the characters only need to gain the support of the head of the merchant council, then typically one player makes all the diplomacy rolls. If the characters must split up to convince every member of the merchant council before their vote, then every player must contribute.” Formal skill challenges are gone, but forcing a party to divide and conquer still invites everyone to contribute.

One limitation of role-playing games is that even when the entire party participates in a role-playing scene, typically only one or two players participate. The rest watch. Sometimes players find themselves overshadowed by players with more forceful personalities. Splitting the party gives more players a solo. Meanwhile, the thief finally gets to sneak. The wizard finally gets to cast Sending.

If done well, splitting the party creates more spotlight time for every player at the table. More on that later.

Why keep everyone together?

Never split the party started as good strategy, but now it feels like part of the game’s social contract. Even when splitting the party seems logical, players keep the group together for three metagame reasons.

1. Players fear encounters designed for a full party.

Players expect combat encounters designed to challenge a group of 4 to 7 characters. If they split up before a fight erupts, then an undermanned party becomes overmatched.

But that happens less often that you think, because you, as a game master, see the situations that invite splitting the party and can plan challenges for smaller groups.

2. Players stay together as a courtesy to the game master.

By staying together, players avoid forcing the GM to juggle two separate narratives.

For the GM, balancing two threads can be fun—in the right situation. For a split to work, either (1) it cannot take more time than the idle players need to grab a snack, or (2) each subgroup needs to meet separate challenges. You can’t leave half of the party inactive for more than 5 minutes.

So the trick of handling a split party comes from devising situations that keep each part of the group busy. If someone goes to scout while the party rests, either the scouting should be finish by the time the idle players grab a drink, or something better stumble into the campsite.

3. Players stay together to keep everyone involved in the action.

A split party inevitably forces some players to wait until the spotlight returns to them. To minimize the problem of downtime, use two techniques.

Cut between scenes

Cut from one group to the next every 2-4 minutes. Some GMs advise setting a timer for about 4 minutes. If you tend to lose track, then a timer helps, but I prefer to use my own sense of time and pacing to switch scenes.

Every role-playing game reaches moments when the players make plans while the GM sits idle. Those moments bring my favorite times to switch scenes. While players debate their next move, I cut to the other half of the table. This sort of switch keeps half the players busy planning while the rest act. Instead of waiting for decisions, I can give more players time in the spotlight. The tempo of the game feels faster.

If I can’t switch scenes on a decision point, I switch on a moment of tension, ideally a cliffhanger.

Delegate the monsters to the idle players

Depending on your players’ dispositions, you might recruit idle players to run monsters in a battle. This works especially well in a simple fight where you expect the PCs to win. If the foes bring complicated abilities or motives, or if their power threatens to slay characters, I would avoid giving up control. When a GM kills a character, it comes in the line of duty, but a player should not take the heat for killing a PC.

If half the party lands in a fight, then the split plays best if the other half finds a battle too. You can run two fights on two maps with the same initiative count.

If you run simultaneous fights and let the players run the monsters, then you can leave the room for a drink. Your greatest GM triumphs often come when you have nothing to do.

Game master Rich Howard goes beyond letting players run foes. He casts idle players as the non-player characters who interact with the rest of the party. I admire the approach, but I feel unready to surrender so much of the game world.

Splitting the room

Even when you split a party, players tend to remain at the same table. This lets inactive players watch the story and lets the GM switch easily from one subgroup to another.

While sharing a table, the spectators learn things that their characters don’t. Most players take it as a point of honor not to use their unearned knowledge. If not, remind them to play in character based on what their character knows.

Separating players to different rooms can add fun though. No player has access to hidden information, so decisions become more interesting. Everyone feels an added sense of peril and concern for their missing comrades.

If you do separate players, you still need to switch groups every 2-4 minutes, so the groups should be as near as the kitchen and the dining room. Make the separation temporary. Your players came to play together.

Back when phones featured dials, I would separate players to sow suspicion about what other party members could be plotting. This fit the early game, when players betrayed each other for loot. Now such mind games only fit Paranoia sessions. Now I insist that my D&D players contrive reasons to cooperate.

Split the party

So split the party. For a GM running a divided party, the second hardest trick comes from finding situations where all the subgroups remain engaged. The hardest trick? Encouraging the players defy protocol and split up when splitting makes sense.

Why second-edition Dungeons & Dragons dropped thieves and assassins

I have only run an evil-themed D&D campaign once, and only because Wizards of the Coast cornered me. They released the Drow Treachery cards and the Menzoberranzan campaign book and promoted the products with the Council of Spiders season of Dungeons & Dragons Encounters. I’ve served as a dungeon master for every season of Encounters and never considered skipping Council of Spiders, but I questioned the wisdom of promoting an evil, backstabbing campaign, especially in a program geared for new and returning players. My concerns proved valid. Two of the regulars at my table seemed uncomfortable with the evil theme, and one player, call him Benedict, embraced the spirit of the treachery too well.

Lloth and Drow at Gen Con

Lloth and Drow at Gen Con

In the final encounter, Benedict joined the season’s villain and killed the rest of the party. “It’s not personal. I’m just playing my character,” he apologized. Over the years, when someone excuses their character’s actions with “I’m just playing my character,” I’d grown to expect trouble. This time, two regular players from my table never came to encounters again. Maybe they had other obligations, but I suspect the unsatisfactory season contributed to them moving on.

I cannot blame Benedict. Like him, I started in the early years of the hobby, an era that celebrated a character’s ability to attempt any action, and where simulation dominated role playing. How better to simulate an imaginary world than to portray characters of all stripes? By this early ethos, total immersion in character trumped everything. If you failed to play your character to the hilt, then you did the game a disservice. Any game master who interfered with a player’s freedom of action was guilty of an abuse of power. If the player’s actions defied her alignment, penalties might be in order, but if not, anything goes.

And the Council of Spiders Encounters season encouraged treachery.

Still, I should have discouraged Benedict’s betrayal. Some players relish in-party conflict, but unless everyone at the table welcomes such conflict, in-party feuding just encourages hard feelings and lost friends. Folks who welcome treachery should play Paranoia, a game invented for the play style.

Before second edition, D&D promoted classes that fostered party conflict. With thieves and assassins, the trouble begins with class names that encourage bad behavior. What sort of thief fails to steal, and who presents richer targets than the rest of the party? What sort of assassin fails to murder?

As soon as thieves and assassins reached playtesting in 1974, Gary Gygax’s Greyhawk campaign saw trouble. On the EN World forums Gary reminisced, “One or two assassin PCs were played, but the party was always chary about them. Minor pilfering of party treasure was tolerated but having a PC offed by an assassin was most annoying. That happened once, maybe twice, with the offending PC then leaving the game, the player returning as a different character.”

Even as late as 1985’s Unearthed Arcana, the original barbarian class provoked trouble: “Barbarians in general detest magic and those who use it. They will often seek to destroy magic items, and if successful, they receive an experience point award as if they possessed the destroyed items.” What could possibly go wrong?

The designers of D&D’s second edition started moving away from  classes with names that encouraged trouble. In a podcast recalling second-edition’s design, Steve Winter says, “The assassin went away because we had seen through letters from customers and talking to people so many cases of assassins ruining campaigns. People who played assassins felt like that was carte blanche to murder their fellow player characters. We got all the time letters from people asking what do I do with this player? He wants to play an assassin, but he keeps assassinating the other PCs.”

In third edition, “thieves” became “rogues” to discourage similar mischief. Steve Winter explains, “When you’re sitting around the table and the thief player is getting a little bored, and there is another PC standing right in front of him… I can’t count the times that I was at the table and somebody was like, ‘I’m going to pick his pocket.’ And right away everyone is like, ‘Oh don’t, please don’t,’ because everyone knows it’s just going to cause problems within the party.”

Of course, you don’t have to play a thief or assassin to “just play your character,” and to instigate fights among the party. In the Legacy of the Crystal Shard Encounters season, one player embraced the corruption of the black ice and seemed tempted to disrupt the party. This time, I felt willing to forbid any action that would make the players war amongst themselves. But first, I set in-game events that challenged the character to choose between the black ice and his other loyalties, and to the player’s credit, he chose to cast aside the corruption.

Games of Paranoia aside, I no longer see “I’m just playing my character” as an excuse for disruptive play.

[February 15, 2014: Updated to indicate that “thief” became “rogue” in third edition.]

Next: A role-playing game player’s obligation

The brilliance of unrealistic hit points

(This post continues a discussion I started in “What does D&D have to do with ironclad ships?”)

After the role-playing game hobby’s first 10 years, designers turned from strict realism and began to design rules that both supported a game’s flavor and encouraged its core activities. Runequest‘s realistically lethal combat systemParanoia 1st edition game fit the fearful world of Call of Cthulhu (1981), as did a new sanity system. Paranoia (1984) built in rules that encouraged a core activity of treachery, while giving each character enough clones to avoid hard feelings.

Today, this innovation carries through stronger then ever. Dungeons and Dragons’ fourth-edition designers saw D&D’s fun in dynamic battles and showing off your character’s flashy capabilities, so they optimized rules that heightened that aspect of the game, possibly to the expense of other aspects.

When Dave Arneson mashed rules for ironclads into Chainmail, he probably gave little thought to supporting the D&D play style that would launch a hobby, but he created some brilliant conventions.

Chainmail gameThe best idea was to give characters steadily increasing hit point totals that “reflect both the actual physical ability of the character to withstand damage─as indicated by constitution bonuses─and a commensurate increase in such areas as skill in combat and similar life-or-death situations, the ‘sixth sense’ which warns the individual of otherwise unforeseen events, sheer luck and the fantastic provisions of magical protections and/or divine protection.” (Gary wrote this rationale for hit points in the first edition Dungeon Master’s Guide.)

Every “realistic” system to follow D&D used hit points to measure a character’s body’s physical capacity to survive injury. In D&D, rising hit points work as an elegant damage-reduction mechanic. Using hit points for damage reduction boasts a number of virtues:

  • Combat plays fast because players do not have to calculate reduced damage for every single hit.
  • Although damage is effectively reduced, the reduction never makes a combatant impervious to damage.
  • Once characters gain enough points to survive a few blows, hit points provide a predictable way to see the course of battle. If a fight begins to go badly, the players can see their peril and bring more resources like spells and potions to the fight, or they can run. In a realistic fight, things can go bad in an instant, with a single misstep resulting in death.
  • Most attacks can hit and inflict damage, providing constant, positive feedback to players while everyone contributes to the fight. Realistic combatants do not wear down from dozens of damaging blows; instead each hit is likely to kill or maim. In more realistic systems like Runequest and GURPS, when two very skilled combatants face off, they block or dodge virtually all attacks. The duels turn static until someone muffs a defense roll and lets a killing blow slip through. This model may be realistic─it reminds me of those Olympic competitions where years of training turn on a single, split-second misstep─but the realistic model lacks fun. No popular sports begin as sudden-death competitions where the first to score wins.
  • Battles can gain a dramatic arc. Fights climax with bloodied and battle-worn combatants striving to put their remaining strength into a killing blow. No one likes to see the climactic battle fizzle with a handful of bad rolls, especially at their character’s expense.

Bottom line: Using hit points for damage reduction enables a combat system where you can hit a lot, and hitting is fun.

Critics of inflated hit points still had a point. Using hit points as a damage-reduction mechanic can strain credulity, especially when you cannot explain how a character could reasonably reduce the damage he takes. Why should an unconscious or falling hero be so much more durable than a first-level mook?  Why does cure light wounds completely heal the shopkeeper and barely help a legendary hero? Over the years, we’ve seen attempts to patch these problems. For example, I liked how fourth edition’s healing surge value made healing proportional to hit points, so I’m sorry to see D&D Next turn back to the traditional hierarchy of cure spells.

D&D maintains a deliberate vagueness about the injuries inflicted by a hit. This abstraction makes possible D&D’s brilliant use of hit points as a damage-reduction mechanic. Fourth edition exploits the ambiguity more than ever, making plausible the second wind and the healing power of a warlord’s inspiration. 4E explicitly makes hit points as much a measure of resolve as of skill, luck and physical endurance. Damage apparently exists as enough of an abstraction that even if a hit deals damage, it doesn’t necessarily draw blood.

Even as 4E aims for the loosest possible interpretation of a hit, it makes the hit roll more important than in any prior edition. In 4E, melee hits can inflict crippling effects without saves. Just getting hit automatically subjects you to poison, or paralysis, or whatever. In past editions, if the spider bit or the ghoul clawed, you took the damage, but you still got an immediate save.

In the early days of the RPG hobby, many games attempted to fuse D&D’s fantastic setting with a more realistic model of combat damage. Although a few of these games enjoyed success, none recreated the combat-intensive, dungeon-bashing play style pioneered by D&D. At the time, no one seemed to realize that the clever damage-reduction mechanism built into game enabled the game’s play style.

Video game designers figured it out. Virtually every video game that combines fighting with character improvement features D&D-style rising hit points.

Next: Hitting the to-hit sweet spot