[Theory] A different axiom

edited August 2014 in Story Games
A caveat: This will likely be a big ball of theory and no implemented practicality, at least to begin with, so be forewarned.

I woke up this morning with an idea in my head, and no idea of where it came from or what to do with it, so here it is:

A decision cannot simultaneously be both moral and tactical.

The idea, illustrated:

"How can you shoot women and children?"
"You just don't lead 'em as much."

This just restates the same sort of thing that's been in theory discussions forever, of course. I know that. But still it sort of feels like it might be a slightly-different point from which to start, even if we all end up in the same place.

But I could be wrong about some or all of this, and all y'all are about the smartest gang of people I know. So go ahead and take a kick at it if you like. If this isn't enough yet and you need more, that's fine too.



Cheers,
Roger

Comments

  • edited August 2014
    Maybe it's wrong of me to take this to the real world(TM), but here's where my mind went...

    I think a decision can be both moral and tactical - political activists choosing a "hands up" tactic to confront militarized police?
    But it also depends on your definition of morals - US military forces enforcing their own ideas of democracy in the Middle East?

    Unless you're saying that the moral decision and the tactical decision are separate decisions.

  • Here's how I see it:

    Tactical choice: How can I most effectively reach my goal?
    Moral choice: Which goal should I strive to reach?

    In a complex situation, where there are several different goals, a choice can have both moral and tactical dimensions. Choosing to do A might make B impossible, but will make it more likely that C happens. You want to achieve both B and C, so abandoning B for C is a moral choice, but at the same time, you're making tactical estimations on how likely B and C are anyway.

    The thing is, in boardgames and most combat systems, there's a single goal, which means moral choices go out the window. There's a defined thing that means "winning" and you either achieve it or not. But if the goals are yours to define for yourself, you can make tactical choices to try to maximize all of them, but still be forced to make prioritization amongst them when it becomes clear you can't do it all.

    So no, I can see what you mean, but I don't agree. Good post!
  • Despite my agreement with Tod and Simon's points, I do think tactics and ethics are frequently at odds in RPGs. If the player's traction in the game is based on action options on the character sheet, or certain game actions get you points, then most players will choose those actions, morality be damned. There's a reason why Burning Wheel gives you points for acting on your character's morals. Similarly, in games where players show up explicitly to wrestle with ethics, tactical concerns must sometimes be ignored -- a test of how far you'll go is nullified by stacking the odds too far in your favor from the get-go.
  • edited August 2014
    There's a reason why Burning Wheel gives you points for acting on your character's morals.
    Burning Wheels gives you a bribe to act in a specific way, where a choice between a tactical decision and a moral one is a dilemma. "Should you risk ten soldiers' lives in order to save one?" A huge difference.
  • I think this is a powerful aesthetic axiom. As such, it really is a way of defining what you mean by tactics and morals mean to someone who has this aesthetic. Calling it true or not is a category error. The best you can say is that this axiom fits a particular perspective - say of players engaged in Macho Narr (a focus on really painful moral no-win situations).

    The way I look at it, implicit in this axiom is that you have two value systems which cannot be exchanged: tactical and moral. Every decision must be all one or the other - there can be no middle choice. There can be a non-choice which is neither tactical or moral, but in this aesthetic is isn't even worth mentioning.

    On the other hand if you come at things from the perspective of a single value system - if it is a moral stance then effectively the opposite axiom is true, "The most tactical decision you can make right now is also the most moral" since your tactics is how you achieve your moral ends. To make that interesting in fiction you focus on things like learning from errors and recovering from ignorance to make more effective decisions in achieving your moral ends.

    Most fiction (and most aesthetics) isn't at one extreme or the other. We typically have a variety of values and can't always shake them down agreement or strict priority. That means we have to guess at the trade-off. But often that trade-off isn't completely opposed. And often we can learn to make better decisions. The problem with only sharp dichotomies between these value systems is that learning isn't possible. All you find out is which impulse is stronger, but I prefer to hope we live in a world were it is possible to make your effectiveness more moral and your morality more effective.
  • Nicely said. Idealistic and nondualistic, which might make it sound strange to American ears, but nice nonetheless.

  • "Should you risk ten soldiers' lives in order to save one?"
    Just wanted to jump in and say I really like this as an example to analyze in more depth. Otherwise I'm still letting everything percolate in and ferment, so carry on with the good work here.
  • Rickard's question is a classic one, upon which rests the whole of Utilitarianism as a philosophical concept. You can find numerous YouTube videos discussing variations on this question with the word "Utilitarianism" in the title. At the core of Utilitarianism is the attempt to make ethical decisions mathematically, based strictly on quantifiable data. Which is horribly fuzzy in real life, but works well enough for a game. :-)

  • There's a reason why Burning Wheel gives you points for acting on your character's morals.
    Burning Wheels gives you a bribe to act in a specific way, where a choice between a tactical decision and a moral one is a dilemma. "Should you risk ten soldiers' lives in order to save one?" A huge difference.
    Does it really, though? Putting aside the loaded language of "bribe", I've honestly never had such a dilemma come up in my Burning Wheel play experience. Indeed, Beliefs have at times been rather pragmatic courses of action. Then, the question becomes, "are you absolutely sure that you agree with your character's morals?"

    I mean, sure. Burning Wheel is heavily predicated on the idea of challenging those Beliefs. But you most easily challenge them by making them difficult and costly to stick to. The central question is "what are you willing to put up with to stick to your Beliefs?", and you get rewarded one way for sticking to your Beliefs, and rewarded the other way for pushing them aside to take the way forward.

    It never really comes down to the sterile land of ethical thought experiments. Not how I've seen it.
  • Actually, you get rewarded *the same way* for both fulfilling and breaking your Belief.
  • I was thinking about the Fate for pursuing a Belief but not fulfilling it. That's definitely another wrinkle, though: that system is nuanced and detailed.
  • Let me introduce you to a little game called Vampire: the Masquerade, in which premeditated murder for personal gain is easy and the best way to solve a lot of your problems.
  • Rickard's question is a classic one, upon which rests the whole of Utilitarianism as a philosophical concept. You can find numerous YouTube videos discussing variations on this question with the word "Utilitarianism" in the title. At the core of Utilitarianism is the attempt to make ethical decisions mathematically, based strictly on quantifiable data. Which is horribly fuzzy in real life, but works well enough for a game. :-)
    Here's something interesting. "Moral dilemmas" of this sort posed as questions usually try to remove the tactical element and single out the moral one. "There's a bunch of people in the boat and you need to eat one to survive. Do you do it, and if so, whom do you eat?". And it's assumed that the chances of survival doesn't depend on who is eaten, and you can be absolutely sure of survival if you eat someone, and absolutely sure of death if you don't.

    But a dilemma like that is purposefully stripped of all the tactical* elements. Rickard's "risk ten to save one" is full of tactical implications. You need to estimate the chances of saving the one soldier, and the risk of one or more soldier dying in the attempt, to be able to make the moral choice.

    And yes, I do agree that tactical and moral choice are often at odds in games today, but I don't think they have to be. And I think the difficulty is partially in being able to make tactical decisions without reducing the variables to boardgame elements like points, turns and dice rolls, which hinder moral choice and engage a player's "board game mentality", where one is less engaged with the characters in the fiction. Maybe?

    ______________________
    * I'm using "tactical" as a technical term here. I realize it sounds awful to talk about "tactics" when discussing eating people.
  • "TACTICAL CANNIBALISM" - The new game from Simon Pettersson :-)

  • edited August 2014
    And yes, I do agree that tactical and moral choice are often at odds in games today, but I don't think they have to be. And I think the difficulty is partially in being able to make tactical decisions without reducing the variables to boardgame elements like points, turns and dice rolls, which hinder moral choice and engage a player's "board game mentality", where one is less engaged with the characters in the fiction. Maybe?
    Another example of tactical and moral decision can be shown in a meta-game. "Is my fun of winning more important than other persons' fun?" Developing shadenfreude when you play, for example. Bullying the one who's last to feel powerful. Smack talking the other participants in order make them loose their concentration.

    Like you said, they can be at odds, but I think it's the same thing as play suboptimal because "That's not how the hat would had done it" in Monopoly. That's not a moral choice, but instead playing to express yourself. Seldom combined with tactical choice, but sometimes it is.

    In your game Svart av kval, vit av lust, there are two versions of a tactical decision and a moral one. One is if you play your own character. You will achieve your goals, but at what cost? If the player doesn't care of the consequences, because of that board game mentality, then (I think) the game isn't as fun.

    When I really enjoy the game is when I'm making tactical decision to put the other participants into moral ones (with the beast commands). You set up two situations and then you bring them together to put the other person in an ambiguous decision. That's when the despair comes, and that's a great payoff from the tactical play.
  • Actually, I think a clearer tactical/moral interplay is at work in Det sjätte inseglet. Your character has a clear goal stated on the character sheet. You want to overthrow the king. It's also explicitly stated that your character is a hero who wants to be good, but suffers from a moral weakness, let's say greed. The other players will offer you extra dice in order to give in to your greed. It's pretty much impossible to succeed in your goal without ever giving in to your weakness. The difficulty in the game is trying to achieve your goal without turning into a monster. This requires you to gauge the moral repugnance of each act against the extra resources it will grant you towards achieving your goal. That's, I think, a tactical and moral choice at the same time. Gauging the effectiveness of the dice offered against the resistance you expect to face is a tactical choice, but weighing the result of that estimation against the act you have to perform is a moral one.

    I've seen people focus solely on the tactical or the moral side, which gives you pretty cool stories of a hero sinking into depravity or a valiant fighter struggling against impossible odds. But the game really shines when you try to have your cake and eat it, too. That gives you the most heart-rending stories.
  • If "moral versus tactical" implies a means/ends relationship, then the two certainly can conflict (and it is often interesting when one's ideals conflict with one's character or actions), but there is just as certainly no necessary conflict between the two.

    The more interesting case is if "moral versus tactical" refers to a choice between values. Then we're fully channelling the ghost of Cicero--On Duties is all about what happens when the moral (honestas) conflicts with the expedient (utile). Or Machiavelli, I suppose, if your answer is the opposite of Cicero's. :)

    But, if the second case is what you mean, this seems to be just the sort of "moral dilemma" that certain sorts of narrative games (and the sorts of fiction they follow) are all about--do you rescue the train car full of innocent people or the one that's empty except for your husband and child?
  • So things are still kind of gestating in here, which is great.

    The statement under review is pretty terse, so of course it's fairly open to interpretation. I'll try to explain my interpretation, which should not be considered authoritative in any way, despite the evidence that I am apparently the author.

    With "decision", I'm looking at the critical, in the moment, act of deciding at the moment of decision. I don't think it's generally possible to look at a decided-upon action in the past tense and say, yep, the decision to throw a grenade in that bunker was a moral (or tactical) one.

    I still kind of like "tactical" but I can see how it's carrying around more than its fair share of baggage. Possibly 'quantifiable' is more what I had in mind with tactical. Moral, then, is sort of on the axiomatic side of things. One result of this is that quantifiable decisions are amenable to discussion -- you might be able to convince someone, no wait, you might think that there's only a 10% of death here, but if you integrate over the long haul you'll see it rise to 25%, so maybe the optimal choice is something different. While no amount of discussion is likely to persuade you that you don't love your kids.

    I'm trying to keep this all on this side of the Theory fence for now, but it becomes awfully seductive to suspect that this has implications for non-story-game decision-making as well. Probably best to leave that dog sleeping as long as we can.

    And someone mentioned Utilitarianism as a way to have the cake and slice it up too. I'm keeping an eye on this, because I think it'll turn out to be something sort of interesting, but I'm not really feeling ready to tackle it quite yet.

    I'm still sort of happy with this rubric, if only because I think it brings a more dynamic, shifting feel to the ongoing decision-making process in the moment, in a way that a term like 'Agenda' maybe doesn't. Yes, I know we've all been jumping up and down trying to impress upon people that Agenda is as fluid as anything, but I'm not convinced it's actually working.

    Some people have been jumping ahead right into Applied Theory and pointing out all the wonderful ways you can encourage people in one direction or the other, which is good stuff.
Sign In or Register to comment.