≡ Menu

Game Theory, Zen, and Decision-making

The Story-Telling, World Traveling, Game Theorist, Troubleshooter, and Zen Problem-Solver

Helping people to create better lives through efficiency, effectiveness, precision, self-awareness, Eastern Wisdom and game-based strategies

 

Q. What is the link between game-based thinking, Zen, and decision-making?

A.  I write this to you from a hotel in Toronto where I am teaching a master class on zen thought and strategizing. There are many certified life-coaches in this class.

The world is filled with life-coaches who took a weekend workshop on being an advisor and hung a shingle. The challenges for these individuals, self-help gurus, peak performance experts, and decision-makers is that though they are skilled at developing winning strategies they are not very skilled at avoiding errors in judgment. One cannot be an effective strategist and problem solver unless one has a mastery of both skill-sets.

In order to develop these skills we must have a profound understanding of numerous concepts drawn from game theory, zen and design thinking including; collaborative intelligence, critical mass tipping points, ripple effects, Black Swans, the Butterfly effect, Support Triangle, Cognitive Bias, Trembling Hand, beginner’s mind, counter-intuitive truth, and other elements.

Within the Game Thinking community, there are troubleshooters whose job it is to check the understanding of experts and specialists for mental errors. Any skilled game theorist knows that the more brilliant, knowledgeable and expert a specialist is, the more likely they are to make some error, often a small one, that can have a major impact on any system especially a game space.  These are individuals, often experts themselves who are skilled at focusing on just the types of thinking errors certain experts are likely to make.  Acclaimed experts often are confused, and even annoyed as to how someone as respected and knowledgeable as themselves could possibly need someone less acclaimed and knowledgeable looking over their shoulders and making suggestions. Any serious strategist soon comes to appreciate these troubleshooters.

The more acclaimed a person is in a specialty the greater the likelihood that they may come to see themselves as infallible experts. This is when they get themselves into trouble.  Where ever there is a genius at work, there will be judgments made with some level of uncertainty.  Taking this stream of thought to the next level, wherever there is an opportunity for human fallibility and preventable human error, an error will eventually take place.

There a number of reasons why skilled individuals make mistakes that have major ripple effects that can cost millions of dollars, and often lead to death and destruction. These include:

  • Failing to see that information received from other experts on their own team or project was unreliable.
  • Paying attention mainly to what they were asked to pay attention to, thus missing some bigger picture.
  • Failing to notice what they were not directly asked to notice.
  • Addressing a small problem without realizing that the problem is an indication of a much larger problem.

The reverse also happens. In this situation, an expert deals with the larger problem without realizing that there is a very small, seemingly irrelevant constraint at the source of the larger problem. When either a large or small problem is ignored one may win the battle but lose the war.

Troubleshooters often train experts to notice small details in an environment that they might not have noticed before. They ask “Is something missing that is usually there?  Has there been a change in an old pattern without any apparent reason or explanation for that change?” Any skilled mentalist and magician can tell you that there is a lot to be learned about people and situations by careful observation.

The great challenge for most experts is that they tend to notice only what they were trained to notice.  This is especially so among engineers and medical doctors.

One of the great challenges here is information bias, a type of cognitive bias that involves a distorted evaluation of information. An example of information bias is believing that the more information that can be acquired to make a decision, the better, even if that extra information is irrelevant for the decision.  This is a common problem among physicians who may attempt to properly diagnose what are actually fictitious diseases.

Experts often isolate a constraint* in a system without considering statistically that there must be another cause more likely for that constraint than what appears on the surface. Many game thinkers think statistically but not statistically enough. They seldom think that probabilities apply to their situation, problem or game scenario. Even more so, most ordinary individuals don’t believe statistical probabilities apply to them.  Most drunk drivers don’t think the statistics that show that they are more likely to be killed if they drive under the influence than if sober, applies to them.

I have many brilliant friends that believe that things are absolutely in alignment with their beliefs and find statistical analysis to support these beliefs. The problem is that they are not skilled statisticians. Even if they were statisticians these are not individuals likely to reach out to a game theorist troubleshooter to verify that their  “numbers” and the conclusions they have reached from those numbers are correct. Often my friends are wrong and the statistics they have chosen are wrong as well. The specific problem here is they have ignored the “representativeness heuristic*” when making judgments about the probability of an event under uncertainty. It is just one of a group of heuristics (simple rules governing judgment or decision-making) proposed by psychologists Amos Tversky and Daniel Kahneman in the early 1970s.

Experts, more than anyone else, need to be conscientious when defining the source of a problem. Often what seems obvious to an expert will be used by them to explain the core constraint and all the related elements that seem to be rippling out from that constraint. Just before this expert acts on what seems obvious, it’s important to bring in a trouble shooter.  This backup expert will help the decision maker just before they act. Often what first came to mind was correct, yet this is not always the case and if the original expert is wrong the consequences can be grave.

Kahneman and Tversky discovered long ago that a person making a prediction can only be allowed to ignore statistics that question a conclusion if they are completely certain they are correct. In a complex game scenario, one can never completely be certain about anything. This is why a skilled game thinking troubleshooter can be invaluable.

Let’s review as well as explore this idea further.

Game Thinking Troubleshooting*  (GTT)  is a form of problem-solving applied to repair strategies or solve problems or processes in any game space or system. It is a logical, systematic search for the source of a problem or constraint in order to solve it, To renew or recreate a winning strategy the first step is to identify the symptoms of the problem. Determining the most likely cause of a problem is a process of elimination—eliminating potential causes or constraints. Finally, GTT requires confirmation that the solution restores the strategy or process to its peak state.

As we discuss throughout this book, a system can be described in terms of its expected, desired or intended behavior. Events or specific strategies are expected to generate specific results or outputs. A simple example of this might be selecting the “print” option from various computer applications. This is intended to result in a hardcopy emerging from some specific device. Any unexpected or undesirable behavior is a symptom that there is a problem. So GTT is the process of isolating the specific cause or causes of the symptom. Frequently the symptom is nothing more than a failure of the strategy to produce any results. (Nothing was printed, for example). Corrective action can then be taken to prevent further failures of a similar kind. But what action is it best to take? There are many ways to determine this and they are explored throughout this book.

Some of the most skilled game thinking troubleshooters recognize that one of the sources of a problem or constraint in a game space is a result of failed “tools”. Here a GTT*  may need to use methods drawn from forensic engineering in tracing a problem or constraint. Forensic engineering is the investigation of materials, products, structures or components that fail or do not operate or function as intended. The consequences of this type of failure are dealt with by GTT through the application of the law of product liability.

In Game Thinking there is a wide range of analytical techniques available to determine the cause(s) of specific failures. Corrective action can then be taken to prevent further failure(s) of a similar kind. Preventative action is possible using failure mode and effects (FMEA)* and fault tree analysis (FTA*) before full-scale production takes place, and these methods can also be used for Failure analysis*.

Usually, troubleshooting is applied to something that has suddenly stopped working, since it’s previously working state forms the expectations about its continued behavior. So the initial focus is usually on recent changes to the system or to the environment in which it exists. (For example, a printer that “was working when it was plugged in over there”). However, there is a well-known principle that “correlation does not imply causality”. For example, the failure of a device shortly after it has been plugged into a different outlet doesn’t necessarily mean that the events were related. The failure could have been a matter of coincidence.  Therefore effective troubleshooting demands critical thinking and at times expertly crafted and understood statistical analysis rather than magical thinking or amateurish forms of number crunching.   It’s useful to consider the common experiences we have with light bulbs. Light bulbs “burn out” more or less at random; eventually, the repeated heating and cooling of its filament and fluctuations in the power supplied to it cause the filament to crack or vaporize. The same principle applies to most other electronic devices and similar principles apply to mechanical devices. Some, though not all failures are part of the normal wear-and-tear of components in a system.

A basic principle in troubleshooting is to start from the simplest problems first. This is illustrated by the old saying. This is an expression of the maxim, the KISS principle* (Keep it simple, stupid!). This principle results in the common complaint about help desks or manuals, that they sometimes first ask: “Is it plugged in and does that receptacle have power?”, but this should not be taken as an affront, rather it should serve as a reminder or conditioning to always check the simple things first before calling for help.

A Zen-Game theorist could check each element in a system one by one, substituting known, good components or approaches, for each potentially “suspect” one. However, this process of “serial substitution” is ineffectual (degenerate) when components are substituted without regard to a hypothesis concerning how their failure could result in the symptoms being diagnosed.

Simple and intermediate systems are characterized by lists or “information trees*” of dependencies among their components or subsystems. More complex systems require more sophisticated approaches

For a more in-depth exploration of these ideas and to financially support my work through donations go to www.patreon.com/askLewis

oooooooooooooooooooooooo

Though I Blog here every Monday, you can see my daily blogs on problem-solving, Eastern Wisdom, and game theory at https://www.patreon.com/AskLewis

 

These informational and entertaining postings will help you to be more effective, efficient, productive, and self-aware, consider learning game-based thinking and life strategies through Harrison’s Applied Game Theory (HAGT).

oooooooooooooooooooooooo

Learn to free your inner visionary through game-based thinking and gamification through my book

http://realuguru.com/products/ebooks/the-realugurus-tips-techniques-and-strategies-for-creating-wealth-success-through-visionary-thinking/

 

Learn how to win at the game of life through my new book “How to Hack Your Life through Game Thinking”

 

https://realuguru.com/products/ebooks/how-to-hack-your-life-through-game-thinking/

 

If you have an interest in having a basic understanding of Applied Game Theory (and you need to) here is an interview I did with James Selman, a pioneer, and innovator in Leadership research.

 

Just click below to watch the entire interview.

https://www.youtube.com/watch?v=8hRf87puZPY

Listen here as Lewis explain the RealUGuru Project and how we can give up unnecessary struggle through visionary thinking in this insightful interview with award winning journalist Phyllis Haynes about the RealUGuru Project

https://www.youtube.com/watch?v=zp4DtXpPBeM

oooooooooooooooooooooooo

 

Liked it? Take a second to support Lewis Harrison on Patreon!