A while back I posted an example outlining how to use petersburg to simulate the St. Petersburg Paradox. In this post, I'd like to dig a little deeper into what petersbug is and what it does.

Petersburg is a minimal python library for representing complex decisions (as in decision theory decisions) as probabilistic graphs. With a game or decision represented, it can be simulated with some context (like finite bankrolls, ruin, or risk aversion). So let's dive in to the inner workings:

A graph is a set of nodes connected by edges. Nodes represent some object or state, and they are connected by edges that may or may not have direction. Both edges and nodes can have attributes tied to them, such as weights, labels or other data. If it is possible to start at a node, and progress along edges and get back to the same node, that is called a cycle, making a directed graph without cycles "acyclic". These directed acyclic graphs (DAGs) are what we use to represent decisions in petersburg.

Nodes in petersburg have one attribute: payoff. This is some numeric value accumulated when the node is reached. There is one special case for nodes, and that is the starting node, where the process begins. Other than that, all decisions are made up of these simple one-attribute nodes connected by directed edges. The edges have two attributes: cost and weight. The cost is, just like the payoff attribute of the nodes, accumulated on arrival to the edge. Weights are positive numeric values that are used to represent the likelihood of taking one edge versus another when more than one path out of a node is available.

With these two objects, a graph object, which is simply a reference to the starting node of a DAG can be built very simply to represent very complex scenarios. To revisit the St. Petersburg case in a more classical case (with finite bankroll), the dictionary representing the game for just one flip would be:

{ "1": { "after": [], "payoff": 0 }, "2": { "after": [ { "cost": 10, "node_id": 1 } ], "payoff": 0 }, "3": { "after": [ { "weight": 1, "cost": 0, "node_id": 2 } ], "payoff": 2 }, "4": { "after": [ { "weight": 1, "cost": 0, "node_id": 2 } ], "payoff": 0 } }

In this graph, there are 4 nodes:

- Start
- Entrance Fee
- Flip #1: Heads
- Flip #2: Tails (fail)

The (3) and (4) node pattern can be repeated indefinitely to represent the St. Petersburg game.

While these classical decision theoretic problems are interesting, and I have two more to post in the coming weeks, the real intention here is to study more interesting relationships in complex decision structures. So in the case where the weights in one part of graph aren't known, but the structure and costs are, how can you use simulation to understand the influence of those weights to make informed (and safe from ruin) decisions? How can likelihood, criticality, and ability to affect change be estimated and consolidated in complex decision scenarios?

Petersburg will be able to help understand these kinds of questions better by providing structure into how the problems are presented, and by providing that structure in such a way that subsets of the decision which may be simpler can be examined in isolation.

So check out petersburg on github, it's still very raw, but there are a bunch of examples that I'll continue developing as the library itself fleshes out:

## 4 Comments