[Solved]: Travelling Salesman which can repeat cities

Problem Detail: In the TSP problem, we usually assume a complete graph. If we can only visit each city once, we need a complete graph to ensure that there will be a path from every city to every other city. This is easy to accomplish as if there is no straight path between A and B, we can simply assign a new edge whose length is the shortest path between A and B. However, if we have a sparse graph, maybe we can benefit from not having a complete graph. In this case, we might be forced to repeat vertices. If our graph is only 3 cities, and only two edges, connecting (1, 2) and (2, 3), then the solution must repeat city 2: 1 – 2 – 3 – 2 – 1. I am struggling to find examples of TSP in the scientific literature which assume a non complete directed graph. Any known references? Any keywords I may be missing? In this case, cycles are allowed. I am especially interested in how we could separate subtour inequalities when cycles are present. I am hoping for a solution based on integer linear programming and branch-and-bound, and am wondering how to add subtour elimination inequalities. Any ideas?

Asked By : Chicoscience

Answered By : Chicoscience

I actually found this solution some time ago, but here we go. This is a formulation in a directed graph that can repeat vertices and does not repeat arcs. It requires that for every arc $(i,j)$ there exists an arc $(j,i)$, and that the distance $d_{ij} = d_{ji} geq 0$. We deal with a directed graph $G = (V, A)$. For $W subseteq V$, we define $delta^-(W) = { (i,j) in A: i not in W, j in W}$, $delta^+(W) = { (i,j) in A: i in W, j not in W}$ and $A(W) = { (i,j) in A: i in W, j in W}$. We are looking for, in graph terminology, a closed walk that covers all vertices (a walk may contain cycles). The decision variables are: $x_{ij} = 1$ if arc $a_{ij}$ is in the walk, $0$ otherwise (binary variable) $g_{i} = $ the outdegree of vertex $i$ (continuous variable) The formulation is given by: $$ min sum_{(i,j) in A} d_{ij} x_{ij} $$ subject to: $$ sum_{(i,j) in delta^+(i)} x_{ij} = g_i, ;;;;forall i in V $$ $$ sum_{(i,j) in delta^+(i)} x_{ij} = sum_{(j,i) in delta^-(i)} x_{ji}, , ;;;;forall i in V $$ $$ sum_{(i,j) in delta^+(i)} x_{ij} geq 1, ;;;;forall i in V $$ $$ sum_{i in W} g_{i} geq 1 + sum_{(i,j) in A(W)} x_{ij} , ;;;; forall W subsetneq V, |W| > 1 $$ These last constraints are subtour elimination constraints, which can be expressed in a nice manner thanks to the $g_i$ linear variables. The separation of these constraints is actually polynomial, as it can be solved using a maxflow algorithm. Notice that for every subset of vertices $W$, $sum_{i in W} g_{i} – sum_{(i,j) in A(W)} x_{ij} geq 1$. The 1 in this constraint means that at least one vertex must leave $W$. Given a solution to a linear relaxation represented by $overline{g}_i$ and $overline{x}_{ij}$, we need to find $W$ that minimises the expression $sum_{i in W} overline{g}_{i} – sum_{(i,j) in A(W)} overline{x}_{ij}$. Solving this problem is equivalent to finding $W$ which is separated from the rest of the graph (given any vertex being source and sink) by a minimum cut, which is equivalent to the max flow. If the max flow value is less than 1, we found a violated cut and we can add it to the model.
Best Answer from StackOverflow

Question Source : http://cs.stackexchange.com/questions/43731