1 post in this topic

Weak duality. The value $|f|$ of any s-t flow $f$ is no greater than the capacity c(S,T) of any s-t cut $(S,T)$.

Strong duality (Max-flow min-cut theorem). The maximum value of an s-t flow is equal to the minimum capacity over all s-t cuts.  



$\begin{align} \mbox{maximize}\quad & v \\
\mbox{subject to}\quad & \sum_{e\in\delta^+(s)}x_e - \sum_{e\in\delta^-(s)}x_e =
\begin{cases}-v &\text{if } i=s \\ v &\text{if } i=t \\ 0 &\text{if } i\in V\setminus\set{s,t} \end{cases} \\
& 0\le x_e \le c_e \quad \forall e\in E. \end{align}$


A cut $(S,T)$ can be represented by indicator variables $z_i$ such that $z_i=0$ if $i\in S$ and $z_i=1$ if $i\in T$. The chosen edges are represented by indicator variables $y_e$. Note that we choose only edges that go from $S$ to $T$, thus $y_{ij}=1$ if and only if $z_i=1$ and $z_j=0$. The minimum cut can therefore formulated as:

$\begin{align} \mbox{minimize}\quad & \sum_{e\in E}y_e c_e \\ \mbox{subject to}\quad &y_{ij} =1 \text{ if } (z_i,z_j)=(0,1) \text{ and } 0 \text{ otherwise}, \quad \forall ij\in E,\\ &z_s=0, z_t=1,\\ & z_i\in\set{0,1}\quad\forall i\in V, y_e\in\set{0,1}\quad\forall e\in E.\end{align}$

The dual of the max-flow problem is:

$\begin{align} \mbox{minimize}\quad & \sum_{e\in E}y_e c_e \\ \mbox{subject to}\quad &y_{ij} +z_i-z_j &&\ge 0 \quad \forall ij\in E,\\ &z_t-z_s &&= 1,\\ & y_e &&\ge 0 \quad \forall e\in E.\end{align}$

This is a relaxation of the min-cut problem.

Complementary slackness condition. $(x,v)$ and $(y,z)$ are both optimal solutions (to the primal and the dual, respectively) if and only if:

  1. $(x,v)$ is primal feasible;
  2. $(y,z)$ is dual feasible;
  3. $x_{ij}(y_{ij}+z_i-z_j)=0 \quad\forall e\in E$.
  4. $y_e(c_e-x_e)=0 \quad\forall e\in E$. 


Recover an optimal solution of the max-flow from an optimal solution of the min-cut. 

From this, we can recover

Share this post

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Related Content

    • By Khue
      In this topic I would like to discuss everything about Linear Assignment Problems (LAPs). Another important class of assignment problems, that is Quadratic Assignment Problems, will be discussed later in another topic.
      Let's begin with the classic LAP, defined as follows (Wikipedia):
      This problem is also called Linear Sum Assignment Problem (LSAP), since the objective function is a sum, to be differentiated with other types of assignment problems such as the Linear Bottleneck Assignment Problem (LBAP). These problems will also be discussed later in this topic. When we say LAP without specification, we refer to the LSAP.
      The formal mathematical definition of the LAP is the following.
      Problem. (LAP) Given a matrix $C = (c_{ij})_{1\le i,j \le n} \in \mathbb{R}^{n\times n}$, called cost matrix, where $n$ is a given positive integer.
      \begin{align} \mbox{minimize}\quad & \sum_{i=1}^n \sum_{j=1}^n c_{ij}x_{ij} \\ \mbox{subject to}\quad & \sum_{i=1}^n x_{ij} = 1 \quad \forall j, \\ & \sum_{j=1}^n x_{ij} = 1 \quad \forall i,\\ & x_{ij} \in \set{0,1} \quad\forall i,j. \end{align}
      It is straightforward to see that this problem is equivalent to finding a perfect matching of a bipartite graph, thus it is also known as the bipartite minimum cost perfect matching problem.
      It is well-known that the LP relaxation of the above integer program (i.e. the integral constraints $x_{ij} \in \set{0,1}$ are relaxed to $x_{ij} \ge 0$) has optimal solutions at extreme points, thus the LP relaxed problem and the original integer program are equivalent.
    • By Khue
      In this topic we will discuss Nikos Komodakis's work on the approximate primal-dual method for solving MRF optimization. I will describe in this topic 4 algorithms: PD1, PD2, PD3 and FastPD.
      The MRF optimization (or MAP Inference) problem:
      &\text{minimize}  \quad    && \sum_{p\in V}\sum_{l\in L} \theta_p(l)x_p(l) + \sum_{pq\in E}\sum_{(l,l')\in L^2} \theta_{pq}(l,l')x_p(l)x_q(l')\\
      &\text{subject to} \quad    && \sum_{l\in L} x_p(l) = 1 \quad \forall p\in V,\\ 
       & &&\sum_{l'\in L} x_{pq}(l,l') = x_p(l) \quad\forall pq\in E, l\in L, \\
       & &&\sum_{l\in L} x_{pq}(l,l') = x_q(l') \quad\forall pq\in E, l'\in L, \\
       & &&x_p(l)\in\set{0,1}, x_{pq}(l,l')\in\set{0,1}\quad\forall pq\in E, l\in L, l'\in L.
      Replacing the integral constraints in the above problem by $x_p(l)\ge 0, x_{pq}(l,l')\ge 0$ we get its the LP-relaxation. The dual of this LP-relaxation is given by:
      \mbox{maximize}\quad & \sum_{p\in V}\min_{a\in L}h_p(a) \label{eq:dual-objective}\\
      \mbox{subject to}\quad &y_{pq}(a) + y_{qp}(b) \le \theta_{pq}(a,b) \quad \forall pq\in E, a\in L,b\in L, \label{eq:dual-feasibility}\end{align}
      where the function $h_p(\cdot)$ is defined by $$h_p(a) = \theta_p(a) + \sum_{q:pq\in E} y_{pq}(a).$$
      Denote by $x_p$ the chosen label for the node $p$.
      PD1 Algorithm
      This algorithm, at termination, gives a primal-dual pair $(\mathbf{x},\mathbf{y})$ satisfying ($\mathbf{x}$ is always chosen as a labeling, thus primal feasibility always holds and hereafter we will not state it explicitly) :
      Relaxed dual feasibility:
      \begin{equation}y_{pq}(a) \le \frac{1}{2}\theta_{pq}^{\mathrm{min}}\quad\forall pq\in E,a\in L. \label{eq:relaxed-dual-feasibility} \end{equation}
      Clearly, this implies dual feasibility, i.e. \eqref{eq:dual-feasibility}. Relaxed complementary slackness conditions: \begin{align} h_p(x_p) &= \min_ah_p(a) \quad \forall p\in V \label{eq:primal-csc-p} \\ y_{pq}(x_p) + y_{qp}(x_q) &\ge \frac{1}{f_\mathrm{app}} \theta_{pq}(x_p,x_q) \quad \forall pq\in E. \label{eq:relaxed-primal-csc-pq} \end{align} During its iterations, the algorithm ensures all the above conditions, except \eqref{eq:primal-csc-p}, which will be satisfied at the end.
      Dual variables update
      Given $(\mathbf{x},\mathbf{y})$ satisfying \eqref{eq:relaxed-dual-feasibility} and \eqref{eq:relaxed-primal-csc-pq}. If $(\mathbf{x},\mathbf{y})$ also satisfy \eqref{eq:primal-csc-p} then we obtain an approximate solution. Otherwise, we will be able to increase the dual objective \eqref{eq:dual-objective} as follows.
      Since \eqref{eq:primal-csc-p} does not hold for all $p$, there exists a non-empty set $V_0 \subseteq V$ of nodes $p$ where $h_p(x_p) > \min_ah_p(a)$.