Preflow-push: Difference between revisions

From Algowiki
Jump to navigation Jump to search
Line 42: Line 42:


'''Abstract view:'''
'''Abstract view:'''
# Extract an active node <math>v</math> from <math>S</math>.
# Choose an active node <math>v</math> from <math>S</math>.
# If there is at least one [[Basic flow definitions#Valid distance labeling|admissible arc]] leavin <math>v</math>:
# If there is at least one [[Basic flow definitions#Valid distance labeling|admissible arc]] leaving <math>v</math>:
## Choose <math>(v,w)</math>:
## Choose one such arc <math>(v,w)</math>, say.
## If <math>e_f(w)=0</math>, insert <math>w</math> in <math>S</math>.
## Increase the flow over <math>f</math> by the minimum of <math>e_f(v)</math> and the residual capacity of <matrh>a</math>.
## If <math>e_f(v)=0</math> now, extract <math>v</math> from <math>S</math>.
# Otherwise: increase <math>d(v)</math> to one more than the maximal label <math>d(w)</math> of any arc <math>(v,w)</math> in the residual network.
# Otherwise: increase <math>d(v)</math> to one more than the maximal label <math>d(w)</math> of any arc <math>(v,w)</math> in the residual network.
'''Proof:'''
Obviously

Revision as of 18:11, 26 October 2014

Abstract view

Algorithmic problem: max-flow problem (standard version)

Type of algorithm: loop.

Definition:

  1. For [math]\displaystyle{ a\in A }[/math], let [math]\displaystyle{ f(a) }[/math] be a real number (not necessarily foring a flow). For [math]\displaystyle{ v\in V\setminus\{s,t\} }[/math], the excess of [math]\displaystyle{ v }[/math] with respect to [math]\displaystyle{ f }[/math] is defined by [math]\displaystyle{ e_f(v):=\sum_{w:(w,v)\in A}f(w,v)-\sum_{w:(v,w)\in A}f(v,w) }[/math].
  2. A node [math]\displaystyle{ v\in V\setminus\{s,t\} }[/math] is called active it its excess is positive.
  3. Let [math]\displaystyle{ \Delta }[/math] denote the sum of the distance labels [math]\displaystyle{ d(v) }[/math] of all active nodes [math]\displaystyle{ v\in V\setminus\{s,t\} }[/math].

Invariant: Before and after each iteration:

  1. For each arc [math]\displaystyle{ a\in A }[/math], it is [math]\displaystyle{ 0\leq f(a)\leq u(a) }[/math] .
  2. For each node [math]\displaystyle{ v\in V }[/math], it is [math]\displaystyle{ e_f(v)\geq 0 }[/math].
  3. The node labels [math]\displaystyle{ d }[/math] form a valid distance labeling, and it is [math]\displaystyle{ d(s)=|V| }[/math].
  4. The currently active nodes are stored in a set [math]\displaystyle{ S }[/math].

Variant: At least one

  1. The label [math]\displaystyle{ d(v) }[/math] of at least one node [math]\displaystyle{ v\in V\setminus\{s,t\} }[/math] is increased.
  2. A saturating push is performed.
  3. The value of [math]\displaystyle{ \Delta }[/math] decreases (in case of a non-saturating push).

Break condition: [math]\displaystyle{ S=\emptyset }[/math].

Induction basis

Abstract view:

  1. For all arcs [math]\displaystyle{ a\in A }[/math], set [math]\displaystyle{ f(a):=0 }[/math].
  2. For each arc [math]\displaystyle{ (s,v)\in A }[/math], overwrite this value by [math]\displaystyle{ f(a):=u(a) }[/math].
  3. Compute a valid distance labeling [math]\displaystyle{ d }[/math], for example, the true distances from all nodes to [math]\displaystyle{ t }[/math] in the residual network.
  4. Set [math]\displaystyle{ d(s):=n }[/math].

Proof: Obvious.

Induction step

Abstract view:

  1. Choose an active node [math]\displaystyle{ v }[/math] from [math]\displaystyle{ S }[/math].
  2. If there is at least one admissible arc leaving [math]\displaystyle{ v }[/math]:
    1. Choose one such arc [math]\displaystyle{ (v,w) }[/math], say.
    2. If [math]\displaystyle{ e_f(w)=0 }[/math], insert [math]\displaystyle{ w }[/math] in [math]\displaystyle{ S }[/math].
    3. Increase the flow over [math]\displaystyle{ f }[/math] by the minimum of [math]\displaystyle{ e_f(v) }[/math] and the residual capacity of <matrh>a</math>.
    4. If [math]\displaystyle{ e_f(v)=0 }[/math] now, extract [math]\displaystyle{ v }[/math] from [math]\displaystyle{ S }[/math].
  3. Otherwise: increase [math]\displaystyle{ d(v) }[/math] to one more than the maximal label [math]\displaystyle{ d(w) }[/math] of any arc [math]\displaystyle{ (v,w) }[/math] in the residual network.

Proof: Obviously