Preflow-push: Difference between revisions
No edit summary |
|||
Line 49: | Line 49: | ||
## If <math>e_f(v)=0</math> now, extract <math>v</math> from <math>S</math>. | ## If <math>e_f(v)=0</math> now, extract <math>v</math> from <math>S</math>. | ||
# Otherwise: increase <math>d(v)</math> to one more than the maximal label <math>d(w)</math> of any arc <math>(v,w)</math> in the residual network. | # Otherwise: increase <math>d(v)</math> to one more than the maximal label <math>d(w)</math> of any arc <math>(v,w)</math> in the residual network. | ||
'''Remark:''' | |||
The preflow-push algorithm is also known as '''push-relabel''' algorithm. The ''push'' operation is step 2.2; the ''relabel'' operation is step 3. | |||
'''Proof:''' | '''Proof:''' | ||
Line 55: | Line 58: | ||
To prove the variant, consider a step in which step 2.2 applies, but the arc <math>(v.w)</math> is not saturated by that. Potentially, <math>w</math> becomes active. However, <math>v</math> definitely becomes inactive. Now the variant follows from the fact that <math>d(w)=d(v)-1</math> for an admissible arc <math>(v,w)</math>. | To prove the variant, consider a step in which step 2.2 applies, but the arc <math>(v.w)</math> is not saturated by that. Potentially, <math>w</math> becomes active. However, <math>v</math> definitely becomes inactive. Now the variant follows from the fact that <math>d(w)=d(v)-1</math> for an admissible arc <math>(v,w)</math>. | ||
It remains to show termination; this is proved by the folowing considerations. | |||
== Complexity == | == Complexity == |
Revision as of 19:34, 26 October 2014
Abstract view
Algorithmic problem: max-flow problem (standard version)
Type of algorithm: loop.
Definition:
- For [math]\displaystyle{ a\in A }[/math], let [math]\displaystyle{ f(a) }[/math] be a real number (not necessarily foring a flow). For [math]\displaystyle{ v\in V\setminus\{s,t\} }[/math], the excess of [math]\displaystyle{ v }[/math] with respect to [math]\displaystyle{ f }[/math] is defined by [math]\displaystyle{ e_f(v):=\sum_{w:(w,v)\in A}f(w,v)-\sum_{w:(v,w)\in A}f(v,w) }[/math].
- A node [math]\displaystyle{ v\in V\setminus\{s,t\} }[/math] is called active it its excess is positive.
- Let [math]\displaystyle{ \Delta }[/math] denote the sum of the distance labels [math]\displaystyle{ d(v) }[/math] of all active nodes [math]\displaystyle{ v\in V\setminus\{s,t\} }[/math].
Invariant: Before and after each iteration:
- For each arc [math]\displaystyle{ a\in A }[/math], it is [math]\displaystyle{ 0\leq f(a)\leq u(a) }[/math] .
- For each node [math]\displaystyle{ v\in V }[/math], it is [math]\displaystyle{ e_f(v)\geq 0 }[/math].
- The node labels [math]\displaystyle{ d }[/math] form a valid distance labeling, and it is [math]\displaystyle{ d(s)=|V| }[/math].
- The currently active nodes are stored in a set [math]\displaystyle{ S }[/math].
Variant: In each iteration, at least one of the following cases will take place:
- The label [math]\displaystyle{ d(v) }[/math] of at least one node [math]\displaystyle{ v\in V\setminus\{s,t\} }[/math] is increased.
- A saturating push is performed.
- The value of [math]\displaystyle{ \Delta }[/math] decreases (in case of a non-saturating push).
Break condition: [math]\displaystyle{ S=\emptyset }[/math].
Induction basis
Abstract view:
- For all arcs [math]\displaystyle{ a\in A }[/math], set [math]\displaystyle{ f(a):=0 }[/math].
- For each arc [math]\displaystyle{ (s,v)\in A }[/math], overwrite this value by [math]\displaystyle{ f(a):=u(a) }[/math].
- Compute a valid distance labeling [math]\displaystyle{ d }[/math], for example, the true distances from all nodes to [math]\displaystyle{ t }[/math] in the residual network.
- Set [math]\displaystyle{ d(s):=n }[/math].
Proof: Obvious.
Induction step
Abstract view:
- Choose an active node [math]\displaystyle{ v }[/math] from [math]\displaystyle{ S }[/math].
- If there is at least one admissible arc leaving [math]\displaystyle{ v }[/math]:
- Choose one such arc [math]\displaystyle{ (v,w) }[/math], say.
- If [math]\displaystyle{ w\neq s }[/math] and [math]\displaystyle{ e_f(w)=0 }[/math], insert [math]\displaystyle{ w }[/math] in [math]\displaystyle{ S }[/math].
- Increase the flow over [math]\displaystyle{ f }[/math] by the minimum of [math]\displaystyle{ e_f(v) }[/math] and the residual capacity of <matrh>a</math>.
- If [math]\displaystyle{ e_f(v)=0 }[/math] now, extract [math]\displaystyle{ v }[/math] from [math]\displaystyle{ S }[/math].
- Otherwise: increase [math]\displaystyle{ d(v) }[/math] to one more than the maximal label [math]\displaystyle{ d(w) }[/math] of any arc [math]\displaystyle{ (v,w) }[/math] in the residual network.
Remark: The preflow-push algorithm is also known as push-relabel algorithm. The push operation is step 2.2; the relabel operation is step 3.
Proof: Points 1, 2, and 4 of the invariant are obviously fulfilled. Point 3 of the invariant is affected by step 3 only; the extremely conservative increase ensures point 3 of the invariant.
To prove the variant, consider a step in which step 2.2 applies, but the arc [math]\displaystyle{ (v.w) }[/math] is not saturated by that. Potentially, [math]\displaystyle{ w }[/math] becomes active. However, [math]\displaystyle{ v }[/math] definitely becomes inactive. Now the variant follows from the fact that [math]\displaystyle{ d(w)=d(v)-1 }[/math] for an admissible arc [math]\displaystyle{ (v,w) }[/math].
It remains to show termination; this is proved by the folowing considerations.
Complexity
Statement: The asymptotic complexity is in [math]\displaystyle{ \mathcal{O}(n^2m) }[/math], where [math]\displaystyle{ n=|V| }[/math] and [math]\displaystyle{ m=|A| }[/math].
Proof: First we show that the total number of relabels (step 3 of the main loop) is in [math]\displaystyle{ \mathcal{O}(n^2) }[/math]. To see that, let [math]\displaystyle{ v\in V\setminus\{s,t\} }[/math] be an active note before/after an iteration of the main loop.