论文部分内容阅读
It is interesting to compare the efficiency of two methods when their computational loads in each iteration are equal. In this paper, two classes of contraction methods for monotone variational inequalities are studied in a unified framework. The methods of both classes can be viewed as prediction-correction methods, which generate the same test vector in the prediction step and adopt the same step-size rule in the correction step. The only difference is that they use different search directions. The computational loads of each iteration of the different classes are equal. Our analysis explains theoretically why one class of the contraction methods usually outperforms the other class. It is demonstrated that many known methods belong to these two classes of methods. Finally, the presented numerical results demonstrate the validity of our analysis.