یک روش گرادیان مزدوج کارآمد برای بهینه‌سازی ناهموار و کاربرد آن در بازیابی تصویر

نوع مقاله : مقاله پژوهشی

نویسندگان

گروه ریاضی کاربردی، دانشکده علوم ریاضی، دانشگاه مازندران، بابلسر، ایران.

چکیده

در این مقاله، یک روش گرادیان مزدوج کارآمد برای حل مسائل بهینه‌سازی نامقید ناهموار معرفی می‌شود. روش‌های گرادیان مزدوج به دلیل سادگی و نیاز به حافظه کم، از محبوب‌ترین روش‌ها برای حل مسائل بهینه‌سازی هموار به شمار می‌آیند؛ با این حال، کاربرد آنها در مسائل ناهموار کمتر مورد توجه قرار گرفته است. در این راستا، ابتدا تابع هدف پیوسته لیپ‌شیتز با استفاده از تابع منظم‌ساز مورا–یاشیدا به یک تابع هموار تبدیل می‌گردد. سپس، یک جهت کاهشی جدید با ترکیب اطلاعات مشتق مرتبه اول تابع هموار شده و جهت کاهشی قبلی پیشنهاد می‌شود. با استفاده از تکنیک جستجوی خطی نادقیق، نشان داده می‌شود که جهت تولید شده شرط کاهش کافی را برآورده کرده و نقطه تکرار جدید در ناحیه اعتماد مناسبی نسبت به تکرار قبلی قرار می‌گیرد. همچنین، همگرایی سراسری روش پیشنهادی تحت فرضیات استاندارد تضمین می‌شود. در پایان، کارایی این روش در حوزه بازیابی تصویر مورد ارزیابی قرار گرفته و نتایج نشان‌دهنده برتری عملکرد آن نسبت به روش‌های موجود است.

کلیدواژه‌ها

موضوعات


عنوان مقاله [English]

An efficient conjugate gradient method for nonsmooth optimization problems and its application in image restoration

نویسندگان [English]

  • Atefe Bay
  • Zohreh Akabri
Department of Applied Mathematics, Faculty of Mathematical Sciences, University of Mazandran, Babolsar, Iran.
چکیده [English]

In this paper, an efficient conjugate gradient method is introduced for solving unconstrained nonsmooth optimization problems. Conjugate gradient methods are among the most popular methods for solving smooth optimization problems due to their simplicity and low memory requirements; however, their application to nonsmooth problems has received less attention. Therefore, the Lipschitz continuous objective function is first smoothed using the Moreau–Yosida regularization function. A new descent direction is then proposed by combining the first-order derivative information of the smoothed function with the previous descent direction. Using an inexact line search technique, we show that the generated direction satisfies the sufficient decrease condition and that the new iterate remains within a suitable trust region relative to the previous iterate. Moreover, the global convergence of the proposed method is guaranteed under standard assumptions. Finally, the effectiveness of this method is evaluated in the field of image recovery, and the results demonstrate its superior performance compared to existing methods.

کلیدواژه‌ها [English]

  • Conjugate gradient method
  • Nonsmooth optimization
  • Moreau–Yosida regularization
  • Global convergence
  • Image restoration
[1] S. Motamed, “Automatic License Plate Recognition Using Improved Deep Learning,” Soft Comput. J., vol. 12, no. 1, pp. 41-48, 2023, doi: 10.22052/scj.2024.253175.1163.
[2] M. Jafari, “Isolation of Vessels in Retinal Color Images,” Soft Comput. J., vol. 12, no. 1, pp. 17-21, 2023, doi: 10.22052/scj.2022.246288.1060.
[3] M. Eftekharian and A. Nodehi, “Breast Cancer Diagnosis and Classification Improvement Based on Deep Learning and Image Processing,” Soft Comput. J., vol. 12, no. 1, pp. 22-26, 2023, doi: 10.22052/scj.2023.246416.1067.
[4] X. Chen and W. Zhou, “Smoothing Nonlinear Conjugate Gradient Method for Image Restoration Using Nonsmooth Nonconvex Minimization,” SIAM J. Imag. Sci., vol. 3, no. 4, pp. 765-790, 2010, doi: 10.1137/080740167.
[5] G. Yuan, T. Li, and W. Hu, “A Conjugate Gradient Algorithm and Its Application in Large-Scale Optimization Problems and Image Restoration,” J. Inequalities Appl., vol. 2019, p. 247, 2019, doi: 10.1186/s13660-019-2192-6.
[6] M. V. W. Zibetti, C. Lin, and G. T. Herman, “Total Variation Superiorized Conjugate Gradient Method for Image Reconstruction,” Inverse Probl., vol. 34, no. 3, p. 034001, 2018, doi: 10.1088/1361-6420/aaa49b.
[7] P. Huang and K. Liu, “A New Conjugate Gradient Algorithm for Noise Reduction in Signal Processing and Image Restoration,” Front. Phys., vol. 10, p. 1053353, 2022, doi: 10.3389/fphy.2022.1053353.
[8] F. Abdollahi and M. Fatemi, “An Efficient Conjugate Gradient Method with Strong Convergence Properties for Non-Smooth Optimization,” J. Math. Model., vol. 9, no. 3, pp. 375-390, 2021, doi: 10.22124/jmm.2020.16747.1452.
[9] M. Malik, I. M. Sulaiman, A. B. Abubakar, G. Ardaneswari, and Sukono, “A New Family of Hybrid Three-Term Conjugate Gradient Method for Unconstrained Optimization with Application to Image Restoration and Portfolio Selection,” AIMS Math., vol. 8, no. 1, pp. 1-28, 2023, doi: 10.3934/math.2023001.
[10] X. Zhang and Y. Yang, “A New Hybrid Conjugate Gradient Method Close to the Memoryless BFGS Quasi-Newton Method and Its Application in Image Restoration and Machine Learning,” AIMS Math., vol. 9, no. 10, pp. 27535-27556, 2024, doi: 10.3934/math.20241337.
[11] A. Bagirov, N. Karmitsa, and M. M. Makela, Introduction to Nonsmooth Optimization: Theory, Practice and Software. Cham, Switzerland: Springer, 2014, doi: 10.1007/978-3-319-08114-4.
[12] J. B. Hiriart-Urruty and C. Lemarechal, Convex Analysis and Minimization Algorithms II: Advanced Theory and Bundle Methods. Berlin, Germany: Springer, 1993, doi: 10.1007/978-3-662-06409-2.
[13] S. Boyd and L. Vandenberghe, Convex Optimization. Cambridge, UK: Cambridge Univ. Press, 2004, doi: 10.1017/CBO9780511804441.
[14] W. W. Hager and H. Zhang, “A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search,” SIAM J. Optim., vol. 16, no. 1, pp. 170-192, 2005, doi: 10.1137/030601880.
[15] Y. H. Dai and Y. Yuan, “A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property,” SIAM J. Optim., vol. 10, no. 1, pp. 177-182, 1999, doi: 10.1137/S1052623497318992.
[16] H. Zhang and W. W. Hager, “A Nonlinear Conjugate Gradient Method with Linear Time Complexity,” SIAM J. Optim., vol. 18, no. 3, pp. 875-898, 2007, doi: 10.1137/060670010.
[17] M. Andrei, “An Unconstrained Optimization Test Functions Collection,” Adv. Model. Optim., vol. 10, no. 1, pp. 147-161, 2008.
[18] N. Parikh and S. Boyd, “Proximal Algorithms,” Found. Trends Optim., vol. 1, no. 3, pp. 123-231, 2014, doi: 10.1561/2400000003.
[19] L. C. Evans, Partial Differential Equations, 2nd ed. Providence, RI, USA: American Mathematical Society, 2010.
[20] G. Yuan, Z. Wei, and G. Li, “A Modified Polak-Ribiere-Polyak Conjugate Gradient Algorithm for Nonsmooth Convex Programs,” J. Comput. Appl. Math., vol. 255, pp. 86-96, 2014, doi: 10.1016/j.cam.2013.05.027.
[21] A. R. Conn, N. I. Gould, and P. L. Toint, Trust Region Methods. Philadelphia, PA, USA: SIAM, 2000, doi: 10.1137/1.9780898719857.
[22] R. Correa and C. Lemarechal, “Convergence of Some Algorithms for Convex Minimization,” Math. Program., vol. 62, pp. 261-275, 1993, doi: 10.1007/BF01585171.
[23] M. Fukushima, “A Descent Algorithm for Nonsmooth Convex Optimization,” Math. Program., vol. 30, pp. 163-175, 1984, doi: 10.1007/BF01580257.
[24] J. Cao and J. Wu, “A Conjugate Gradient Algorithm and Its Applications in Image Restoration,” Appl. Numer. Math., vol. 156, pp. 243-252, 2020, doi: 10.1016/j.apnum.2019.12.002.
[25] R. Fletcher and C. M. Reeves, “Function Minimization by Conjugate Gradients,” Comput. J., vol. 7, no. 2, pp. 149-154, 1964, doi: 10.1093/comjnl/7.2.149.