TU Darmstadt / ULB / TUbiblio

Low-Rank Tensor Completion via Novel Sparsity-Inducing Regularizers

Wang, Zhi-Yong ; So, Hing Cheung ; Zoubir, Abdelhak M. (2024)
Low-Rank Tensor Completion via Novel Sparsity-Inducing Regularizers.
In: IEEE Transactions on Signal Processing
doi: 10.1109/TSP.2024.3424272
Artikel, Bibliographie

Kurzbeschreibung (Abstract)

To alleviate the bias generated by the ℓ1 -norm in the low-rank tensor completion problem, nonconvex surrogates/regularizers have been suggested to replace the tensor nuclear norm, although both can achieve sparsity. However, the thresholding functions of these nonconvex regularizers may not have closed-form expressions and thus iterations are needed, which implies high computational load. To solve this issue, we devise a framework to generate sparsity-inducing regularizers with closed-form thresholding functions. These regularizers are applied to low-tubal-rank tensor completion, and efficient algorithms based on the alternating direction method of multipliers are developed. Furthermore, convergence of our methods is analyzed and it is proved that the generated sequences are bounded and converge to a stationary point. Experimental results using synthetic and real-world datasets show that the proposed algorithms outperform the state-of-the-art methods in terms of restoration performance.

Typ des Eintrags: Artikel
Erschienen: 2024
Autor(en): Wang, Zhi-Yong ; So, Hing Cheung ; Zoubir, Abdelhak M.
Art des Eintrags: Bibliographie
Titel: Low-Rank Tensor Completion via Novel Sparsity-Inducing Regularizers
Sprache: Englisch
Publikationsjahr: 11 Juli 2024
Verlag: IEEE
Titel der Zeitschrift, Zeitung oder Schriftenreihe: IEEE Transactions on Signal Processing
Kollation: 17 Seiten
DOI: 10.1109/TSP.2024.3424272
Zugehörige Links:
Kurzbeschreibung (Abstract):

To alleviate the bias generated by the ℓ1 -norm in the low-rank tensor completion problem, nonconvex surrogates/regularizers have been suggested to replace the tensor nuclear norm, although both can achieve sparsity. However, the thresholding functions of these nonconvex regularizers may not have closed-form expressions and thus iterations are needed, which implies high computational load. To solve this issue, we devise a framework to generate sparsity-inducing regularizers with closed-form thresholding functions. These regularizers are applied to low-tubal-rank tensor completion, and efficient algorithms based on the alternating direction method of multipliers are developed. Furthermore, convergence of our methods is analyzed and it is proved that the generated sequences are bounded and converge to a stationary point. Experimental results using synthetic and real-world datasets show that the proposed algorithms outperform the state-of-the-art methods in terms of restoration performance.

Zusätzliche Informationen:

Early Access

Fachbereich(e)/-gebiet(e): 18 Fachbereich Elektrotechnik und Informationstechnik
18 Fachbereich Elektrotechnik und Informationstechnik > Institut für Nachrichtentechnik
18 Fachbereich Elektrotechnik und Informationstechnik > Institut für Nachrichtentechnik > Signalverarbeitung
Hinterlegungsdatum: 17 Jul 2024 12:24
Letzte Änderung: 17 Jul 2024 12:24
PPN:
Export:
Suche nach Titel in: TUfind oder in Google
Frage zum Eintrag Frage zum Eintrag

Optionen (nur für Redakteure)
Redaktionelle Details anzeigen Redaktionelle Details anzeigen