liu.seSearch for publications in DiVA
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Distributed Quantile Regression with Non-Convex Sparse Penalties
Norwegian Univ Sci & Technol NTNU, Norway.
Norwegian Univ Sci & Technol NTNU, Norway.
Linköping University, Department of Science and Technology, Physics, Electronics and Mathematics. Linköping University, Faculty of Science & Engineering.ORCID iD: 0000-0001-8145-7392
Norwegian Univ Sci & Technol NTNU, Norway.
2023 (English)In: 2023 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP, SSP, IEEE , 2023, p. 250-254Conference paper, Published paper (Refereed)
Abstract [en]

The surge in data generated by IoT sensors has increased the need for scalable and efficient data analysis methods, particularly for robust algorithms like quantile regression, which can be tailored to meet a variety of situations, including nonlinear relationships, distributions with heavy tails, and outliers. This paper presents a sub-gradient-based algorithm for distributed quantile regression with non-convex, and non-smooth sparse penalties such as the Minimax Concave Penalty (MCP) and Smoothly Clipped Absolute Deviation (SCAD). These penalties selectively shrink non-active coefficients towards zero, addressing the limitations of traditional penalties like the l(1)-penalty in sparse models. Existing quantile regression algorithms with non-convex penalties are designed for centralized cases, whereas our proposed method can be applied to distributed quantile regression using non-convex penalties, thereby improving estimation accuracy. We provide a convergence proof for our proposed algorithm and demonstrate through numerical simulations that it outperforms state-of-the-art algorithms in sparse and moderately sparse scenarios.

Place, publisher, year, edition, pages
IEEE , 2023. p. 250-254
Series
IEEE/SP Workshop on Statistical Signal Processing, ISSN 2373-0803
Keywords [en]
Distributed learning; quantile regression; non-convex and non-smooth penalties; weak convexity; sparse learning
National Category
Probability Theory and Statistics
Identifiers
URN: urn:nbn:se:liu:diva-198533DOI: 10.1109/SSP53291.2023.10208080ISI: 001051091700051ISBN: 9781665452458 (electronic)ISBN: 9781665452465 (print)OAI: oai:DiVA.org:liu-198533DiVA, id: diva2:1805601
Conference
22nd IEEE Statistical Signal Processing Workshop (SSP), VNU Univ Engn & Technol, Hanoi, VIETNAM, jul 02-05, 2023
Note

Funding Agencies|Research Council of Norway

Available from: 2023-10-17 Created: 2023-10-17 Last updated: 2023-10-17

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full text

Search in DiVA

By author/editor
Kumar Dasanadoddi Venkategowda, Naveen
By organisation
Physics, Electronics and MathematicsFaculty of Science & Engineering
Probability Theory and Statistics

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 25 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf