Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
Ergebnis 11 von 50
Journal of applied statistics, 2017-01, Vol.44 (2), p.253-269
2017
Volltextzugriff (PDF)

Details

Autor(en) / Beteiligte
Titel
Bayesian variable selection and estimation in maximum entropy quantile regression
Ist Teil von
  • Journal of applied statistics, 2017-01, Vol.44 (2), p.253-269
Ort / Verlag
Abingdon: Taylor & Francis
Erscheinungsjahr
2017
Quelle
Taylor & Francis Journals Auto-Holdings Collection
Beschreibungen/Notizen
  • Quantile regression has gained increasing popularity as it provides richer information than the regular mean regression, and variable selection plays an important role in the quantile regression model building process, as it improves the prediction accuracy by choosing an appropriate subset of regression predictors. Unlike the traditional quantile regression, we consider the quantile as an unknown parameter and estimate it jointly with other regression coefficients. In particular, we adopt the Bayesian adaptive Lasso for the maximum entropy quantile regression. A flat prior is chosen for the quantile parameter due to the lack of information on it. The proposed method not only addresses the problem about which quantile would be the most probable one among all the candidates, but also reflects the inner relationship of the data through the estimated quantile. We develop an efficient Gibbs sampler algorithm and show that the performance of our proposed method is superior than the Bayesian adaptive Lasso and Bayesian Lasso through simulation studies and a real data analysis.

Weiterführende Literatur

Empfehlungen zum selben Thema automatisch vorgeschlagen von bX