References
- Koenker R. Quantile regression. New York (NY): Cambridge University Press; 2005.
- Efron B. Regression percentiles using asymmetric squared error loss. Stat Sin. 1991;55:93–125.
- Newey W, Powell JL. Asymmetric least squares estimation and testing. Econometrica. 1987;55(4):819–847.
- Friedman JH. Greedy function approximation: a gradient boosting machine. Ann Stat. 2001;29(5):1189–1232.
- Sobotka F, Kneib T. Geoadditive expectile regression. Comput Stat Data Anal. 2012;56(4):755–767.
- Yang Y, Zou H. Nonparametric multiple expectile regression via ER-Boost. J Stat Comput Simul. 2015;85(7):1442–1458.
- Waldmann E, Sobotka F, Kneib T. Bayesian regularisation in geoadditive expectile regression. Stat Comput. 2017;27(6):1539–1553.
- Schnabel SK, Eilers PHC. Optimal expectile smoothing. Comput Stat Data Anal. 2009;53(12):4168–4177.
- Vapnik V. Statistical learning theory. New York: John Wiley; 1998.
- Farooq M, Steinwart I. An SVM-like approach for expectile regression. Comput Stat Data Anal. 2017;109:159–181.
- Yang Y, Zhang T, Zou H. Flexible expectile regression in reproducing kernel Hilbert space. Technometrics. 2018;60(1):26–35.
- Osuna E, Freund R, Girosi F. An improved training algorithm for support vector machines. In: Proceeding IEEE Workshop Neural Networks for Signal Processing; 1997. p. 276–285.
- Osuna E, Freund R, Girosi F. Training support vector machines: an application to face detection. In: Proceeding IEEE Conference on Computer Vision and Pattern Recognition; 1997.
- Platt J. Fast training of support vector machines using sequential minimal optimization. In: Schöelkopf B and Burges C and Smola A, editors. Advances in Kernel Methods – Support Vector Learning. MIT Press; 1998.
- Zheng S. KLERC: kernel lagrangian expectile regression calculator. Comput Stat. 2021;36(1):283–311.
- Choi K-L, Shim J, Seok K. Support vector expectile regression using IRWLS procedure. J Korean Data Info Sci Soc. 2014;25(4):931–939.
- Rahimi A, Recht B. Random features for large-scale kernel machines. In: Proceeding of Advances in Neural Information Processing Systems; 2007.
- Chitta R, Jin R, Jain AK. Efficient kernel clustering using random Fourier features. In Proceedings of IEEE Int'l Conf. on Data Mining; 2012.
- Zheng S. Speeding up L2-loss support vector regression by random Fourier features, Communications in Statistics – Simulation and Computation. 2022. Available from: https://doi.org/10.1080/03610918.2022.2037638
- Avron H, Kapralov M, Musco C, et al. Random fourier features for kernel ridge regression: approximation bounds and statistical guarantees. In: Proceedings of Int'al Conf. on Machine Learning. 2017.
- Lopez-Paz D, Sra S, Smola A, et al. Randomized nonlinear component analysis. In: Proceedings of the 31st Int'l Conf. on Machine Learning. 2014.
- Cutajar K, Bonilla EV, Michiardi P, et al. Practical learning of deep gaussian processes via random fourier features. Stat. 2016;1050:14.
- Kapushev Y, Kishkun A, Ferrer G, et al. Random Fourier features based SLAM. arXiv:2011.00594. 2021.
- Huang P-S, Deng L, Hasegawa-Johnson M, et al. Random features for kernel deep convex network. In: proceeding of IEEE Int'l Conf. on Acoustics, Speech, and Signal; 2013.
- Mehrkanoon S, Suykens JAK. Deep hybrid neural-kernel networks using random Fourier features. Neurocomputing. 2018;298:46–54.
- Francis DP, Raimond K. Major advancements in kernel function approximation. Artif Intell Rev. 2021;54(2):843–876.
- Li Z, Ton J-F, Oglic D, et al. Towards a unified analysis of random Fourier features. In: Proceeding of the 36th Int'l Conference on Machine Learning, Long Beach: 2019.
- Liu F, Huang X, Chen Y, et al. Random features for kernel approximation: a survey on algorithms, theory, and beyond. arXiv preprint arXiv:2004.11154v3. 2020.
- Mangasarian OL, Musicant DR. Successive overrelaxation for support vector machines. IEEE Trans Neural Networks. 1999;10(5):1032–1037.
- Mangasarian OL, Musicant DR. Lagrangian support vector machines. J Mach Learn Res. 2001;1:161–177.
- Musicant DR, Feinberg A. Active set support vector regression. IEEE Trans Neural Networks. 2004;15(2):268–275.
- Kutner MH, Nachtsheim CJ, Neter J, et al. Applied linear statistical models. New York (NY): McGraw-Hill, 2004.
- Kimeldorf GS, Wahba G. A correspondence between Bayesian estimation on stochastic processes and smoothing by splines. Ann Math Stat. 1970;41(2):495–502.
- Rudin W. Fourier analysis on groups. New York: Wiley-Interscience; 1994.
- Sobotka F, Schnabel S, Schulze WL. Expectreg: expectile and quantile regression. R package version 0.39; 2014.
- Yeh I-C. Modeling of strength of high performance concrete using artificial neural networks. Cement Concrete Res. 1998;28(12):1797–1808.
- Pennington J, Yu FX, Kumar S. Spherical random features for polynomial kernels. In: Proceedings of Advances in Neural Information Processing Systems (NIPS 2015); 2015.
- Kar P, Karnick H. Random feature maps for dot product kernels. In Proceedings of the 15th Int'l Conf. on Artificial Intelligence and Statistics; 2012.
- Chang W-C, Li C-L, Yang Y, et al. Data-driven random Fourier features using Stein effect. In: proceedings of in Int'l Joint Conf. on Artificial Intelligence; 2017.
- Damodaran BB, Courty N, Gosselin P-H. Data dependent kernel approximation using pseudo random Fourier features. arXiv preprint arXiv:1711.09783. 2017.
- Li Y, Zhang K, Wang J, et al. Learning adaptive random features. In: Proceedings of AAAI Conf on Artificial Intelligence; 2019.
- Shahrampour S, Beirami A, Tarokh V. On data-dependent random features for improved generalization in supervised learning. In: proceedings of AAAI Conference on Artificial Intelligence; 2018.
- Sinha A, Duchi JC. Learning kernels with random features. In: Proceedings of Advances in Neural Information Processing Systems 29 (NIPS 2016); 2016.