2,125
Views
8
CrossRef citations to date
0
Altmetric
Theory and Methods

First-Order Newton-Type Estimator for Distributed Estimation and Inference

, &
Pages 1858-1874 | Received 01 Nov 2019, Accepted 12 Feb 2021, Published online: 12 Apr 2021

References

  • Angrist, J., Chernozhukov, V., and Fernández-Val, I. (2006), “Quantile Regression Under Misspecification, With an Application to the US Wage Structure,” Econometrica, 74, 539–563. DOI: 10.1111/j.1468-0262.2006.00671.x.
  • Banerjee, M., Durot, C., and Sen, B. (2019), “Divide and Conquer in Non-Standard Problems and the Super-Efficiency Phenomenon,” The Annals of Statistics, 47, 720–757. DOI: 10.1214/17-AOS1633.
  • Battey, H., Fan, J., Liu, H., Lu, J., and Zhu, Z. (2018), “Distributed Estimation and Inference With Statistical Guarantees,” The Annals of Statistics, 46, 1352–1382.
  • Chen, X., Lee, J. D., Li, H., and Yang, Y. (2021), “Distributed Estimation for Principal Component Analysis: A Gap-Free Approach,” Journal of the American Statistical Association (to appear).
  • Chen, X., Lee, J. D., Tong, X. T., and Zhang, Y. (2020), “Statistical Inference for Model Parameters in Stochastic Gradient Descent,” The Annals of Statistics, 48, 251–273. DOI: 10.1214/18-AOS1801.
  • Chen, X., Liu, W., Mao, X., and Yang, Z. (2020), “Distributed High-Dimensional Regression Under a Quantile Loss Function,” Journal of Machine Learning Research, 21, 1–43.
  • Chen, X., Liu, W., and Zhang, Y. (2019), “Quantile Regression Under Memory Constraint,” The Annals of Statistics, 47, 3244–3273. DOI: 10.1214/18-AOS1777.
  • Chen, X., and Xie, M. (2014), “A Split-and-Conquer Approach for Analysis of Extraordinarily Large Data,” Statistica Sinica, 24, 1655–1684.
  • Fan, J., Wang, D., Wang, K., and Zhu, Z. (2019), “Distributed Estimation of Principal Eigenspaces,” The Annals of Statistics, 47, 3009–3031. DOI: 10.1214/18-AOS1713.
  • He, X., and Shao, Q.-M. (2000), “On Parameters of Increasing Dimensions,” Journal of Multivariate Analysis, 73, 120–135. DOI: 10.1006/jmva.1999.1873.
  • Huang, C., and Huo, X. (2019), “A Distributed One-Step Estimator,” Mathematical Programming, 174, 41–76. DOI: 10.1007/s10107-019-01369-0.
  • Johnson, R., and Zhang, T. (2013), “Accelerating Stochastic Gradient Descent Using Predictive Variance Reduction,” in Advances in Neural Information Processing Systems.
  • Jordan, M. I., Lee, J. D., and Yang, Y. (2019), “Communication-Efficient Distributed Statistical Inference,” Journal of the American Statistical Association, 114, 668–681. DOI: 10.1080/01621459.2018.1429274.
  • Lai, T. L. (2003), “Stochastic Approximation,” The Annals of Statistics, 31, 391–406. DOI: 10.1214/aos/1051027873.
  • Lee, J. D., Lin, Q., Ma, T., and Yang, T. (2017), “Distributed Stochastic Variance Reduced Gradient Methods by Sampling Extra Data With Replacement,” Journal of Machine Learning Research, 18, 4404–4446.
  • Lee, J. D., Liu, Q., Sun, Y., and Taylor, J. E. (2017), “Communication-Efficient Sparse Regression,” Journal of Machine Learning Research, 18, 1–30.
  • Li, R., Lin, D. K., and Li, B. (2013), “Statistical Inference in Massive Data Sets,” Applied Stochastic Models in Business and Industry, 29, 399–409.
  • Li, T., Kyrillidis, A., Liu, L., and Caramanis, C. (2018), “Approximate Newton-Based Statistical Inference Using Only Stochastic Gradients,” arXiv no. 1805.08920.
  • Pang, L., Lu, W., and Wang, H. J. (2012), “Variance Estimation in Censored Quantile Regression via Induced Smoothing,” Computational Statistics & Data Analysis, 56, 785–796.
  • Polyak, B. T., and Juditsky, A. B. (1992), “Acceleration of Stochastic Approximation by Averaging,” SIAM Journal on Control and Optimization, 30, 838–855. DOI: 10.1137/0330046.
  • Shi, C., Lu, W., and Song, R. (2018), “A Massive Data Framework for M-Estimators With Cubic-Rate,” Journal of the American Statistical Association, 113, 1698–1709. DOI: 10.1080/01621459.2017.1360779.
  • Volgushev, S., Chao, S.-K., and Cheng, G. (2019), “Distributed Inference for Quantile Regression Processes,” The Annals of Statistics, 47, 1634–1662. DOI: 10.1214/18-AOS1730.
  • Wang, J., and Zhang, T. (2017), “Improved Optimization of Finite Sums With Minibatch Stochastic Variance Reduced Proximal Iterations,” arXiv no. 1706.07001.
  • Wang, X., Yang, Z., Chen, X., and Liu, W. (2019), “Distributed Inference for Linear Support Vector Machine,” Journal of Machine Learning Research, 20, 1–41.
  • Yang, J., Meng, X., and Mahoney, M. (2013), “Quantile Regression for Large-Scale Applications,” in International Conference on Machine Learning.
  • Zhang, Y., Duchi, J., and Wainwright, M. (2015), “Divide and Conquer Kernel Ridge Regression: A Distributed Algorithm With Minimax Optimal Rates,” Journal of Machine Learning Research, 16, 3299–3340.
  • Zhao, T., Cheng, G., and Liu, H. (2016), “A Partially Linear Framework for Massive Heterogeneous Data,” The Annals of Statistics, 44, 1400–1437. DOI: 10.1214/15-AOS1410.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.