418
Views
1
CrossRef citations to date
0
Altmetric
Original Articles

Job Insecurity and Its Effect on Organizational Performance: Does Senior Executive Services (SES) Accountability Make a Difference?

ORCID Icon
 

ABSTRACT

Previous research on the association between job insecurity and performance has yielded inconsistent results. Using longitudinal data from the 2012, 2013, 2014, 2015, and 2016 U.S. Federal Employee Viewpoint Survey (FEVS), this study examines the effect of job insecurity on organizational performance. The results indicate that the overall effect of job insecurity at the U.S. Department of Veterans Affairs (VA) is an increase of approximately 12 percentage points in organizational performance. This study finds that job insecurity plays an important role in determining organizational performance in the context of the employment at-will system for the Senior Executive Service (SES).

Disclosure statement

No potential conflict of interest was reported by the author(s).

Notes

1. Yang and Kassekert (Citation2009) argue that in a study like this, optimal results would be achieved by delving into every dataset in the panel data form. However, repeated cross-sectional datasets, according to Abadie (Citation2005) and Wooldridge (Citation2008), are useful in the construction of difference-in-differences estimators.

2. In this study, common method bias is not an issue since our estimation model is based on a difference-in-differences (DID) quasi-experimental design using longitudinal data. Most statistical method scholars (Brannick et al., Citation2010, Jakobsen & Jensen, Citation2015, p. 16, Brannick) argue that one statistical remedy to remove common method bias consists in using panel data and quasi-experimental research design. They explain that a useful feature of panel data is that it can be used to control out all factors that are time-invariant and, thus, control out all method bias that is stable across time. To control out time-invariant factors, a difference-in-differences (DID) quasi-experimental design is applied in this current study, which corresponds to including dummy variables for each of the respondents in the model. Therefore, any respondent characteristics that do not change between the surveys are held constant (Jakobsen & Jensen, Citation2015, p. 22; Brannick et al., Citation2010)

3. Regarding the 2014 FEVS, OPM launch dates were organized in two waves in 2014, with approximately 6–week administration periods beginning April 29 and May 6, 2014, thus, the 2014 FEVS data are used in the regressions to represent the pre-Veterans’ Access, Choice, and Accountability Act period (not post period; OPM, Citation2014).

4. Since the 2016 FEVS survey was conducted online from May 5, 2016 to June 16, 2016, the 2016 FEVS data could also be used in the regressions to represent the post-VACAA period (OPM, Citation2016).

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.