300
Views
0
CrossRef citations to date
0
Altmetric
Articles

The effects of review form and task complexity on auditor performanceFootnote*

, &
Pages 449-462 | Received 08 Nov 2016, Accepted 20 Jun 2017, Published online: 01 Jul 2017
 

Abstract

This study examines the effects of review form and task complexity levels on auditor performance. Adopting a 2 × 2 (face-to-face vs. e-mail review and low vs. high task complexity) between-subjects design, we recruit auditors with limited experience to perform going-concern evaluations in an experimental setting. Our results reveal that auditors in the face-to-face review group perform better than those in the e-mail review group, and that auditor performance is lower for more complex tasks. More importantly, consistent with our hypotheses, auditors in the face-to-face review group perform better than those in the e-mail review group when task complexity is low, but not when task complexity is high.

Acknowledgement

We thank the Editor and the anonymous reviewer for their comments and suggestions. Financial support of the Ministry of Science and Technology, Taiwan is gratefully acknowledged.

Notes

* Accepted by Hong Hwang upon recommendation by Jong Hag Choi

1. Chang, Ho, and Liao (Citation1997) recruit MBA and undergraduate students as participants to perform problem-solving tasks, rather than audit judgments. Whether their findings can apply to practicing auditors performing audit judgments remains an open question. Both Tan and Kao (Citation1999) and Tan, Ng, and Mak (Citation2002) employ three different tasks to operationalize task complexity and manipulate task complexity within subjects. In our study, we employ one task and adopt a between-subjects manipulation to vary task complexity by the number of evidence items and clarity of the evidence items’ impact on going concern judgments. Pany and Reckers (Citation1987) report that the design of the experiment, within-subjects vs. between-subjects, is critical to the results obtained. Unlike Tan and Kao (Citation1999) and Tan, Ng, and Mak (Citation2002), we measure auditor performance by workpaper quality. Using a different operationalization of the focal variable and different designs can help ascertain the generalizability of prior findings. More importantly, the above accountability studies do not examine the effect of interaction between review form and task complexity.

2. Social psychology research has found that accountability makes decision makers process information more vigilantly, use information more completely, and exert more effort, which enhances the quality of their decisions and reduces biases such as primacy effect (Tetlock Citation1983a, 1985; Tetlock and Kim Citation1987). However, accountability makes decision makers over-interpret irrelevant information and therefore exacerbates another judgmental bias, the dilution effect (Tetlock and Boettger Citation1989). Moreover, when decision makers expect to justify their attitudes to an audience with a specific attitude, decision makers have strategic attitude shifts in order to impress that audience (Tetlock Citation1983b). See Lerner and Tetlock (Citation1999) and Hall, Frink, and Buckley (Citation2015) for reviews.

3. Another line of research examines reviewers’ strategy choice or performance. See, e.g. Bible, Graham, and Rosman (Citation2005); Rosman et al. (Citation2007); Agoglia, Hatfield, and Brazel (Citation2009) and Agoglia et al. (Citation2010).

4. The order of the items was determined randomly.

5. Approval had been granted by the audit firm, which provided auditors to participate in the experiment.

6. All the 59 participants have at least two-year audit experience. Participants without a CPA license indicate that they have not passed the CPA examinations.

7. In the following, p-values are based on one-tailed tests unless two-tailed tests are indicated.

8. The order is determined by the COO at random.

9. The reason for this arrangement was explained to the participants.

10. The going-concern judgment and rationales are both considered.

11. A majority of the participants did not record either the starting time or the ending time, making an analysis of time spent infeasible.

12. Contrast-coded ANOVA is increasingly adopted in accounting research (e.g. Sedor Citation2002).

13. Guggenmos, Piercey, and Agoglia (Citation2016) suggest the limitation of zero-weight of variables when using contrast-coding analysis. Hence, we assign −2, −2, 1, 3 as the variable weights and re-run contrast-coded ANOVA. We still find a significant result (p = 0.03, two-tailed).

14. We exclude 19 participants at the rank higher than senior (assistant manager, or manager), leaving us with 40 auditors (3.58 years of audit experience on average). The results are similar to those using the full sample.

15. We are grateful to the reviewer for suggesting examination of the mediation process. While conducting structural equation modeling (SEM) is an ideal approach, our sample size is too small to perform SEM (Fritz and MacKinnon Citation2007). Using PROCESS produces almost the same estimates of coefficients as SEM does, but PROCESS does not require a sample as large as SEM. While PROCESS does not provide statistics about model fit, it provides a direct measure of moderated mediation that can be tested for statistical significance. See Hayes, Montoya, and Rockwood (Citation2017) for a comparative demonstration.

16. These four items are accountability to reviewer; motivation to perform well; effort expended on task; and pressure to impress reviewer.

17. We appreciate the reviewer for pointing out this issue.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.