1,916
Views
4
CrossRef citations to date
0
Altmetric
Health Technology

Quantifying the economic impact of a digital self-care behavioral health platform on Missouri Medicaid expenditures

&
Pages 1084-1090 | Received 25 Jul 2018, Accepted 06 Aug 2018, Published online: 31 Aug 2018

Abstract

Aim: The primary objective of this study is to estimate the economic benefits relative to return on investment (ROI) of a state-wide initiative to integrate digital behavioral health (BH) self-care into the community BH system.

Methods: The observational study reviewed claims data of 799 people who registered with the digital tool (myStrength) and 715 matched control study participants. The control group was formed via coarsened exact match with blocking variables, including presence on the claims file, volume of health care utilization, participation in a medical health home, BH diagnosis and volume of psychotherapy claims. The primary study analysis of cost differences for the BH self-care tool group versus the control group were calculated by cost setting and the ROI of the BH self-care tool was estimated. Cost settings assessed include inpatient, emergency services, general and psychiatric outpatient, and outpatient psychotherapy.

Results: An incremental annual difference in difference reduction of $382 per user was observed over the 11 month study period in the self-care tool group. Sensitivity analysis indicated an ROI of between 142% and 695%.

Conclusions: Augmenting BH offerings to include digital BH self-care tools appears to generate positive ROI via reduced total cost of care.

JEL classification codes:

Introduction

While about 20% of Americans (44.7 million in 2016) live with a behavioral health (BH) condition, only about 43% of these will receive BH care in a given yearCitation1. Stigma, low perceived need, lack of access, cost and dearth of qualified providers all contribute to this care gapCitation2–4.

At the same time, BH conditions, including depression, anxiety, insomnia and substance use disorders, are major drivers of cost in the US health care system, doubling or tripling total cost of careCitation5. Increased utilization of emergency departments and inpatient days are significant contributors to increased costsCitation6. At the same time, these are often avoidable medical consumptionCitation7–9 which can be effectively reduced through the expansion of BH servicesCitation9.

Digital health solutions have emerged as a potential solution to the dual challenge of increasing access to care while also reducing the BH cost burden. Over the course of the past decade, literature has accumulated which points to the clinical efficacy of digital solutions across a wide range of BH concerns including depression, anxiety, insomnia, stress management, and problematic substance useCitation10–18. Digital self-care solutions can be highly cost effective at potentially only a small fraction of the cost of live care provision. Introduction of digital self-care tools thus may create secondary cost savings via reduced need for and/or duration of in-person BH careCitation19. While initial studies suggest that digital tools may be able to generate cost savings, early work in this area has also called for increased research focus on cost of care impacts of widespread introductions of digital health toolsCitation20–23.

The current work focuses on a cost–efficacy analysis of the use of a digital BH tool across the state of Missouri (MO). The state of Missouri faces trends aligned with the above national statistics. Nearly 859,000 people in the state of Missouri experienced some form of mental illness in 2014Citation24. Fifty-five percent of those received no treatment. Likewise, approximately 419,000 people in Missouri struggled with a substance use disorder (SUD) in 2016, and 84% did not receive treatmentCitation24. Annual cost burden in direct medical costs of untreated SUD and BH in MO was estimated at over $250 million in 2016Citation25.

In 2016 a unique partnership between the Missouri Coalition for Community BH care, the Missouri Department of Mental Health and the Missouri Foundation brought myStrength, a leading digital BH self-care tool, to 25 community mental health clinics (CMHCs) across the state. It was introduced both to each agency’s client base as well as to each agency’s broader community via the center’s outreach efforts.

As part of this effort, a formal evaluation of the program’s financial impact was conducted. While the tool’s clinical impact has been documented in other workCitation13,Citation26, cost impact and return on investment (ROI) of a broad implementation was not previously available. The evaluator therefore was tasked with making an actuarially sound and also highly conservative assessment of the medical cost impact of this statewide initiative. More specifically, the evaluation assessed: (1) to what extent introducing the digital tool helped to drive cost reductions in avoidable medical costs beyond those being achieved by other available services; (2) to what extent was its introduction reducing direct cost of care for BH services; and (3) to what degree was there indication of a positive return on investment (ROI) from the statewide roll-out. This paper presents the approach and results of this third-party evaluation.

Methods

Program

The myStrength program is a web and mobile-based BH platform, delivering evidence-based resources to consumers. Interactive applications address challenges including depression, anxiety, insomnia, chronic pain management, stress and SUD. It integrates empirically proven psychotherapy models – such as cognitive behavioral therapy, motivational interviewing, acceptance and commitment therapy, positive psychology and mindfulness – along with mood tracking, sharing of community and personal inspirations, and a searchable library of BH and wellness/well-being resources. The user experience is highly personalized, based on individual interests. It can be used as a stand-alone tool or in conjunction with in-person, professional BH care. To foster engagement users are sent weekly BH-focused emails encouraging them to log in and use the tool.

Study participants

The digital BH self-care tool was made available via 25 CMHCs across the state of Missouri. The mission of the CMHCs is to provide BH services to Missouri residents suffering from severe and persistent mental illness, substance use and addiction disorders, poverty, homelessness and trauma. Approximately 60% of Missouri-based CMHCs serve rural populations. The continuum of services provided includes screening and assessment, case management, medication management, individual and group outpatient therapy, coordination for inpatient treatment, community psychiatric and substance abuse treatment rehabilitation, 24 hour emergency services, community support, consultation, education and prevention, and administrative coordination. In addition, all the CMHCs also provide outreach and education into the communities in their local catchment areas.

Agency staff were trained to make referrals to the BH self-care tool both as a supplement to clinical care as well as for more general community outreach. Agencies used the tool as an extension to in-session treatment, wait-list management, group session content, post-discharge support and community outreach. There were no limits placed on who could set up an account beyond the need to have an agency-wide access code which was made widely available on marketing materials.

As part of the registration process, new users were asked to voluntarily provide their first name, last name and date of birth for use in research linking them to their Medicaid claims. The first users to register with this process enrolled in November 2016. Users who were a part of this analysis joined the program no later than July 2017. Thirty-five percent of new registrants during this period chose to share this identifying data (n = 1869). In terms of age and gender, the volunteer sample is representative of all newly registered users affiliated with Missouri CMHCs during the study period: 67% are female, and approximately 60% are 31 years of age or older.

Data source and institutional review board oversight

A full-service, private institutional review board (IRB) with expertise in BH protocols approved the study design and research objectives (Solutions IRB #2017/12/13). Solutions IRB is registered with the Office for Human Research Protections. All data storage was compliant with Health Insurance Portability and Accountability Act guidelines.

Via a partnership with the Missouri BH coalition the third-party evaluator received a de-identified file with claims data for 869 users successfully matched to claims data. In addition, claims data on a random sample of 4029 adults enrolled in Missouri Medicaid who were not part of the identified set of users was also delivered to the third-party evaluator for use in this evaluation.

Evaluation approach

The evaluation presented here faced several significant challenges which drove the selection of methodological framework. First, the project’s funding cycle dictated that the evaluation needed to be completed within 15 months of the program’s launch, creating a very tight time-frame to assess economic impact. Due to the nature of the program, users were introduced to the BH self-care tool throughout the evaluation period. Second, the state of Missouri had launched a successful health homes initiative with demonstrated cost savings in both BH and avoidable medical expenses, which was being expanded into the same community BH clinics (CBHCs) at the same time as the digital BH program was introduced. Finally, given the real-world nature of the program structure, there was considerable variability in the extent to which other BH services were accessed alongside the BH self-care tool and to what degree people were exposed it prior to setting up an account. For example, in many clinics, the BH self-care tool is used in direct care sessions via a provider’s account. In those settings, patients were encouraged to set up their own accounts only after they have become familiar with the platform.

In light of these considerations, instead of conceptualizing the study as a classic experimental or quasi-experimental pre–post design, the evaluation framed the launch of the BH self-care tool as akin to the introduction of a coupon campaign into a retail sales environment. Rather than trying to determine impact from the day a coupon entered a person’s hand, the classical experimental perspective, the impact of a coupon campaign is best modeled by looking at differential sales (sales lift) between an exposed and naïve population across the entire period of the campaign. By analogy, this work compares difference in difference costs between people with evidence of exposure to the “campaign” (BH self-care tool registration) to those who remained BH self-care tool naïve (no account created) regardless of the date on which they registered. This shift afforded an approach which would be robust across the rolling window of registration dates. In addition, this approach ensured equivalent claims windows for both conditions.

In addition, the real-world considerations of this observational study clearly required a matching strategy that would be highly effective at balancing conditions and reducing potential bias. King and NielsenCitation27 argue that a propensity score approach, the general go-to tool in observational studies, is highly problematic as it systematically reduces power, increases model dependence and also introduces bias. Instead, they advocate the use of coarsened exact matching to ensure tight and unbiased control over multiple factors.

Therefore, a coarsened exact match using micro-strata was implemented. Pre-analysis of the data indicated that there were five available covariates which were highly influential on total cost of care. Five coarsened blocking variables were used to form a set of mutually exclusive micro-strata which fully covered the experimental population. The five blocking variables are as follows:

  1. Presence on the claims files and Medicaid eligibility in the given period: Presence on the claims file across each of these two pre-periods and two post-launch periods between people in the control and experimental groups to ensure similar eligibility.

  2. Disease management and health care home participation: The micro-stratification scheme nested participation in neither, either or both of these cost control initiatives which were being introduced in MO concurrent to the BH self-care tool initiative.

  3. Psychiatric diagnosis: The stratification schema created deciles by volume of claims for core BH (depression and anxiety), severe BH (bipolar and schizophrenia) and/or SUD over the entire study period. These were then reduced to four groups – high, medium, low and no claims – to form strata.

  4. Claims setting: The stratification scheme created deciles of claims by setting over the entire study period. Settings included inpatient, ambulance/emergency, ambulatory procedures and outpatient health clinic/psychiatric facility. These were then reduced to four groups – high, medium, low and no claims – to form strata.

  5. Psychotherapy: Volume deciles of claims over the entire study period with a Current Procedural Terminology (CPT) code indicating a psychotherapy encounter. These were then reduced to four groups – high, medium, low and no claims – to form strata.

Why do claims volume metrics figure so prominently in the stratification scheme? Returning to the analogous situation of a coupon campaign, controlling for the regularity with which a shopper visits a store is critical when looking for sales lift impact between a coupon exposed intervention group and a naïve control group. By analogy, controlling for the amount of interaction with the health care system was important for this study, whose mandate was to establish a highly conservative estimate of the impact of launching the “campaign”. Because the propensity to interact with the health care system varies greatly among people and this variability clearly impacts claims costs, several of the stratification levers are based on volume of claims over the entire study period.

The 869 users were placed into the micro-strata produced by the blocking schema as were the 4029 available controls. Within each stratum, random pruning reduced the controls to be equivalent to the n of the treatment group. In 25 of 182 strata there were not enough controls available to fully fill the strata. Likewise, 70 users were pruned from the data set as no controls matched to their micro-strata. presents a consort diagram for this data pre-processing. presents core demographic data on the BH self-care tool versus control groups.

Figure 1. Consort flow for subject selection.

Figure 1. Consort flow for subject selection.

Table 1. Demographic break-down of self-care tool group versus control group.

Because there were significant differences in gender and age between control and BH self-care tool groups, the stratification schema was re-run with these added as blocking variables. Doing this resulted in 241 additional subjects being dropped due to non-matching strata. Review of historical myStrength data showed no evidence of differential impact of the program by age or gender. Therefore, a decision was made to retain the original blocking scheme in favor of the larger sample size.

Claims were broken out into four mutually exclusive cost centers: inpatient, emergency department (ED) and ambulance (emergency services), outpatient psychiatric facility/health clinic (outpatient) and psychotherapy. The psychotherapy category was first pooled to encompass all claims billed under CPT codes for outpatient therapy. Thus, the ED, outpatient and inpatient codes include no claims for psychotherapy – even if therapy was received in one of these settings. Difference in difference comparisons on all members with at least one claim in both the pre- and the post-period for any given cost area were used to evaluate deltas in total cost of care between conditions.

Results

Before assessing difference in difference costs between the intervention and control conditions, a check on the stability of the sampling frame was made by running four independent control group pools and looking at variability across each estimate. Total claims cost showed no significant or even borderline significant (p < .1) differences across all four control group pools by treatment setting and/or in any pairwise comparisons.

Results, presented in , indicate that core highly avoidable medical care consumption, reflected in the inpatient and emergency settings, evidenced the largest difference in difference cost reductions. While the p values for both of these differences do not meet traditional levels of statistical significance, given the skewed nature of claims data in general and this data in specific, it is still possible that there is a genuine trend here.

Table 2. Difference in difference health care spend by setting.

Costs associated with visits to psychiatric facilities and/or to outpatient clinics evidence significantly more difference in difference cost reduction in the BH self-care tool group than in the control group. At the same time, the absolute magnitude of these differences is smaller than those for high cost avoidable claims.

While not included in , difference in difference spending was also calculated for outpatient procedures, a setting in which cost was not believed to be impacted by BH. Outpatient procedures included any planned procedure done in a hospital setting where there is no hospital admission associated with the procedure. The difference in difference cost on these procedures between groups was $7 on a total pre–post cost difference of about $600 in both conditions (p = .98).

In keeping with the mandate of this evaluation to establish conservative estimates of ROI, a series of ROI calculations were undertaken to determine both the likely range of ROI from the study period as well as some guidance on what future ROI might look like. ROI is calculated as the expected per user difference in difference savings times the number of enrolled users that year, divided by the annual cost of the statewide BH self-care tool contract. Note that because the study period did not align with the contract year cost period, we have elected to utilize the number of actual enrollments over the first year of the program contract, as opposed to that of the claims analysis window.

ROI estimate were made for: (1) all observed savings in keeping with the generally accepted actuarial standard of p < .25 indicating probable cost differences; (2) a more conservative standard at p < .1; and (3) the normative standard in experimental research of p < .05. In addition, three scenarios were computed to give a sense of where ROI lands under different levels of program engagement (e.g. with 3000 vs. 5000 vs. 8000 users) (). These were all computed with the most conservative estimate of cost savings (p < .05).

Table 3. Program ROI scenarios.

Discussion

Results of this study indicate that from an ROI perspective the statewide introduction of the BH self-care tool does appear to have generated more savings than expenses – netting the state an ROI of somewhere between 1.9 and 6.9 times savings in reduced medical spend over the cost to bring the program to the state. The most convincing evidence of cost savings is in reduced cost for BH care in psychotherapy and psychiatric facility and/or health clinic settings. Results indicate that people provided with a digital self-care tool have lower total cost for in-person, BH and routine outpatient care. Informal review of claims volume underlying the cost data hint that this finding may be being driven by reduction in volume of claims from the people who used the most psychotherapy in the pre-study period. In other words, providing “frequent flyers” with a digital BH self-support tool may be helping them to make more prudent use of BH and other outpatient resources – perhaps because they feel more empowered to manage on their own.

In terms of high cost medical expenses, while not quite achieving traditional levels of statistical significance as assessed with t-tests based on a normal distribution, the data does evidence a trend towards reduced expenses in avoidable high cost expenses, namely emergency department visits and inpatient stays. Review of claims volume by consumption levels suggests a slightly different driver of cost trends here. For emergency department usage, volume did not decrease more for users than controls between pre- and post-periods, but costs did. This would suggest that, when emergency services were used, they were used more efficiently. Perhaps self-help tools help constrain such visits to a more focused set of needs and thus drive down costs of these visits. With regards to inpatient stays, volume changed the most for people in the middle brackets in terms of claims volume. While most certainly speculative, this could indicate that the mechanism via which BH drives inpatient costs is via reduced inpatient days for people who have more subjectivity in their stay length and who feel more able to go home earlier when armed with a bit of additional support.

These findings must be viewed in light of several notable limitations in the work. First, demographic differences between the users of the BH self-care tool and the control sample might be biasing results. While we had no reason to believe that the program was working differently based on age and gender, we did do a re-run of the difference in difference model using the samples generated from a coarsened exact match which included gender and age. Results trended in the same ways as the results presented here, though the reduced sample size impacted statistical significance levels. While we would have liked to have had access to a deeper pool of potential control members which might have allowed us to match on demographics as well as to achieve full matching in all strata, unfortunately, the stipulations of the partnership which allowed us access to the claims file precluded such access.

Interesting as these hints from claims volume review may be, they also point to a major limitation of this work. Because of tight restrictions on the type and window of data available, we felt it was necessary to block on claims volume – thus making it impossible to also look at claims volume as a dependent variable. Future work would be welcomed that dove deeply into the dynamics of for whom and under what circumstances digital BH tools do reduce health care usage.

An additional source of bias might come from users in the BH self-care tool group self-selecting to try the program resulting in a population with elevated initiative and interest in self-improvement. At the same time, the program cost–efficacy frame of this work focuses more on understanding whether the cost of bringing the program was less than the cost of not bringing the program, net to the state. In that regard, even if the program only provided an alternative service to a more motivated population, it would still be reducing overall net cost of care.

This work may err on the overly conservative side with regard to cost savings estimation. First, our decision to block on claims volume across the entire study period, while on the one hand essential for ensuring that control and experimental conditions are well balanced on proclivity to utilize health care, has likely also reduced the observable program cost effect. In addition, this work looks only at direct health care impacts. Given the dramatic demonstrated cost of BH on workplace productivityCitation28 as well as disability and unemploymentCitation29, it is possible that indirect cost savings are also being generated by the initiative evaluated here.

Past research gives voice to the concern that evaluation of digital health tools is predominantly undertaken on crowd-sourced or other convenience samples, as opposed to in real-world implementations and actual clinical practicesCitation30,Citation31. The current work grapples with the complexities of a real-world evaluation – complete with real-world constraints on timing and availability of data and a myriad of varied and uncontrolled use cases. Drawing on a methodological frame common in the complex world of consumer spending patterns, we are able to demonstrate that even under very conservative scenarios the statewide initiative to supplement traditional BH services with a broadly available digital BH tool appears to have generated a positive ROI.

In addition, our results begin to move the literature on digital self-care interventions for BH beyond “do they work” and into “do they deliver economic value in real-world settings”. A recent international work group of digital behavior change interventions concluded that two of the most pressing research needs in this area are: 1) work which takes projected uptake and reach into account in economic evaluations; and 2) research which isolates health care cost impact from other program impacts (like health outcomes, productivity, etc.)Citation32. The current evaluation provides a range of ROI scenarios under different uptake conditions while also retaining focus on health care cost impact.

Future research would be served well by continued focus on understanding how actual consumers in actual real-world settings are impacted by the explosion of digital health tools entering the marketplace. Likewise, increased scrutiny of claims by digital health providers to deliver population health and/or cost benefits, would likely result in more economic impact studies such as this one. This would serve to both better inform the marketplace and enrich our understanding of the potential of digital BH tools.

Transparency

Declaration of funding

This research was funded by myStrength Inc. and the Missouri Health Foundation.

Declaration of financial/other relationships

SA has disclosed that he is the owner of Propensity4 and was paid as a consultant to provide methods and analysis for the study. A.H. has disclosed that she is a paid consultant for, and owns stock in, myStrength and serves as its Chief Clinical Officer.

Peer reviewers on this manuscript have received an honorarium from JME for their review work, but have no relevant financial or other relationships to disclose.

Acknowledgements

Krista Schladweiler coordinated work and data exchange between partners and helped with writing early drafts of the paper. Alissa Link and Britni Meyers helped collect citations and made finals edits to the paper. Ed Jones and Steven Schwartz both provided high level strategy consultation.

References

  • NIMH. Mental Illness. Available at: https://www.nimh.nih.gov/health/statistics/mental-illness.shtml [Last accessed 29 May 2018]
  • Kazdin AE. Innovations in Psychosocial Interventions and Their Delivery: Leveraging Cutting-edge Science to Improve the World’s Mental Health. Oxford University Press, 2018
  • Mojtabai R, Olfson M, Sampson NA, et al. Barriers to mental health treatment: results from the National Comorbidity Survey Replication. Psychol Med 2011;41:1751-61
  • Mojtabai R, Chen L-Y, Kaufmann CN, et al. Comparing barriers to mental health treatment and substance use disorder treatment among individuals with comorbid major depression and substance use disorders. J Subst Abuse Treat 2014;46:268-273
  • Milliman. Potential economic impact of integrated medical–behavioral healthcare: updated projections for 2017. Milliman. Available at: /insight/2018/Potential-economic-impact-of-integ-rated-medical-behavioral-healthcare-Updated-projections-for-2017/ [Last accessed 10 June 2018]
  • Assessment #9_The Uncoordinated Costs of Behavioral and Primary Health Care.pdf. Available at: https://www.nasmhpd.org/sites/default/files/Assessment%20%239_The%20Uncoordinated%20Costs%20of%20Behavioral%20and%20Primary%20Health%20Care.pdf [Last accessed 10 June 2018]
  • Lanoye A, Stewart KE, Rybarczyk BD, et al. The impact of integrated psychological services in a safety net primary care clinic on medical utilization: impact of integrated services on utilization. J Clin Psychol 2017;73:681-92
  • Chiles JA, Lambert MJ, Hatch AL. The impact of psychological interventions on medical cost offset: a meta‐analytic review. Clin Psychol Sci Pract 1999;6:204-20
  • Blount A, Schoenbaum M, Kathol R, et al. The economics of behavioral health services in medical settings: a summary of the evidence. Prof Psychol Res Pract 2007;38:290-7
  • Firth J, Torous J, Nicholas J, et al. The efficacy of smartphone-based mental health interventions for depressive symptoms: a meta-analysis of randomized controlled trials. World Psychiatry 2017;16:287-98
  • Firth J, Torous J, Nicholas J, et al. Can smartphone mental health interventions reduce symptoms of anxiety? A meta-analysis of randomized controlled trials. J Affect Disord 2017;218:15-22
  • Aboujaoude E, Salame W, Naim L. Telemental health: a status update. World Psychiatry 2015;14:223-30
  • Schladweiler K, Hirsch A, Jones E, et al. Real-World Outcomes Associated with a Digital Self-Care Behavioral Health Platform. Ann Clin Res Trials 2017;19(7):e271
  • Carolan S, Harris PR, Cavanagh K. Improving employee well-being and effectiveness: systematic review and meta-analysis of web-based psychological interventions delivered in the workplace. J Med Internet Res 2017;19:e271
  • Deady M, Choi I, Calvo RA, et al. eHealth interventions for the prevention of depression and anxiety in the general population: a systematic review and meta-analysis. BMC Psychiatry 2017;17:310
  • Mohr DC, Burns MN, Schueller SM, et al. Behavioral intervention technologies: evidence review and recommendations for future research in mental health. Gen Hosp Psychiatr 2013;35:332-8
  • Lindhiem O, Bennett CB, Rosen D, et al. Mobile technology boosts the effectiveness of psychotherapy and behavioral interventions: a meta-analysis. Behav Modif 2015;39:785-804
  • Erbe D, Eichert H-C, Riper H, et al. Blending face-to-face and internet-based interventions for the treatment of mental disorders in adults: systematic review. J Med Internet Res 2017;19:e306
  • van den Berg S, Shapiro DA, Bickerstaffe D, et al. Computerized cognitive–behaviour therapy for anxiety and depression: a practical solution to the shortage of trained therapists. J Psychiatr Ment Health Nurs 2004;11:508-13
  • Donker T, Blankers M, Hedman E, et al. Economic evaluations of Internet interventions for mental health: a systematic review. Psychol Med 2015;45:3357-76
  • Beecham J, Bonin E-M, Görlich D, et al. Assessing the costs and cost-effectiveness of ICare internet-based interventions (protocol). Internet Interventions 2018. In press. Available at: https://doi.org/10.1016/j.invent.2018.02.009
  • Kolovos S, Dongen JM van, Riper H, et al. Cost effectiveness of guided Internet-based interventions for depression in comparison with control conditions: an individual-participant data meta-analysis. Depression Anxiety 2018;35:209-19
  • Ahern E, Kinsella S, Semkovska M. Clinical efficacy and economic evaluation of online cognitive behavioral therapy for major depressive disorder: a systematic review and meta-analysis. Expert Rev Pharmacoecon Outcomes Res 2018;18:25-41
  • Enomoto K. Behavioral Health Barometer. Missouri, 2015
  • The Cost of Untreated Mental Illness: State of Missouri. Health Care Foundation of Greater Kansas City. Available at: https://hcfgkc.org/infographics/the-cost-of-untreated-mental-illness-state-of-missouri/ [Last accessed 10 June 2018]
  • Hirsch A, Luellen J, Holder JM, et al. Managing Depressive Symptoms in the Workplace Using a Web-Based Self-Care Tool: A Pilot Randomized Controlled Trial. JMIR Res Protoc 2017;6(4):e51
  • King G, Nielsen R. Why Propensity Scores Should Not Be Used for Matching. 2016. Available at: http://j.mp/2ovYGsW [Last accessed 28 March 2018]
  • Stewart WF, Ricci JA, Chee E, et al. Cost of lost productive work time among US workers with depression. JAMA 2003;289:3135-44
  • Higgens, E. Is Mental Health Declining in the U.S.? Scientific American 2017, January 1. 316(1).
  • Torous J, Firth J. Bridging the dichotomy of actual versus aspirational digital health. World Psychiatry 2018;17:108-9
  • Mohr DC, Weingardt KR, Reddy M, et al. Three problems with current digital mental health research…and three things we can do about them. PS 2017;68:427-9
  • Michie S, Yardley L, West R, et al. Developing and evaluating digital interventions to promote behavior change in health and health care: recommendations resulting from an international workshop. J Med Internet Res 2017;19(6):e232

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.