125
Views
3
CrossRef citations to date
0
Altmetric
Original Research

Undermining a common language: smartphone applications for eye emergencies

&
Pages 21-40 | Published online: 15 Jan 2019

Abstract

Background

Emergency room physicians are frequently called upon to assess eye injuries and vision problems in the absence of specialized ophthalmologic equipment. Technological applications that can be used on mobile devices are only now becoming available.

Objective

To review the literature on the evidence of clinical effectiveness of smartphone applications for visual acuity assessment marketed by two providers (Google Play and iTunes).

Methods

The websites of two mobile technology vendors (iTunes and Google Play) in Canada and Ireland were searched on three separate occasions using the terms “eye”, “ocular”, “ophthalmology”, “optometry”, “vision”, and “visual assessment” to determine what applications were currently available. Four medical databases (Cochrane, Embase, PubMed, Medline) were subsequently searched with the same terms AND mobile OR smart phone for papers in English published in years 2010–2017.

Results

A total of 5,024 Canadian and 2,571 Irish applications were initially identified. After screening, 44 were retained. Twelve relevant articles were identified from the health literature. After screening, only one validation study referred to one of our identified applications, and this one only partially validated the application as being useful for clinical purposes.

Conclusion

Mobile device applications in their current state are not suitable for emergency room ophthalmologic assessment, because systematic validation is lacking.

Background

Clinical utility of available smartphone applications for emergency health care providers who evaluate ophthalmologic complaints has not yet been established. Emergency room physicians evaluate a variety of ophthalmologic emergencies, including acute glaucoma, retinal detachment, and episcleritis/scleritis. These emergencies potentially threaten vision and require careful visual examination. A quick, accessible, portable electronic tool that evaluates vision in patients of all ages at the bedside is required.Citation1Citation4 Before use, however, such tools need to be rigorously evaluated. Transferring a tool from its paper to smartphone version does not necessarily mean that reliability and validity remain intact.Citation5Citation7

Visual acuity (VA) tools (eg, Eye Handbook, Visual Acuity XL) are available on smartphones, and are employed variably in emergency departments. VA tests give clinicians an estimate of a patient’s ability to perceive spatial detail,Citation8 and are one aspect of a full assessment. VA is the easiest and most important test for bedside evaluation, because it correlates positively with both quality of life and degree of limitation in independent activities of daily living, especially in the geriatric population.Citation9Citation11

Evaluation of VA faces a number of challengesCitation12Citation14 as it comprises detection acuity (ability to interpret visual stimulus and note if present or absent), resolution acuity (ability to evaluate and express if all the spatial detail is absorbed and resolved from the background), and recognition acuity (ability to identify a target and recognize it). This evaluation can be especially difficult when assessing young childrenCitation13 or the elderly.Citation15

Today, eye care professionals use the Bailey–Lovie chartCitation14 and the Early Treatment of Diabetic Retinopathy Study (ETDRS).Citation16 Both tools have standard letter optotypes (letter-like images) with five optotypes per line. These tools correlate well with ocular pathology in the adult population and are the gold standard for VA. In research circles, VA is now expressed in terms of logarithm of minimum angle of resolution (logMAR) equivalents, as opposed to Snellen equivalent distances (eg, 20/40 feet [6/12 m]), although the latter are often still used in modern emergency departments.Citation13,Citation17

There are smartphone applications for VA tests that could replace the older paper versions. The problem is that these applications may not have undergone the rigorous methodological assessment necessary for either screening out pathological conditions or arriving at an accurate diagnosis. Previous reviews of smartphone applications in other areas of medicine have demonstrated substantial variation in quality.Citation17Citation20 Quality is especially important in acute care settings, where urgent treatment decisions need to be made. The aim of this paper was to assess the evidence for the usefulness and validity of selected smartphone applications intended for ophthalmologic assessment of acute eye emergencies.

Methods

This systematic review identified relevant smartphone applications in Canada and Ireland through a search of the websites of two mobile technology vendors (iTunes and Google Play) on three separate occasions between November 2014 and July 2017 using the search terms “eye”, “ ocular”, “ ophthalmology”, “ optometry”, “ vision”, “ and visual assessment”. Secondly, we searched four medical databases (Cochrane, Embase, PubMed, Medline) for research papers on the applications we had identified. We used the same search terms, with the addition of “mobile” or “smartphone”. We included only papers written in English from 2010 to 2017. During the whole analysis, the two authors performed data extraction independently, and conflicts on pertinence were resolved by discussion. This systematic review thus evaluates existing smartphone applications marketed to health care professionals for the determination of VA in Canada and Ireland.

Application search

Identification of mobile applications

An iterative ongoing search in both countries for applications in online stores for iPhone (iTunes, App Store) and Google Play (Google Play) was done, with search terms (Figure S1) altered for use in each database. Both authors independently reviewed applications for inclusion on the basis of a priori criteria (Figure S2). The final update was completed in November 2017. The search was limited to the two stores listed, as they represent the majority (99.7%) of smartphone user platforms according to the International Data Corporation Worldwide Quarterly Mobile Phone Tracker (May 2017) and make up the majority of the market share in the two target regions.Citation21

Selection criteria

English language applications marketed for evaluation of vision by health care professionals were screened by title and description. Applications that were targeted for educational purposes/knowledge dissemination, games, self-monitoring, multimedia/graphics, recreational health and/or fitness, business, travel, weather, or sports or were clinically outdated were all excluded. Where it was unclear whether the application should be included, further review of any linked websites was performed.

Data extraction and encoding

Data elements extracted included year of release, affiliation (academic, commercial), target as stated, content source, and cross-platform availability (for use with tablets and/or computers). A preliminary coding system was developed based on the first application store.

Health literature search

A systematic search was conducted of the four major databases () from January 1, 2010 to July 31, 2016. The search strategy was developed in consultation with a medical librarian and methodological search professional (Figure S1). This was supplemented with a review of relevant reference material for any missed literature.

Figure 1 Identification of relevant smartphone studies.

Figure 1 Identification of relevant smartphone studies.

Identification of articles for literature review

Selection criteria, data extraction, and coding

Twelve relevant articles were identified in the literature using the processes described (). Additional exclusion criteria for articles included preclinical studies or those addressing clinically specialized ophthalmologic/neurological populations not seen in the emergency room (Table S1).

Results

Applications

A total of 44 applications were retained in the final data set after screening of 7,595 applications. In the Canadian iTunes store, 2,526 applications met our initial search criteria. Of those, 927 were unique and suitable for detailed review (): 229 were screened by title, of which 21 were selected based on description. Similar selections were made in Google Play. The results were combined and four additional duplicates removed, to obtain a final data set of 24 applications whose characteristics are summarized in . The Irish iTunes store had substantially fewer applications and duplicates, with 1,100 applications identified and 307 duplicates removed, to obtain a total of 793 applications to be screened by title ().

Figure 2 Selection of Canadian smartphone applications.

Figure 2 Selection of Canadian smartphone applications.

Figure 3 Selection of Irish smartphone applications.

Figure 3 Selection of Irish smartphone applications.

Table 1a Results of systematic application review, Canada (n= 24)

Table 1b Results of systematic application review, Ireland (n= 24)

On the two Canadian stores, on average Google Play had a much lower percentage of on-target applications (0.5% vs 2.3%), and iTunes had a much higher percentage of costing vs free applications (52% vs 29%; ). Applications in the Google Play store were more likely to be free and less expensive overall. This trend was also seen in the Irish store (). In the iTunes store, only a single application cost above $200, but 19% of the identified applications carried a cost of $50 or more (). Of applications that appeared in both stores, 50% (n=2) remained free regardless of where they appeared. One app was more expensive in the iTunes store, and one was more expensive in Google Play. Again, this trend was also seen in the Irish stores.

Figure 4 Cost of Canadian smartphone applications by platform.

Figure 4 Cost of Canadian smartphone applications by platform.

Figure 5 Cost of Irish smartphone applications by platform.

Figure 5 Cost of Irish smartphone applications by platform.

None of the listed providers are academic institutions. One application (4%) referred to an academic affiliation in the form of a validation study from the University of Auckland. The majority (75%, n=18) of the identified applications originated from companies. Individuals provided the remaining 21% (n=5) of applications, while 4% (n=1) of applications did not list any provider.

Eighteen of 24 applications in the two Canadian stores vs 22 of 24 in the two Irish stores (75 vs 92%) explicitly indicated which health care professionals they were directly marketed toward. Targeted professionals included medical doctors, ophthalmologists, opticians, and medical students, to name a few. However, 58% of these applications in Canadian stores vs 42% in Irish stores included some sort of disclaimer, eg, “Should not be used as a primary visual acuity measuring tool, it can provide a handy rough vision screen when a chart is not available, or it can be used to complement static, wall-based Snellen charts” (Eye Chart HD by Dok LLC).

In the Canadian stores, four applications (17%) claimed they were validated although none, when investigated, had used large-scale head-to-head studies under clinical conditions or included a variety of modern smartphones or tablet technology. In the Irish stores, even fewer (two of 24, 8%) applications stated some type of validation. Of the four validated Canadian applications, Visual Acuity XL showed the most methodological rigor, as evidenced by its reference to an affiliation with the University of Auckland and a 2013 validation study by Black et al.Citation22 All four of the validated applications were available in the iTunes store, but only one, Pocket Eye Exam by Nomad, was also available on the Google Play platform. In the Irish stores, validated applications appeared only in the iTunes store ().

Table 2 Quality assessment of selected apps

Of note, applications varied by country, although 58% (14 of 24) of applications were present in both countries. Each country had ten distinct applications. Moreover, within a given country, applications were not universally available for both platforms. For example, KyberVision Japan’s Visual Acuity XL, the most robustly validated application, was available in both countries, but only on the iTunes platform.

Overall, only 21% (n=5) of applications in the Canadian store were affiliated with academic institutions, which (given limited information) was our best proxy measure for academic rigor. Only one application, Visual Acuity XL, explicitly mentioned a validation study. In the Irish store, the same four applications were affiliated with academic institutions. The additional application in the Canadian store spoke of a research group without explicitly naming an affiliated academic institution. However, it was retained based on consensus of the investigators.

Systematic review of the literature

Twelve relevant articles were retained from an initial 5,648 identified by the systematic review of the literature (). Only ten of 15 assessed methods of validation for smartphone applications. Two review articles spoke about a variety of applications, but primary data sources were not identified. Cheng et alCitation23 discussed in limited fashion some of the chal lenges in this area in 2014. Mostly, articles compared one to three applications on the basis of a specific smartphone or tablet. One of the articles discussed validation of the Eye Handbook vs a near vision chart and identified that the application tends to overestimate near-VA vs conventional near vision card by an average of 0.11 logMAR (P<0.0001).

Given the rate of mobile technology change or upgrade,Citation21 the specific applications reviewed here are likely already obsolete. Moreover, there is poor correlation of the literature with the identified smartphone applications. There is poor cross-referencing between the applications and the health literature. Only one application cited a validation study. Furthermore, the health literature, when referenced, may not always be accessible to the consumer, due to copyright limitations when journals are not open source. While the Eye Handbook did have some validation in the health literature,Citation24,Citation25 this was not referred to in the online platform of the application (). This makes it impossible for busy health care professionals to distinguish a validated application from among the others.

Table 3 Demographic data of articles identified in a systematic search of the health literature (n=12)

Discussion

This systematic review demonstrates that despite the availability of many mobile device applications for ophthalmologic assessment, they are either not suitable for the emergency room or else systematic validation is lacking. A combination of 5,024 Canadian and 2,571 Irish applications were identified on Google Play and iTunes as having the potential for use in ocular emergency diagnostics. Less than 1% of the identified applications (n=44) were unique and on target as potentially suitable. Four applications that were available in both stores and one additional one from a Canadian store only (n=5) were affiliated with an academic institution. Only a single application explicitly cited a validation study in its online store. This validation, based on the current standards of best practice, would have to be described as only partial.

When searched in the academic literature, three applications – Visual Acuity XL, Eye Handbook, and AmblyoCare – had some evidence of validation. The Eye Handbook was validated by a single studyCitation26 on iPhone 5, but did not address issues of glare, which in studies of other applications have been shown to make the results unreliable.Citation27 Black et alCitation22 used a cross-sectional design with a convenience sample of 85 healthy volunteers to demonstrate that a first-generation iPad and Visual Acuity XL could reproduce gold standard eye chart evaluation data, but only with significant attention to position and modifications to the tablet’s screen to avoid glare. Outside these standardized conditions, iPad results were significantly poorer than standardized paper-based/wall-mounted eye charts.Citation28

Aurora et alCitation28 noted the advantage of preliminary screening using the Eye Handbook application for purposes of home monitoring or public health data, but considered it not yet useful in the clinical setting. These authors did not comment on the reliability or validity of the Eye Handbook adapting the gold standard ETDRS into a now obsolete application for an iPod touch and an iPhone 3G. They did not address the impact of screen glare or size of the device on the patient’s required distance from it and how these variables impacted test validity either.

The online store text for the AmblyoCareCitation28 application indicates that it is registered as a medical device with the Austrian medical board. However, no original research data for this application came up in our literature review, making the limits of this application for clinical use unclear.Citation28 Therefore, there is a lack of transparent validation of this tool that is accessible to health-care professionals.

The difficulty with all these applications is that adapting a visual chart, eg, a Snellen chart, to the varying screen sizes/properties of various smartphone devices does not ensure that the size, font, or required distance from the image preserve the diagnostic properties of the original paper chart.Citation5,Citation7,Citation29,Citation30 One may argue that these tools could provide a rough estimate of vision sufficient for an emergency clinician. However, we would respond by saying that the results of tools that have not been validated cannot be usefully compared to known benchmarks to make treatment decisions concerning a patient’s eyesight. For example, if an application categorizes someone’s sight as normal, that is not comparable to 20–20 vision on a logMAR chart. Such results may generate treatment decisions based on faulty information.

Limitations

We did not examine smaller electronic markets, and our analysis looked only at the description of the application. Applications for non-English-speaking foreign markets were excluded from the review. We did not address the variation in regulatory requirements in the different global markets. We did not ask the opinion of professionals in this field, as some studies similar to ours have done.

Conclusion

We conclude that efficient regulation and standardization of valid clinical tools for smartphones are needed.Citation18,Citation31,Citation32 This is a major challenge. One possible solution could come from the business world. Instead of free ad-based revenue passed to individual health care professionals, which rewards individual developers with low-quality applications, a business case can be developed to amalgamate resources, first nationally and perhaps eventually internationally, to fund high-quality applications that can be used globally. This would consolidate funds and expertise into a high-quality validated application that has international value (on a par with paper tools).

Despite the bright future for smartphone technology, mobile device applications in their current state are not suitable for emergency room ophthalmologic assessment. Furthermore, education for clinicians about measurement science and the limits of technological validation is also required. The importance of quality electronic diagnostic tools for patients and the challenges introduced by nonvalidated tools need to be disseminated to all health professionals.

Data-sharing statement

Most data generated or analyzed during this study are included in this published article and its supplementary information files. The original data sets used and/or analyzed during the current study are available from the corresponding author on reasonable request.

Author contributions

JMC conceived and designed the study and supervised the conduct of the study and data collection. Both authors collected and analyzed the data, managed the data and quality control, provided statistical advice, drafted the manuscript, and contributed substantially to its revision, gave final approval of the version to be published, and agree to be accountable for all aspects of the work.

Acknowledgments

This work would not have been possible without the generous support of the following dedicated individuals. Thank you: Dr Mary V Seeman, Mrs Jennifer Desmarais, Mr Tobias Feih, Dr Joshua Chan, Ms Chelsea Lefaivre, Ms Johanna Tremblay, Ms Jennifer Lay, Dr Amanda Carrigan, Dr Clarissa Potter, Dr Gerald Lane, and Dr Vinnie Krishnan.

Disclosure

The authors report no conflicts of interest in this work.

References

  • BourgesJLBoutronIMonnetDBrézinAPConsensus on Severity for Ocular Emergency: The BAsic SEverity Score for Common OculaR Emergencies [BaSe SCOrE]J Ophthalmol2015201557698326294965
  • CollignonNJEmergencies in glaucoma: a reviewBull Soc Belge Ophtalmol20052962967181
  • MuthCCEye EmergenciesJAMA2017318767628810025
  • TarffABehrensAOcular Emergencies: Red EyeMed Clin North Am2017101361563928372717
  • BellamyNCampbellJHillJBandPA comparative study of telephone versus onsite completion of the WOMAC 3.0 osteoarthritis indexJ Rheumatol200229478378611950022
  • BondMDavisALohmanderSHawkerGResponsiveness of the OARSI-OMERACT osteoarthritis pain and function measuresOsteoarthritis Cartilage201220654154722425883
  • HawkerGADavisAMFrenchMRDevelopment and preliminary psychometric testing of a new OA pain measure-an OARSI/OMERACT initiativeOsteoarthritis Cartilage200816440941418381179
  • KniestedtCStamperRLVisual acuity and its measurementOphthalmol Clin North Am200316215517012809155
  • ChouRDanaTBougatsosCGrusingSBlazinaIScreening for Impaired Visual Acuity in Older Adults: Updated Evidence Report and Systematic Review for the US Preventive Services Task ForceJAMA2016315991593326934261
  • MatthewsKNazrooJWhillansJThe consequences of self-reported vision change in later-life: evidence from the English Longitudinal Study of AgeingPublic Health201714271428057201
  • HochbergCMaulEChanESAssociation of vision loss in glaucoma and age-related macular degeneration with IADL disabilityInvest Ophthalmol Vis Sci20125363201320622491415
  • GerraGZaimovicAGerraMLPharmacology and toxicology of Cannabis derivatives and endocannabinoid agonistsRecent Pat CNS Drug Discov201051465219832688
  • SonksenPMSaltATSargentJRe: the measurement of visual acuity in children: an evidence-based updateClin Exp Optom201497436924912601
  • BaileyILLovieJENew design principles for visual acuity letter chartsAm J Optom Physiol Opt19765311740745998716
  • AbdolaliFZoroofiRAOtakeYSatoYAutomatic segmentation of maxillofacial cysts in cone beam CT imagesComput Biol Med20167210811927035862
  • ElliottDBWhitakerDBonetteLDifferences in the legibility of letters at contrast threshold using the Pelli-Robson chartOphthalmic Physiol Opt19901043233262263364
  • AnsticeNSThompsonBThe measurement of visual acuity in children: an evidence-based updateClin Exp Optom201497131123902575
  • BenderJLYueRYToMJDeackenLJadadARA lot of action, but not in the right direction: systematic review and content analysis of smartphone applications for the prevention, detection, and management of cancerJ Med Internet Res20131512e28724366061
  • LallooCShahUBirnieKACommercially available smartphone apps to support postoperative pain self-management: scoping reviewJMIR Mhealth Uhealth2017510e16229061558
  • LarsenMENicholasJChristensenHA Systematic assessment of smart-phone tools for suicide preventionPLoS One2016114e015228527073900
  • Smartphone OS Market Share, 2017 Q1Smartphone OS2017 Available from: https://www.idc.com/promo/smartphone-market-share/osAccessed November 17, 2017
  • BlackJMJacobsRJPhillipsGAn assessment of the iPad as a testing platform for distance visual acuity in adultsBMJ Open201336e002730
  • ChengNMChakrabartiRKamJKiPhone applications for eye care professionals: a review of current capabilities and concernsTelemed J E Health201420438538724476190
  • PereraCChakrabartiRResponse to: ‘Comment on The Eye Phone Study: reliability and accuracy of assessing Snellen visual acuity using smartphone technology’Eye201529121628
  • PereraCChakrabartiRIslamACrowstonJThe eye phone study (EPS): Reliability and accuracy of assessing snellen visual acuity using smart-phone technologyClinical and Experimental Ophthalmology2012Conference: 44th Annual Scientific Congress of the Royal Australian and New Zealand College of Ophthalmologists, RANZCO 2012Melbourne, VIC AustraliaConference Start: 20121124 Conference End: 20121128 Conference Publication: (var.pagings). 2012114020121121
  • TofighSShortridgeEElkeebAGodleyBFEffectiveness of a smartphone application for testing near visual acuityEye201529111464146826206531
  • BlackJMHessRFCooperstockJRToLThompsonBThe measurement and treatment of suppression in amblyopiaJ Vis Exp201270e3927
  • AroraKSChangDSSupakontanasanWLakkurMFriedmanDSAssessment of a rapid method to determine approximate visual acuity in large surveys and other such settingsAm J Ophthalmol201415761315132124548874
  • BastawrousARonoHKLivingstoneIADevelopment and Validation of a Smartphone-Based Visual Acuity Test (Peek Acuity) for Clinical Practice and Community-Based FieldworkJAMA Ophthalmol2015133893093726022921
  • SullivanGMA primer on the validity of assessment instrumentsJ Grad Med Educ20113211912022655129
  • GagnonLTime to rein in the “Wild West” of medical appsCMAJ20141868E24724733763
  • ColeBLMeasuring visual acuity is not as simple as it seemsClin Exp Optom20149711224345028
  • AslamTMParryNRMurrayIJSallehMColCDMirzaNDevelopment and testing of an automated computer tablet-based method for self-testing of high and low contrast near visual acuity in ophthalmic patientsGraefes Arch Clin Exp Ophthalmol2016254589189926899899
  • BradyCJEghrariAOLabriqueABSmartphone-Based Visual Acuity Measurement for Screening and Clinical AssessmentJAMA2015314242682268326720028
  • ChaconARabinJYuDJohnstonSBradshawTQuantification of color vision using a tablet displayAerosp Med Hum Perform2015861565825565534
  • KollbaumPSJansenMEKollbaumEJBullimoreMAValidation of an iPad test of letter contrast sensitivityOptom Vis Sci201491329129624413274
  • PereraCChakrabartiRIslamFMCrowstonJThe Eye Phone Study: reliability and accuracy of assessing Snellen visual acuity using smart-phone technologyEye (Lond)201529788889410.1038/eye.2015.60 Epub20155125931170
  • PhungLGregoriNZOrtizAShiWSchiffmanJCReproducibility and comparison of visual acuity obtained with Sightbook mobile application to near card and Snellen chartRetina20163651009102026509223
  • TonerKNLynnMJCandyTRHutchinsonAKThe Handy Eye Check: a mobile medical application to test visual acuity in childrenJ AAPOS201418325826024924280
  • ZhangZTZhangSCHuangXGLiangLYA pilot trial of the iPad tablet computer as a portable device for visual acuity testingJ Telemed Telecare2013191555923434538