ABSTRACT
Purpose: To evaluate the content quality, accuracy, and readability of commonly visited websites by cataract patients contemplating cataract surgery.
Setting: Freely available online information.
Design: Cross-sectional study.
Methods: Ten websites were evaluated in a cross-sectional study for content analysis using a grading sheet of 40 questions individually scored by three ophthalmologists. JAMA benchmarks were used to assess the quality. An online readability tool, Readable, was used to assess the readability.
Results: There was a significant difference between the content and accuracy of each website according to a Kruskal-Wallis test (H = 22.623, P = .007). The average score for all websites using the grading sheet was 90.85 out of 160 points, or 57% (SD 29.93, CI 95%±17.69). There was no significant correlation between website rank on Google.com and content quality of the website (r = 0.049, P = .894). No websites complied with all 4 JAMA criteria for authorship. There was no significant correlation between content quality of each website and number of JAMA requirements met (r = −0.563, P = .09). The average Flesch Reading Ease Score for all websites was 52.64 (SD 11.94, CI 95%±7.40), and the average Mean Reading Grade was 10.72 (SD 1.58, CI 95%±0.98). There was a significant difference in Mean Reading Grades between websites (H = 23.703, P = .005). There was no significant correlation between content quality of the website and Mean Reading Grade (r = −0.552, P = .098).
Conclusion: Commonly accessed online resources on cataracts and cataract surgery are insufficient to provide patients with a clear and complete understanding of their condition as well as available medical and surgical treatment options.
Disclaimer
The views expressed in this article are of the authors and are not an official position of the University of Miami.
Source of Support
N/A
Disclosure of Interest
The authors report no conflict of interest.