1,768
Views
12
CrossRef citations to date
0
Altmetric
Research Papers

An evaluation of the feasibility and usability of a proof of concept mobile app for adverse event reporting post influenza vaccination

, , , , , , , & show all
Pages 1738-1748 | Received 07 Oct 2015, Accepted 05 Feb 2016, Published online: 25 Mar 2016

ABSTRACT

The Canadian National Vaccine Safety network (CANVAS) gathers and analyzes safety data on individuals receiving the influenza vaccine during the early stages of annual influenza vaccination campaigns with data collected via participant surveys through the Internet. We sought to examine whether it was feasible to use a mobile application (app) to facilitate AEFI reporting for the CANVAS network. To explore this, we developed a novel smartphone app, recruited participants from a hospital influenza immunization clinic and by word of mouth and instructed them to download and utilize the app. The app reminded participants to complete the CANVAS AEFI surveillance surveys (“AEFI surveys”) on day 8 and 30, a survey capturing app usability metrics at day 30 (“usability survey”) and provided a mechanism to report AEFI events spontaneously throughout the whole study period. All survey results and spontaneous reports were recorded on a privacy compliant, cloud server. A software plug-in, Lookback, was used to record the on-screen experience of the app sessions. Of the 76 participants who consented to participate, 48(63%) successfully downloaded the app and created a profile. In total, 38 unique participants completed all of the required surveillance surveys; transmitting 1104 data points (survey question responses and spontaneous reports) from 83 completed surveys, including 21 usability surveys and one spontaneous report. In total, we received information on new or worsening health conditions after receiving the influenza vaccine from 11(28%) participants. Of the usability survey responses, 86% agreed or strongly agreed that they would prefer to use a mobile app based reporting system instead of a web-based system. The single spontaneous report received was from a participant who had also reported using the Day 8 survey. Of Lookback observable sessions, an accurate transmission proportion of 100% (n=290) was reported for data points. We demonstrated that a mobile app can be used for AEFI reporting, although download and survey completion proportions suggest potential barriers to adoption. Future studies should examine implementation of mobile reporting in a broader audience and impact on the quality of reporting of adverse events following immunization.

Introduction

Post-market surveillance to identify adverse events following immunization (AEFI) is a critical component of immunization programs globally.Citation1,2 It is particularly important for seasonal influenza vaccines, which are modified annually to include up- to-date circulating strains and, thus, each year need to be monitored for safety. In Canada, in addition to passive safety surveillance conducted by provinces and territoriesCitation3 and active surveillance conducted by the Canadian Immunization Monitoring Program Active (IMPACT),Citation4,5 the Canadian National Vaccine Safety (CANVAS) network monitors for AEFI's related to the seasonal influenza vaccine. The CANVAS program relies on web-based survey responses from adults and parents of children 8 d following vaccine receipt regarding their post immunization experience. Additional telephone follow up is used for reports of severe adverse events.Citation6

Mobile devices have enhanced the provision of health care both for individuals and providers. Smartphones offer an opportunity to enhance immunization practice and facilitate tailored communication between individuals and public health officials.Citation7-9 With increased mobile device and application (app) usage, the potential exists to capture, transmit, and monitor post immunization experience information in real-time using personal mobile devices. Thus mobile apps could potentially serve as a mechanism for AEFI reporting. Specifically, we saw an opportunity to use mobile apps to complement the efforts of the CANVAS network by providing individuals with the ability to both answer post vaccination surveys using their mobile devices as well as spontaneously report AEFI's as they occur instead of waiting until the time of survey distribution. However, the feasibility and usability of mobile reporting of AEFI's is unknown. We developed a standalone CANVAS app for iPhones and sought to examine whether it could facilitate AEFI reporting for the CANVAS network.

The objectives of this proof of concept study were to determine the feasibility and usability of mobile AEFI reporting for the CANVAS network. Specifically, we sought to: 1) determine if participant data from smartphone devices could be accurately transmitted into a secure cloud server accessible by study staff by determining the proportion of data elements accurately transmitted (i.e. accurate transmission proportion) and 2) evaluate participant usability of the app as measured by download rates, in-app AEFI survey response rates, and responses to a usability survey. We also report on the use of the app to report AEFI's.

Results

Participant recruitment and response rates

A total of 76 individuals consented to participate in this study, 59(78%) were recruited via hospital immunization clinics and 17(22%)via word of mouth (Appendix A, ). There were 48(63%) participants who downloaded and logged into the app using an email address provided at enrollment (). Throughout the study period, 38 unique participants submitted 83 completed surveys. Of the AEFI surveys, there were 35 Day 8, 24 Day 30 and 3 Combined Day 8/30 Surveys. Twenty-one participants completed usability surveys, 15(71%) participants recruited via hospital clinics and 6(29%) via word of mouth (Appendix A, ).

Figure 1. Overview of participant recruitment and response rates.

Figure 1. Overview of participant recruitment and response rates.

AEFI surveys

Day 8 AEFI surveys were completed by 38 participants (50% of total participants, 79% of those that downloaded the app). Day 30 AEFI surveys were completed by 27 participants (35% of total participants, 56% of those that downloaded the app). Three participants chose to complete the Day 8 and Day 30 AEFI surveys through a combined “Day 8 and Day 30” AEFI survey. Of the 38 who completed a Day 8 AEFI survey, participant age ranged from 20-29(34%), 30-39(29%), 40-49(26%) and 50-59(11%) (). Half (19) of participants were male and half (19) were female. All reported for themselves, rather than for a family member. Over 80%(32) had received their seasonal influenza vaccine for the past 2 influenza seasons: 31(82%) received it in a hospital immunization clinic, 3 at pharmacies and 4 at Public Health or Other types of clinics. It took participants an average of 12 d from time of vaccination to submit the day 8 survey (4 d from survey availability) and 32 d from vaccination to submit the day 30 survey.

Table 1. Demographics of study participants who submitted survey responses (n = 38).

Usability surveys

Usability surveys were completed by 21 participants (28% of total participants, 44% of those that downloaded the app (). Of those who completed the usability survey (n=21), average age was 34.7(IQR 13) ().

Table 2. Demographics of study participants who completed the usability survey (n = 21).

Reporting of AEFIs

In total, we received an AEFI report on new or worsening health conditions after receiving the influenza vaccine from 11(28%) participants. We received 5 reports via the Day 8 AEFI survey, 4 in the Day 30 AEFI survey and 2 in the combined Day 8 and Day 30 survey. The single spontaneous report received was from a participant who had also reported using the Day 8 AEFI survey.

Of the Day 8 AEFI survey responses, 3 participants indicated they experienced a new health problem or exacerbation of an existing health problem in the first week following vaccination. All indicated this problem was “not severe enough to miss work/school or prevent/stop normal activities” and did not report it to a health care provider. Two additional participants answered, “Yes” to “Do you have any other concerns that you would like us to know about” and specified onset of new health conditions within 24 hours of vaccination.

Of the Day 30 AEFI survey responses, 4 participants reported developing a new health problem or exacerbation of an existing health problem in the last month following the flu shot. These four participants were distinct from the 5 who reported events in the Day 8 AEFI survey. Three provided details on the condition and 2 of the participants who reported conditions in the Day 30 AEFI survey said this was severe enough to “miss work/school or prevent/stop normal activities.” None of the participants saw their health care provider.

Of the participants who completed the combined Day 8 and Day 30 AEFI survey, one reported a condition in the first week that was severe enough see their health care provider (specified as “Clinic/Family Physician”) and the other reported a condition in the first week that was not severe enough to miss work/school or prevent/stop normal activities or see a health care provider.

Accurate transmission proportion

374 data elements from completed AEFI surveys and spontaneous reports were sent from the app and recorded in the server throughout the study period (Appendix A, ). 321 data points were observed in AEFI surveillance and spontaneous reporting sessions captured using Lookback. Overall, 291 of participant entered data elements were captured by Lookback and were matched with the server, meaning they were successfully submitted to the server and observed as identical in Lookback. All of the data points from complete submitted surveys that were observed in properly recorded Lookback sessions were identical, resulting in an accurate transmission proportion of 100%.

The discrepancy between the number of participant entered data elements observed by Lookback (321) and the number present in the server (374) exist due to failure of the Lookback Software to record the sessions. On average, 25% of data was not observed in Lookback (Appendix A, Table 5).

Usability

Among the 21 participants who completed the usability survey, half (11) had reported events during the study period. The average age of participants who completed the usability survey was 34.7 y (IQR 13) and their mean technology readiness index score was 3.35 (). Our sample scored slightly lower than average in the dimension of “Optimism” (3.73 compared to 3.75 in public average) which describes the level to which an individual has a positive view of technology and a belief that it offers people increased control, flexibility and efficiency in their lives. Approximately half (52%) were male and the majority (89%) had some form of university or college education. Individuals, on average, owned their device for 5 y. 86% of participants agreed or strongly agreed, that they would “prefer to use a mobile app-based reporting system, such as the AEFI app, instead of a web-based system on a computer,” whereas only 14% disagreed, preferring the web based methods (). Participants agreed or strongly agreed that it was easy to navigate the app (95%), create a user profile (86%), locate the AEFI surveys in the app (67%), and complete the AEFI surveys (76%).

Table 3. Usability Survey Responses (n=21).

Table 4. App usability feedback submitted by participants.

Participants also agreed or strongly agreed that the app was inviting to use (62%), an efficient and realistic method of reporting vaccine adverse events (91%) and that they found it convenient to complete AEFI surveys on their mobile device (85%). Only 14% of participants agreed and no one strongly agreed that they found it difficult to complete a task within the app on one or more occasions (). Throughout the study period, we received 3 pieces of feedback from participants related to app usability ().

Discussion

The primary objective of this proof of concept study was to develop and test the functionality of a mobile app for the monitoring of safety outcomes after influenza immunization. Throughout the study period, we received 62 AEFI surveys, 1 spontaneous report, 3 pieces of usability feedback and 21 usability surveys. Participants reported a total of 11 events following immunization, one of which was categorized as serious. We were able to demonstrate that a mobile app could be used to successfully and accurately transmit data from vaccine recipients to a secure server as measured by a 100% accurate transmission proportion. These results need to be taken in the context of the proof of concept nature of the study and there may be limitations in the generalizability of the findings. Our findings support the potential of utilizing mobile applications to enhance or complement adverse event reporting following immunization.

However, a functioning app is not sufficient for this purpose if there are significant barriers to use and data from this study suggest this is an important consideration.Citation10 Only 63% of recruited participants successfully downloaded the app and logged in. Of those who did, less than half (43%) completed the usability survey. While there was support for the use of mobile apps for AEFI reporting in this subset, it is difficult to draw conclusions based on our low response rate. Additionally, we only recruited individuals who owned and utilized iOS devices. Apple's smartphone market share is approximately 30-35% compared to Android, estimated at over 60% which may have introduced challenges in recruiting of participants.Citation11

Another potential impediment to post-recruitment adoption of the app may have been related to user motivation. Logistical barriers to downloading mobile apps are greater than barriers to web usage and may have discouraged participants from engaging right from the beginning. At the time of the study, downloading an app on an iPhone required first locating the app in the App Store and then authorizing the download through entry of a password. Shortly after, Apple released an update allowing users to disable password entry when downloading free applications from the app store which may reduce this inconvenience in the future. Including AEFI reporting features within an existing app on a user device, such as an app for vaccine tracking might reduce this factor.

Strategies to facilitate downloading and use of mobile applications center upon the Technology Acceptance Model concept of “perceived usefulness.”Citation12 Perceived usefulness can be thought of as the degree to which a person believes using a tool would enhance their ability to perform a task in home or work life. With this in mind, an application solely for AEFI reporting is possibly not useful enough to motivate people to download an app. However, if AEFI reporting tools were integrated into another app which serves a useful purpose for the user, they may be motivated to engage in the process. In either case, a scenario where the app would be downloaded by a health care provider may also improve usage.

Another possible factor resulting in low response rate is user adeptness or comfort with new technologies. Even of the 21 usability surveys completed, the mean technology readiness scores of participants was 3.35 compared to the public average of 3.02.Citation13 Our sample scored higher in Innovativeness and lower than average in Discomfort, Optimism and Insecurity, potentially indicating our population was more comfortable trying new technologies than the general public, although it is difficult to make conclusions given our small sample size. Given that less than 30% of total recruits completed the usability survey, it is plausible that the participants who consented but did not download the app or complete the usability survey would have differed significantly from those for whom we did obtain scores.

Assuming the barriers to use can be overcome, significant advantages exist for post-market vaccine safety monitoring using mobile apps. Our observation that 11 participants (28%) used the app to report some sort of event, with one event meeting the criteria of “severe” suggests the potential of this modality to facilitate reporting. Apps provide a direct, accessible mechanism for individuals to spontaneously report AEFI's as they occur, which could potentially improve reporting time. The Canadian Adverse Events Following Immunization Surveillance System relies primarily on immunization or health care providers to report adverse events to local public health units.Citation14,15 Provincial/territorial and local health officials collect and analyze these reports which contribute to their post-market surveillance activities. Relying solely/mostly on health care providers to complete AEFI reports could serve as a barrier to AEFI surveillance.

Mobile reporting of AEFI using SMS and web-based reporting has previously been examined in several setting, including Australia and Cambodia, and was demonstrated to have high response rates and provide real time reporting.Citation16-20 Mobile AEFI reporting using apps could also leverage other functionality in smartphones. Assessment of local reactions or rash could be facilitated if individuals are able to photograph lesions and transmit them with their reports. Using smartphone cameras to scan the 2D bar codes on vaccine vials can permit the integration of lot number and global trade identification number with the AEFI report.Citation21,22 This would help to mitigate the challenge of improve the quality of the information received by public health officials and assist in identification of lot specific issues as well as assist individuals by automating the identification of the product, instead of requiring entry via drop down menu. This capability would also permit public health officials to send notifications about defective lots, with the app determining if the individual requires notification.

There are potential disadvantages to mobile AEFI reporting. Enabling individuals to more easily report AEFI's could result in a high number of reports of mild reactions that are not currently under surveillance, which would have resource implications as they may require follow-up by local public health agencies and introduce new challenges for signal detection. However, one study of consumer reporting of AEFI's demonstrated a higher rate of serious events being reported by vaccines than by health care providers, although more research is needed to explain why this occurs.Citation23,17 The impact of an app that facilitates mobile AEFI reporting on vaccine hesitancy also needs to be studied. Mobile AEFI reporting permitting direct reporting by individuals would resemble the United States Vaccine Adverse Event Reporting System (VAERS) in several ways and may reflect the advantages and disadvantages of this system.Citation24,25 Given that we have determined that the technology is functional and secure, a more comprehensive study is warranted to examine these questions.

While this was a proof of concept, there are some important strengths to this study. We believe, to the best of our knowledge, that this is the first evaluation of mobile monitoring of vaccine safety in North America. Globally, at least one other adverse event reporting app exists.Citation26 iDsurv was developed by the Indian Academy of Pediatrics as part of the Infectious Disease Surveillance for reporting serious AEFIs with the objectives “to develop an early warning system for pediatric infectious diseases in India, to generate data on burden of infectious diseases in India and to generate data on serious AEFI.”Citation27,28 Our use of Lookback was also unique as it permitted us to assess whether the data entered by the user matched the data on the server.

The primary limitations of this study relate to its sample size and the generalizability of the results given the nature of those who would participate in a technology oriented study. As this was a proof of concept study focusing on determining if the technology was functional it was not powered to examine usability as a primary outcome. A related limitation is that we only have usability data on those who used the app and completed the usability survey at day 30, which likely represented a biased sample. In the future it would be useful to have baseline surveys of participants to determine if those who completed the study differed from those who did not. A final limitation is that 25 % of completed AEFI survey data was not recorded in Lookback as a result of Lookback failure and could therefore not be compared with the server data. However, it is unlikely that the unrecorded data would differ systematically from the rest of the data with respect to accurate transmission proportions.

We have demonstrated that a mobile AEFI app is functional and may have the potential to complement existing post-market surveillance systems. Whether this technology is usable and acceptable to a large enough segment of the population to be of value still requires elucidation. While this study suggests a standalone app is technically feasible, we believe that integration into a general immunization app could encourage downloads and usage. In this scenario the app could be useful not only for health care workers influenza AEFI reporting as in this study but also for general adverse event reporting. Future studies should examine broader implementation of mobile AEFI reporting to examine its acceptability, and whether it has an impact on the quality and comprehensiveness of reporting of adverse events following immunization.

Methods

Population and setting

English speaking individuals over the age of 18 y who owned and used a compatible smartphone were eligible to participate in this study. Participants were required to use an iPhone running iOS 7 or higher. We formally recruited participants face-to-face for one day at a staff and family member's influenza immunization clinic run by Occupational Health at The Ottawa Hospital in November 2014 as well as through word of mouth using posters in the Occupational Health Offices at the Civic Campus. The Ottawa Hospital is one of the largest teaching hospitals in Canada with over 1100 beds across 3 Campuses employing over 1300 physicians. The clinic used for recruiting was held on a Saturday (11/16), at the Riverside, a day-hospital with no inpatient beds. The clinic was open to staff members and their immediate family members only. After being immunized, attendees were asked to wait for in a post-immunization waiting area in case of adverse reactions. Two study staff members approached potential participants during this waiting period and asked attendees if they were interested in participating. If they obliged, written informed consent was immediately collected. Participants had to provide study staff with their email address at the time of enrollment. Study staff then registered participants email address in the backend administration system permitting users to access the app after downloading it. When the email address was registered in the administration system, it generated an email to the participant with a link to download the app. Participants had to click on this link then were taken to the CANVAS app in the iOS app store where they could download the app, login and create a profile. If participants did not download the app and login using their email address provided, a reminder email notification with the link to the app was generated and sent to the address provided at enrollment.

In addition to the clinic, we recruited informally through word of mouth and posters in occupational health offices throughout November and December 2014. For participants recruited in this method, the same process of collecting consent and registering email addresses was followed.

The study was approved by the Ottawa Health Science Network Research Ethics Board and the University of British Columbia Research Ethics Board.

Intervention

Canadian national vaccine safety network

Since 2009, the CANVAS network gathers and analyzes safety data on thousands of individuals who receive the influenza vaccine to provide select safety information during the early stages of the annual influenza vaccination campaign.Citation29,30 Participants for CANVAS are recruited from acute care, public health and pharmacy influenza vaccination clinics in 5 provinces across Canada (British Columbia, Alberta, Ontario, Quebec, Nova Scotia) and complete a short web-based survey 8 d after immunization. Surveys identify if participants had severe events, defined as any new health problem or the exacerbation of an existing condition that is severe enough to cause work or school absenteeism/prevent daily activities or to require a medical consultation. Trained nurses follow-up selected severe reported events for validation. Longer-term follow up of 28 d and 6 months post-vaccination has occurred in previous years, but has not been included in recent years.

CANVAS surveys

The AEFI Surveillance surveys administered through the app were identical to surveys used by CANVAS either in 2014/2015 flu season or ones prior. Their content was not modified for the app.

Day 8 survey contains a maximum of 10 questions. The first 5 questions pertained to demographics and history of influenza vaccination, including age bracket, who they were completing this survey on behalf of and where they received the flu shot. They were then asked “In the first week (7 days) after your flu shot did you develop a new health problem or did an existing health problem get worse?.” If respondents answered “no” to this question, they were directed to the final survey questions. If they answered “yes” they were led to an additional 2 questions gathering details on their changing health condition.

The Day 30 survey contained a maximum of 8 questions, the first who they were completing the survey on behalf of and then moved directly onto “In the last month after your flu shot did you develop a new health problem or did an existing health problem get worse”?. Again, if respondents answered “no” to this question, they were directed to the final survey questions. If they answered “yes” they were led to an additional 2 questions gathering details on their changing health condition.

The Combined Day 8 and Day 30 survey had a maximum of 23 questions, covering the same demographical questions as the above surveys and asked specifics regarding health problems occurring both within the first week following vaccination and in the subsequent days leading up to Day 30.

Usability survey

Unlike the AEFI Surveillance surveys, the usability survey was developed specifically for use in this study. The survey was first pilot tested on 10 individuals to ensure the instructions and questions were clear. Data from the pilot surveys were not included in the final analysis.

The survey collected additional participant demographic information (e.g. age, education, employment status, experience with smartphones, smartphone device currently using, internet use), opinions regarding perceived ease of use and future intent to use the app, and concerns about the security and privacy of personal health information while communicating from mobile devices. The survey also measured key usability domains of the app and participant baseline technology readiness index (TRI) scores.Citation13,31 TRI quantifies 4 attitudes toward technology, 2 motivators (Optimism, Innovativeness) and 2 inhibitors (Discomfort and Insecurity) which gauge individual propensity to adopt a new technology to accomplish a goal in home or work life.

App usability was captured on a 5 point Likert scale with questions such as: “I would prefer to use a mobile app-based reporting system, such as the AEFI app, instead of a web-based system on a computer,” “I found it easy to navigate through the app,” “I found it easy to complete the adverse event surveys,” and “the CANVAS app is an efficient and realistic method of reporting vaccine adverse events.”

Mobile app and study procedure

The CANVAS app (Appendix B) was developed on the iOS platform for this study.Citation32 The app includes a profile, where participants were asked to enter their name, phone number, influenza vaccine date and vaccine product received. A drop down menu was available for participants to select product received (Agriflu®, Fluad®, FluMist®, Fluviral®, FluZone®, Influvac®, Intanza®, Vaxigrip®, don't know). The app was pre-programmed with the capability of delivering all 4 surveys. We also included the opportunity for participants to report AEFI's spontaneously, at any time outside of the CANVAS surveys.

Participants were prompted to complete the surveys using automatically generated internal notifications at days 8 and 30 after immunization to prompt survey completion, similar to the email reminders sent to participants in the current validated CANVAS web based methods. Additional automated internal notifications at days 11 and 33 after immunization were sent to those participants who had not yet completed their survey. At day 30, a combined day 8 and 30 survey was provided to participants who did not complete the Day 8 survey.

At completion of the Day 30 AEFI survey, participants were prompted to complete the usability survey, to evaluate their user experience with the app. The app also provided a mechanism for participants to provide feedback to the research team on the app performance and usability throughout the study period and request technical assistance.

In either survey, those who reported no health changes after immunization were taken to the end of the survey and exited the app. Participants who reported a serious event (i.e., an event that prevent daily activities or resulted in a medical visit) were asked additional questions about the event.

All data entered into the app by the user was encrypted before being transmitted to the secure server. Study staff were provided a web portal with which to access the survey results and user data. An alert system was built in to notify staff if specific severe adverse events were reported through the app. Participants were notified in the app that they may be contacted by a nurse if they reported a severe adverse event.

Data collection

The proportion of participants who downloaded the app and completed the Day 8, 30 or combined surveys or used the spontaneous adverse event reporting option was recorded. The date and time participants submitted the surveys were also recorded. Responses to each survey question were also recorded in the server and defined as “data points.” This includes multiple choice question responses, drop down menus and free text fields.

Lookback, a software plug-in, was used to record a screen capture video of participant's app use.Citation33 Lookback was selected for use in this study because it permitted study staff to visually compare video of participants generating data points on the app to the data points residing in the secure server. Each app session recorded by Lookback where participants engaged with an AEFI surveillance survey or spontaneous report was extracted and manually reviewed by one member of the study staff and observed data points were entered in an excel spreadsheet then compared to the data points found on the server. Not all app sessions were successful recorded in Lookback as the software was still in its β testing phase.

Analysis

We defined accurate transmission proportion as the percentage of survey responses successfully delivered to the server that matched the survey responses sent from the app, as observed using Lookback. The total response rate, Day 8 survey, Day 30 survey and combined survey response rate were calculated by determining the number of individuals who submitted surveys divided by the total number of participants. We also calculated the mean time of Day 8 and Day 30 survey submission from time of immunization and from when the survey was made available to the participant. We calculated the mean value for Likert scaled responses from the usability survey. We recorded all reports of adverse events as well as comments on usability of the app.

Acknowledgments

Thank you to the Canadian Association for Immunization Research and Evaluations (CARIE) for their networking assistance. Finally, the authors gratefully acknowledge the expert assistance provided by the Vaccine Evaluation Center, public health and hospital collaborators.

Funding

Thank you to our funders, the Public Health Agency of Canada and the Canadian Institutes of Health Research.

References

  • Chen RT, Glasser JW, Rhodes PH, Davis RL, Barlow WE, Thompson RS, Mullooly JP, Black SB, Shinefield HR, Vadheim CM, et al. Vaccine Safety Datalink project: a new tool for improving vaccine safety monitoring in the United States. The Vaccine Safety Datalink Team. Pediatrics 1997; 99(6):765-73; PMID:9164767
  • World Health Organization (WHO) Revised aide-memoire on AEFI investigation. Switzerland: Essential Medicines and Health Products (EMP) Department, 2015. Accessed May 12, 2015.
  • Conference of Deputy Ministers of Health (Canada), Advisory Committee on Population Health and Health Security. National Immunization Strategy: Final Report 2003. Ottawa, Ont: Public Health Agency of Canada; 2003.
  • Scheifele DW, Halperin SA. Immunization Monitoring Program, Active: a model of active surveillance of vaccine safety. Semin Pediatr Infect Dis. 2003;14(3):213-9; PMID:12913834; http://dx.doi.org/10.1016/S1045-1870(03)00036-0
  • Scheifele D.W. IMPACT after 17 years: lessons learned about successful networking. Paediatr Child Health 2009. 14(1):33-5.
  • Bettinger JA, Vanderkooi OG, MacDonald J, Kellner JD. Rapid Online Identification of Adverse Events Following Influenza Immunization in Children by PCIRN's National Ambulatory Network. Pediatr Infect Dis J 2014; 33:1060-4; PMID:25361187
  • Wilson K, Atkinson KM, Deeks SL, Crowcroft NS. Improving vaccine registries through mobile technologies: a vision for mobile enhanced Immunization information systems. Journal of the American Medical Informatics Association : JAMIA. 2016; 23:207-11.
  • Wilson K, Atkinson K, Deeks S. Opportunities for utilizing new technologies to increase vaccine confidence. Expert Rev Vaccines. 2014(0):1-9.
  • Wilson K, Atkinson KM, Westeinde J. Apps for immunization: Leveraging mobile devices to place the individual at the center of care. Human vaccines & immunotherapeutics. 2015;11(10):2395-9; PMID:26110351
  • Atkinson KM, Ducharme R, Westeinde J, Wilson SE, Deeks SL, Pascali D, et al. Vaccination attitudes and mobile readiness: A survey of expectant and new mothers. Hum Vaccin immunother 2015; 11(4): 1039-45.
  • Kantar Worldpanel ComTech. Smartphone OS sales market share. Kantar Worldpanel http://www.kantarworldpanel.com/global/smartphone-os-market-share/intro. Accessed January 5, 2015.
  • Davis FD. Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS quarterly. 1989; 13:319-40; http://dx.doi.org/10.2307/249008
  • Parasuraman A, Colby CL. An updated and streamlined technology readiness index TRI 2.0. Journal of Service Research. 2014; 18(1): 59-74; PMID:25530983
  • Law BJ, Laflèche J, Ahmadipour N, Anyoti H. Canadian Adverse Events Following Immunization Surveillance System (CAEFISS): Annual report for vaccines administered in 2012. Canada Communicable Disease Report 2015; 40(S3):7.
  • National Advisory Committee on Immunization (NACI) Vaccine safety and adverse events following immunization. Part 2. Recommended Immunization. Active Immunizing Agents-Influenza Vaccine. Canadian Immunization Guide. 7th ed. Her Majesty the Queen in Right of Canada represented by the Minister of Public Works and Government Services Canada, 80-4.
  • Regan AK, Blyth CC, Effler PV. Using SMS technology to verify the safety of seasonal trivalent influenza vaccine for pregnant women in real time. Med J Aust. 2013;199(11):744-6; PMID:24329634; http://dx.doi.org/10.5694/mja13.10712
  • Clothier HJ, Selvaraj G, Easton ML, Lewis G, Crawford NW, Buttery JP. Consumer reporting of adverse events following immunization. Hum Vaccin Immunother 2014;10(12):3726-30; PMID:25483686
  • Regan AK, Blyth CC, Mak DB, Richmond PC, Effler PV. Using SMS to monitor adverse events following trivalent influenza vaccination in pregnant women. Aust N Z J Obstet Gynaecol 2014; 54(6):, 522-8; Epub 2014 Oct 11; PMID:25306915; http://dx.doi.org/10.1111/ajo.12266
  • Leeb A, Regan AK, Peters IJ, Leeb C, Leeb G, Effler PV. Using automated text messages to monitor adverse events following immunisation in general practice. Med J Aust. 2014; 200(7):416-8; PMID:24794676; http://dx.doi.org/10.5694/mja13.11166
  • Baron S, Goutard F, Nguon K, Tarantola A. Use of a text message-based pharmacovigilance tool in Cambodia: pilot study. J Med Internet Res 2013;15(4); PMID:23591700; http://dx.doi.org/10.2196/jmir.2477
  • Pereira JA, Quach S, Hamid JS, Quan SD, Diniz AJ, Van Exan R, et al. The integration of barcode scanning technology into Canadian public health immunization settings. Vaccine. 2014; 32:2748-55.
  • Pereira JA, Quach S, Hamid JS, Heidebrecht CL, Quan SD, Nassif J, Diniz AJ, Van Exan R, Malawski J, Gentry A, et al. Exploring the feasibility of integrating barcode scanning technology into vaccine inventory recording in seasonal influenza vaccination clinics. Vaccine. 2012; 30(4):794-802; PMID:22119585; http://dx.doi.org/10.1016/j.vaccine.2011.11.043
  • Parrella A, Gold M, Braunack-Mayer A, Baghurst P, Marshall H. Consumer reporting of adverse events following immunization (AEFI) Identifying predictors of reporting an AEFI. Human vaccines & immunotherapeutics. 2014; 10(3):747-54; PMID:24406315
  • Varricchio F, Iskander J, Destefano F, Ball R, Pless R, Braun MM, Chen RT. Understanding vaccine safety information from the vaccine adverse event reporting system. Pediatr Infect Dis J. 2004; 23(4):287-94; PMID:15071280; http://dx.doi.org/10.1097/00006454-200404000-00002
  • Braun, M.M (2006) Vaccine Adverse Event Reporting System (VAERS). Usefulness and Limitations. Insitute for Vaccine Safety. http://www.vaccinesafety.edu/VAERS.htm. 2006.
  • Google Play Store (2015). IDsurv app info page. Accessed May 12, 2015 from https://play.google.com/store/apps/details?id=com.idsurv&hl=en.
  • Thacker D, Vashishtha V, Bansal C, Yewale V, Saxena V. Novel surveillance system demonstrates burden of enteric fever in India. Int J Infect Dis 2014; 21:47; http://dx.doi.org/10.1016/j.ijid.2014.03.515
  • Thacker DN, Vashishtha V, Bansal CP, Yewale V, Saxena V. Novel surveillance system demonstrates burden of enteric fever in India.16th International Congress on Infectious Diseases. 2014; 21:47.
  • De Serres G, Gariepy M-C, Coleman B, Rouleau I, McNeil S, Benoît M, McGeer A, Ambrose A, Needham J, Bergeron C, et al. Short and Long-Term Safety of the 2009 AS03-Adjuvanted Pandemic Vaccine. PLoS One 2012; 7(7):e38563; PMID:22802929; http://dx.doi.org/10.1371/journal.pone.0038563
  • Bettinger J, Rouleau I, Gariépy M, Bowie W, Valiquette L, Vanderkooi O, et al. Successful methodology for large-scale surveillance of severe events following influenza vaccination in Canada, 2011 and 2012. Euro surveillance: bulletin Europeen sur les maladies transmissibles=European communicable disease bulletin. 2014; 20(29); PMID:NOT_FOUND
  • Parasuraman A. Technology Readiness Index (TRI) a multiple-item scale to measure readiness to embrace new technologies. J Serv Res 2000; 2(4):307-20; http://dx.doi.org/10.1177/109467050024001
  • Ottawa Hospital Research Institute. Canadian National Vaccine Safety Network iTunes Preview; URL:https://itunes.apple.com/ca/app/canadian-national-vaccine/id938262382?mt=8. Accessed December 2014.
  • Lookback Inc. User search simplified. https://lookback.io/ 2015.

Appendix A

 

Figure 1. Total Consenting Individuals Participating in Study.

Figure 1. Total Consenting Individuals Participating in Study.

Table 1. Demographics of total study participants who submitted survey responses.

Table 2. Demographics of study participants who completed the usability survey.

Table 3. Usability Survey Responses Combined.

Table 4. Source of Data Transmitted to Server.

Table 5. Sessions examined in Lookback.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.