124
Views
6
CrossRef citations to date
0
Altmetric
Articles

RSVP IconMessenger: icon-based brain-interfaced alternative and augmentative communication

, , , , , , & show all
Pages 192-203 | Received 14 Aug 2014, Accepted 04 Dec 2014, Published online: 22 Dec 2014

References

  • Jinks A, Sinteff B. Consumer response to AAC devices: acquisition, training, use, and satisfaction. Augment Altern Commun. 1994;10:184–190.
  • Ball LJ, Beukelman DR, Pattee GL. Acceptance of augmentative and alternative communication technology by persons with amyotrophic lateral sclerosis. Augment Altern Commun. 2004;20:113–122.
  • Todman J, Alm N, Elder L. Computer-aided conversation: a prototype system for nonspeaking people with physical disabilities. Appl Psycholinguist. 1994;15:45–73.
  • Beukelman D, Mirenda P. Augmentative and alternative communication: supporting children and adults with complex communication needs. Baltimore (MD): Brookes; 2006.
  • Sutton S, Braren M, Zubin J, John E. Evoked potential correlates of stimulus uncertainty. Science. 1965;150:1187–1188.
  • Bayliss JD, Inverso SA, Tentler A. Changing the P300 brain computer interface. Cyberpsychol Behav. 2004;7:694–704.
  • Hoffmann U, Vesin J, Ebrahimi T, Diserens K. An efficient P300-based brain–computer interface for disabled subjects. J Neurosci Methods. 2008;167:115–125.
  • Nijboer F, Sellers EW, Mellinger J, Jordan MA, Matuz T, Furdea A, Halder S. A P300-based brain–computer interface for people with amyotrophic lateral sclerosis. Clin Neurophysiol. 2008;119:1909–1916.
  • Orhan U, Erdogmus D, Roark B, Oken B, Purwar S, Hild K, Fowler A, Fried-Oken M. Improved accuracy using recursive bayesian estimation based language model fusion in ERP-based BCI typing systems. In Engineering in Medicine and Biology Society (EMBC), 2012 Annual International Conference of the IEEE, 2012;2497–2500.
  • Orhan U, Erdogmus D, Roark B, Oken B, Fried-Oken M. Offline analysis of context contribution to ERP-based typing BCI performance. J Neural Eng. 2013;10(6):066003.
  • Orhan U, Hild KE, Erdogmus D, Roark B, Oken B, Fried-Oken M. RSVP keyboard: an EEG based typing interface. In 2012 IEEE International Conference on Acoustics, Speech and Signal Processing, 2012;645–648.
  • Acqualagna L, Treder MS, Schreuder M, Blankertz B. A novel brain-computer interface based on the rapid serial visual presentation paradigm. In Engineering in Medicine and Biology Society (EMBC), 2010 Annual International Conference of the IEEE, 2010; 2686–2689.
  • Krusienski DJ, Sellers EW, McFarland DJ, Vaughan TM, Wolpaw JR. Toward enhanced P300 speller performance. J Neurosci Methods. 2008;167:15–21.
  • Wiegand K, Patel R, Erdogmus D. Leveraging semantic frames and serial icon presentation for message construction. In Proceedings of the ISAAC, 2010.
  • Fillmore CJ. Frame semantics and the nature of language. Ann N Y Acad Sci. 1976;280:20–32.
  • Mackenzie IS, Felzer T. SAK: scanning ambiguous keyboard for efficient one-key text entry. ACM Trans Comput Hum Interact. 2010;17:11:1–11:39.
  • Lesher G, Sanelli C. A web-based system for autonomous text corpus generation. In Proceedings of ISAAC, 2000, Washington DC, USA.
  • Wandmacher T, Antoine J. Training Language Models without Appropriate Language Resources: Experiments with an AAC system for disabled people. In Proceedings of LREC. 2006.
  • Trnka K, McCoy K. Corpus studies in word prediction. In Proceedings of the 9th International ACM SIGACCESS Conference on Computers and Accessibility. 2007;195–202.
  • Vertanen K, Kristensson PO. The imagination of crowds: conversational AAC language modeling using crowdsourcing and large data sources. In Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP). 2011;700–711.
  • Porter MF. An algorithm for suffix stripping. Program. 1980;14:130–137.
  • Detheridge T, Whittle H, Detheridge C. Widgit Rebus Symbol Collection. Cambridge (UK): Widgit Software; 2002.
  • Friedman J. Regularized discriminant analysis. J Am Stat Assoc. 1989;84(405):165–175.
  • Silverman B. Density estimation for statistics and data analysis. Boca Raton: Chapman and Hall/CRC; 1998.
  • Aloise F, Schettini F, Arico P, Salinari S, Babiloni F, Cincotti F. A comparison of classification techniques for a gaze-independent p300-based brain–computer interface. J Neural Eng. 2012;9(4):045012.
  • Kaufmann T, Schulz S, Grunzinger C, Kubler A. Flashing characters with famous faces improves erp-based brain–computer interface performance. J Neural Eng. 2011;8(5):056016.
  • Farwell LA, Donchin E. Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials. Electroencephalogr Clin Neurophysiol. 1988;70:510–523.
  • Treder M, Schmidt N, Blankertz B. Gaze-independent brain–computer interfaces based on covert attention and feature attention. J Neural Eng. 2011;8(6):066003.
  • Liu Y, Zhou Z, Hu D. Gaze independent brain–computer speller with covert visual search tasks. Clin Neurophysiol. 2011;122:1127–1136.
  • S. Chennu A, Alsufyani M, Filetti A, Owen M, Bowman H. The cost of space independence in p300-bci spellers. J Neuroeng Rehabil. 2013;10(82):1–13.
  • Zhou Z, Yin E, Liu Y, Jiang J, Hu D. A novel task-oriented optimal design for P300-based brain–computer interfaces. J Neural Eng. 2014;11(5):056003.
  • Wiegand K, Patel R. Non-syntactic word prediction for AAC. Proceedings of the Third Workshop on Speech and Language Processing for Assistive Technologies, Association for Computational Linguistics. 2012;28–36.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.