A phonology-free mobile communication app

Purpose : Aphasia – loss of comprehension or expression of language – is a devastating functional sequela of stroke. There are as yet no effective methods for rehabilitation of aphasia. An assistive device that allows aphasia patients to communicate and interact at speeds approaching real time is urgently needed. Methods : Behavioral and linguistic studies of aphasia patients show that they retain normal thinking processes and most aspects of language. They lack only phonology: the ability to translate (input) and/or output sounds (or written words) such as ‘‘ta-ble’’ into the image of a four-legged object with a top at which one works or eats. Results : We have made a phonology-free communication mobile app that may be useful for patients with aphasia and other communication disorders. Particular innovations of our app include calling Google Images as a ‘‘subroutine’’ to allow a near-infinite number of choices (e.g. food or clothing items) for patients without having to make countless images, and by the use of animation for words, phrases or concepts that cannot be represented by a single image. We have tested our app successfully in one patient. Conclusions : The app may be of great benefit to patients with aphasia and other communication disorders. (cid:2) Implications for Rehabilitation

Aphasia -loss of speech -is a devastating consequence of stroke or brain injury. In the 1860s, Paul Broca [1] noted that individuals with damage to the left anterior/frontal lobe retained the ability to understand speech but had great difficulty in producing speech. In 1874, Carl Wernicke [2] described a different, even more profound form of aphasia. His patients with damage in the left posterior/temporal lobe were able to fluently produce speech, but it was incomprehensible gibberish. Furthermore, they could not understand anything said to them, nor could they read. There is still no method of rehabilitation for Broca's or Wernicke's aphasia [3,4]. Patients with aphasia have normal cognitive functioning in everyday situations not requiring language: they will roll up a sleeve for a daily morning injection but extend a hand for pills in the afternoon. Using public transit is usually completely viable. Furthermore, we reported that a Wernicke's patient became visibly upset, and physically tried to reverse an illegal chess move made (deliberately) by investigators [5]. Language studies show that even Wernicke's patients may not have lost language, but only ''phonology'' -the ability to translate sounds such as ''ta-ble'' into the image of a four-legged object with a top at which one works or eats [6][7][8][9]. These studies raise the exciting possibility that by using visual instead of phonologic input/output, one can bypass this deficit in aphasia patients and effectively allow for near-normal communication. We designed and incorporated such a system of utilizing visual ''boxes'' into a portable phone application, which we describe here.
The home screen for the app is shown in Figure 1. The food ( Figure. 2) and shopping boxes (Supplementary Video 1, video descriptions in the Appendix) ultimately call Google Images as a ''subroutine'' to allow a near-infinite number of choices to minimize the number of original icons needed. Selections made by a patient can be saved in a folder with a thumbnail of its box of origin.
Pointing out specifics in a complex distant scene is extremely difficult for aphasic patients, but the in-app camera makes this possible using a moveable arrow pointer (Supplementary Video 2). This feature lets the user store pictures of important people, places, or things as ''favorites'' into folders.
The needs and wants of aphasic patients can also be nonmaterial, such as a desire to relax or to accurately express an emotional reaction. To allow users to communicate a desire to engage in hobbies and relaxation, we have developed an entertainment box with integrated social interaction ( Figure 3a). The emotions box of the app provides a completely pictorial, nonverbal system to communicate feelings, with sliders to accurately quantify input and translate it into phonologic output ( Figure 3b).
We also have a box that brings up icons for common conversational phrases. Despite the old saying stating that a picture is worth a thousand words, we have found some phrases such as ''congratulations'', ''thank you'', ''how are you'' or even ''pass the salt'', which we could not effectively illustrate using a single figure. Instead, we used animation video loops to express these phrases. Using icons and animations, as well as select pictures from the customized folders, in-app camera and other boxes of our app, the Wernicke's patient can effectively hold a two-way conversation with another individual about both physical and nonphysical items (Supplementary Video 3).
A few features of our app exist for very specialized purposes. We created a medical box that can facilitate each patientphysician exchange -in particular, the history of present illness, which may be known only to the patient (Figure 4a). (Other aspects of the medical history such as current medications, allergies and past surgeries can be entered into the app by family members or caretakers.) Our calendar box furnishes users with an ability to convey time and date without words. Users can add medications, appointments and custom events via the in-app camera to our pictorial calendar, which is closely integrated with features of the medical box (Figure 4b). In emergency situations, lack of phonology poses a potentially dangerous problem. Our help box serves as a workaround. Users can program the phone to shout ''Help!'' or their caretaker's name, as well as call a starred contact shown as a photo icon (Supplementary Video 4) We have included specific tutorials for each box, a sample of which is illustrated in Figure 5, as well as for integrated navigation around the home screen and output menus, which are composed entirely of pictures and symbols to train the user how to utilize our icons as a phonology replacement. These are accessible from the red question mark displayed on all screens. Our large, bold, white icons against the dark background provide increased visual contrast to these often elderly and visually impaired aphasic patients.
We studied use of the app in a patient with transcortical motor aphasia (severe expressive/Broca's aphasia with preserved repetition). After only minimal instruction and demonstration, the patient was easily able to navigate and use the app. The food and shopping features were particularly appreciated. When the patient Figure 2. Expression of user's material wishes, desires and needs through selection of visual input from large arrays. The food box. Selection of location -takeout, home or restaurant-and a meal-breakfast, lunch, or dinner -takes the user to a variety of food types for that meal. Choosing a food type then takes them to a Google images search for a specific food item.   was asked whether she liked the app, the patient smiled broadly and said ''cool!'' The patient's family, as well, thought that the app was most helpful. The patient was excited to demonstrate that she could rapidly order a series of different foods using the app. A nice benefit of the app is that it lets patients interact at normal speed. The nurse also found the app helpful for communicating with the patient. While we think that our app will be helpful for aphasia patients in general, formal controlled trials are necessary to demonstrate this.
We have seen other apps for aphasia [10]. Many of these are for teaching speech, not the purpose of our app. Others require ability to understand and/or produce intelligible speech or writing. Our app is the only one we see that is fully phonology independent in content and design.
Our app also could be helpful to facilitate language output for patients with communication deficits other than aphasia, for example individuals on the autistic spectrum. Also our app could facilitate detailed testing of language and thinking processes in Wernicke's patients. Furthermore, our app could be helpful for communication between neurologically intact individuals who speak different languages -individuals who lack phonology for each other's language. Linguistically our app also raises the interesting question of which aspects of language can be expressed with a single image versus animation. Finally, our app could be used to study a question originating from Wernicke himself [2] as to why these patients appear oblivious to the fact that they cannot understand what anyone says to them, and no one can understand them.