Alternative Access for Paper-based Augmentative and Alternative Communication (AAC)

Alternative Access for Paper-based Augmentative and Alternative Communication (AAC)

Description

Individuals with complex communication needs may require an alternative method of accessing their vocabulary to express their wants, needs, and ideas. During the quick win, you will find key words pertaining to AAC, information about 3 types of scanning and eye gaze, and how to model these alternative forms of communication.

Why Try Something New?

Individuals with complex communication needs (CCN) who have physical challenges may be unable to directly point to a symbol on their AAC when they want to say something.  Alternative access provides a way for these individuals to communicate.

Key Words

AAC: Stands for Augmentative and Alternative Communication; a tool or support that assists a learner with communicating a message.

CCN: Stands for Complex Communication Needs; an individual who has difficulty expressing their wants, needs, and ideas using speech alone.

Communication Partner: Any person who could interact with another person. For AAC users, a communication partner may be someone who models language on their AAC tool (to help the person with CCN learn language) or assist the learner communicate with alternative access (ex: partner assisted scanning or low-tech eye gaze). AAC users benefit from interactions with other AAC users and non-AAC users. 

Communication display: The screen, page, or picture that provides the symbol options a learner can use to access vocabulary. Communication displays can be static (i.e., does not change when a symbol, word, or letter is selected), dynamic (i.e., changes when a symbol, word, or letter is selected), or a combination of the two.

Complex body: A way to describe the body of a learner who has physical limitations or impairments that prohibit them from moving in a typical manner. Learners with complex bodies often need an alternative method of accessing an AAC tool or device. 

Label: The spoken and/or printed word paired with a symbol.

Symbol: The image or graphic representing a word or concept on a communication display.

 

How to Model Alternative Access

Individuals with CCN learn how to communicate with their AAC system through “aided language input” (i.e., modeling). Modeling simply means using the AAC tool the same way the individual with CCN should be using the tool so that the individual can see how it works.

AAC systems can be used by pointing to or touching words and/or symbols.  People who need a different way to access their AAC systems use “alternative access,” such as partner-assisted scanning.

Modeling looks a little different for systems set up for alternative access, since the process of partner-assisted scanning takes more time. 

The amount of time you have can influence the type of modeling you use with your learner.

If you have a lot of time, use a full model.

However, if you have less time, a direct model or a partial model are a good start.

Watch this video to see the different ways to model.

Direct Model: The communication partner points to one or more symbols while speaking to communicate a message. This method demonstrates the paths used to express a message, but it does not model how  individuals could express themselves using AAC. 

Partial Model: The communication partner uses a direct point to navigate the AAC system until the final page, then uses a full model to demonstrate the scanning required to communicate a message. 

Full Model: The communication partner models how the individual could communicate using AAC. This type of modeling takes longer as it involves using the specific scanning or eye gaze interactions required for the user of AAC to communicate a message. It may be easier to demonstrate a  “full model” when two other people are present. One person scans, one person talks; the individual observes the interaction. 

Tips for Modeling Alternative Access:

  • Model how the individual initiates or indicates that they have something to say (e.g., vocalize, activate switch, raise arm, etc).
  • Remember that individuals learning a new way to communicate need models of how they can access their AAC system (full model).
  • Because time constraints or distractions can be an issue, direct and partial models can be used more often. 
  • If available, use the “navigational” or “operational” options for all types of models 
    • Navigational options move the individual to another page of words/symbol
    • Operational options allow the individual to add, edit, or indicate a message is finished (e.g., “oops mistake,” “another word,” or “that’s all I have to say about that”).

Partner Assisted Visual Scanning

Who Could Benefit: Individuals with CCN and physical challenges that interfere with them directly pointing to symbols on a display, but can see the symbols and can locate the desired symbol on a page of symbols. 

Steps: 

  1. The communication partner runs their finger down one column and says “this one”
  2. Wait for either a “yes/no” response or the appropriate processing time elapses if mutually decided that individual only gives “yes” responses
    1. If no: move to the next column and repeat step 1 by running their finger down the column and saying “this one”
    2. If yes: move onto step 3
  3. Partner holds their finger at side of first symbol and says “this one” then waits for individual’s response
    1. If no: repeat step 3 by holding their finger on next symbol down in same column
    2. If yes and the symbol has a navigational option: follow navigation to appropriate page. Repeat step 1 on the new page.
    3. If yes and symbol does not have a navigational command: move onto step 4
  4. After each symbol is selected, and if operational options are there such as “oops,” “another word on this page,” etc, the partner scans those as done in step 3
  5. When the individual indicates their message is finished, for example “that’s all I have to say about that”, the partner repeats the full message and responds as an interactive partner (with spoken language and/or modeling). 

Other Tips: 

  • The communication partner should only indicate items/columns/selection.  He/she does not say the name of the symbol or word.
  • Try not to anticipate or guess what the communicator has to say. Simply follow the steps in the order listed above.
  • Keep an even pace. Don’t lead the individual into saying something based on your nonverbal response or longer pauses.  Spend the same amount of time pointing to each column and symbol.

Partner Assisted Auditory Scanning

Who Could Benefit: Individuals with complex communication needs with very limited to no vision.

Steps: 

  1. Partner reads out loud a word or group of words
  2. Wait for either a “yes/no” response or the appropriate processing time elapses if mutually decided that individual only gives “yes” responses
    1. If no: repeat step 1 for next group of words
    2. If yes: move onto step 3
  3. Partner reads first word from the group aloud then waits for individual’s response (or elapsed processing time)
    1. If no: repeat step 3 by verbally scanning next word in same group
    2. If yes and word is a navigational command: follow the command to the appropriate page. Repeat step 1 on the new page.
    3. If yes and the word is not a navigational command: move onto step 4
  4. Partner reads the operational options one at a time (e.g., “oops,” “another word on this page,” etc.), then repeats the scanning as done in step 3
  5. When the individual indicates his/her message is complete, for example, “that’s all I have to say about that”), the partner repeats the full message and responds as a communication partner (with spoken language and/or modeling). 

Other Tips: 

  • Remember to use a flat tone so you don't lead the communication partner to say something other than what they meant to say
  • Avoid using a rising intonation, as if you are asking a question.

Partner Assisted Visual + Auditory Scanning

Who Could Benefit: Individuals with CCN and visual challenges who inconsistently locate the desired symbol(s) on a page.

When using visual + auditory scanning the communication partner points to the symbols and reads out loud the label for each symbol. The individual may rely on their understanding of the spoken labels or visually recognize the symbols. 

  • Often used to enable children to utilize their current understanding of spoken language to communicate whilst providing opportunities for them to learn to visually recognize symbols (for future use).
  • May be used with children who have cortical vision impairment to provide visual stimulation towards the development of their visual skills. 
  • May be used with children who do not yet have any language skills as multi-modal receptive language

Burkhart & Porter, ISAAC 2006

Steps: 

  1. Partner verbally reads group of words in first column while running finger down the column 
  2. Wait for either a “yes/no” response or the appropriate processing time elapses if mutually decided that individual only gives “yes” responses
    1. If no: repeat step 1 for next column
    2. If yes: move onto step 3
  3. Partner verbally reads first symbol then waits for individual’s response
    1. If no: repeat step 3 by verbally reading the next symbol down in same column
    2. If yes and the symbol has a navigational command: follow the command to the appropriate page. Repeat step 1 on the new page.
    3. If yes and symbol does not have a navigational command: move onto step 4
  4. Partner moves to auditorily and visually scan operational options such as “oops,” “another word on this page,” etc. by repeating scanning as done in step 3.
  5. When the individual indicates their message is complete, for example, “that’s all I have to say about that,” the partner repeats the full message and responds as a communication partner (with spoken language and/or modeling).

Other Tips: 

  • Read groups of words quickly and together.  Understanding the group of words is harder when the words are read slowly.
  • Keep in mind that the individual uses either auditory recognition of spoken words or visual recognition of the symbols or a combination of both

Eye Gaze

Who Could Benefit: 

  • Individuals with physical challenges that prevent them from directly pointing to communication symbols.
  • Individuals who can learn to look around a communication display, look at a partner to indicate readiness, then look at a symbol to select
  • Individuals who can maintain their eye gaze to one of the areas of a display, allowing the communication partner to understand what the individual is saying.

Steps: 

  1. Hold the communication system with symbols facing the individual with complex communication needs. 
  2. Engage in “self talk” script: 
    1. “I’m looking around”
    2. “When I’ve found it, I look at (partner’s name)”
    3. “I look at the one I want... now”
  3. Put their finger on the symbol where the individual is looking. This reinforces the communicator’s message.
  4. Follow navigation/operational commands if appropriate.
  5. Respond to the individual's message.

Other Tips: 

  • Use the individual's AAC to have conversations just like conversations you have with non-AAC users.
  • Model the individual's method of showing that he/she would like to start  communicating so the individual can see how it works.
  • Model all “operational” symbols such as  “turn the page”
  • Speak aloud the label as you indicate a symbol
  • Verbally reference (or self talk) what you (the communication partner) are doing
  • Regularly recap the message along the way
  • Repeat the message in normal, spoken English at the end

Stay In Touch

The Alt+Shift newsletter provides updates on our professional learning opportunities, informs readers of upcoming events, and highlights resources for people who work with students with disabilities.