Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Migrated to Confluence 5.3

...

See also Greg Fields' slides, which will be attached to this page.

  • Braille TTY exists
  • Space to take mobile beyond desktop - don't duplicate, expand
    • There is effort to extend browser capabilities to all APIs on a device
    • Zoomba
  • Gregg's "Portable Accessibility Device"
  • Using mobile phones as a "universal remote control" URC
    • Gregg
    • Georgia Tech
  • Portable OCR/barcode reader e.g. KNIB Reader
  • Gestures - standardization?
    • Gregg's kiosk work
  • auto-location of potential assistance based on preference profile
    • e.g. deaf person looking for a translator
    • social networking
    • social proximity awareness
  • iPhone shortcut button: blurs line between native and web-based. Distinction no longer matters
  • Android apps: being made in the image of iPhone interactions.
    • iPhone interactions moving to ubiquity
  • Ontario: ICT standards, expecting info to be produced according to WCAG 2.0; Employment a11y standards referencing ICT standards; having mobile device as a tool for access could decrease initial effort for organizations to meet the standards.
    • e.g. mobile RCC, mobile CART: real-time, crowd sourced closed captioning
  • User-centred design
    • Need to do user studies: observe real users in real scenarios. What really matters to the users?
  • Disability awareness training
  • Usability != checklist
    • Instead of doing "accessibility testing", make friends who can help with usability design
  • Participative Joint Design Methodologies void in all/most instances including Mobile

Demo: Jorge's Bluetooth/Serial port adapter

Problem Statement

Carve out a new space:

  • speech assistant (face-to-face)
    • real-time translation
  • universal remote (powered wheelchairs)
  • real gesture mapping -> need to involve real people
    • API (access)
    • gesture profile (cross-platform)
    • compatibility with HID/bluetooth
    • allow users to define their own gestures for particular actions?

Priority: Flexibility

Next Steps:

  • consider how to have UI that is flexible: what do we need for that?
    • access to hardware APIs

Problem Statement:

From Greg's presentation, we have a good definition of the problem space. We've identified the priority of flexibility in UI and UX. The focus should be finding a cohesive way to bring it all together and make sure that the mobile space becomes flexible.

Proposal theme possibilities:

  • making sure standards like ISO 24751, Access for All, are supported
  • don't include "new gestures" because everyone is already trying to do this

Who involved in proposal?

  • AEGIS
  • RIM

Deliverables:

    Jorge: Possible solution space: Web space

    Jess: Fluid Engage demo (Safari only)
    UI Options demo

    Proposal?:

    • Web as development platform for flexibility. Work there can be extended to the device.
    • Let the UI and the open web structure how we might address the points raised in brainstorming.

    Relation to location-based awareness:

    • Tagging. Most system have closed APIs. Firefox opened their APIs for recording wifi info. But doesn't make sense on a laptop.
    • Difficulty: access to accelerometer not normally available from browser.
    • Prediction: Web will eventually have equal footing with native in terms of richness
    • Issue: Using the web = using data, which costs money on a mobile device.

    Multi-phase proposal process:

    • First grant: user study to assess high-priority user needs; where are the common links, avenues of research or development or design that will improve accessibility of mobile devices for many
    • Second grant: follow up on findings of user study

    AEGIS has identified areas of need Greg, please fill in?
    They didn't cover some disability groups (deaf/blind, mobility impaired, speech impaired). Should we try to cover those groups, using their techniques?

    • Opportunities identified in Greg's slides cover these three groups.

    Potential candidate: Speech Assistant:

    • real need
    • work on it would move several areas forward
    • can add translation

    Problem:

    • can't communicate face-to-face ad hoc

    Use Cases:

    1. Face to face communication with a hearing person
      • inclusiveness
        • order coffee, talk dirty

    User:

    • speech impairement
      • temporary
      • long-term

    Functional Requirements:

    • translation
    • real-time for face-to-face
    • text-to-speech
    • don't need an accessibility API
    • spell check
    • Auto Text
    • TTS direct access
    • predictive text
    • saved, forward
    • local
    • "Notepad"
    • cross-platform

    Assumptions:

    • first pass won't address cognitive impairments, such as aphasia

    Exploitation:

    • App store
    • Speech therapists
    • disability advocacy community
    • OATsoft