About UniDescription

Conference room at Harper's Ferry

Some of the Unid research team members in Harpers Ferry Center, W.V., in September 2016, working on the National Park Service's first "Descriptathon," an event that brought together parks from throughout the country to audio describe their brochures. Clockwise, from the far right, is principal investigator Brett Oppegaard, research assistant Phil Jordan, web developer Joe Oppegaard, and NPS Media Accessibility Coordinator Michele Hartley.

The UniD (“UniDescription”) Project officially began in the fall of 2014, when principal investigator Dr. Brett Oppegaard moved from Washington State University to University of Hawai‘i. During this transition, he was working with Michele Hartley at Harpers Ferry Center on accessibility issues related to printed National Park Service products, such as the “Unigrid” brochures, and started envisioning the potential of mobile technologies to remediate and translate those static texts into acoustic forms. Once in Manoa, he began collaborating with two scholars who have spent their careers focused upon issues of accessibility, Dr. Megan Conway and Tom Conway, both serving in the UH Center on Disability Studies. For a behind-the-scenes look at the process of developing this project, please see the blog.

For a bit of additional background, in the late 1970s, designer Massimo Vignelli worked with Harpers Ferry Center staff to create the "Unigrid System," upon which all National Park Service brochures since have been based. The self-described "information architect," who also designed the innovative New York subway map, favored a modular system with a subtextual grid that facilitated order and consistency.

Our web-based project – with direct connections to Harpers Ferry, the National Park Service, those brochures, and those basic beliefs – has been called UniD, in tribute. That name should be pronounced like "unity," serving as both an abbreviation of the more wonky original label of "unidescription" and as an inspiration for our mission:

To bring unity to the world of audio description.

Translating a map from a visual experience to an acoustic experience has been one of the most complicated challenges of the UniD project. As part of our research, UH Communication and Information Science Ph.D. student, and Unid research assistant, Philipp Jordan went to Monocacy National Battlefield in Frederick, MD, to examine and experiment with the sights and sounds of its three-dimensional, fiber-optic map, which also included a soundtrack and closed captioning.

Audio description (often called verbal description) can be thought of as a medium equivalent to open and closed captioning, only for audiences that prefer information in acoustic rather than visual forms. In some cases, that involves the simple verbalization of a transcript (as in text-to-voice translation), but what we mostly are concerned with here is the more complex audiovisual translation of visual into audible material. For example, how would you describe an Ansel Adams photograph of a scene within Yellowstone National Park to a person who cannot see, or has low vision, or has difficulty interpreting print materials, or simply prefers information in audible forms? Those varied audiences (including people who are blind, with low-vision, print dyslexic, and audio-oriented) deserve full access to public discourse, and this project has been created to serve them, under the core principles of Universal Design.

In turn, this UniD project has been developed to help people create more audio description and to be a robust resource for those interested in this topic, including "best practices" guidelines, updated scholarly research, and a forum for related thoughts and discussions. Our hope is that like the impact Vignelli's system had on NPS brochures, the UniD Project will bring higher clarity and quality to this acoustic communication form, especially in public spaces.

This federally funded project is free and open source. To start making your own audio description, just create an account, sign in, and follow the directions.

  • It can help you translate static visual media of any kind (texts, photographs, paintings, posters, statues, etc.) into audio content that can be freely shared
  • It will convert text to speech (in an audio file format)
  • It will help you manage multiple text-to-speech projects, which can be created around any type of audiovisual translation context, including the need to translate a static media source (such as a brochure), a grouping of artifacts (either by theme or location), or whatever other ways in which you might find it useful
  • It provides templates for common audiovisual translation contexts
  • It provides best practices and scholarly research related to audiovisual translation issues
  • It includes a forum for discourse about audiovisual translation, including audio description, verbal description, and many of the other terms used for the similar process of verbally describing something visual and sharing that description with others
  • It creates deliverables that are accessible in many ways; your audiovisual translation can be exported as text, audio files, or even mobile apps (in Android and iOS formats)
  • This is a grant-sponsored program, so all of this is offered to you for free – and its products created for free distribution – in the hopes of making the world a more accessible place to people of all abilities.

The principal investigator on this project is: Dr. Brett Oppegaard in the School of Communications in the College of Social Sciences at the University of Hawai‘i.

All inquiries about this project should be directed to him, either by email or phone

Sushil at the Lincoln Memorial

During their fall 2016 visit to the Lincoln Memorial, Sushil Adhikari, from Nepal (right), and Nang Attal, from Afghanistan, discovered there was no audio description available at the site. So, Attal read the wall text to Adhikari and did his best to describe the surroundings. The UniD project is intended to help visitors, like Adhikari, who are blind or visually impaired, have equivalent experiences, vetted by park staff, which allow more people to participate fully in important societal and cultural discussions.

University of Hawai‘i at Manoa

Collaborator

A transdisciplinary collaboration between the School of Communications and the Center on Disability Studies

National Park Service

Collaborator

Including Harpers Ferry Center, the design hub of the bureau, and park sites nationwide

The Hawaii-Pacific Islands Cooperative Ecosystem Studies Unit

Collaborator

A coalition of governmental agencies, non-governmental organizations and universities, promoting research within the Pacific region

Montana Banana

Developer

Seattle-based web and mobile app development company

Brett Oppegaard's bio photo

Brett Oppegaard

Principal Investigator

An Assistant Professor in the UH School of Communications

Megan Conway's bio photo

Megan Conway

Co-PI

An Assistant Professor in the UH Center on Disability Studies

Thomas Conway's bio photo

Thomas Conway

Co-PI

Media Coordinator in the UH Center on Disability Studies

Michele Hartley's bio photo

Michele Hartley

NPS Liaison

Media accessibility coordinator at Harpers Ferry Center, WV

Joe Oppegaard's bio photo

Joe Oppegaard

Chief Technology Officer

Montana Banana
Seattle, WA

Jason Kenison's bio photo

Jason Kenison

Senior Programmer

Montana Banana
Seattle, WA

Tuyet Hayes's bio photo

Tuyet Hayes

Research Assistant

University of Hawai‘i
Honolulu, HI

Philipp Jordan's bio photo

Philipp Jordan

Research Assistant

University of Hawai‘i
Honolulu, HI

Sina Bahram's bio photo

Sina Bahram

Consultant

Prime Access Consulting, NC

Annie Leist's bio photo

Annie Leist

Consultant

Art Beyond Sight, NY

Additional contributions by: Sean Zdenek (Texas Tech University) and Marsha Matta (graphic designer)