One big change over the last year is my slow move to working on projects that could have more of an impact on the world than just working on stuff to scratch my love for solving puzzles. In addition, I’ve been wanting to work on stuff that encompasses more of my interests as well! So I’ve been starting some new projects, and one of them is looking at new ways to help programmers with vision impairments, well, program!
There is surprising little on this, especially when it comes to helping general industry programmers rather than just helping students who are learning to program when they have vision impairments. My hope is to not only make it easier and more enjoyable for vision-impaired programmers, but to provide a framework for programming language creators to be able to add such accessibility features without a lot of overhead. It’s a big dream!
My student, Sala, and I are reading a bunch of papers on this topic, and I thought, why not throw my notes in a post? So today I’m reading the paper:
A. Stefik, A. Haywood, S. Mansoor, B. Dunda and D. Garcia, "SODBeans," 2009 IEEE 17th International Conference on Program Comprehension, 2009, pp. 293-294, doi: 10.1109/ICPC.2009.5090064.
Introduction
They make some really nice comments on the difficulty and the lack of support
The main commercially available programming environments largely focus on visual stimuli to represent information to the programmer. Thus, programmers with vision impairments must rely on screen readers to interact with the environments.
It might seem like accessibility APIs are sufficient for the those with vision impairments, but this is not the case. In fact, commercial institutions little or no financial incentive for creating effect interfaces for the vision impaired. Thus, most APIs are created as form of charity or national accessibility standards.
Making effective environments for the vision impaired is extremely hard.
When navigating without sight logical context (e.g., knowing that the current item being explored is connected to another item) is more meaninful than spatial context. This is known as the “Where am I?” problem. Screen readers do not support this type of exploration.
I’ve also realized this in my research, and have some ideas on how we can make progress on this problem using some newish PL ideas.
Screen reader companies are small companies, and yet they have to try and support every type of application a person might use. This is super hard! So they generally concentrate on the most popular applications.
Programming Blind
Then they go into the issues and needs of a programmer with vision impairments:
Braille displays exist, but are uncommon in the vision-impaired programming community.
Programmers develop custom screen-reader scripts for development environments to read code using auditory cues.
A programmer may want to hear descriptions of a number of different contexts:
For syntax errors, they may want to hear every character of a line of code.
After a syntax error is spoken, they may want to jump to the location of the error.
The programmer may want to have a summary of the information around a given location in the code.
The programmer might want to determine the behavior of the program at runtime, and thus, a number of auditory cues would need to be available to the programmer to explore the running program.
- Like reading iterations and quantitative information during debugging.
The programmer might want meta-auditory cues. These are cues about the program that are not represented by the data in a file.
Contextual information like the structure of the parse tree; e.g., nesting of control blocks or type information surrounding a hole.
Exploring the lines of a program during execution; e.g., moving forward and backward during debugging.
They give several examples of bad accessibility support among common programming environments.
- Visual Studio says “Graphic 53” plus the current line of code during execution of a program. Furthermore, it always says “F11”. These are supper unhelpful.
SODBeans
SODBeans is an extension of NetBeans with that extends NetBeans with the Sonified Omniscient Debugger (SOD). This is a framework consisting of a C-compiler, custom debugging architecture, sound libraries, and a sophisticated system for auditory cue design.
The problem with this section is that it doesn’t go into any more detail nor gives any citations backing up what SODBeans can do and how it’s designed. I managed to find another paper by the same authors:
Stefik, Andreas & Hundhausen, Christopher & Patterson, Robert.
"The Sonified Omniscient Debugger: A Program Execution and Debugging Environment for Non-Sighted Programmers Built from the Ground up"
I’m hoping this one sheds more light on the inner-workings of SOD.