Braille and Cognition

A couple of weeks ago, I went into the Smith-Kettlewell Eye Research Institute in San Francisco to hear a presentation by my friend Robert Englebretson. Robert is a professor of linguistics at Rice University. He is blind, and I met him through an email group of blind academics. In addition to his primary linguistic research on Indonesian languages, Robert has begun to research Braille reading using the tools of cognitive linguistics. It’s exciting research that points the way toward better Braille for everyone.

This particular presentation was about a study he had done on Braille contractions with his Rice colleague Simon Fischer-Baum. I haven’t said much about contracted Braille since my very first post, so I’ll recap: in order to save space and increase reading speed, there are a bunch of contractions that reduce the length of common letter combinations and words. Some of these contractions have to appear by themselves, but others can be used as parts of larger words. These include small groups of letters like [er], [ch], [ing], and [the], as well as words like [time], [ever}, and [less]. 

In English Braille, these contractions were determined largely by statistical frequency. The most common letter combinations and words got contracted, meaning the most possible space got saved on the page. In the days when huge tomes had to be hand-Brailled, this made a lot of sense, but it’s a practice that benefits the producers of Braille, and its effect on readers had never been studied. 

For decades, the prevailing assumption in Braille education has held that blind Braille readers expand all of the contracted words into their full print form before interpreting them. If this is true, then all contractions should be equally readable. But are they?

Granted, there are some contracted spellings that everybody agrees are a bad idea. My favorite is this gem:


In English characters, that’s 

[ch] [e] [mother] [a] [p] [y]

Chemotherapy. Sure, it’s space-efficient—it shortens a twelve-letter word to seven cells. But the two-cell (dot 5, m) contraction for [mother] is so ingrained to mean “mother” that everyone inevitably reads it something like “Key-mother-a-pee.” Is saving the space of five cells on the page worth losing a few seconds every time you have to make sense of this word? Not to me.

Robert set out to show that the same process happens in less egregious situations as well. You don’t often run into contractions as bad as “chemotherapy,” but there are many, many examples of contractions that cause momentary confusion and delays in comprehension. Take ”redo” for example, which can be written in Braille as [r] [ed] [o]. Most people can’t help but read it as “red-o.”

Basically, Robert theorized that reading problems occur when contractions conflict with our understanding of a word’s sublexical structure. We all intuitively understand that some words are made up of smaller units that have been smooshed together. “Redo” is the verb “do,” plus the prefix “re-.” We understand it as a two-part word, and when the [ed] contraction effectively erases the boundary between the two parts, we stumble over the reading.

And this is exactly what he found in a large number of similar cases. I won’t go into the methodology or technical details here, but Robert was able to show that fluent Braille readers took longer to comprehend words when contractions crossed sublexical boundaries than they did when the contractions did not cross boundaries. This implies that Braille readers are reading those contractions as single units, covering up any dividing lines that may exist within them. In other words, the theory that Braille readers uncontract words before they process them does not hold up.

To me, sitting in the audience, Robert’s results seemed right on. To be fair, I remember a time when I did have to uncontract every word I read. When I was first learning contracted Braille, I had to picture every word in my head, building each one letter by letter as I read each cell. But this was just a symptom of my inexperience, and I moved past it quickly. After a month or so, I no longer had to picture the words I was reading. Instead, the cells brought sounds into my mind—whether letters, syllables, or entire words. 

Then, maybe a month ago, something different happened. I brushed my finger over some Braille text and all of a sudden the meaning of the words came into my head. Not the sounds, the meanings. It felt surreal, how automatic it was. I took that for granted with print reading—for most print readers it becomes so automatic that it isn’t even a choice. Text is read as soon as it is seen, whether you like it or not. It was very strange to have that sensation in Braille. 

Right now that only happens with very simple language. Less common words take a little longer, and rare words or weird contracted spellings can trip me up pretty bad. As I gain experience, more will become automatic and everything will come faster, but Robert’s research suggests that some contracted spellings will never stop slowing me down.

The study shows that cognitive linguistics can provide valuable insights into Braille reading, and points the way toward further research. In time, we may be able to more precisely distinguish between contractions that help and contractions that hinder, We may also come to see which other features of the current Braille system improve speed and comprehension, and which do not. Eventually, research like this could influence best practices that optimize Braille for the human beings who read it rather than pages it is embossed on.


A photograph of the San Antonio Riverwalk. Just like me, the people in this photo have not fallen in the river.

I just got home from San Antonio, where I attended the Annual meeting of the Society of Biblical Literature. SBL, as we call it, is the largest conference of biblical scholars in the world, and it’s held in conjunction with the American Academies of Religion. This means that for one weekend a year, thousands upon thousands of scholars in biblical and religious studies descend upon one unsuspecting American city.

The scope of the thing is truly mind-boggling. There are hundreds of papers and presentations each day, covering every conceivable topic within and around biblical studies and religion. It’s a great place for scholars to hear about new discoveries, new ideas, and new approaches, and to share their own ideas and get feedback before committing them to print. But the best part is reconnecting with friends and colleagues and meeting new ones. The conversations over dinners and drinks are consistently stimulating and gloriously nerdy.

For me, SBL also offers a chance to reflect on how I am changing and still need to change as my sight wanes. It’s a different experience every year.

The most noticeable change this year: I used my white cane. Last year I brought it with me, and I carried it in my bag throughout the weekend, but I never actually took it out to use it. This time I had it out almost the entire time. Inside and in daylight I don’t really need it for the large-scale tasks of orientation and navigation, but it warns me of curbs, planters, and benches, and warns people around me that I will definitely run into them if they don’t move. At night on the narrow and winding Riverwalk, on the other hand, I needed the cane (and a few timely nudges from my friends) to keep from taking a swim.

The social and networking aspect of the conference keeps changing, too. I’ve never been great at picking faces out of a crowd, but now it’s near impossible. I’m sure I walked right by dozens of people I should have said hello to, and failed to notice friends and colleagues attending sessions with me. Name tags are useless, so I’m sure I also miss out on meeting scholars whose work I know but have never met in person.

Even meeting close friends can be disorienting. Suddenly someone appears right in front of me, or beside me walking in lockstep, and it takes a few seconds to piece together who they are from their voice and whatever visual clues I can get. Sometimes I start a hug or a handshake without knowing who’s on the other side, and recognize them halfway through. Luckily, my assumption that people who want to hug me or shake my hand generally have a good reason to do so hasn’t backfired yet! 

I am curious to see how my experience of the conference changes over time. It will continue to provide a benchmark for my vision and adaptation, as well as for my scholarly career. Next year I plan to present a paper—a dual novelty since it will be my first paper at an academic conference and my first presentation from Braille notes!

Last Month and Last Year

A series of butterfly cocoons. Some are still closed, a few are open. A butterfly has just emerged from one of them.

It’s been a quiet month here on the blog, because life outside the blog has been a whirlwind. My parents came into town for two weeks to help me clear some of the last medical and bureaucratic hurdles involved in relocating and getting social services. There’s a lot of bureaucracy involved in going blind—a lot of paperwork and visits to doctors and government buildings, then a lot of waiting, then more visits to doctors and government buildings. My parents graciously ferried me all over the Bay, condensing errands that would have taken me weeks into a few days.

It was the big push at the end of a long year of change. About this time last year my dissertation proposal was accepted by the faculty in my department and I became officially A.B.D. I had already been legally blind for a year before that, but the proposal marked a turning point. It was the last piece of work I was able to complete without radically changing my process to accommodate reduced vision.

And since then? Well, I haven’t made substantial progress on the actual dissertation in almost a year.

It’s not like I’ve been idle. Our family moved across the country and settled in a new town. Kristin started a new job in a new career. Jane turned from a baby into a toddler. I learned Braille and mobility skills, dove deep into accessibility, and developed new workflows for research and writing.

Looking back, it feels like I spent the year in some kind of professional chrysalis, a space that allowed me to process, change, and transform. Now I feel like I can finally return to the work I set out to do in the first place.  It won’t be smooth and easy sailing as a beautiful blind butterfly, but at least I’m ready to start moving forward.

Writing with Sound

I sense that there is a change happening in my writing, because there is a change in how I write. I used to write, like most people, silently. My eyes and fingers worked together to lay down words on the page.

When you compose with your eyes, you read over what you’ve written with your internal voice. You supply all of the missing elements of speech: tone, emphasis, pause, and all the other things that add texture and life to the words on the page.

When I used to write with my eyes, I would become so familiar with the way I read a piece of text in my head that I could not imagine it any other way. I never considered that those invisible auditory elements would not be immediately evident to any other reader. At least, not until I came back to a piece of writing a few days later, when I had forgotten the finer details, the shape and flow of each individual sentence. Then I would inevitably puzzle over a phrase or paragraph until I realized how I had meant it to be read. I found these pitfalls lurking in the writing of others, too—places where the emphasis or tone was important but not apparent, waiting like rocks in tall grass to trip up unsuspecting readers.

Now I write with VoiceOver. It is Apple’s main accessibility feature, which gives immediate audio feedback for every character, word and sentence that I write, and reads me paragraphs so I can remind myself of their flow and argument. It is a multi-sensory experience, since I still see the words appearing on my screen, sometimes zoomed in close enough to read them but most often not. I used to find the constant wash of letters and words distracting and intrusive, but in the few weeks since I’ve started writing this way exclusively, it has become natural and I feel adrift when it is turned off.

When VoiceOver reads every word, line, sentence, and paragraph, it supplies its own intonation, emphasis, and rhythm. Computer voices today are not the abrasive robot voices of the eighties and nineties. They are getting closer and closer to the sound of natural speech. No one would mistake them for human, but developers are focused on improving their realism and fluency in reading long passages. So the computer adds pauses, shifts its tone, and inflects words up or down based on how it interprets the context. All those choices I used to make in composing, usually without even thinking about it, are now made by the computer. The computer becomes a controlling voice in my writing, because if it reads something awkwardly, I often change the text.

Of course, I don’t always cede to the computer. Sometimes it is obviously wrong (like when it pronounces a homonym inappropriately, like “read” as red instead of reed) or its interpretation is clumsily rule-bound (like when it thinks “Mr.” is the end of a sentence). But if the computer’s reading is misleading, ambiguous, or simply awkward, I will often rewrite it. I just can’t bear to hear the voice stumble over the same passage dozens of times as I write and edit, so I alter it to accommodate the computer’s flow.

The big question is, how will this affect my writing? Will it become more stilted or robotic as I adapt to a computerized interpretation of natural language? I hope not. My hope is that the instant feedback will remind me to consider the auditory dimensions of language more carefully than I did before. VoiceOver, imperfect as it is, is an audience that is present and interactive at every stage of my writing process, letting me know what I’ve written and at least one way it can be read. If it makes a mistake, there’s a chance another reader would also have made that mistake. VoiceOver was developed by people, after all, and those people defined the interpretive choices it makes.

VoiceOver is changing my writing, but how remains to be seen. I guess I’ll just have to wait for you, my human readers, to let me know.


Hebrew Braille: First Impressions

An image of my twenty-volume Hebrew Bible in Braille, sitting on my bookshelf.

I finally took my first stab at reading a second language in Braille.

My twenty-volume Bible in Hebrew Braille has been sitting around for five months, ever since Jewish Braille International graciously sent it to me, free of charge. This particular copy is used. It once belonged to a certain Nancy Ellen Jaslow, presented to her “on the wonderful occasion of her Bat Mitzvah, October 11, 1963.” So thank you, Ms. Jaslow, for your Bible. I hope I will put it to good use.

I cracked it this weekend and read through the introductory material. The project of creating a Braille system for Hebrew and transcribing the Bible was conducted by a team of blind and sighted Jewish rabbis and scholars from New York, London, and Vienna. They began in the early 1930s and finally published in 1954, hindered by “the stringencies of the time,” as the introduction so euphemistically admits. It’s not a scholarly edition of the text, but I was impressed to see that well-known biblical scholars like H. L. Ginsberg and Theodore Gaster had reviewed the text and notes.

On Monday morning, I perused the key to the text and began to read. At this time, I have read exactly one page of Hebrew in Braille. Since some of you have asked, I thought I would share some of my first impressions here.

First Thing: What is It?

All Braille, everywhere and in every language, is made up of cells, which are made up of six or eight dots in two columns, like so:

⠿   ⣿

It has to be embossed very precisely and uniformly; there are no fonts or scripts or cursive in Braille. Braille already pushes the fingers to their perceptive limits, and there is no room for fanciful embellishments. Eight-dot Braille is mostly reserved for musical and mathematical notation, while every language that I know of uses six-dot cells.

Six dots allow for sixty-three different combinations of dots, not counting the blank cell. Every language has the exact same stock of cells to choose from, and each language gets to choose how it will use those cells. Since English only has 26 letters, it uses the rest of the cells to represent punctuation, common letter combinations, or whole words. Chinese, which has thousands of characters, has to get more creative. It uses two or three cell combinations to represent each character. Hebrew is like English in that it has fewer than 63 letters in its alphabet: 22 consonants (5 of which have a second form that appears at the end of words) and 15 or so vowels. This means one cell can be used to represent each letter or vowel, and there will still be some left over for punctuation.

But regardless of how a language uses Braille, it’s still just combinations of those same 63 cells. So no matter how different two languages are, and no matter how different their written scripts look, in Braille the cells look the exact same, and lines of text look very similar.

⠠⠓ ⠁ ⠝⠊⠉⠑ ⠐⠙

⠚⠪⠍⠂ ⠝⠊⠋⠄⠇⠣⠁

See? One of the lines above is Hebrew, the other English. Can you tell which is which? The first line says “Have a nice day” in English, the second says יום נפלא “have a wonderful day” in Hebrew.

Before I started learning the Hebrew Braille system, I worried that I would sometimes not know what language I was reading in. No one would ever mistake a page of printed Hebrew for English, because the scripts are just too different. But since the Braille script is universal, and reading it with fingers doesn’t allow for that same full-page first impression you get with printed text, I thought sometimes I might get really confused for a while.

It turns out this is not a problem. It could be confusing for one letter, maybe two, but then it becomes completely incomprehensible. If I tried to read the Hebrew sentence above as English, it would be “jowm, nif’lgha”—no confusion there!

I guess it’s like looking at a page of German or French. They use the same letters as English, but you immediately know that it’s not English.

So, one less thing to worry about.

Second Thing: How does it compare to reading printed hebrew?

I knew that reading Hebrew in Braille would be a different experience from reading it printed on a page or written on a manuscript. It’s written from left to right, like English, so some people have asked me if it’s more like reading Hebrew transliterated into English characters. So far, I would say it’s not like reading transliteration or Hebrew script. It’s like reading Hebrew in Braille.

Classical Hebrew, the Hebrew of the Bible, was originally written with only consonants. This is a fine way of writing for people who grew up speaking the language, but once it fell out of everyday use, readers needed help remembering proper pronunciation. Scribes and copyists added in vowels and other pronunciation aids, in the form of small dots and marks surrounding the consonants. Now when you see Hebrew, it looks like this:

וְלֹא־לְמַרְאֵה עֵינָיו יִשְׁפּוֹט

And transliterated Hebrew looks like this:

wᵉlōʾ lᵉmarʾēh ʿênāyw yišpôṭ

Both Hebrew script and transliteration include marks above and below the letter: vowel points in Hebrew and diacritical marks in transliteration. In Braille, it is impossible to modify a letter by placing something above or below it. Everything has to be linear. Each of those marks needs to be represented by a character that either precedes or (more often) follows the letter it modifies.

This has a couple of effects. It hides somewhat the similarities between related vowels. One example is that of holem and holem waw (the ō and ô in the transliteration above). These two vowels make the same sound and are interchangeable in the spelling of many Hebrew words. The transliteration and their writing in Hebrew script make this similarity apparent. In Braille, holem is ⠕ and holem waw is ⠪—two completely different cells. For those who know Hebrew, the same principle applies to shureq and qibbutz, hireq and hireq yod, and the hatef vowels.

The feel of reading Hebrew (pardon the pun) also changes, because the vowels don’t play second fiddle to the consonants the way they do in print. They are given equal weight on the page. Apart from making words feel longer, though, I’m not sure how this will affect my experience of reading Hebrew. Let me get back to reading and I’ll let you know.

And of course, until next time, “jowm, nifl’gha!”

The Value of Photographs

An image of my daughter at the kitchen table, grinning over her breakfast. She is in striped pajamas and has wild, messy bed-head.

I spent a long time looking at this photograph today. Many minutes, because I had to, and because it was worth it.

Kristin took it at breakfast this morning. I was sitting right there, and I was enjoying the moment, but seeing the photograph was a different experience entirely.

In general, I look at a lot fewer photographs these days. It takes me longer to make sense of what’s going on in them. The colors and lines just won’t resolve into recognizable objects the way they used to. Sometimes it takes a few seconds, or a few minutes before I realize who or what is in a picture.

But there’s a flip side to this, because I don’t see life as quickly as I used to either. It takes time for me to make sense of what I see, for my brain to construct an image out of the faulty and partial information my eyes pass along. And life doesn’t always stick around waiting for me to make sense of it. 

My daughter is a toddler right now, and she doesn’t stick around long enough for almost anything. Sometimes, if the light is just right and at just the right angle, and she moves in just the right way so she’s framed in it perfectly, and of my eyes feel like behaving themselves at just the right moment, I get this clear, fleeting glimpse of her face, and I get to see just how beautiful she is. These moments stick in my mind, but they are rare, and all too brief. I miss so many details so much of the time.

But when Kristin takes a photo like this—a crystal clear, stunning capture of a living moment—The world slows down. It stops, allowing me all the time I need to pore over the scene, working out and appreciating every feature and every nuance. 

I know that a day will come when I won’t be able to see her this way, to know her face in this much detail, even from photographs. And I know that some day after that, I won’t know what she looks like at all.

And I will miss it.

It won’t affect my love for her, or my care for her. I will still play with her and laugh with her and teach her and share life with her. I will know a million things about her that are more important than her physical appearance.

But I will still miss it.

I’ll miss that shining grin and those sparkling  blue eyes, those looks of joy, inquisitiveness, mischief, and wonder.

So for now, I will treasure the photographs, and I will gladly take all the time I need to etch every detail into my mind and into my memory.

My Quest for the Perfect Word Processor: Act Two

Photograph of a winding path through a dark forest. This is a quest, after all.

In Act One of this epic tale, our hero had fallen on dark days. Forced away from Mellel, his comfortable word-processing home, he began to wander the land seeking new possibilities and brighter horizons.

Now we see him revisiting familiar territory. Microsoft Word for Mac 2011 is already installed on his machine, after all. But it too offers only disappointment. Hazardous to navigate and full of unmarked and unlabeled dangers, it is a VoiceOver nightmare. 

He considers other options: Pages, VoiceDream Writer. These are friendly and accessible, but nowhere near full-featured enough for a dissertation. He falls to using TextEdit—at least it works well with VoiceOver. Perhaps he will write his whole dissertation in plain text and typeset it with LaTeX. But of course this is absurd. Navigating a document as long as a dissertation in plain text would be next to impossible. Plus he would have to learn LaTeX, so…

And then at last, on the verge of despair, he finds hope. There is a new version of Microsoft Word for Mac, and it has been substantially rebuilt and reconfigured. Word has always had features galore, of course, and is capable of handling large projects like books and dissertations. In the new 2016 version, the development team has increased VoiceOver compatibility and improved support for Hebrew (as long as the Hebrew keyboard is used). 

Almost all of the buttons, tabs, and menus are clearly labeled for VoiceOver, and navigating the interface is relatively easy. Setting VoiceOver Hotspots for the ribbon and main text pane makes it even more painless. The only problem with this is that the Hotspots for the ribbon are document-specific, so if you have two documents open at the same time, you have to make sure you go to the correct ribbon. 

Navigating long documents can also be cumbersome. You can navigate by page or line, but it would be very useful to be able to navigate within your document structure. The VoiceOver rotor could come in handy here, connecting the headings menu to document headings and allowing users to skip back and forth that way.

The biggest bug in Word for Mac 2016 comes when documents get long and cover multiple pages. If you make changes to early pages in the document that affect later pages, VoiceOver can get confused about what it should be reading . When you use the “Read Line” or “Read Paragraph” commands, it will read the wrong line or paragraph, or start or stop too early. When this happens, closing and reopening the document solves the problem, It is not insurmountable, but it does get very tedious. 

Track Changes and Comments—two critical tools in academic work—are also difficult to use, but these are acknowledged issues that Word is working to improve.

So our hero takes up this tool, imperfect though it is, and sets his hand to the work. But his vigilance remains constant, and from afar he hears rumours of a new kind of tool: a powerful writing suite with deep VoiceOver compatibility. Tune in next time, brave readers, as our hero encounters…the Scrivener.


(This  epic post reviewed MS Word for Mac 2016 Version 15.24. Any subsequent  improvements to accessibility in later versions are not covered)

Connection and Collaboration

For the past several years, being a blind person in the academy has seemed like a very lonely, and often discouraging road. I was able to find a few stories about blind professors online, but I had difficulty finding any in real life. Faculty and students at my university were supportive and willing to help, but no one (including myself!) was well acquainted with the particular problems of blindness or how to address them. 

A couple weeks ago, however, in a crazy series of events, I suddenly began to find the blind academic camaraderie I’ve wanted. My wife Kristin, who is an interaction and usability designer, decided on a whim to attend an accessibility even in San Francisco, hosted by Cordelia McGee-Tubb. The event was great, and while she was there, she fell into conversation with Jennifer Sutton, an accessibility consultant who also happens to be something of an expert on Braille, and also happens to be blind (go see what she does on Twitter or on LinkedIn). Kristin took her card, and told me I should get in touch with her.

So I gave her a call, and I couldn’t be more happy that I did. Jennifer was generous with her time and resources. In a phone call and a number of subsequent emails, she connected me to blind scholars and academics in a variety of fields—a professor of linguistics at Rice, an English professor at Berkeley. Through these connections, I found an email group of blind academics. There are smart blind people doing great academic work in practically every field imaginable—English, history, art, psychology, human computer interaction, you get the idea.. They share knowledge and advice freely, and I have already learned a great deal from this group.

Perhaps most important of all, she connected me with a small group of people who are working to increase access to biblical language study for people who are blind. They have transcribed the Hebrew Bible, New Testament, and some other original language documents into Braille. Now they are working on scholarly research tools, grammars, and other ancient language materials. 

Check out for more information, and this blog post in particular to see the current state of the project. My skills at Braille pale in comparison to theirs, but I hope to contribute to this work as I improve.

It’s been exciting and slightly overwhelming to realize how many other talented blind people there are out in the world and in the academy. Those I have met are kind, generous, and resourceful. They are determined to succeed and eager to help others do the same. They have thought through and worked around many of the difficulties associated with academic work, and they are happy to give advice and encouragement. Of course, there will still be many challenges in my future, many problems left to solve, but the task feels a little lighter.

My Quest for the Perfect Word Processor: Act One

An image of a big, unlabeled red button.

“Button. You are currently on a button. To click this button, press Control-Option-Space.”

Uh oh. This is the sound of VoiceOver non-compliance, and the first time I heard it, my heart sank. I am still new at VoiceOver, but I was even newer then, just learning basic commands and navigation skills. I was testing the various apps I use on a regular basis, experimenting to see how I would use them when I could no longer use my sight. VoiceOver is the main accessibility feature on Mac—it identifies objects and reads text on the screen, and allows the user to control everything with the keyboard and trackpad. So what is the problem? It’s helpful to know you’re on a button, right? It is, but it would also be nice to know what the button does. Exploring sloppy apps like this is like breaking into a super-villain’s secret lair. There are buttons—oh so many buttons—but none of them are labelled. Does this button open a trap door to the dungeons, or order minions to bring coffee? Does this button save my file, or delete it?

What that button above should have said was something like “Save button. Save. You are currently on a button. To click this button, press Control-Option-Space.” See? Proper labelling makes everything so much clearer.

The problem app in this case—the app that made my heart sink—was Mellel, my favorite word processor. It is the word processor of choice in my field because it was developed by an Israeli team and handles right-to-left languages (Hebrew, Aramaic) without a hitch. It also includes a robust set of options for formatting, structuring, and managing citations in long documents like academic papers and dissertations. In short, it was the perfect tool while I had sight.

But the developers had not considered blind users and had not put in the effort to make Mellel VoiceOver compatible by labelling buttons and ensuring that the menus and palettes were navigable. It would not even read the text I had written back to me.  Now I came face-to-face with the realization that I couldn’t use this familiar tool to write my dissertation. Worse, everything I’d written for the last eight years was inaccessible.

For now, I could muddle through. I can still see enough to spot read and navigate the on-screen geography of buttons and banners, can zoom in to read the smaller text. But this is getting harder, and it certainly won’t last forever, I need a word processor that will work when I can no longer see at all. So I need a word processor that is

  • VoiceOver Compatible
  • Robust enough to handle a dissertation-sized project
  • Capable of dealing with all the languages I use

May be a tall order. We’ll see. In upcoming posts, I’ll talk about some of my experiments and experiences with other word processors. As it turns out, I’ve just found one that I think is going to work. Stay tuned for my review, and in the meantime, feel free to share with me any recommendations for accessible word processors that have worked for you!