How to Use Instagram’s New(ish) Edit Tools to Make Your Posts More Disability-Accessible
Late last year, Instagram rolled out new features to help make the photo-sharing app more accessible to people with visual impairments. And if you didn’t know the features existed, now’s the perfect time to try them out and make it easier for people with certain disabilities to interact with your feed.
“First, we’re introducing automatic alternative text so you can hear descriptions of photos through your screen reader when you use Feed, Explore and Profile,” Instagram wrote in a blog post to promote the new features. “This feature uses object recognition technology to generate a description of photos for screen readers so you can hear a list of items that photos may contain as you browse the app.”
The post continues: “Next, we’re introducing custom alternative text so you can add a richer description of your photos when you upload a photo. People using screen readers will be able to hear this description.” Users can manually add image descriptions in the advanced settings (located under location on the post page on the app).
In other words, you can go back and add detailed, more literal descriptions to existing posts to help visually impaired ‘gram users understand what’s in each photo. So, for example, you could take an existing photo of a squirrel in a tree with a giant nut in its cheek and the caption, “Aw nuts,” and add an alt caption that reads something like, “Happy squirrel sits in tree with a nut in its cheek.”
But the alt-text isn’t just about helping people with visual impairments. As Vox pointed out just after the rollout, the company’s object recognition software should ostensibly have been able to provide basic descriptions of the objects in photos without being told what is in individual photos. After all, parent company Facebook has long been using this object recognition to help train its own artificial intelligence — which the company uses for its facial recognition software, among other potential uses. Which brings us to one potential reason Instagram may have realized it would be useful to roll out these features now.
“We rely almost entirely on hand-curated, human-labeled data sets,” Mike Schroepfer, Facebook’s chief technology officer, told the crowd at last year’s F8 developer conference. “If a person hasn’t spent the time to label something specific in an image, even the most advanced computer vision systems won’t be able to identify it.”
(Photos via Getty Images)