How enCaption Changed the Game
As a speech recognition company, we're always championing accessibility in media. The news, for example, is essential viewing all over the world. This means it must be ready to adapt to the 7100+ languages. One of the ways broadcasters are ensuring accessibility for all is through captioning. Sometimes, broadcasters will use automated captioning – an automatic speech recognition system that turns speech-to-text. Some use human transcribers to transcribe speech-to-text in real-time.
Taking a genuine step toward accessibility in media means broadcasters need to move customers away from human transcription towards automated captioning due to the volume of media that is being produced everyday.
In steps ENCO.ENCO.com
We recently witnessed ENCO's enCaption system provide live subtitling for the main stage at NAB in Las Vegas. It applies 'computer-based automation to broadcasting,' attracting big-name customers in the broadcast and entertainment industries as well as enterprises in other vertical markets. Simply put, it’s better and more accurate than other automated captioning solutions.
enCaption enables broadcasters and content producers to add closed or open captions quickly and easily to live and pre-recorded content. Closed captions (where you have the option to turn subtitles off) are a legal requirement in the UK and USA for nationally broadcast media. Of course, laws such as the American Disability Act make it apparent why such rules are necessary. Without prioritized accessibility, people with hearing impairments aren't afforded the same chance to consume media as others. However, captioning goes further than that.
For starters, research has shown that a remarkably high percentage of viewers who consume content on their personal mobile devices do so with the sound muted. Captioning allows them to understand and enjoy the content this way.DigiDay Research Article
Additionally, journalist Sean Neumann of The Outline wrote about how closed captioning saved his relationship with Game of Thrones. It enabled him to remember the characters (of which there are many) names and proper titles. So, it's clear that not only does captioning offer inclusivity for all viewers; it can maximize enjoyment.The Outline Article on Captioning for Game of Thrones
Powered by Speechmatics’ Autonomous Speech Recognition (ASR) engine, enCaption learns local names and places, has multi-speaker identification and offers on-premises, cloud and hybrid deployments. If that wasn't enough, customers can generate captions for pre-recorded content while offline.
After all, accessibility doesn't always have a Wi-Fi signal.
While enCaption has helped pave the way for accessibility in media, it isn't the only solution ENCO offers. enTranslate, an optional plug-in for enCaption, allows customers to automatically translate captioning from video. Once translated, customers can embed captions into video on demand (VOD) content, making it understandable to non-native speaking viewers. These two forces combined represent a real win for media and entertainment, especially given it supports 34 languages.
ENCO's innovation is no more apparent than with enCaption. The latest version – number five – was launched at InfoComm 2022. The most important change from earlier editions is the new cloud-native architecture, negating the need for equipment on-premises and taking advantage of public cloud infrastructure. Then there’s the new light/dark mode and a modernized design, as well as the remote management system which allows customers to use enCaption5 from anywhere. Looking deeper, ENCO has added 'intelligent numeric capabilities' - what we call entity formatting – the ability to transcribe entities such as numbers and currencies in numerical or written format.
ENCO has also made a 'significant investment' to expand its research and design team, allowing the company to enhance enCaption4's capabilities while launching enCaption5.
The Guardian reports a shocking Ofcom study from 2006, which found that of the estimated 7.5 million UK television viewers using subtitles, only 1.5 million had a hearing impairment. Despite how long ago the study was, an Ofcom regulator said:
"Our understanding is that subtitle use has increased as the use of smart/mobile devices has increased, as more and more people watch programmes or videos on commutes."Ofcom Study
While that may be true, the need for automated captioning within all broadcasting stretches far beyond commuters. The future of media is accessible, as ENCO displayed at NAB. In the coming years, the continued innovation of our industry will make the lagged, mistake-laden captions you see on the news a thing of the past.
Read about our technology or sign up for free today. The ability to consume Speechmatics’ any-context speech recognition engine directly in the Microsoft Azure technology stack enables businesses to start using the technology quickly without barriers to adoption.
We work with great companies, read some of our partner case studies
Looking to collaborate?