Learning Area

Barry Olsen: "The Tech-Savvy Interpreter: A first look at the Interprefy Remote Interpretation Platform"

Barry Olsen from InterpretAmerica is the worldwide leading expert on remote interpreting services. He regularly gives webinars and workshops on the topic.

In the 256th edition of the Tool Box Journal he gave a feedback upon Interprefy Platform.

Last month I introduced you to WebRTC, the new technology "baked in" to the Google Chrome and Mozilla Firefox browsers that makes in-browser audio, video, text and file sharing possible without installing any plug-ins. In an effort to show you how WebRTC is enabling remote interpreting over the web, I thought it might be fun to take a look at a WebRTC-based remote interpreting platform. So, I reached out to Kim Ludvigsen, CEO of the Zurich-based startup Interprefy and asked for a look-see at their remote interpreting platform, which runs on WebRTC. He readily obliged.

Some Background

Like many startups that are looking to disrupt the interpreting space, Interprefy is a newcomer. Their idea was born out of a dissatisfaction with the way conference interpreting services were provided in meetings the company founders attended themselves.

It is important to understand that "remote interpretation" is an umbrella term that covers a broad range of interpreting modalities and service delivery models. Is it consecutive or simultaneous? Are the participants in the same room or are they remote as well? Will the interaction be just a few minutes long or will it last all day? Can the interpreters see the participants? How many languages will there be? Etc. The list of questions needed to define a specific use case is long indeed. So, whenever looking at a remote interpreting platform, it is important to understand its use case, which is to say where and how it will be used.

That said, Interprefy has chosen to tackle one of the most difficult and controversial remote interpreting use cases -- remote simultaneous interpretation for meetings where the participants are all physically in the same room. Only the interpreters are connected remotely through technology. Think of it as meeting delegates seated around the same table with the interpreters seated across town or halfway around the world and connected to the meeting over the Internet.

For many conference interpreters this is a doomsday scenario. I disagree. While this technology does have the potential to displace some on-site conference interpreting work, it is not going to change 60+ years of professional practice overnight and probably not ever. To the contrary, it has the potential to create more new remote work than the traditional on-site conference interpreting work it displaces.

So How Does It Work?

Interprefy uses WebRTC technology to transmit both audio and video over the Internet to allow interpreters to receive source audio and video and interpret into the target language for attendees who are physically present in the same meeting where the speaker is. All of this is done over the Internet. It goes without saying that there has to be fast, dependable broadband Internet available at the meeting venue and the offices where the interpreters will be working.

What Do the Interpreters Need to Connect?

Interpreters connect to the Interprefy platform using either the Google Chrome or the Mozilla Firefox web browser on a PC or Mac. They need to have a quality headset and a webcam. They must have a wired broadband connection to the Internet (in technical terms, according to Interprefy, that means a minimum of 2 mbps down, 2 mbps up, and a ping of under 50 milliseconds). As is the case when working remotely on any platform, interpreters must be in a quiet space where they can work uninterrupted.

What's the Interpreter Interface Like?

There are many elements of the interpreter interface. The most prominent feature is the video of the speaker. It also includes a chat box that allows the interpreter to communicate with the organizer on site at the venue and another to chat with a virtual booth mate. The mute button is used to turn the microphone on and off. The interface also has the potential to provide a video feed of one's booth mate as well.

One feature I found very helpful was the inclusion of a two-way video and audio link between interpreter and speaker before the event begins -- a kind of "virtual green room" that makes pre-speech briefings for the interpreters a reality.

How Do Delegates Listen to the Interpretation?

Delegates can use any Android or iOS smartphone with earbuds or headphones. They have to download the Interprefy Connect app, which is simple to do. Each conference or meeting is given a unique code (Interprefy calls it a "token"). Participants enter the meeting token and they are then connected to the alternate language channel (i.e. the language the interpreters are working into). The smartphone becomes a simple audio receiver. I was able to demo this myself on my iPhone 6. Connecting to the audio stream was fast and simple -- type a six-digit code and you are in. The audio was clear and there was very little latency.

The folks at Interprefy have ambitiously designed their platform for a wide range of use cases-for large presentations, seminars, workshops and smaller meetings. It its current form, based on what I saw, it seems best suited for large presentations where the communication is one to many. Meetings with lots of dialog going back and forth, like negotiations or discussions entail additional complexity, and I haven't seen Interprefy in that kind of use case yet so I'm not in a position to make an assessment yet.

Undoubtedly, the ability to use a smartphone or other smart device as a receiver for interpretation is compelling. For meeting organizers, this means a cost savings on equipment while still providing interpreting services for attendees. However, it does shift the burden of setup to the participant, and as many technicians have pointed out to me, the issue of smartphone battery life limits how long a person can and may be willing to use a smartphone for listening to interpretation.

I raised this concern to Interprefy. They explained that for longer meetings with interpretation they supply 2500 mAh power packs that have a battery lifetime double of average smartphones enabling their use for full-day meetings with interpretation. The power packs work with both Android and Apple devices. They also have multi-device charging stations available for longer meetings as well.

Overall Assessment and Reflections

End User Experience: The Interprefy model seems simple and easy for meeting participants to use. If you have earbuds (Interprefy can also supply these at the meeting venue for participants who didn't bring their own) and a smartphone or tablet, you can listen to the interpretation. That simplicity is powerful but it also raises the question: How willing will a meeting participant be to use his/her own smartphone to listen to the interpretation?

I expect this kind of interpretation setup to be used initially for meetings that last a few hours, not an entire day or longer. This technology will also make it possible to provide interpretation to a larger group of people than may have been feasible in the past. Think of it this way. If someone is giving a speech in a large auditorium or a stadium with tens of thousands of attendees, hundreds or thousands could listen to the interpretation with an app by simply using their smartphones and earbuds. No logistical headache of having to distribute and then collect and clean headphones and receivers.

It is important to note that simultaneous interpretation technicians will still need to be on site to ensure the technology is working and to resolve any problems that may emerge during a remotely interpreted meeting.

Interpreter Experience: The interpreter experience on Interprefy is headed in the right direction but still needs refinement. For example, currently the meeting room video and the interpreter controls have to be opened in separate windows that can end up buried under other programs open on the desktop. Additionally, it still takes several clicks and the introduction of passwords, tokens and codes to get the system configured to interpret. Volume control and microphone selection can also require several steps. These are failings that I have noted in multiple remote interpreting platforms. While some tech-savvy interpreters are able to get through these processes easily, many others struggle.

The good news is the engineers at Interprefy are working to integrate all the necessary features in a single interface that will allow the remote simultaneous interpreter to connect quickly and painlessly. I've seen a screenshot of the new interface and I think they are headed in the right direction. Even so, as is the case with any interpreter console, physical or virtual, interpreters need time to get used to its features so they can use them under the stress of simultaneous interpretation. Adapting to online work takes time and practice.

The long and short of it is that the Interprefy platform is up, running and commercially available. They have been providing remote interpretation for meetings since early 2015. The platform makes remote simultaneous interpretation of face-to-face meetings and conferences a reality without having to spend significant amounts of money on equipment, transportation and lodging for interpreters. This means that there is a potential for significant change in the conference interpretation market. Interprefy is still a little rough around the edges, but the great thing about technology hosted in the cloud is that improvements can be made on an ongoing basis, which means an already functional platform is only going to get better.

Do you have a question about a specific technology? Or would you like to learn more about a specific interpreting platform, interpreter console or supporting technology? Send us an email at inquiry@interpretamerica.com.

Learn action

Seeing is believing

Schedule a demo today