Jump to: Intro | How It Works | History | Contact
Watch NPR's Christian McBride talk to FarPlay co-creator Dan Tepfer about low-latency audio and real-time remote musical collaboration:
How it works
In order to play music with other people, especially in styles where rhythmic synchrony is important, it's essential that the players hear each other as quickly as possible. The time between the moment one musician makes a sound and the moment another hears it is called latency. Even when playing together in the same room, there's some amount of latency, because sound travels through air at about 1 foot per millisecond, meaning it takes about one thousandth of a second to travel one foot. So, if a drummer is 10ft (3m) away from a bassist in a room, there's a one-way latency of 10ms between them, assuming they're playing acoustically. It's been established that musicians start to feel uncomfortable playing rhythmic music together when the one-way latency exceeds 20ms. This makes intuitive sense: it feels strange to play rhythmic music with someone 20ft (6m) away from you, while 10ft is common and perfectly fine. This is one of the reasons large orchestras need conductors.
In contrast, apps like Zoom, FaceTime, Skype or WhatsApp have latencies on the order of half a second (500ms), more than twenty times longer. Anyone who's tried to make music through the internet using these conventional means knows it's an exercise in frustration. So how does FarPlay reduce the latency so much? In several ways. First, FarPlay transmits completely uncompressed audio. As a result, there's no time-consuming compression stage on the sending side, and no decompression stage on the receiving end (and as an added benefit, your audio gets transmitted in the highest-possible quality). Second, FarPlay sends audio directly between participants, from device to device, rather than passing through a third-party server. This minimizes the distance that the data has to travel. Third, FarPlay gives you complete control over the amount of buffering applied to the sound you receive.
What's buffering? It's how long the app waits before sending the audio it's received to your ears. Why wait at all? Because on the internet, data is sent in small chunks called packets, and the amount of time it takes for a packet to get from one place to another varies constantly due to changing traffic conditions. So, if you play a packet the instant it arrives, it's possible that the following packet won't have arrived by the time the first is finished playing. This would result in a glitch in the sound, a short moment of silence that sounds like static. So it's a good idea to wait until you have a few packets stored up before playing the first one, to give yourself a little margin. With high enough buffering, you'll never have a glitch in the sound, but you'll have high latency. With buffering too low, you'll hear too much static. FarPlay allows you to fine-tune this so you can find the perfect buffering sweet spot.
In March 2020, as the COVID-19 epidemic spread worldwide, pianist / composer / coder Dan Tepfer suddenly found himself isolated from other musicians, unable to play with anyone. He came across the experimental open-source software JackTrip, developed by Chris Chafe and Juan-Pablo Caceres at Stanford University, which specializes in low-latency audio through the internet. Provided that two musicians are close enough together (generally less than 500 miles apart, but in some cases, up to 3000 miles), Jacktrip reduces the latency between the moment one musician plays a note and the moment the other hears it to the point that playing together in real-time, even in highly rhythmic music, is possible.
As experimental software, Jacktrip wasn't designed to be easy to use. It had no visual interface and could only be accessed through the command line. To use it, one participant had to open ports on their internet router, which is potentially insecure. Still, with his experience as a coder, Dan was able to get JackTrip working and within weeks, was beginning to livestream remote collaborations with other musicians, including a live four-player session on May 15th 2020 for jazz radio station WBGO with bassist Jorge Roeder and saxophonists Melissa Aldana and Dayna Stephens.
This early experience with JackTrip led Dan to realize that for livestreaming purposes, an additional feature should be added to JackTrip, enabling the data stream to be read in two separate ways, first with a low buffer for monitoring, and second with a high buffer for broadcast. He reached out to Chris Chafe, who put him in touch with Anton Runov, an experienced developer based in Saint Petersburg who had been actively contributing to the JackTrip open-source project. Together, Anton and Dan developed what became JackTrip's Broadcast Output, which Dan used throughout the pandemic to stream live duo concerts with musicians such as Cécile McLorin Salvant, Fred Hersch, Gilad Hekselman, Melissa Aldana, Linda May Han Oh, Miguel Zenon, Antonio Sanchez, Kristin Berardi, Christian McBride, Aaron Diehl, and others.
Still, JackTrip remained forbiddingly difficult to use for the average user. At the beginning of 2021, Anton and Dan decided to collaborate on an app that would make JackTrip accessible to everyone. This required solving a number of challenges. Dan, with his extensive experience using JackTrip as a musician, designed the visual interface and the web server, aiming to simplify the process of connection and setup as much as possible. Anton, with his extensive experience as a VoIP developer, rewrote the entire JackTrip codebase, circumventing the need to open ports on the user's router and integrating direct latency measurement, among many other features. The result was FarPlay, launching in public beta in October 2021.
Reach out to the FarPlay team at firstname.lastname@example.org.