Some smartphone companies are bringing back the headphone jack, yet others remain staunchly opposed to reintegrating it. Whether or not your phone supports wired audio, there’s no denying the convenience of going wireless. Having fewer wires clutter my office is a relief, but wireless audio introduces a host of new concerns. Now, let’s get you up to speed on the what’s what of Bluetooth codecs.
What you should know
Before breaking down the various wireless codecs, we need to establish working definitions for a few terms and to cover a few concepts.
- Sample rate (Hz): the number of data points per second in an audio file. To capture a given frequency accurately, you need to capture two samples. This means audio is sampled at twice the limit of human hearing (~20Hz). High-res formats are exported at 96kHz or greater.
- Bit depth (-bit): the number of bits per audio sample dictates a file’s resolution. CD quality is 16-bit, while DVDs and Blue-ray discs may support 24-bit audio. Just like sample rate, a higher bit-depth yields larger files.
- Bit rate (kbps): this is the number of bits processed per unit of time, which is usually measured in seconds. We typically record this as either kilobits/second (kbps) or megabits/second (Mbps). Bit-rate is calculated by the sample rate x bit-depth x channels.
Bluetooth data transfer rates are unstable. SoundGuys has demonstrated this with regards to Sony’s LDAC and Android’s encoding of AAC along with its overarching latency issues. When a company advertises a transfer rate for its codec, it’s likely not a constant rate. Rather it’s the optimal bit rate, and your streaming speed may never reach that maximum if conditions aren’t constantly optimal.
What’s more, Bluetooth devices have a determined connectivity range, usually three meters for headphones. The farther you move from your phone, the more interference a signal has to combat. This interference manifests as physical barriers (e.g. walls, people, air) and as other resonant frequencies (e.g. radio waves and Wi-Fi signals).
Something else to be aware of is psychoacoustics. This is the study of how humans perceive sound and is profoundly complicated. To abbreviate the matter: a psychoacoustic model is applied to digital media. It determines what data points can be compressed or deleted without notable degradation of sound quality. If you want to impress your friends at Wednesday night trivia, tell them of how psychoacoustics held the door for the MP3 format and compression, which has influenced subsequent audio formats. To learn more about compressed, uncompressed, and lossy formats, check out SoundGuys’ article.
What is a Bluetooth codec?
Congratulations: you’ve passed Wireless Audio 101 with flying colors. Now let’s apply that knowledge.
A codec determines how Bluetooth is transmitted from a source (e.g. smartphone, tablet, or computer) to your headphones. No matter the codec, it’s responsible for encoding and decoding digital audio data into a specified format. The goal is to transmit a high-fidelity signal at the minimum bit rate. Doing so minimizes space and bandwidth requirements for storage and playback. As you may have guessed from skimming the earlier definitions, a low bit rate results in greater compression and reduced audio quality. Meanwhile, a high bit rate results in less compression and greater audio quality.
Bluetooth’s greatest hindrance is its limited bandwidth, which can result in connection stutters.
Your gut reaction may be to favor audio quality over compression. However, as last week’s fortune cookie informed me, there’s a time and place for everything. For instance, in a highly trafficked area, a low bit rate may make connection strength more stable. Sure, it’s at the expense of sound quality and accuracy but in such settings, say a subway car or gym, you’re unlikely to reap the benefits of high-quality codecs anyway.
Sifting through Bluetooth codec alphabet soup
The low-complexity sub-band codec (SBC) is the the lowest common denominator of Bluetooth codecs. It’s the Bluetooth Special Interest Group’s (SIG) bare minimum requirement for the A2DP profile, which determines how audio may be streamed between devices. SBC was initially designed to obtain passable audio quality at medium bit rates, thus minimizing streaming complexities. It’s engineered to accommodate Bluetooth bandwidth limitations and variable processing power across devices. Transfer rates (maximum: 320kbps) are manageable at the expense of data loss.
Then there’s Qualcomm’s host of aptX codecs: aptX, aptX Low Latency (LL), and aptX HD; once aptX Adaptive rolls out, it will replace aptX LL. These codecs are most relevant to Android users. They lessen streaming latency, but the rate at which lag decreases depends on what smartphone is used. Users should look out for headphones and earbuds that support aptX if they want more accurate and detailed audio. aptX alone supports 48kHz/16-bit LCPM audio data (352kbps), while aptX HD supports 48kHz/24-bit LCPM audio data (576kbps). With that in mind, both formats are still lossy.
For now, if you have an iPhone, aptX is irrelevant as iOS devices only support SBC and AAC.
The advanced audio coding (AAC), is the standard for lossy digital audio compression. It’s also the license-free standard for YouTube and is Apple’s preferred mode of transfer. iPhone users benefit most from AAC’s hi-res playback, which has tops out at 250kbps.
Although Android supports AAC, its performance is severely underwhelming because of inconsistent streaming quality. This isn’t to criticize AAC, though; instead it’s because Android OS has yet to deliver on a universal way of handling AAC. This is a power-hungry codec which Apple can manage because of its gate kept ecosystem.
Sony’s LDAC seems promising. Its variable bit rate is brilliant in theory and should consistently transfer three times the data of SBC. However, in practice, this isn’t the case. LDAC has three modes: 990kbps, 660kbps, and 330kbps. The two highest bit rates lose fidelity above 20Hz, and LDAC (330kbps) is outperformed by both aptX and SBC. While the option for LDAC 990kbps is great, the fact of the matter is that most phones default to LDAC 330kbps, requiring you to enter developer settings and force the higher option.
Do Bluetooth codecs actually make a difference?
Well, yes and no. If you suffer from noise-induced hearing loss or are listening from a particularly noisy environment, it’s unlikely you’ll be able to discern the differences between SBC, aptX, and AAC. That said, there are more benefits to high-quality codecs than sound quality, namely responsiveness. If you’ve ever skipped a track via your headphones’ on-board controls only to wait a second before the next song actually began playing, your device was probably streaming over SBC. This lag is even more noticeable in things like video playback whereby late night show hosts’ jokes are sullied by the audio-visual lag accompanying SBC streaming. Granted, there is compensation built in to keep video and audio in sync. This preemptive calculation doesn’t completely negate skips and stutters, though.
Ultimately, this barrage of information is hard to keep straight. To sum it up in a thought: aptX and aptX HD are Android users’ choice codecs while iPhone users should keep to AAC-supported headphones. To continue geeking out over Bluetooth codecs, check out our in-depth explainer.
Next: Samsung Galaxy Buds vs. New Apple AirPods (2019)
>> Source Link