It’s loss-less, not loss-none
Dang it, was going to make this same joke lol
It’s a good joke
We really need someone other than Qualcomm & Apple to come up with lossless Bluetooth audio codecs.
TBF the whole Bluetooth audio situation is a complete mess
Opus! It’s a merge of a codec designed for speech (from Skype!) with one designed for high quality audio by Xiph (same people who made OGG/Vorbis).
Although it needs some more work on latency, it prefers to work on bigger frames but default than Bluetooth packets likes, but I’ve seen there’s work on standardizing a version that fits Bluetooth. Google even has it implemented now on Pixel devices.
Fully free codec!
opus isn’t lossless
Nobody needs lossless over Bluetooth
Edit: plenty of downvotes by people who have never listened to ABX tests with high quality lossy compare versus lossless
At high bitrate lossy you literally can’t distinguish it. There’s math to prove it;
https://en.wikipedia.org/wiki/Nyquist–Shannon_sampling_theorem
At 44 kHz 16 bit with over 192 Kbps with good encoders your ear literally can’t physically discern the difference
The minute lossless becomes available wirelessly I’ll ditch my ridiculous headphone cable.
Nobody “needs” to listen to music over Bluetooth at all, but why not make it sound like it’s supposed to?
Why use lossless for that when transparent lossy compression already does that with so much less bandwidth?
Opus is indistinguishable from lossless at 192 Kbps. Lossless needs roughly 800 - 1400 Kbps. That’s a savings of between 4x - 7x with the exact same quality.
Your wireless antenna often draws more energy in proportion to bandwidth use than the decoder chip does, so using high quality lossy even gives you better battery life, on top of also being more tolerant to radio noise (easier to add error correction) and having better latency (less time needed to send each audio packet). And you can even get better range with equivalent radio chips due to needing less bandwidth!
You only need lossless for editing or as a source for transcoding, there’s no need for it when just listening to media
This has strong “nobody needs a monitor over 120Hz because the human eye can’t see it” logic. Transparency is completely subjective and people have different perceptions and sensitivities to audio and video compression artifacts. The quality of the hardware playing it back is also going to make a difference, and different setups are going to have a different ceiling for what can be heard.
The vast majority of people are genuinely going to hear zero difference between even 320kbps and a FLAC but that doesn’t mean there actually is zero difference, you’re still losing audio data. Even going from a 24-bit to a 16-bit FLAC can have a perceptible difference.
The Nyquist-Shannon sampling theorem isn’t subjective, it’s physics.
Your example isn’t great because it’s about misconceptions about the eye, not about physical limits. The physical limits for transparency are real and absolute, not subjective. The eye can perceive quick flashes of objects that takes less than a thousandth of a second. The reason we rarely go above 120 Hz for monitors (other than cost) is because differences in continous movement barely can be perceived so it’s rarely worth it.
We know where the upper limits for perception are. The difference typically lies in the encoder / decoder or physical setup, not the information a good codec is able to embedd with that bitrate.
for bluetooth to be a proper replacement for wired audio it needs to support 56kbps dial up.
Ah yes, good old TS3 and Mumble times.
discord also uses opus
is opus the one that allows high quality mic and headphone at the same time over Bluetooth?
That’s more than a codec question, that’s a Bluetooth audio profile question. Bluetooth LE Audio should support higher quality (including with Opus)
Isn’t LDAC made by sony?
Correct. Qualcomm makes aptX
Proprietary by Sony, but they did open source it
Don’t they make the encoder free, but license the decoder?
Wait, did Apple implement its own codec? I thought even the Airpods Max used AAC, which is lossy.
As for Qualcomm, only aptX Lossless is lossless and I’m not aware of many products supporting it (most supports aptX HD at most)
Yeah, the problem (imo) isn’t lossy v lossless. It’s that the supported codecs are part of the Bluetooth standard and they were developed in like the 90s.
There are far better codecs out there and we can’t use them without incompatible extensions on Bluetooth.
There’s a push for Opus now, it’s the perfect codec for Bluetooth because it’s a singular codec that fits the whole spectrum from low bandwidth speech to high quality audio, and it’s fully free
Opus is great, but there is no option to make it lossless, like what WavPack (also a free-as-in-freedom codec) provides for example.
Transparency is good enough, it’s intended to be a good fit for streaming, not masters for editing
Why not have the option for true lossless available so that Bluetooth can be scaled up to sound good on even the highest end of systems.
You literally can not distinguish 192 Kbps Opus from true lossless. Not even with movie theater grade speakers. You only benefit from lossless if you’re editing / applying multiple effects, etc, which you will not do at the receiving end of a Bluetooth connection.
The newer H2 SoC AirPods support ALAC, Apple’s lossless codec; however, their phones don’t yet support it, so the only way to use it is with the Vision Pro.
AFAIK, ALAC will not be actually lossless over bluetooth for the sames reason LDAC can’t be lossless; there simply isn’t enough bandwidth. That doesn’t mean that it won’t sound great or perhaps work better than LDAC.
It runs over a 5GHz connection, not a 2.4GHz connection like bluetooth.
Oh, so they aren’t on bluetooth at all? That is an entirely different story, thanks for the info.
Sony created LDAC
On Windows, Alternative ADP2 driver provides LDAC support. It’s a few bucks, but also the only option I know of.
Just use uncompressed 16bit/48khz! We’re not bats that would need 96khz audio!
Well bluetooth doesn’t carry enough bitrate to accomplish this. Besides. Apple won’t and doesn’t need to because their AAC encoder is superior. There is no other bluetooth codec that comes even close. Every codec that claims to be the best one yet is more marketing than anything.
Vendors reframed the narrative for SBC to be dog shit so they can push their own as cutting edge new tech. In reality SBC isn’t that bad. The vendor codecs aren’t that good. And Apple has some kind of secret sauce in their AAC encoder that results in really good quality reproduction of audio.
As far as I’ve seen most of the gimmicky codecs are spins of existing old technology. AAC itself is old too but at least one vendor Apple has focused on making their implementation good. We don’t need another standard+1. We just need a common standard done well. If only Apple would open theirs.
Except Opus. Beats it at most bitrates
BT 5 has max bandwidth of 2Mbps, which would in theory be enough for “CD quality”, i.e 44.1khz/16 bit raw uncompressed audio, as that’s around 1.4Mbps. In real life conditions it isn’t. AFAIK aptX lossless gets close by doing some compression.
But if you go full audiophile levels and start demanding lossless 192khz 24 bit audio, that’s 10Mbps and not even remotely possible over BT no matter what you’d try.
Ah, misleading use of terminology that indicates one thing, but will win in court even if it actually means, or can later be said to mean, another.
I hope those involved in helping companies win these lawsuits choke on bones from food sold as boneless. Because that won a court case after “boneless” was redefined as a cooking method.
I don’t want them to choke to death. Just a little lesson, you know?
I vote they choke indefinitely. But not to death; I want them to die of old age, spending decade upon decade choking endlessly.
I work in pro AV and so many companies do this. Wow, you say LOSSLESS video on a valens chip? Oh, you’ve never actually done a side-by-side conparison, have you…
Extron differentiates between lossless and “visually lossless” which I appreciate.
Transparent or indistinguishable lossy compression are other common terms
I’m perfectly OK with those, too.
Of course it’s Ohio!
Just a little to death.
As unfortunate as the naming misdirection is, I have to say: LDAC sounds significantly better (to me) than other Bluetooth codecs I have tried. It also works on Linux and android with no issues whatsoever. Open source is good.
I use it with a pair of Sony XM5’s, which can also be used in wired mode, so you kind of get the best of both worlds.
at high signal strength LDAC should default to 990kbps… which is kind of ridiculous since it’s so high it’s higher than some lossless codecs, like uncompressed 16-bit 48kHz. (which is higher than standard CD quality)
Uncompressed 16 bit 48KHz stereo is 1536 kbps, which is just slightly higher than what bluetooth 5 is capable of.
Oh I forgot about stereo, ha.
The bitrate is manually enforceable on Linux, too
*specifically using PipeWire
Pipewire or the pulseaduo Bluetooth codec add-on. The pipewire implementation seems to be mimicking the old pulseaudio plugin.
That’s assuming raw PCM data, no compression (lossy or lossless) whatsoever.
LDAC can do lossless redbook audio (16 bit 44.1 KHz) at 990kbps. All other modes are lossy.
It’s probably doing something much like FLAC- lossy encoder + residual corrections to ensure you get the original waveform back out, but with less bandwidth than raw PCM.
I highly doubt that. Do a proper ABx test (such as the one on digitalfeed.xyz) I have yet to meet someone who can pass the tests with a reasonable degree of accuracy.
You highly doubt my personal experience?
Do you mean abx.digitalfeed.net?
Many lossless codecs are lossy codecs + residual encoders. For example FLAC has predictor(lossy codec) + residual.
Could also stand for Lazy DumbAss Cat if the pic is any relation
Does this meme format / cat have a name? I was trying to find the raw version the other day and could not.
“Cat looks inside”
> knowyourmeme link
> look inside
> cat
Thanks!
To my knowledge it’s lossless in CD quality only, in high-res modes it becomes lossy
It’s nearly lossess if you can connect and maintain a 990kbps connection, but it still doesn’t have enough bandwidth to do it truly lossless. I think it would require 1411kpbs to be actually lossless. It is still better than any codec I know of for bluetooth as far as that does, but bluetooth just kinda sucks for that sort of application.
1411 kbps before compression. FLACs can go as low as 200 kbps based on the content of a file
Interesting. If that is so, then I am surprised that neither actually support actual lossless at that res without blowing up the noise floor.
FLAC is a lossless compression format. It will reduce file size but keeps the audio quality. So-called “high-res” format on streaming platform like spotify (mandatory fuck spotify here) are usually mp3 320kbps so heavily compressed and lossy, indeed.
“On 17 September 2019, the Japan Audio Society (JAS) certified LDAC with their Hi-Res Audio Wireless certification.”
Something something oxymoron. Bluetooth is trash, its why I still use wired whenever I can.
I don’t understand what’s funny. It’s developed with no competition, it’s open source, it’s definitely better than the current options out there and doesn’t cost money. Is it just audio snobs in here? I consider myself somewhat snobby re:audio but even I use wireless headphones. Some grade A snobbery in this thread. LDAC is great. You’re not convincing anyone to go back to wired headphones for day to day use
it’s as simple as
loss-less vs. lossy
within only a few words of the main description of the thing - no judgement on the tech whatsoever (at least from my side)
Ignorant of the subject matter, but I ripped a bunch of CDs to FLAC some time ago. Would that not work for this purpose?
The Sound Guys do a good job of breaking down LDAC, however the main point of criticism I have about the article is that they say that LDAC isn’t great because most smartphones don’t auto-choose the highest 990 bitrate. That doesn’t seem like an LDAC problem, that seems like a phone problem. My phone is admittedly a Sony, but it always chooses the highest bitrate first. There’s even a setting to force it to use 990.
The other criticism I have is that the sound guys kind of overlook the fact that, when your phone is in your pocket, it’s close enough to the headphones that you’ll almost always get the 990 bitrate. And the sound quality at 990 is fantastic. I cannot tell a difference between it and a wired connection for CD-quality FLACs. Even the 660 stepdown bitrate of the LDAC codec is really good.
Ldac is a Bluetooth thingy, so my understanding is that flacs will be re-encoded on the fly when you play 'em on bt headphones with ldac.
Bluetooth has fairly low bitrate which also helps save power. The throughput will also vary with signal quality. It needs to somehow adjust to worse conditions, otherwise it will just keep cutting out. Streaming CD quality FLAC could probably be done over Bluetooth 5 2M PHY, but 2Mbps is just the physical layer. There’s also some overhead. Perhaps just enough would be left, but the bitrate will also vary with the content. Not everything can be compressed much, while some audio can be compressed quite a bit.
Probably would work, but the reliability is also a question.
Anyway, just guessing. Perhaps the 3Mbps EDR could be used just fine.
Oh, Bluetooth 3.0 + HS could do 24Mbps. Sort of. It used WiFi to do that.
My favorite is most people are listening to already lossy compressed music that gets decoded and then recompressed in another lossy manner… I miss my cable sometimes.