this post was submitted on 06 Mar 2024
317 points (88.9% liked)

Technology

59605 readers
2976 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] aleph@lemm.ee 6 points 8 months ago (3 children)

But 24-bit audio is useless for playback. The difference is literally inaudible. In fact, the application of dynamic range compression during the mixing/mastering process has a far greater impact on perceptible audio quality than sample rate or bitrate does (the placebo effect notwithstanding).

If you care about audio quality, seek out album masters and music that is well-recorded and not dynamically crushed to oblivion. The bitrate isn't really all that important, in the greater scheme of things.

[–] resetbypeer@lemmy.world 7 points 8 months ago (1 children)

I partially agree with you. Yes mixing and mastering is far more important than bitrate. However if I let my gf listen to a identical song both in normal 16/44khz and 24 bit version, she can hear difference. Now is it night and day ? Not always, but subtle Improvement can matter when enjoying music.

[–] aleph@lemm.ee 7 points 8 months ago* (last edited 8 months ago)

Literally the only difference between 16 bit and 24 bit is that the latter has a lower noise floor, which is really only useful for sound production - It doesn't translate to any increase in meaningful detail or dynamic range when dealing with playback.

16-bit was chosen as the defacto standard for CDs and digital music precisely because it contains more than enough dynamic range for human hearing.

Any difference your gf hears is due to the placebo effect rather than any inherent difference in the actual audio.

[–] datendefekt@lemmy.ml 2 points 8 months ago (1 children)

That writeup from Xiph is excellent. The comparison with adding ultraviolet and infrared to video makes so much sense. But you're dealing with audiophiles who seriously consider getting hi-end power and ethernet cables. I read somewhere that there was a listening test with speakers connected with hanger wire - and audiophiles couldn't tell.

In the end, it's all physics. I could never hear a quality improvement beyond normal 16bit, 320kbps, no matter how demanding the music.

[–] aleph@lemm.ee 2 points 8 months ago

As a recovering audiophile, I can safely say the hobby is heavily based around FOMO (the nagging doubt that something, somewhere, in your audio chain is causing a loss of audio quality), and digital audio is no exception. Not only is 320kbps more than enough, even with $1000s worth of equipment, but with codecs more efficient than MP3 (especially Opus), even 128kbps can be good enough to sound identical to lossless.

If you have plenty of local storage then 16-bit FLAC is ideal, but if you are just streaming then you really don't need a lossless service except to keep the FOMO at bay.

[–] prole@sh.itjust.works 1 points 8 months ago (1 children)

Anyone who has ever heard a 128kbps mp3 side-by-side with a 320kbps (or really anything above 192kbps in my experience) version can tell you that bitrate definitely matters. The better audio equipment you play it through, the more noticeable it is.

It definitely becomes inaudible at a certain point, but back in my CD ripping days, I'd scoff at anything below 192kbps

[–] aleph@lemm.ee 1 points 8 months ago (1 children)

Have you ever done an actual double blind listening test? You'd be surprised. Even with good listening equipment it can be very challenging.

Have a go on the 128 kbps AAC test on this page and see how you do:

https://abx.digitalfeed.net/spotify.html

[–] prole@sh.itjust.works 1 points 8 months ago (1 children)
[–] aleph@lemm.ee 1 points 8 months ago

Presumably it was using an older/outdated codec then. With modern encoders, especially with codecs like Opus, Ogg, and Apple's AAC, the vast majority of listeners find 128kbps to be transparent, and certainly nowhere near night-and-day when compared to lossless.

Check out the results of this public listening test here:

https://listening-test.coresv.net/results.htm