Direct Stream Digital Audio

A Music Lover's Creates His Own Test for Dynamic Range -- No Surprise, DSD Wins!

A Music Lover's Creates His Own Test for Dynamic Range -- No Surprise, DSD Wins!

Written by David Slattery of the Colorado Audiophile Society

Note: If you have questions or comments for David, please leave in the "comment" section below. Comments may take 24 hours to post.

After reading many articles about Dynamic Range (DR) I wanted to take a look at my own collection and see if I could identify which format had the best dynamic range. I then wanted to look inside that format and see if I could identify where the best dynamic range’s come from.

Before I get started here is how I did it. Jriver has a function that will analyze every track you select. However, I wanted to have an average for each Album. This is where things get tricky. At the end of this article I will give complete instructions on how to do it. Once I had the album calculations I could thus analyze my own collection.

I first started off by looking at Redbook vs Hi-Res PCM vs DSD. The results clearly show that the DSD albums in my collection have the best dynamic range.

I then wanted to look and see if there where certain albums that had higher dynamic range. I First looked at the releases I had from Blue Coast Records. I was very happy to see that every release fell into the Good range. The minimum Dynamic Range was 12 with the vast majority falling above 13.

I then looked at different decades to see if there were any patterns. I actually was very surprised to see that the results was very consistent in every decade.

Finally I looked at the different Genres. I do not have many albums from the that would be classified as Classical or Country or even from many other genres. For tis reason I really looked at only Jazz, Rock and then grouped the rest as everything else. I guess the results should not be surprising. The numbers for jazz were extremely good with 95% in the good range. I guess I should not have been surprises that Rock/Pop rock had only 60% in the good range. And I was very happy to see that out of the rest the values were very good.

For me this analysis clearly shows that DSD has the best DR values. I was not surprised as I enjoy listening to DSD releases the most. 

RedBook (1982 CDs)


1-7 Bad 227 (11.45%)

8-10 Transition Bad 780 (39.35%)


Thus 0-10 (Bad) 1007 (50.76%)


11-13 Transition Good 782 (39.46%)

14-18 194 (9.79%)


Thus 11-18 (Good) 976 (49.24%)



Hi-Res PCM (1313 CDs)


1-7 Bad 79 (6.02%)

8-10 Transition Bad 407 (31%)


Thus 0-10 (Bad)  486 (37.01%)


11-13 Transition Good  668 (50.88%)

14-18 Good  160 (12.19%)


Thus 11-18 (Good) 828 (63.06%)



Hi-Res DSD (887 CDs)


1-7 Bad 7 (0.79%)

8-10 Transition Bad 185 (20.86%)


Thus 0-10 (Bad) 192 (21.65%)


11-13 Transition Good  474 (53.44%)

14-18 Good  222 (25.03%)


Thus 11-18 (Good) 696 (78.47%)




Looking at Just DSD releases from Blue Coast Records:


Hi-Res Blue Coast DSD (68 CDs)


1-7 Bad 0 (0%)

8-10 Transition Bad 0 (0%)


Thus 0-10 (Bad) 0 (0%)


11-13 Transition Good  26 (38.24%)

14-18 Good 42 (61.76%)


Thus 11-18 (Good) 68 (100%)





1-10 Bad  4/78 (5%)

11-18 Good 74/78 (95%)




1-10 Bad 33/167 (20%)

11-18 Good 134/167 (80%)




1-10 Bad  65/226 (20%)

11-18 Good 161/226 (71%)




1-10 Bad 24/91 (26%)

11-18 Good 67/91 (74%)




1-10 Bad 13/72 (18%)

11-18 Good 59/72 (72%)




1-10 Bad 48/173 (28%)

11-18 Good  125/173 (72%)



1-10 Bad 5/80 (6%)

11-18 Good 75/80 (94%)





Jazz + Fusion

1-10 Bad 17/338 (5%)

11-18 Good  321/338 (95%)



Rock/Progressive Rock + Pop/Rock

1-10 Bad 124/309 (40%)

11-18 Good 185/309 (60%)



Everything Else

1-10 Bad 51/240 (21%)

11-18 Good 190/240 (79%)





If you are interested in having album based calculations in JRiver here are the instructions I used to make it happen. I coped these instructions from a post by Mark H from the Jriver forums.


Just copy and paste the code below into a smartlist and you're good to go:


[=save(0,var_number_of_tracks[album artist (auto)][album])1]=1
[=save(0,var_album_dynamic_range_sum[album artist (auto)][album])1]=1
[=save(math(1+load(var_number_of_tracks[album artist (auto)][album])),var_number_of_tracks[album artist (auto)][album])1]=1
[=save(math([Dynamic Range (DR)]+load(var_album_dynamic_range_sum[album artist (auto)][album])),var_album_dynamic_range_sum[album artist (auto)][album])1]=1
[=save(math(load(var_album_dynamic_range_sum[album artist (auto)][album]) / load(var_number_of_tracks[album artist (auto)][album])),var_album_dr[album artist (auto)][album])1]=1


The Album DR ends up in the variable: var_album_dr[album artist (auto)][album]

You could create a user field, eg Album DR and set it as a calculated field to:

load(var_album_dr[album artist (auto)][album])

This would then allow use to use the field anywhere inside MC.  But you would have to ensure you call the smartlist to populate the field values otherwise they would just be empty.

You'll notice I use [album artist (auto)][album] in the variables.  This helps to keep albums with the same title, but different artists, from clashing with each other, but won't keep compilation albums with the same title apart.  But that's another level of problem to solve.