Line-in bit depth for external phono stage?

DamagedGoods

Member
Joined
Oct 19, 2024
Messages
10
I have an external phono stage connected to the line-in of my Ultra. I have variable gain on that phono stage and I had to set it to the lowest possible setting or I would occasionally get a lot of distorsion and nasty clicks and pops while playing records, which is not the case when connecting the phono stage directly to my amplifier. So I assumed that there were too little headroom on the Ultra's line-in, but then I played around with audio settings and changed the line-in input resolution bit depth from 16 bits to 24 bits, and all the distorsion disappeared. I can now increase the gain on the phono stage by 6dB without any audible distorsion as a result.

Why do I need 24 bit depth on a relatively low resolution analogue source such as phono?
 
Last edited:
Have you tried reverting to 16 bits and checked if the distortion reappears?

It might just have been a temporary glitch in the config that got resolved by flipping the switch. Just a wild guess. I see no reason why 24 bit should change the handling of the input signal in that way.
 
I did try and the distortion came back. Very peculiar… Sample rate makes no difference, only bit depth.
 
Weird, indeed. But good you found out about it. Losing 6 dB of gain on your phono stage is nothing you'd like to have (for no good reason).

Out of curiosity: Have you tried what happens if you compensate the lost gain with positive digital pre-gain in the audio input settings?
 
Weird, indeed. But good you found out about it. Losing 6 dB of gain on your phono stage is nothing you'd like to have (for no good reason).

Out of curiosity: Have you tried what happens if you compensate the lost gain with positive digital pre-gain in the audio input settings?
I can compensate, and it does not create distorsion, but I fear that it's more than just a headroom issue. 43 dB for MM or 63 dB for MC causes distorsion. Lowering to 40 and 60 respectively solves the issue, but 3dB is a very small margin, and I think I experience slightly more pops even then, compared to when I use 24 bits and higher gain.

I suppose there's no obvious reason NOT to use 24 bit depth though. I would just like to understand why this issue occurs when using 16 bit depth and if it is a bug?
 
I can compensate, and it does not create distorsion, but I fear that it's more than just a headroom issue. 43 dB for MM or 63 dB for MC causes distorsion. Lowering to 40 and 60 respectively solves the issue, but 3dB is a very small margin, and I think I experience slightly more pops even then, compared to when I use 24 bits and higher gain.

I suppose there's no obvious reason NOT to use 24 bit depth though. I would just like to understand why this issue occurs when using 16 bit depth and if it is a bug?
Report it through the app feedback.
 
Back
Top