DamagedGoods
Member
- Joined
- Oct 19, 2024
- Messages
- 10
I have an external phono stage connected to the line-in of my Ultra. I have variable gain on that phono stage and I had to set it to the lowest possible setting or I would occasionally get a lot of distorsion and nasty clicks and pops while playing records, which is not the case when connecting the phono stage directly to my amplifier. So I assumed that there were too little headroom on the Ultra's line-in, but then I played around with audio settings and changed the line-in input resolution bit depth from 16 bits to 24 bits, and all the distorsion disappeared. I can now increase the gain on the phono stage by 6dB without any audible distorsion as a result.
Why do I need 24 bit depth on a relatively low resolution analogue source such as phono?
Why do I need 24 bit depth on a relatively low resolution analogue source such as phono?
Last edited: