Confirming this Ultra developer flub

Dude

Active Member
Joined
Dec 10, 2025
Messages
78
I was shocked to see bit rate no higher than 16 on my Ultra. So I ran this by AI:

“I play Apple Music on Lysoniq app on iPhone Xr to USB to a “Cubilux USB A to TOSLINK Optical Audio Adapter“, to a Wiim Ultra. But the bit rate never goes higher than 16. The Cubilux USB A to TOSLINK Optical Audio Adapter is rated at 24/192 but songs played at that hi-res play 16/192. What is happening?”

It gave this answer — is it correct?

”The issue is a cosmetic display limitation within the WiiM interface, not a hardware bottleneck. You are successfully playing 24-bit/192kHz audio.

The WiiM Optical Display Limitation

The TOSLINK (S/PDIF) protocol does not carry bit-depth metadata in a way that the WiiM Ultra can accurately translate to its display interface.

Previous Behavior: WiiM software originally defaulted to displaying "24-bit" for all optical connections. Users complained that standard 16-bit sources (like CD players and older TVs) were incorrectly labeled as high-res 24-bit.

Current Behavior: To resolve those complaints, WiiM updated their firmware to hardcode the optical input display to "16-bit" for all optical inputs, regardless of the actual incoming bit depth.

Hardware Verification

Your physical chain passes a bit-perfect 24/192 signal without downsampling:

iPhone XR (Apple Music): Outputs up to 24-bit/192kHz natively.

Lightning to USB / Cubilux Adapter: Converts and outputs the full 24/192 signal over optical.

WiiM Ultra: Receives and processes the 24/192 signal correctly, even though the UI only displays the 16-bit placeholder.

No configuration changes are necessary. Your system is operating at maximum resolution, and the 16-bit readout is strictly a visual quirk of how WiiM handles optical inputs.”

But what about the app? (actually what I was asking about)

“The WiiM app acts as a direct monitor for the WiiM Ultra's internal status. Because the WiiM Ultra's firmware defaults to labeling all incoming optical signals as 16-bit, the app simply mirrors that hardcoded label.”

I would suggest the Wiim engineers change this flub to show “16/24 bit” instead of either number. “/” meaning “or”.
 
As has been mentioned in the forum, WiiM said they can't reliably extract the bit depth from the headers of an incoming digital signal, so originally always displayed 24 until they changed it to 16 after some users complained 24 was incorrect for input from a CD player iirc. If they changed it to "16/24" I think that would confuse matters more as that format is normally used to show bit depth/sample rate. Even if they changed it to "-" or similar, users will still question what that meant. But now that you know the background, does it still matter what it shows?
 
Last edited:
As has been mentioned in the forum, WiiM said they can't reliably extract the bit depth from the headers of an incoming digital signal, so originally always displayed 24 until they changed it to 16 after some users complained 24 was incorrect for input from a CD player iirc. If they changed it to "16/24" I think that would confuse matters more as that format is normally used to show bit depth/sample rate. Even if they changed it to "-" or similar, uses will still question what that meant. But now that you know the background, does it still matter what it shows?
Even though it’s early in the morning, that’s a spot-on beautiful answer. 😄
 
I was quite surprised at how well the AI response seemed to describe the situation reasonably accurately, but I had to look up what “flub” was, because it’s not a slang word that I am familiar with.🤔
 
As has been mentioned in the forum, WiiM said they can't reliably extract the bit depth from the headers of an incoming digital signal, so originally always displayed 24 until they changed it to 16 after some users complained 24 was incorrect for input from a CD player iirc. If they changed it to "16/24" I think that would confuse matters more as that format is normally used to show bit depth/sample rate. Even if they changed it to "-" or similar, uses will still question what that meant. But now that you know the background, does it still matter what it shows?
TBH personally I don't care much for what it shows. But I am a strong contender of 'if you do it, do it right'. If in doubt it shouldn't show anything at all.
 
I was quite surprised at how well the AI response seemed to describe the situation reasonably accurately, but I had to look up what “flub” was, because it’s not a slang word that I am familiar with.🤔
I can't say how common it is but the word obviously does exist.

flub.jpg
 
I was quite surprised at how well the AI response seemed to describe the situation reasonably accurately, but I had to look up what “flub” was, because it’s not a slang word that I am familiar with.🤔
Probably used the WiiM forum as its source...

And re "flub", I thought you as moderator would have translated non-English terms as you normally do 😜🤣
 
Probably used the WiiM forum as its source...

And re "flub", I thought you as moderator would have translated non-English terms as you normally do 😜🤣
AI tells me it's your (our) job...

A curator of words is someone who carefully selects, organizes, and presents language, literature, or information to make it meaningful for an audience, often acting as a specialized editor, content strategist, or language expert.
 
AI tells me it's your (our) job...

A curator of words is someone who carefully selects, organizes, and presents language, literature, or information to make it meaningful for an audience, often acting as a specialized editor, content strategist, or language expert.
It appears though that we are curators without portfolio as we can't edit posts ;)
 
TBH personally I don't care much for what it shows. But I am a strong contender of 'if you do it, do it right'. If in doubt it shouldn't show anything at all.
To be fair, that is what they do with Chromecast tracks, they show nothing as again they can't accurately extract that info from such a stream, so maybe my suggestion of "-" should be upvoted ;)
 
How come not able to know the bit depth of incoming optical signal. Then how the hell DSP works on the signal and DAC converts that to analogue if they don't take the samples with the right amount of bits?
 
Or maybe it should say "how the hell you don't know your own source's bit depth, stupid".

A tad long, but it could be shortened to "hthydkyosbds".
 
Last edited:
I was quite surprised at how well the AI response seemed to describe the situation reasonably accurately, but I had to look up what “flub” was, because it’s not a slang word that I am familiar
Cough, human in control, cough. Don’t make me flabbergasted!
 
Back
Top