I am trying play AAC player through PC over HDMI then HDMI connected my receiver, Actually my receiver is capable of decoding AAC stream.
I have tried using MPC-HC and MPC-BE players didn't succeed.
And also tried using Kodi player by enabling audio pass through option still it plays AAC as PCM data only.
I suspect those players are converting AAC to PCM. I don't want to decode AAC stream to any other format(ex: PCM,AC3 etc). I just want to get the output as AAC only.
Please help on guys on this issue.
+ Reply to Thread
Results 1 to 11 of 11
What is your HDMI card?
Does the hdmi specs even support AAC?
I can't find any info regarding this.
Afaik hdmi supports bitstream or lpcm up to 8 channels, but there's no mention of AAC
It ultimately makes no difference whether hdmi carries aac or not. The path still has to be aac->lpcm->analog->speakers->ears. Changing it so that decoding to lpcm is later only buys you lower data bitrate down the hdmi pipe, and those differences are miniscule and noteworthLESS in the overall datarate scheme of things. Who cares if you use 75(v)+1(a)=76% or 75+0.1=75.1% of the bandwidth?
Only other difference is decoder algorithm, and it is very likely that the aac decoder in your computer is noticeably better quality than the one in your tv.
Last edited by Cornucopia; 3rd Nov 2017 at 22:44.
Last edited by pandy; 4th Nov 2017 at 05:21.
From a straight datarate constraint, that's BS and you know this. 24bit lpcm × 192kHz = 4.5Mbps. Even with 200 channels that doesn't come close to the bandwidth of Hdmi. Long before it hits that point, other constraints would kick in elsewhere.
Embedded decoders might be more compliant, but just like scalars, what you can do in software can far surpass what you can do in embedded firmware, which often has memory-, calculation-, and update/improvement constraints.
Still curious as to why the OP even considers software decoding an issue...
You can't transmit over HDMI more than 8xLPCM 192kHz - any faster sample rate or more channels are against standard - such audio's must be encapsulated within IEC 61937.
Embedded decoders are Real Time, they provide predictable behaviour, side to this they are better tested and usually more complaint (frequently also more flexible and capable).
Yes, those are some of those constraints I was referring to. But the OP was talking about simple, normal samplerate, normal bitrate aac vs lpcm, and both of those things are quite easily doable by hdmi without any hacks whatsoever, so the location of the decoder in the chain shouldn't have any appreciable affect WRT data burden. We are trying to discuss the OP, right?
Not all hardware is updatable, or field updatable, and not all field-updatable hardware has enough room (memory). Tech isn't as thoroughly & universally advanced as you might want it to be.
I already mentioned they are more compliant, so not sure why you're reiterating it except to one-up me.
And you are putting more weight to "realtime OS" than is warranted. Part of the nature of digital audio & video is buffering, which is the great leveller in smoothness in data output. Except in realtime "live/interactive" situations (videoconferencing, etc), small amounts of total latency are unnoticeable.
So can we conclude that it is possible to bitstream AAC through hdmi to a capable receiver?
And that it doesn't make much difference where in the chain the decoding happens?
If so the OP doesn't need to worry about that the AAC is decoded in the player