VideoHelp Forum
+ Reply to Thread
Results 1 to 11 of 11
Thread
  1. Hi All,

    I am trying play AAC player through PC over HDMI then HDMI connected my receiver, Actually my receiver is capable of decoding AAC stream.

    I have tried using MPC-HC and MPC-BE players didn't succeed.

    And also tried using Kodi player by enabling audio pass through option still it plays AAC as PCM data only.

    I suspect those players are converting AAC to PCM. I don't want to decode AAC stream to any other format(ex: PCM,AC3 etc). I just want to get the output as AAC only.

    Please help on guys on this issue.
    Quote Quote  
  2. Member
    Join Date
    Aug 2010
    Location
    San Francisco, California
    Search PM
    What is your HDMI card?
    Quote Quote  
  3. Member
    Join Date
    Mar 2008
    Location
    Netherlands
    Search Comp PM
    Does the hdmi specs even support AAC?
    I can't find any info regarding this.

    Afaik hdmi supports bitstream or lpcm up to 8 channels, but there's no mention of AAC
    Quote Quote  
  4. Member
    Join Date
    Aug 2010
    Location
    San Francisco, California
    Search PM
    "HDMI can carry any currently available flavor of compressed audio format."
    https://www.hdmi.org/learningcenter/kb.aspx?c=11#38
    Quote Quote  
  5. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    It ultimately makes no difference whether hdmi carries aac or not. The path still has to be aac->lpcm->analog->speakers->ears. Changing it so that decoding to lpcm is later only buys you lower data bitrate down the hdmi pipe, and those differences are miniscule and noteworthLESS in the overall datarate scheme of things. Who cares if you use 75(v)+1(a)=76% or 75+0.1=75.1% of the bandwidth?
    Only other difference is decoder algorithm, and it is very likely that the aac decoder in your computer is noticeably better quality than the one in your tv.

    Scott
    Last edited by Cornucopia; 3rd Nov 2017 at 21:44.
    Quote Quote  
  6. Originally Posted by jan5678 View Post
    Does the hdmi specs even support AAC?
    I can't find any info regarding this.

    Afaik hdmi supports bitstream or lpcm up to 8 channels, but there's no mention of AAC
    Excerpt from HDMI specification:

    Image
    [Attachment 43610 - Click to enlarge]



    Originally Posted by Cornucopia View Post
    It ultimately makes no difference whether hdmi carries aac or not.
    Maybe for this particular case but for sure not in overall context - compressed stream may have more than 8 channels.

    Originally Posted by Cornucopia View Post
    Only other difference is decoder algorithm, and it is very likely that the aac decoder in your computer is noticeably better quality than the one in your tv.
    Sorry but it is very unlikely that PC AAC decoder is better than embedded in TV (or generally consumer electronic).
    Last edited by pandy; 4th Nov 2017 at 04:21.
    Quote Quote  
  7. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    From a straight datarate constraint, that's BS and you know this. 24bit lpcm × 192kHz = 4.5Mbps. Even with 200 channels that doesn't come close to the bandwidth of Hdmi. Long before it hits that point, other constraints would kick in elsewhere.

    Embedded decoders might be more compliant, but just like scalars, what you can do in software can far surpass what you can do in embedded firmware, which often has memory-, calculation-, and update/improvement constraints.

    Still curious as to why the OP even considers software decoding an issue...

    Scott
    Quote Quote  
  8. Originally Posted by Cornucopia View Post
    From a straight datarate constraint, that's BS and you know this. 24bit lpcm × 192kHz = 4.5Mbps. Even with 200 channels that doesn't come close to the bandwidth of Hdmi. Long before it hits that point, other constraints would kick in elsewhere.
    Audio is sent only in very limited time slots (during H and V Blank time) - to sent audio sometimes you need to force HDMI weird modes (like pixel repetition modes) this is dirty hack. Side to this HDMI standard define particular limitation and any HDMI compliant sink or source must follow those limitations.
    You can't transmit over HDMI more than 8xLPCM 192kHz - any faster sample rate or more channels are against standard - such audio's must be encapsulated within IEC 61937.

    Originally Posted by Cornucopia View Post
    Embedded decoders might be more compliant, but just like scalars, what you can do in software can far surpass what you can do in embedded firmware, which often has memory-, calculation-, and update/improvement constraints.
    Funny... there is no hard-coded AAC decoder (AFAIK) - all "HW" are DSP based with quite commonly possibility to update code. Problem is fact that Windows (most popular OS) but also other popular OS are not Real Time OS, they suffer from unavoidable limitations related to their nature.
    Embedded decoders are Real Time, they provide predictable behaviour, side to this they are better tested and usually more complaint (frequently also more flexible and capable).
    Quote Quote  
  9. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    Yes, those are some of those constraints I was referring to. But the OP was talking about simple, normal samplerate, normal bitrate aac vs lpcm, and both of those things are quite easily doable by hdmi without any hacks whatsoever, so the location of the decoder in the chain shouldn't have any appreciable affect WRT data burden. We are trying to discuss the OP, right?

    Not all hardware is updatable, or field updatable, and not all field-updatable hardware has enough room (memory). Tech isn't as thoroughly & universally advanced as you might want it to be.
    I already mentioned they are more compliant, so not sure why you're reiterating it except to one-up me.

    And you are putting more weight to "realtime OS" than is warranted. Part of the nature of digital audio & video is buffering, which is the great leveller in smoothness in data output. Except in realtime "live/interactive" situations (videoconferencing, etc), small amounts of total latency are unnoticeable.

    Scott
    Quote Quote  
  10. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    Dup
    Quote Quote  
  11. Member
    Join Date
    Mar 2008
    Location
    Netherlands
    Search Comp PM
    So can we conclude that it is possible to bitstream AAC through hdmi to a capable receiver?
    And that it doesn't make much difference where in the chain the decoding happens?

    If so the OP doesn't need to worry about that the AAC is decoded in the player
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!