Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: Cross-platform AAC encoding: Nero vs Max/XLD (Read 14413 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

Cross-platform AAC encoding: Nero vs Max/XLD

It seems that every time I stumble across an online discussion over AAC encoding, I read about Nero's AAC engine. And I'm not sure whether to attribute the frequent discussion of it to the fact that Windows is the dominant platform and Nero provides one of the best solutions for it, or the fact that Nero is actually better for AAC encoding than anything available on OS X.

So what I'm wondering is if somebody could settle for me whether or not Nero actually does a better job with VBR AAC than XLD or Max under OS X does. If so, setting up a virtual environment on my MacBook to run Windows may be worth it.

Thanks.
Altec UHP336s/Sennheiser eH 150s/AKG K171s & q113 tVBR AAC...

Cross-platform AAC encoding: Nero vs Max/XLD

Reply #1
Search for Quicktime AAC. This has been discussed in the past.

Cross-platform AAC encoding: Nero vs Max/XLD

Reply #2
Search for Quicktime AAC. This has been discussed in the past.


Ah, thanks. 
Altec UHP336s/Sennheiser eH 150s/AKG K171s & q113 tVBR AAC...

Cross-platform AAC encoding: Nero vs Max/XLD

Reply #3
Nero does a great job too, and i noticed both at 192 kbps nero produced smaller files than both quicktime AAC and lame 3.99's latest builds...

I hear no audible difference between any at 192kpbs, so the only thing to eliminate the encoders since they all pretty much sound the same is how efficient they do it..

Oh yea and i have quicktime aac...i bought it

Cross-platform AAC encoding: Nero vs Max/XLD

Reply #4
Nero does a great job too, and i noticed both at 192 kbps nero produced smaller files than both quicktime AAC and lame 3.99's latest builds...

I hear no audible difference between any at 192kpbs, so the only thing to eliminate the encoders since they all pretty much sound the same is how efficient they do it..


Sorry, that's irrelevant. It would be quite easy to let your codec always output smaller files than your competitor at any given bitrate!

But yes, these are both indeed excellent codecs. It should depend on your environment, which to use: Quicktime is much easier to employ on a Mac, Nero on Windows.

BTW, the best (quality per byte) setting in Quicktime will be the 127 setting of the True VBR encoder. That will also output 192~ on average (over your whole collection), but has much more freedom to scale for very easy and very complex passages. It is not available in iTunes.

Cross-platform AAC encoding: Nero vs Max/XLD

Reply #5
It would be quite easy to let your codec always output smaller files than your competitor at any given bitrate!


Brilliant idea!

Has anyone ever checked how many bps does NEROAACENC use for given br/cbr setting?

Cross-platform AAC encoding: Nero vs Max/XLD

Reply #6
And why would anyone care?

128 kbit/s CBR settings will give 128 kbit/s on average very closely. It's much more important to tune an encoder for quality than patching it at all ends for point landing on a target bitrate with byte precision. Nobody need's that. Slight +/- overshooting is depending on content.

There's nowadays no reason to care about exact bitrates for AAC, anyway. VBR will give you much more quality per byte on average and compatibility is great. This is much more relevant than wether an encoder in CBR mode outputs 129 kbit/s on average for track A and 127 kbit/s for track B.


Cross-platform AAC encoding: Nero vs Max/XLD

Reply #8
Don't you think that encoder setting can be biased? Here is an example, I just encoded all files
from BS.1387-1 CD and there are few tracks of type "B"!

neroaacenc -lc -br 48000 -if infile.wav -of outfile.mp4

[font= "Courier New"]
filename -- bit/s
arefsna.mp4            60756
breftri.mp4            52356
crefsax.mp4            55630
erefsmg.mp4            49476
frefsb1.mp4            50209
freftr1.mp4            51587
freftr2.mp4            51587
freftr3.mp4            51587
grefcla.mp4            54819
irefsna.mp4            60807
krefsme.mp4            49432
lrefhrp.mp4            57226
lrefpip.mp4            50860
mrefcla.mp4            56625
nrefsfe.mp4            49266
srefclv.mp4            59752
[/font]

The second column is rawaac bitrate, mp4 headers're excluded. mean = 53873, ADTS would have plus 2625 bits/s

Cross-platform AAC encoding: Nero vs Max/XLD

Reply #9
  • This test looks a little far-fetched. The clips seem to be quite short and you force the encoder into LC mode for a bitrate where HE/SBR would be the natural choice. Both factors can skew the results and don't seem necessary to make a point.
  • You are still refraining to explain why all this should be important, even if a encoder had a slight bias in one or the other direction.
  • Strict CBR is the worst mode of operation one can imagine for AAC and MP3. Keeping the bitrate constant means a constant fluctuation between bit shortage (degraded quality) and surplus (waste of space). Letting the bitrate fluctuate slightly (or even better freely as in VBR), instead of enforcing strict limits for no obvious benefit, can considerably increase quality.


If you are interested in codec quality evaluation, sit down with a set of killer samples and try to ABX them. That is going to tell you much more than comparing bitrate deviations.

Choosing a target quality instead of a target bitrate will always give you more quality per byte, wether for storage or streaming.

Cross-platform AAC encoding: Nero vs Max/XLD

Reply #10
-br is ABR where the final bitrate will depend a lot on the content being encoded, with -cbr you will get very close to the actual requested bitrate.

Cross-platform AAC encoding: Nero vs Max/XLD

Reply #11
This test looks a little far-fetched. The clips seem to be quite short and you force the encoder into LC mode for a bitrate where HE/SBR would be the natural choice. Both factors can skew the results and don't seem necessary to make a point.


The clips are not much shorter than those used in listening tests. Using HEv1 gives 52596 bits/s mean bitrate.

You are still refraining to explain why all this should be important, even if a encoder had a slight bias in one or the other direction.[/u]


I never claimed this is important. But do please note that BITRATE is rather a bitstream (but not encoder) parameter, so words
"output smaller files than your competitor at any given bitrate" sound funny.

Though you believe it's not important people often compare codecs operating with same bitrate and I noticed possible bias.

Strict CBR is the worst mode of operation one can imagine ....


It seems like you have personal enmity towards CBR. If the CBR mode is that bad CT encoder would probably have bad scores in listening tests.

sit down with a set of killer samples and try to ABX them


Thanks. Would you mind if I hungup for a while?










Cross-platform AAC encoding: Nero vs Max/XLD

Reply #12
But do please note that BITRATE is rather a bitstream (but not encoder) parameter, so words "output smaller files than your competitor at any given bitrate" sound funny.


I'm glad that I could make you laugh!  But bitrate is all this:
  • A parameter in each MP3 frame header.
  • A property of the actual bitstream (actual bits per second in interval i<=stream length).
  • An encoder parameter (target bitrate).

1. and 2. are not identical, not even if you look at single frames. 3. is also distinct since the encoder has to actually hit the set bitrate when allocating bytes, which is not trivial. Very slight over- and undershooting might apply.

Though you believe it's not important people often compare codecs operating with same bitrate and I noticed possible bias.


People also often color the edges of their CDs with a pen to improve sound quality. I also believe that's not important.

Strict CBR is the worst mode of operation one can imagine ....


It seems like you have personal enmity towards CBR.


I just stated a fact, where did you see personal enmity?

 

Cross-platform AAC encoding: Nero vs Max/XLD

Reply #13
Strict CBR is the worst mode of operation one can imagine ....


It seems like you have personal enmity towards CBR.


I just stated a fact, where did you see personal enmity?


It's not a fact at all. While CBR might not give enough bits to some parts of a song which could hurt quality, VBR can easily give too little bits and hurt quality compared to CBR that way. Doing VBR is complicated and like most psychoacoustic related things, can easily be done wrong or it can fail on certain input data, CBR does not have this problem. So CBR is not the worst mode imaginable, badly implemented VBR and ABR can be worse in general and even well implemented VBR and ABR can do worse on specific input.

Cross-platform AAC encoding: Nero vs Max/XLD

Reply #14
I did not say the worst implementation imaginable, but the worst mode. Any implementation can be done in a sub-optimal way. When looking at two comparably well done implementations you'll obviously get more quality per byte if you let the bitrate fluctuate (ABR, VBR)  instead of letting the quality fluctuate (CBR), or not?

Your own product defaults to VBR and setting the bitrate with "-br" (not "-abr") defaults to ABR, only "-cbr" sets strict CBR mode.

If CBR has so many advantages in your opinion, why isn't it the default?

Cross-platform AAC encoding: Nero vs Max/XLD

Reply #15
It's not a fact at all. While CBR might not give enough bits to some parts of a song which could hurt quality, VBR can easily give too little bits and hurt quality compared to CBR that way.

It can be reason why Apple has constrained VBR algorithm.

Cross-platform AAC encoding: Nero vs Max/XLD

Reply #16
It can be reason why Apple has constrained VBR algorithm.


From Apple's developer documentation:

Quote
Variable Bit Rate (VBR) kAudioCodecBitRateControlMode_Variable - Recommended for controlling audio quality.

The audio signal is encoded with constant (and settable) quality and virtually no bit rate constraints. This is the best mode to achieve consistent audio quality across many files and the smallest file size to achieve that quality. It also has the lowest complexity of all the encoding modes.

Variable Bit Rate But Constrained (VBR Constrained) kAudioCodecBitRateControlMode_VariableConstrained - Recommended as a compromise between VBR and ABR.

This mode is similar to VBR but limits the average bit rate variation. The lower limit is the user-selected bit rate. Higher bit rate is adapted for difficult tracks and can generate larger files than the ABR mode.


iTunes' restriction to constrained VBR has probably mainly economical reasons, i.e. they don't want to waste support costs for people asking why the highest quality VBR setting can sometimes result in <92 kbit/s files (I have some bandwidth limited recordings from the 50's that scale that low).

Anybody who selects 192 kbit/s VBR in iTunes will get a bitrate close to or above 192 kbit/s for that file. The unconstrained VBR setting accessible through Quicktime, CoreAudio, or afconvert will deliver 192 kbit/s average over a whole collection at the highest setting of Q127.

Cross-platform AAC encoding: Nero vs Max/XLD

Reply #17
I did not say the worst implementation imaginable, but the worst mode. Any implementation can be done in a sub-optimal way. When looking at two comparably well done implementations you'll obviously get more quality per byte if you let the bitrate fluctuate (ABR, VBR)  instead of letting the quality fluctuate (CBR), or not?


Sure, that's fact. But how do you know if an implementation is well done? This needs to be objectively measured somehow (TOS8). So just because of your theoretical fact you can't advise someone not to use CBR.

Quote
Your own product defaults to VBR and setting the bitrate with "-br" (not "-abr") defaults to ABR, only "-cbr" sets strict CBR mode.

If CBR has so many advantages in your opinion, why isn't it the default?


Because we believe that our VBR implementation is very good.

Cross-platform AAC encoding: Nero vs Max/XLD

Reply #18
Sure, that's fact. But how do you know if an implementation is well done? This needs to be objectively measured somehow (TOS8). So just because of your theoretical fact you can't advise someone not to use CBR.


Sorry, I consider your argument much more theoretical than mine. For all maior MP3 & AAC codecs (I have only tested LAME, QT, and NERO - FHG anyone?) VBR is much better at identical bit rates for most if not all content. Isn't that common sense for a long time? Just take the following quick sample:

EBU SQAM 17a:

Original: [attachment=5172:tec_sqam...lossless.flac]
LAME 3.98.2, CBR 96 kbps, q2: [attachment=5174:tec_sqam...6kbps_q2.flac]Sounds like shit, heavy tremolo over the whole range.

LAME 3.98.2, New VBR (6.9) 96.46 kbps, q2: [attachment=5173:tec_sqam...R_6.9_q2.flac]Excellent quality, almost transparent!

Both samples are time synchronized, the VBR and CBR encoder had produced different delays. ABX logs should be expendable (I have them), both can be distinguished without much effort. It's just laughably easy for the CBR encoding anywhere in the file, but I had to carefully listen to the key release click positions to ABX the VBR encoding.

Here are the original MP3s (including delay):
[attachment=5175:tec_sqam...6kbps_q2.mp3][attachment=5176:tec_sqam...R_6.9_q2.mp3]

I couldn't adjust the VBR encoder to output exactly 96 kbps, the only other option would have been 86 kbps (7.0).

Cross-platform AAC encoding: Nero vs Max/XLD

Reply #19
Yes VBR is recommended for a long time, but mostly based on the theoretical quality advantage in case of a optimal implementation, as far as I can see. I've never noticed any carefully performed comparisons on this. I love to see them, though. I don't think the sample you used is very good for this, it's 11 seconds, with almost 4 seconds of silence. VBR will be able to add around 30% of bitrate to the non-silent parts, that will usually not happen in real life samples.

Cross-platform AAC encoding: Nero vs Max/XLD

Reply #20
It's funny because I was just thinking about this the other day.  CBR has been the red headed step child for so long.  Maybe it's time to give it some respect.

And maybe this thread should be split, since it's gotten way off path.

Cross-platform AAC encoding: Nero vs Max/XLD

Reply #21
Give it respect because of the suggestion* that some VBR implementations might not be very good?

That's nonsense!

(*) coming from a person of authority who recommends VBR, no less.

If someone is willing to demonstrate the superiority of CBR over VBR at similar bitrates with everyday samples I'll split the thread, otherwise I think this senseless quibbling is probably best kept here.

Maybe it's best that I just close this thread?

Cross-platform AAC encoding: Nero vs Max/XLD

Reply #22
Whoa nelly!  I just said maybe give it some respect. 

And if you closed threads for "senseless quibbling" there'd be no forum.