IPB

Welcome Guest ( Log In | Register )

5 Pages V  < 1 2 3 4 5 >  
Reply to this topicStart new topic
WOW, Monkey's Audio is still the best, Ya better believe it
Fandango
post Sep 24 2006, 16:18
Post #51





Group: Members
Posts: 1546
Joined: 13-August 03
Member No.: 8353



QUOTE (guruboolez @ Sep 24 2006, 16:45) *
EDIT3: fandango> I was completely ignorant about this interesting component smile.gif I had to recover 260 GB of lossless music this summer due to HDD corruption ; I was able to test the recovered FLAC files with the VUplayer app but I'm still looking for a way to check the integrity of the WavPack files. I dreamt about such foobar2000 components. I will look for it. Thanks !!!


I think it only appeared recently on the official components page. It might have been around for some time now, but I only discovered it recently when I updated fb2k. Btw, I posted a bug report on the fb2k forum about this matter so hopefully this will be no problem anymore with fb2k v0.9.4.1 (more likely, I tend to agree with haregoo) or the next version of the integrity checker.
Go to the top of the page
+Quote Post
greynol
post Sep 24 2006, 19:16
Post #52





Group: Super Moderator
Posts: 10000
Joined: 1-April 04
From: San Francisco
Member No.: 13167



I tried the MAC sample and verified the results.

I encoded the sample using the extra high compression setting, altered a few bytes and was able to get only a small portion of audio to decode.

I also encoded the sample using the high compression setting, altered a few bytes and was able to get quite a bit more audio to decode.

To me this suggests that the amount of recoverable data is dependent on the compression used.


--------------------
Your eyes cannot hear.
Go to the top of the page
+Quote Post
TBeck
post Sep 24 2006, 19:31
Post #53


TAK Developer


Group: Developer
Posts: 1095
Joined: 1-April 06
Member No.: 29051



QUOTE (greynol @ Sep 24 2006, 20:16) *
To me this suggests that the amount of recoverable data is dependent on the compression used.

Most encoders partition the whole audio data into smaller frames. If the frames are independend from each other, an error within one frame will in the worst case make the samples of this frame undecodable.

I i remember it right, at least earlier versions (i didn't look at newer versions) of Monkey were using increasingly bigger frames for stronger presets, what would explain your findings.

Furthermore even faster presets of Monkey seem to use considerably bigger frames than for instance FLAC (Default 4608 samples = 104 ms for 44 KHz sampling rate). Therefore it isn't surprising, that Monkey will lose far more samples than FLAC, if a frame is damaged.
Go to the top of the page
+Quote Post
pest
post Sep 24 2006, 19:47
Post #54





Group: Members
Posts: 208
Joined: 12-March 04
From: Germany
Member No.: 12686



The reason why Monkey uses large frames (up to 4s at 44.1khz) relies on it's architecture.
OptimFROG suffers from the same problem. The adaptive predictors have to catch up some data...
Go to the top of the page
+Quote Post
guruboolez
post Sep 24 2006, 19:53
Post #55





Group: Members (Donating)
Posts: 3474
Joined: 7-November 01
From: Strasbourg (France)
Member No.: 420



QUOTE (greynol @ Sep 24 2006, 20:16) *
To me this suggests that the amount of recoverable data is dependent on the compression used.

Yeah... It was already said it in a previous post rolleyes.gif Bullshit, isn't it?
Go to the top of the page
+Quote Post
Leo 69
post Sep 24 2006, 20:01
Post #56





Group: Members
Posts: 121
Joined: 16-May 04
From: UK - Russia
Member No.: 14117



guruboolez, how would your experiment with manually corrupting the files apply to normal conditions (without modifiying them intentionally) ? What conditions are mandatory to make an .ape file non-playable ? I've never seen anyone reporting such problems in real life. Any links ?

Thank you

This post has been edited by Leo 69: Sep 24 2006, 20:03
Go to the top of the page
+Quote Post
guruboolez
post Sep 24 2006, 20:16
Post #57





Group: Members (Donating)
Posts: 3474
Joined: 7-November 01
From: Strasbourg (France)
Member No.: 420



QUOTE (Leo 69 @ Sep 24 2006, 21:01) *
guruboolez, how would your experiment with manually corrupting the files apply to normal conditions (without modifiying them intentionally) ? What conditions are mandatory to make an .ape file non-playable ? I've never seen anyone reporting such problems in real life. Any links ?

Thank you

First of all, don't expect from me (or anyone sane) to take a hammer and to partially destroy my hard disks in order to get corrupted files corresponding to "real life" problems. Most of us don't have any other choice than simulating "real life" with artificial conditions - and their validity is in essence questionable. I don't have any answer to your question. It's still better than nothing I believe and I'm not the first one using such artificial way of corruption to test error handling of various audioformats. This experience is still teaching us valid elements (like the average amount of lost data or the way different tools are handling the error).

About corruptions, there were several reports in the past from HA.org users (especially with Monkey's Audio, apparently more sensitive to hardware issues). You can search for it. And as I said in a previous post, I got myself two cases of corruption, and for one I was unable to recover anything occuring after the corrupted part ; for the second, the loss was limited to a small fragment. Our simulation is apparently showing the same phenomenon: sometimes recovery is possible (see greynol experience); sometimes it isn't (see mine).

Note: corruptions are more likely to happen on optical media such as CD-R and DVD-R. I have a lot of .ape files burned into CD-R. Once they get corrupted, I will obtain real-life complient testing files - but I'm not too hurry for that wink.gif

This post has been edited by guruboolez: Sep 24 2006, 20:19
Go to the top of the page
+Quote Post
spoon
post Sep 24 2006, 20:29
Post #58


dBpowerAMP developer


Group: Developer (Donating)
Posts: 2725
Joined: 24-March 02
Member No.: 1615



In real life if you get a bad hard drive you will loose more than a few bytes, what ever the sector size is - so perhaps multiples of 4KB to 32KB missing / corrupted data - if could be all lossless encoders would barf on such a large corruption, not sure.

>(especially with Monkey's Audio, apparently more sensitive to hardware issues).

Which is why when encoding to lossless you should always verify the written file against the original source md5, this not only checks the encoder but also your hard disk.


--------------------
Spoon http://www.dbpoweramp.com
Go to the top of the page
+Quote Post
guruboolez
post Sep 24 2006, 20:37
Post #59





Group: Members (Donating)
Posts: 3474
Joined: 7-November 01
From: Strasbourg (France)
Member No.: 420



QUOTE (spoon @ Sep 24 2006, 21:29) *
In real life if you get a bad hard drive you will loose more than a few bytes, what ever the sector size is

I know... I lost 20 full albums this summer due to hardware corruption on a *NEW* hard drive crying.gif

On an optical media I suppose that the corruption could be much smaller when the media just start to become unreadable on some part.
Go to the top of the page
+Quote Post
Leo 69
post Sep 24 2006, 21:38
Post #60





Group: Members
Posts: 121
Joined: 16-May 04
From: UK - Russia
Member No.: 14117



Well, the reasons for avoiding Monkey's audio regarding corruption due to losing some few bytes seem to me pretty vague, since this is an extremely rare and irreproducable case. I'm sure if you have reliable media and robust burner, there's nothing to worry about, let alone keeping the files on a harddisk which I consider the safest way of keeping the media of all.
Go to the top of the page
+Quote Post
guruboolez
post Sep 24 2006, 21:49
Post #61





Group: Members (Donating)
Posts: 3474
Joined: 7-November 01
From: Strasbourg (France)
Member No.: 420



QUOTE (Leo 69 @ Sep 24 2006, 22:38) *
Well, the reasons for avoiding Monkey's audio regarding corruption due to losing some few bytes seem to me pretty vague

I agree (EDIT: more or less though. The ability of playing music after a stream problem is important and such possibility is implemented in all efficient A/V containers for good reasons). But did someone talked about leaving this format for that reason? In this topic? I don't think so. BTW situation isn't really different for the ratio argument: who will seriously choose a format for the sole reason that it allows you to spare 1,5% of your diskspace? Isn't it a bit ridiculous? Could a few kbps pertinently make MAC as "the best" lossless format (to quote the title of this topic) or wouldn't it better to take into account several more reasons (as error handling in rare situation, and also features such seeking speed, hardware support, development, impact on decoding speed, etc...)?


2nd EDIT after bryant's message: minor clarification.

This post has been edited by guruboolez: Sep 24 2006, 23:05
Go to the top of the page
+Quote Post
bryant
post Sep 24 2006, 22:58
Post #62


WavPack Developer


Group: Developer (Donating)
Posts: 1287
Joined: 3-January 02
From: San Francisco CA
Member No.: 900



There are a few things I'd like to add to this topic. First, thanks to Guru for doing these tests. I suspect that the authors of the various encoders have also done tests like these, but it's nice to have someone outside who doesn't already know the best (and worse) places to corrupt the file.

This brings me to a caveat, and that is that there is a certain amount of luck to this procedure. It's kind of like poking someone with a stick; most of the time you'd get them in the leg or the arm, but if you were really lucky you would get them in the eye and cause real trouble. Guru sent me a file some time about ago that had the first block corrupt. It played fine (minus the first block), but because I had the overall length encoded there, and didn't properly handle the case of it being missing, I reported an absurd runtime. I recently got a WavPack file that had a corruption that causes a GPF.

The only way a test like this could be truly fair is to introduce hundreds of random errors and see what percentage cause various levels of damage. Obviously a decoder should be written in such a way that no error can cause the whole file to be corrupt, but there can always be an unforeseen problem that trips the decoder up so badly (like my GPF above) that continuing is impossible. I agree with greynol that this entry in the wiki should probably not be a binary option.

As for WavPack, I am actually in the process of improving the robustness of the decoder, which is what has delayed version 4.4 somewhat. The decoding of hybrid lossless files is currently pretty fragile, and there are some problems with regular files like what Guru and I found.

Also, WavPack currently has a CRC in each block for the decoded audio data, but I have been considering adding a CRC to cover the entire block. This would improve the robustness because I could ignore bad blocks straight away rather than try to parse through them and possibly get tripped up, and it would also allow a "quick verify" option that Guru requested long ago.

Finally, in fairness to Matt, I believe that the Monkey's Audio format (and the decoder) were designed long before this hysteria with "error robustness" started. The WavPack format back then would not tolerate a single bit error, and would in fact sometimes play full volume white noise until the end of the track! This kind of back and forth competition and learning is why the lossless encoders are as good as they are today, and why we're not all still using Shorten... smile.gif
Go to the top of the page
+Quote Post
pest
post Sep 24 2006, 23:28
Post #63





Group: Members
Posts: 208
Joined: 12-March 04
From: Germany
Member No.: 12686



QUOTE
Also, WavPack currently has a CRC in each block for the decoded audio data, but I have been considering adding a CRC to cover the entire block.


in my opinion the following is optimal. a crc32 for every entire block to check for io-errors
and a forced md5 on the whole file to evaluate possible decoding errors.
Go to the top of the page
+Quote Post
TBeck
post Sep 24 2006, 23:35
Post #64


TAK Developer


Group: Developer
Posts: 1095
Joined: 1-April 06
Member No.: 29051



QUOTE (bryant @ Sep 24 2006, 23:58) *
This brings me to a caveat, and that is that there is a certain amount of luck to this procedure. It's kind of like poking someone with a stick; most of the time you'd get them in the leg or the arm, but if you were really lucky you would get them in the eye and cause real trouble. Guru sent me a file some time about ago
...

I totally agree. While for example the (limited) error robustness test for Yalac could not bring it into trouble, i later found some cases, which the decoder could not handle. If someone had damaged the right part...

QUOTE (bryant @ Sep 24 2006, 23:58) *
The only way a test like this could be truly fair is to introduce hundreds of random errors and see what percentage cause various levels of damage. Obviously a decoder should be written in such a way that no
..

*Promotion on* If someone would like to try my Damage-tool for this, please tell me... *Promotion off*

QUOTE (bryant @ Sep 24 2006, 23:58) *
Finally, in fairness to Matt, I believe that the Monkey's Audio format (and the decoder) were designed long before this hysteria with "error robustness" started. The WavPack format back then would not tolerate a single bit error, and would in fact sometimes play full volume white noise until the end of the track! This kind of back and forth competition and learning is why the lossless encoders are as good as they are today, and why we're not all still using Shorten... smile.gif

To be honest, without the requests from the hydrogen members, i myself would have paid very little attention to error robustness. And yes, the standards are much higher today, who would have known earlier? In 1997 (when i started with my work on audio compression) i never would have used CRC's, because my i486-25 MHz simply was too slow...
Go to the top of the page
+Quote Post
adlai
post Sep 25 2006, 01:18
Post #65





Group: Members
Posts: 317
Joined: 29-November 03
Member No.: 10090



You know, now that you speak of it...


Last year, when 3.97 alpha was made the recommended version, I decided to do a complete reencode. At the time I had about 120 DVD-R's full of .APE files.

I ran into about 5 DVD-R's that were corrupted in some manner. I lost the audio data on them.

Now, in some cases there were actually holes in the metal skin, ie a manufacturing error. other cases it was I think from scratches on the disc.

I'm not sure if flac could have done better, but in most cases, I'd wager that a simple disc doctor could have solved it.
Go to the top of the page
+Quote Post
CyberFoxx
post Sep 25 2006, 02:31
Post #66





Group: Members
Posts: 62
Joined: 15-August 02
Member No.: 3062



Just thought I'd toss in my $0.02CDN.

Monkey's Audio, as a format, is great. Great compression, decent playback speed, etc.
Monkey's Audio, as a supported format IMHO, isn't so great. I would gladly convert all my Flacs to APEs, but not many, if any, media players for Linux actually support Monkey's Audio.

Then again, WavPack and OptimFROG suffer from the same problem as well. Just seems that OSS loves Ogg Vorbis and Flac for some reason...


--------------------
"It's the panties fault! The panties made me a pervert!"
Go to the top of the page
+Quote Post
jcoalson
post Sep 25 2006, 06:55
Post #67


FLAC Developer


Group: Developer
Posts: 1526
Joined: 27-February 02
Member No.: 1408



one more note is that a thorough test should also include deletions, including big ones, like when you lose a bunch of sectors on a disk. a decoder might survive errors contained within a frame but still fail on deletions because of inability to resync.

tbeck has a tool called damage that would be good for automating some of this.

this thread and the other one linked should be a footnote for the robustness part of the wiki table.

Josh
Go to the top of the page
+Quote Post
greynol
post Sep 25 2006, 07:06
Post #68





Group: Super Moderator
Posts: 10000
Joined: 1-April 04
From: San Francisco
Member No.: 13167



QUOTE (guruboolez @ Sep 24 2006, 11:53) *
rolleyes.gif Bullshit, isn't it?
Touche? biggrin.gif

So considering that MAC beats out flac when it comes to compression by well over 1.5% (try 5+% for my eclectic collection), how much data is lost when it comes to files with comparable compression?

I do appreciate your demonstration of one of MAC's shortcomings. Will you be doing any tests comparing apples to apples when it comes to data corruption? I mean you did prove a valid point, but I don't think it should serve as a basis for making truly comparative claims (not that I'm accusing you of making any such claims) regarding error tolerance.

flac has many many virtues. One worth mentioning here is that it keeps a md5sum handy for the raw pcm data. This has enabled me to verify all of the files I've transcoded to Monkey's Audio using the high profile. I have converted well over 800 titles, but for each 100 that I transcode, 2GB of drive space is freed up. Maybe this isn't worthwhile to some, but it has been for me.

Bryant, I'm glad that you commented about the wiki. I made no mention of your codec earlier because I'm only vaguely familiar with it but have been very impressed from what little exposure I have had.

EDIT: Josh, thanks for taking the time to comment also. You wrote it while I was drafting this.

A specific explanation and some sort of graded scale for the error entry in the wiki chart would hopefully make it more clear. I know that it has been confusing for me! I also think it is fair to give MAC a lower score than flac or include some sort of warning regarding what profile is used when it comes to this entry.

This has been very enlightening for me and I thank all of you for helping me better understand this issue that I wouldn't let go.

This post has been edited by greynol: Sep 25 2006, 07:21


--------------------
Your eyes cannot hear.
Go to the top of the page
+Quote Post
spoon
post Sep 25 2006, 08:16
Post #69


dBpowerAMP developer


Group: Developer (Donating)
Posts: 2725
Joined: 24-March 02
Member No.: 1615



>One worth mentioning here is that it keeps a md5sum handy for the raw pcm data

Monkeys Audio after 3.99 has an md5 (but it is upto the application to manually check it, not like flac which can check automatically at the end if asked).


--------------------
Spoon http://www.dbpoweramp.com
Go to the top of the page
+Quote Post
guruboolez
post Sep 25 2006, 17:02
Post #70





Group: Members (Donating)
Posts: 3474
Joined: 7-November 01
From: Strasbourg (France)
Member No.: 420



QUOTE (greynol @ Sep 25 2006, 08:06) *
I do appreciate your demonstration of one of MAC's shortcomings. Will you be doing any tests comparing apples to apples when it comes to data corruption? I mean you did prove a valid point, but I don't think it should serve as a basis for making truly comparative claims (not that I'm accusing you of making any such claims) regarding error tolerance.

The test I did was more an example than a real, complete or simply valid test. It's not meant to draw any definite conclusions, but rather to easily illustrate that Monkey's error handling could be more annoying than WavPack or FLAC's one. A very basic approach (corrupting 16 consecutive bytes) is apparently enough to provisionally refute some your past claims. I'm of course open to further experience even (and especially) if they could invalidate my own results. I'm not defending any particular position. My recent experience simply confirms what several people noticed and published in the board these last years.

Performing a full test would be much more time consuming: it should be based IMO on several samples with different length, different degree of corruptions, different way of corruption (example: destroying the header instead of a part of the audio stream), etc... TBeck's tool may help (I didn't tried it yet) anyone interested to achieve this big task.
Go to the top of the page
+Quote Post
greynol
post Sep 25 2006, 18:00
Post #71





Group: Super Moderator
Posts: 10000
Joined: 1-April 04
From: San Francisco
Member No.: 13167



Yes, I made a blanket statement that MAC could be decoded despite errors, as did rjamorim that it couldn't.

My point, however, is that at comprable levels of compression MAC doesn't fail like your example shows.


--------------------
Your eyes cannot hear.
Go to the top of the page
+Quote Post
guruboolez
post Sep 25 2006, 18:52
Post #72





Group: Members (Donating)
Posts: 3474
Joined: 7-November 01
From: Strasbourg (France)
Member No.: 420



QUOTE (greynol @ Sep 25 2006, 19:00) *
My point, however, is that at comprable levels of compression MAC doesn't fail like your example shows.

I used the most powerful settings for WavPack (-hx6), FLAC (-8), flake (-12) and Monkey's (-c5000). There's some coherency with what I tested.

Don't forget that there's no comparable level of compression between FLAC and Monkey's Audio, because the latter compresses better than flac even with -c1000. Moreover, comparison could be done on different points like decoding speed (this time, MAC fastest setting is much slower than FLAC and WavPack). Therefore, any comparison would be partial.
Go to the top of the page
+Quote Post
greynol
post Sep 25 2006, 19:15
Post #73





Group: Super Moderator
Posts: 10000
Joined: 1-April 04
From: San Francisco
Member No.: 13167



QUOTE
Don't forget that there's no comparable level of compression between FLAC and Monkey's Audio, because the latter compresses better than flac even with -c1000.
...both in compression level and encoding speed (when shooting at similar file sizes), though you still lose quite a bit more data upon corruption with MAC even at -c1000.

So it's apples and oranges no matter which way you look at it.

I guess it gets back to how one weights features when choosing a lossless codec.


--------------------
Your eyes cannot hear.
Go to the top of the page
+Quote Post
guruboolez
post Sep 25 2006, 19:22
Post #74





Group: Members (Donating)
Posts: 3474
Joined: 7-November 01
From: Strasbourg (France)
Member No.: 420



QUOTE (greynol @ Sep 25 2006, 20:15) *
I guess it gets back to how one weights features when choosing a lossless codec.

Always smile.gif
That's why this topic ("the best") didn't start very well.
Go to the top of the page
+Quote Post
greynol
post Sep 25 2006, 19:26
Post #75





Group: Super Moderator
Posts: 10000
Joined: 1-April 04
From: San Francisco
Member No.: 13167



I remember looking at it when it first appeared and thinking, "Oh no, here we go again!"

laugh.gif

EDIT: I must admit my conspiracy-mindedness did factor into my initial reaction to the topic. wink.gif

This post has been edited by greynol: Sep 25 2006, 19:30


--------------------
Your eyes cannot hear.
Go to the top of the page
+Quote Post

5 Pages V  < 1 2 3 4 5 >
Reply to this topicStart new topic
1 User(s) are reading this topic (1 Guests and 0 Anonymous Users)
0 Members:

 



RSS Lo-Fi Version Time is now: 24th April 2014 - 06:06