Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: Andre's EAC Offset Calculation (Read 98808 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

Andre's EAC Offset Calculation

Reply #75
So, who is thinking of adopting this new reference?

I think that I will: assuming that my understanding of earlier posts is correct, I will be changing the offset for my Lite-On LTR-52327S from +6 to -24. Am I correct to assume that the write offset will also require changing - in this case, from -6 to +24?

Andre's EAC Offset Calculation

Reply #76
[...] I can just confirm that his method is right, while Andre Wiethoff's one was just an approximation.

Hi Pio2001

I have been thinking about this for some time know and i must confess that based on your quoted comment above, then it seems that i do not understand this subject at all

Could you please tell me how Andre's results could ever just be an aproximation ? What i seemed to think, was that as Andre looked at discs where there where non-null samples i.e. background noise up untill the very last sample before the lead-out begun, then he actually did have the ability to calculate the precise CD offset values of the CDs that he tested, and not just an aproximation of the CD offsets used on them. To me, then the new reference result and Andre's result, just vitnesses that two different CD offsets was used for the measurement. In fact, it could be that Andre also found the new CD offset that we are discussing here, when he did his tests, but as he decided on the one specific value, that repeated itself the most often(6 times out of all the ones tested), then he could e.g. have found the new reference 2 times and the established reference 6 times, since as you know, Andre said that he found many different CD offsets on the tested discs, but just selected the most frequently occuring. So pio2001, i would be very happy if you would please enlighten me on the following thoughts of mine, please

Thank's in advance.

@dv1989: Yes, the write offset will of course also be affected if changing from one reference to another one

CU, Martin.


Andre's EAC Offset Calculation

Reply #78
Martin, I agree; any more light shed by such knowledgeable people such as Pio2001 is always welcome! And thanks for the clarification with regard to the write offset (not that I ever use it, but let's be consistent!)

Andre's EAC Offset Calculation

Reply #79
I never understood this obsession with having a "bit perfect" audio CD copy.
Same here.


I agree. I do want the audio part that I can hear to be bit-perfect. With FLAC files of each track, I can then recreate a copy of the original CD that for all practical intents will be the same as the original. I can put it in a CD player and it will play each track as it did from the original CD. Maybe it would not hit freecddb as it used to, but I don't really care. After all, the FLACs are just backup copies to be used if some of my CDs break.

So why should I care about bit-perfect copies?

Andre's EAC Offset Calculation

Reply #80
Quote
So why should I care about bit-perfect copies?

Some ppl are perfectionists and others aren't. that's it

@dv1989: Cheers mate

Andre's EAC Offset Calculation

Reply #81
This new reference isn't going to give you copies that are any more bit-perfect than Andre's reference except for discs that begin with 30 audio samples that are not null which are becoming increasingly rare.

edited my bad grammar (and spelling, sheesh!)

Andre's EAC Offset Calculation

Reply #82
As if that is audible anyway.

HA upholds audibility difference (re: TOS #8), explicitly demanding all poster to prove audible difference using ABX... and all this high standard gets thrown out of the window when you're ripping

Let me think of a proper term... not audiophile... bitperfectionophile


Andre's EAC Offset Calculation

Reply #84
So, who is thinking of adopting this new reference?

Not me. The Accuraterip database is too important a reference IMO and I agree with Spoon's decision in not changing it. It's not worth it for all the reasons cited already.

The ability to ABX a track that is offset by 30 samples can be easier than you think.

ABX requires time aligned samples. 

I know, I know. The purpose of the test would be to see if you can tell whether or not you music is offset by 30 samples. Pointless.
daefeatures.co.uk

Andre's EAC Offset Calculation

Reply #85
I know, I know. The purpose of the test would be to see if you can tell whether or not you music is offset by 30 samples. Pointless.

...as opposed to 29 or 31?

How about a blind test to determine the exact size of the offset between two samples?

Yeah, it's absolutely pointless, but that's not what I meant originally. 

Andre's EAC Offset Calculation

Reply #86
But what if those 30 samples are null? Can you ABX the extra silence?

Andre's EAC Offset Calculation

Reply #87
Quote

So why should I care about bit-perfect copies?

Some ppl are perfectionists and others aren't. that's it


Ok, I can buy that. Perfect acceptable reason in my book. And thank you for your answer, because you confirmed to me that I don't care

Now, please go back to your regular bit copy talk and I will shut up.

Andre's EAC Offset Calculation

Reply #88
This is beginning to remind me of Aquarium's posts in that thread at Digital-Inn . . . and that's not a good thing!

Andre's EAC Offset Calculation

Reply #89
In case that was directed at me, here are my ABX results:

foo_abx 1.3.1 report
foobar2000 v0.9.4.2
2006/12/11 18:40:05

File A: C:\Untitled (1).mp3
File B: C:\Untitled (2).mp3

18:40:05 : Test started.
18:40:41 : 01/01  50.0%
18:40:45 : 02/02  25.0%
18:40:50 : 03/03  12.5%
18:40:53 : 04/04  6.3%
18:40:57 : 05/05  3.1%
18:40:59 : 06/06  1.6%
18:41:01 : 07/07  0.8%
18:41:14 : 08/08  0.4%
18:41:21 : 09/09  0.2%
18:41:26 : 10/10  0.1%
18:41:33 : 11/11  0.0%
18:41:43 : 12/12  0.0%
18:41:58 : Test finished.

----------
Total: 12/12 (0.0%)

I will gladly upload my samples if anyone is interested.



PS:  I can repeat the test for the original lossless files and upload them instead if you think this will make a difference.

 

Andre's EAC Offset Calculation

Reply #90
What did you test though? 30 samples of null or a song offset by 30 samples?


Andre's EAC Offset Calculation

Reply #92
You do realise that the audiophile crowd (those who can tell the difference between wav and lossless compression ie numpty's) are going to see this ABX test as being quite significant. 
daefeatures.co.uk

Andre's EAC Offset Calculation

Reply #93
So, who is thinking of adopting this new reference?

Not at all, it's futile,
first, this is not a standard (to me at least  ). It's just an interesting result.
second, I'm stil sceptic that this result might be (Plextor) hardware specific (although that might not be very important).

Also, as long as the offset of your drive is not exactly zero, you will still lose a few samples (at the beginning or the end).
And with write offset correction you burn the the samples on the right spot, still with the current EAC read offset correction.

Oh and 30 samples is not the thing to worry about, it's getting similar results with different drives, that is the point.
In theory, there is no difference between theory and practice. In practice there is.

Andre's EAC Offset Calculation

Reply #94
Personally i will continue to use Andre's reference also. I am also not obsessed with "getting the perfect rip", but i just think that the issue is interessting to discuss  Another thing i would like to say, is that i think that we need to begin to use a consistent terminology for using the word bit-perfect. To me, then the word bit-perfect means that all the bits of the original CD-DA is copied perfectly(but only the mainchannel data and not the subchannel data, lead-in(with the TOC), lead-out and the parity bytes of the CIRC system). Many people use the term for meaning that just all the audio bits are copied perfectly, but IMHO that is not bit-perfect. To me, then bit-perfect means that as all bits needs to be copied perfectly, then we cannot just take the audio bits into account, but also the null samples before and after the first and last track. This is the reason why i often say that it isn't possible to make bit-perfect rips, since CDs themselves have their own built-in offsets also. Sure, all audio bits can be copied without a problem, but to me, then bit-perfect should stand for also getting the precisely number of mastered null samples before/after the first/last track of the album also copied perfectly. Please understand that this isn't something that i personally is concerned about doing, and i'm happy if just the audio bits have been copied error-free, but this is what the term bit-perfect personally mean to me.

CU, Martin.

Andre's EAC Offset Calculation

Reply #95
All of this reminds me of something I wanted to try a long time ago.  I was always annoyed when people used the wrong offset, so I got the idea that if the samples in the beginning and end were null, couldn't I just delete the extra bytes in the beginning and add the same amount of null samples to the end in order to fix the offset correction?

So here's what I did,

I ripped the same track with both Andre's offset and the new one.
Andre's (old) = +97 (CRC=B8954265)
New = +67 (CRC=6E1701B7)

Luckily for me the first track began and ended with plenty of null samples.

I was going to be messing with the raw audio data, so I used wvunpack to decode my WavPacks to raw audio with the command (-mr).

Then, I opened up my hex editor (XVI32) and the old rip (Andre's offset) (.raw).  Then I added 120 bytes (=30 samples) to the beginning of the file and removed 120 bytes from the end. (I knew I was on the right track when I did a CRC hash in XVI32 and it matched the CRC from the new rip (6E1701B7))

After saving the new file, I encoded it to FLAC with the command (-V -8  --endian=little --sign=signed --channels=2 --bps=16 --sample-rate=44100).  Then I decoded it to wav, and then encoded it back to WavPack.  The original MD5 stored in both this new file and the one made from the new offset were exactly the same (2E4284CD29FEF1E0C19DD3D02658B289).

So, by using this method you can fix someone's rip with a wrong offset simply with a hex editor.  Although, if the samples in the end of each track are not null, you would have to do this with the album as one large wav.  There is also the chance that some bytes may be missing from thier wrong offset, so it's always important to check the beginning and end for null samples.

The funny thing is, it's probably easier to just to burn a CD-r with the proper offset and rip from there.

Anyway, I doubt anyone will seriously do this just to fix someone's offset correction to match the new calculation, but I thought i'd be funny to simple show that it was possible in some cases. 

Andre's EAC Offset Calculation

Reply #96
You do realise that the audiophile crowd (those who can tell the difference between wav and lossless compression ie numpty's) are going to see this ABX test as being quite significant. 
No kidding!  There's a big difference between saying I can easily ABX a track and saying I can easily ABX any track.

So, by using this method you can fix someone's rip with a wrong offset simply with a hex editor.  Although, if the samples in the end of each track are not null, you would have to do this with the album as one large wav.
You may find that it is much much easier to mount the tracks as an image to a virtual drive and configure that virtual drive with a read samples offset of -30.

I'll be sticking with the classic reference so long as it is the one being used in AccurateRip.

Andre's EAC Offset Calculation

Reply #97
I didn't think you could mount the images because of the non-compliant cues, but editing the cue is definitely faster and safer than burning a cd.  I still think it's important to check the files just in case it looks like there are missing samples.

I think I want to use the new reference, but it doesn't look like it'll be too popular.  Still, isn't having it bit perfect one of the main reasons for lossless?  I've been an absolute perfectionist before, so why should I stop now when I can apparently make my rips more accurate?

Andre's EAC Offset Calculation

Reply #98
I didn't think you could mount the images because of the non-compliant cues, but editing the cue is definitely faster and safer than burning a cd.
You can't with noncompliant sheets, but with EAC it's really easy to make a compliant cue sheet.  I've also created a batch that will make a compliant sheet out of a noncompliant one.
It can be found here:
http://www.hydrogenaudio.org/forums/index....st&p=452891

I still think it's important to check the files just in case it looks like there are missing samples.
Besides nonsilent samples at the beginning of track 1, which are so incredibly rare, what are you talking about?

Still, isn't having it bit perfect one of the main reasons for lossless?
I think this is an illusion and have already went out of my way in saying why in this thread.  To me lossless is a compression format which is completely independent of DAE.

I've been an absolute perfectionist before, so why should I stop now when I can apparently make my rips more accurate?
Go back and read what I said to SpareTire regarding frame boundaries and different pressings.

Also, don't take my comment regarding the ABXing of two tracks that differ by an offset too seriously.  If a track boundary is so close to the beginning of a song that 30 samples is going to make a difference I would consider the disc as not being mastered well.  I've seen a few discs where the track boundary noticeably starts into a song.  If you don't agree with me on my previous example hopefully you'll agree that these other discs have legitimate problems.

Andre's EAC Offset Calculation

Reply #99
I didn't think you could mount the images because of the non-compliant cues

The "non-compliant" cue type is only for rips of separate tracks with gaps (if present) appended to previous tracks.

I think I want to use the new reference, but it doesn't look like it'll be too popular. Still, isn't having it bit perfect one of the main reasons for lossless? I've been an absolute perfectionist before, so why should I stop now when I can apparently make my rips more accurate?

But, as was said above, not all discs are manufactured in conformance with that "new" zero offset reference. If you use the new reference, what percent of your rips will be more accurate vs. less accurate?