Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: Big difference between PlexTools and KProbe result (Read 9800 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

Big difference between PlexTools and KProbe result

Hi there!

Yesterday, I burnt a Verbatim CD-R with my LITE-ON LTR-52246S and done a C1/C2 check with both PlexTools (using a Plextor PX-712A) and KProbe; there was a huge difference between the results: KProbe reported 5000 C1 errors and 0 C2 errors, while PlexTools reported 15000 C1 errors and 1 C2 error. Which one is right?

Edit: Here are the scan results.

Regards,
Sebastian

Big difference between PlexTools and KProbe result

Reply #1
I read somewhere that they comared KProbe and Plextools to a standalone professional device and the reults was that Plextools was closest to the professional device. Dont remember what the professional device was and where I read it but I hope this helps anyway.

Big difference between PlexTools and KProbe result

Reply #2
I posted about it in this and this post. It shows that KProbe results can be very misleading, depending on the drive. Many people swear by it, but it just can't replace a professional hardware analyzer.

Big difference between PlexTools and KProbe result

Reply #3
Both of them are "right", and your test (or the one from c't) has not proven anything
regarding the precision of Kprobe versus Plextools. The most important premise of disc
testing is to understand that C1/C2/PI/PO values you measure are not errors on the
disc. Your plots show the errors your particular drive saw at that particular time on that
particular disc, and they have no meaning outside of this context.

Big difference between PlexTools and KProbe result

Reply #4
Quote
Both of them are "right", and your test (or the one from c't) has not proven anything
regarding the precision of Kprobe versus Plextools. The most important premise of disc
testing is to understand that C1/C2/PI/PO values you measure are not errors on the
disc. Your plots show the errors your particular drive saw at that particular time on that
particular disc, and they have no meaning outside of this context.
[a href="index.php?act=findpost&pid=246339"][{POST_SNAPBACK}][/a]


Sure they have a meaning outside of that context. You just said it yourself. KProbe is pretty useless for measuring burned media quality. And of course KProbe is showing the performance of the combination drive/media (what else could it do), but AFAIK, it shows it before error correction, thus in theory revealing erroneous regions on the media. If it would show the performance after error correction, it would be almost useless for media tests altogether.

It also showed that KProbe is inconsistent with different Lite-On drives, as you said yourself. With "at that particular time" you're probably hinting to the continuous deterioration of any media with an organic dye. But we're usually looking at media fresh from the oven, which is often already out of spec, no need to wait for aging.

Big difference between PlexTools and KProbe result

Reply #5
@Sebastian

If you change the PI/PO sum you will get different results.

As far as the precision of KProbe I believe it depends on the drive, later lite-on drives seem to report too low PI/PO values with even low quality burns for some reason, performing a transfer rate test with something like Nero CD Speed you can see if the drive has problems reading the media.

Big difference between PlexTools and KProbe result

Reply #6
The short story: spath is right.


The long story:

1) Error rates are indirect measures, not true disc quality measures

Error rate scans are indirect measures of disc quality.

Indirect means that they do NOT measure true disc characteristics.

Error rates are always a combination of the disc and the READING drive. That's why they are called indirect (and high level indirect for that matter).

It's like a reading test on text with badly printed paper.

Some text is slightly skewed and harder for some people to read, thus they make mistakes.

Others will read those parts without any problems and produce no errors.

So, Sebastian's results only show that on that particular burn, the Plextor drives sees AND reports more errors than the LiteOn drive.

Which is more correct? They both are, as they only show how well the drive has read back the information.

Again, scans are NOT a true measure of disc quality alone. They show how well the scanning drive read back the disc.

Hence, scan result = scanning drive reading capability + disc readability.

Now, for CD discs there is a "metric standard" reader against which all readers should be compared to. This is the Philips Rodan development unit. AudioDev CATS analyzer are calibrated against this standard (or used to be calibrated at least).

So, in theory one could say that the drive that produces scans more close to the Philips Rodan "reference standard" analyzer, would be more correct.

However, in practise that theoretical accuracy has very little practical value.

For practical purposes, a good burn is a disc that reads back problem free with low error rates on ALL tested drives. That is the closest practical measure of burn quality and resulting disc compatibility that one can get.

I have personally scanned 10 really problematic discs with kProbe (LTR-52327s) and Plextools Pro (Plextor Premium) and CATS SA300 (AudioDev pro analyzer).

The results are not consistent.

Sometimes kProbe scan implies the disc is crap, while Plextools flys through it. Some times vice versa.

Sometimes the CATS device refuses to even scan a disc (it just aborts testing), that remain readable on the LiteOn and Plextor.

I think it is fair to say that the best of modern cd-rom/cdrw readers have already surpassed the CATS SA300 on some levels of reading capability.

That is, they are able to read back some discs (without uncorrectable errors) that the analyzer even refuses to analyze for low level and indirect measures.

2) Raw error rates and kprobe/Plextools Pro results

Now, as for the raw error rates: it has not been proven that for example Mediatek chipset (used in LiteOn drives) or the Sanyo chipset (used in Plextor drives) really reports raw error rates.

In fact, it is debatable, whether it is even possible to calculate C2 errors on raw level before error correction, as C2 errors are defined as the errors that remain after the C1 level error correction has been applied. To confuse things even more, the implementation of the C1 level can vary, so that the amount of errors that can be detected AND corrected on either stage is not constant.

3) Reliability of LiteOn and kProbe for scanning CD discs

It has been shown that LiteOn combo (DVD/CDRW) and dvd-rom drives are not accurate for scanning cd discs for error rates. CDRW drives should be 'reliable' (remember, it's an indirect measure of the reader and the disc).

This has been found out by scanning problem free discs on combo/dvd-rom drives and getting really high error rates (implying unreadable discs). However, the discs remain perfectly readable, in various tested drives, including the ones that did the scanning.

kProbe author himself has stated that cd scanning on combo/dvdrom drives is not reliable.

The same applies for other combo/dvd-rom drives using Mediatek chipsets (regardless of which program is used for measurement, kProbe/cd speed).

DVD scanning (with PI/PIF) is another thing altogether, with different (non-fixed) standards, different problems and different limitations.

That's a topic for another forum thread

cheers,
halcyon

PS Don't worry about an error rate difference that is so "small". Plextor drive will almost always report higher error rates than the LiteOn drive (based on my experience of having scanned several dozens of different brand cd discs from various factories). Both scans can read back the disc well.

EDIT: some typos

Big difference between PlexTools and KProbe result

Reply #7
Halcyon said it well so I'll just add a few things :

> And of course KProbe is showing the performance of the combination drive/media
> (what else could it do), but AFAIK, it shows it before error correction, thus
> in theory revealing erroneous regions on the media. If it would show the performance
> after error correction, it would be almost useless for media tests altogether.

There's no such thing as before and after, all these reported errors are
the result of calculations done the CIRC/RSPC blocks as data pass through
them to be corrected. And the PI/PO errors your hardware sees depend on
plenty of changing mechanical, optical, electrical, etc parameters.

For instance, just by slightly changing the lens focus your PI/PO figures can
be 10x higher. And lens focus is just one of the many parameters which are
calibrated every time you play a disc or which can change between firmware
versions. Of course different drives can also have different focus behaviours.

> It also showed that KProbe is inconsistent with different Lite-On drives,
> as you said yourself.

I never said this, on the contrary. Different drives are not supposed to
report the same PI/PO errors, it's a fundamental premise of disc testing.

Big difference between PlexTools and KProbe result

Reply #8
Quote
There's no such thing as before and after, all these reported errors are
the result of calculations done the CIRC/RSPC blocks as data pass through
them to be corrected. And the PI/PO errors your hardware sees depend on
plenty of changing mechanical, optical, electrical, etc parameters.


Fair enough. But what i meant is this: It shows "before" in the way that you can supposedly see the unprocessed output of the error protection layers. If it gives inconsistent results, compared to the CATS analyzer, you may say it's the drives' fault or KProbe's fault or both. All i'm saying is, if results are this far off from the reference, you may call it "right" or "valid for this purpose", for me it's not. I also don't agree with this:

Quote
For practical purposes, a good burn is a disc that reads back problem free with low error rates on ALL tested drives. That is the closest practical measure of burn quality and resulting disc compatibility that one can get.


I want a disc (especially DVD) that not only reads problem free, but really has error rates as low as possible in the CATS test. This will give some headroom for further unavoidable media deterioration and i might be able to read it for longer than the one who just "plays well" with my Lite-On at this particular moment, but really has higher error rates.

Quote
> It also showed that KProbe is inconsistent with different Lite-On drives,
> as you said yourself.

I never said this, on the contrary. Different drives are not supposed to
report the same PI/PO errors, it's a fundamental premise of disc testing.


Nobody said they should show exactly the same. But such grave inconsistency cannot simply come from differences within the batches/models of drives. It has to do with KProbe itself, as Halcyon suggested, and only indirectly with the drives.

Big difference between PlexTools and KProbe result

Reply #9
Quote
Fair enough. But what i meant is this: It shows "before" in the way that you can supposedly see the unprocessed output of the error protection layers. If it gives inconsistent results, compared to the CATS analyzer, you may say it's the drives' fault or KProbe's fault or both. All i'm saying is, if results are this far off from the reference, you may call it "right" or "valid for this purpose", for me it's not. I also don't agree with this:


There are several things here, that one must take into account:

1) There isn't a practical way to show 'raw unprocessed error counts' for either cd or dvd readings, because the numbers depend on the implementation and earlier stage error correction.

It is possible to show number of correctable and uncorrectable errors at various 'stages', but not 'raw' per se. Maybe this is what you want? CATS analyzers report the same non-raw error counts (and lots of other, more important, measures as well). Again, these errors are not on the disc, they are in the reading process (disc+analyzer).

2) CD and DVD error rate scanning are completely different. Not just because cd and dvd error correction mechanisms differ. The more important distinction is that for cd reading there is a 'reference standard' to which all readings can be compared to. If a reader deviates from this standard enough, it can be considered 'wrong' in some ways.

For dvd standard there is no such 'reference standard'. DVD player designers are free to choose various parameters in the reading process as they themselves see fit. One drive will pass problems of type A (on the disc) without any rise in errors, but fail on problem type B and report a rise in errors. Another drive might do well on B, but worse on A. Both drives can be equally correct. As such, it's not always meaningful to even compare LiteOn/kprobe (or any other drive) to Pulsetec/Cats combo, because both can be 'right'.

The acid test of a reader is how well it can read various different problems that are actually on the disc itself. These are called true disc characteristics and error rates or jitter are NOT among them. Jitter and error rates are causal errors caused by a mismatch with problems on the disc and capabilities of the reading dvd drive.

There is not (to my knowledge) a superior implementation of a dvd reader that can pass ALL problematic disc types (true disc characteristic problems) with flying colours.

Some readers will excell on some areas, while others will fail on them, but work well on something else.

Quote
I want a disc (especially DVD) that not only reads problem free, but really has error rates as low as possible in the CATS test. This will give some headroom for further unavoidable media deterioration and i might be able to read it for longer than the one who just "plays well" with my Lite-On at this particular moment, but really has higher error rates.


This is a laudable goal: low initial error measures to ensure improved longevity of the discs.

I'm after the same goal myself.

But the question is: low error measures in WHICH drive?

CATS SA300 (using Pulsetec drives) is already an accomplished reader.

Having low error measures in Cats does not guarantee readability in various other dvd-roms, dvd burners, set top dvd players, etc.

It's just one measure from a calibrated unit that test companies use as a relatively stable reference to which to compare to. It's not _the_ 'reference standard' for dvd measurements, it's just one standard. It's useful, because the drives are calibrated to be identical (unit-to-unit variance is low) and they are kept calibrated throughout the use of the drives (scan-to-scan variance is low).

Also, as I haven't been able find which drive is a bad reader in all possible ways, I haven't found one single test drive that would really give me useful error scans.

If I had such an all-encompassing bad drive, I could be content that if the disc reads well (and with low error counts) in that 'bad' drive, it's going to read well in (almost) any other drive in the world.

So the solution is to use several different drives from various different manufacturers, all of which have different kind of 'weak spots' in the way they read dvd discs.

Personally I currently use: Optorite (Sanyo), Plextor (Sanyo), LiteOn (Mediatek), AOpen (ALI), Toshiba (Via) and BenQ (Philips). Of course, not just the manufacturer and the chipset matter, but also the selection of the transport (including PUH, calibration and sometimes firmware). Unfortunately, there is scant information about these for an interested layman.

A disc that scans with low error measures in all of those drives, is a disc that I could relatively safely trust as a disc that probably has very good true disc characteristics.

Hence, it would be archival safe in terms of it's low error count in most drives that do the reading. However, there are other factors to get into, when we talk about dvd archival problems and low error count is not always the most important of them.

Quote
Nobody said they should show exactly the same. But such grave inconsistency cannot simply come from differences within the batches/models of drives. It has to do with KProbe itself, as Halcyon suggested, and only indirectly with the drives.


There is a possiblity we are talking (writing?) about different things, but let me try and elucidate my view, which I think is close to what Spath is saying:

With CD scanning there will be inconsistencies with error counts from one drive unit to another drive unit (same manufacturer/model/firmware). However, if the drives are reliable, these differences should be relatively small. In practise, the amount of errors should not be a big multiple from one unit to another (perhaps max 2-3 times as much errors. I'm just guessing here, I haven't analysed actual multiples statistically).

With CD scanning using two different models/manufacturers, there will be even bigger differences, even if both drives are not broken. Still, if both drives are relatively good readers and have relatively good error detection and correction, the error counts should not be too far off each other. I've seen up to 10 times difference in error counts with discs that remain readable on both drives (this is when one of the drives has really low error counts and another has 10 times, both remain within readability/correctability limits).

With DVD scanning the differences can be much bigger, because the implementations of the drives can vary so much (due to a lack of reference standard). All this, without the drives being broken or the software utilities being badly designed. I'm now excluding Mediatek chipset based dvd-rom/combo-drives which have been proven to be inconsistent and unreliable for dvd scanning.

I'm not sure if that clarified things any further, but unfortunately the situation is a little bit more complex than what you and me wish for

I also hope there was a simple, fast and reliable way to test dvd burn quality, in order to decide archival quality of the discs.

However, at least I haven't found such a test yet. I'm still looking though.

Perhaps a CATS dvd scan could be such a test, but it has not been proven yet. And it sure isn't cheap nor easy to do

friendly regards,
halcyon

Big difference between PlexTools and KProbe result

Reply #10
Thanks for your elaborate reply, Halcyon.

I was under the impression that the CATS DVD Pro or similar were the standard for DVD quality, as for instance Lite-On et al seem to use it themselves to develop working burn strategies (and abandon KProbe..? just a guess). But okay, if it's just one standard and not "the" standard, i still think it's the best way to get a glimpse of the real quality (combination burner/media), and the other methods available to the end user can't even come close, especially KProbe. This is why i said KProbe is inconsistent, forgive me for simplifying things, i didn't know the whole background of it. Therefore i tend to believe c't tests much more (CATS) than any of the hundreds of KProbe scans that can be found on the net.

But i think DVD quality in general should be the most pressing issue for the industry. Many users lose trust in certain media, burners and even formats (+/-) because of some disastrous performance with certain combinations. The industry should do everything to offer and advertize simple tools with whom the user can at least halfway reliably check burn quality himself, or at least give a rough and realistic indication...

Big difference between PlexTools and KProbe result

Reply #11
No problem.

These things are complicated to me too and I don't claim to have understood it all.

Also, I agree that CATS scan are more useful than kProbe scans. Two reasons for this:

1)LiteOn drives are particularly good readers (more so than most brands). This skews the results in favour of incompatible discs, which can be read with low errors by LiteOns.

2) Also, CATS scans reveal much more other useful information, than just error counts, namely asymmetry and dc jitter or even lower level measures like push-pull, radial values, cross talk, etc. Sometimes these are more telling than just error values (according to my source at where I scan my discs). I'm not good at deciphering those values myself.

As for your last comment (dvd quality being important), I couldn't agree more.

And I think it should be not just initial burn quality (which is _very_ important), but also disc quality (average/variance), longevity, storage, compatibility, etc.

The big picture is missed by many and I can just imagine what sorts of information will be lost in years to come, when people who stored the data on DVDs, though that it was digitally perfect forever... and it wasn't.

It shouldn't be the problem of consumers to find this out. They should be able to buy media and burners that they can trust for years to come.

Currently, it is not so and I hope this will change.

Friendly regards,
halcyon

Big difference between PlexTools and KProbe result

Reply #12
> Nobody said they should show exactly the same. But such grave inconsistency cannot
> simply come from differences within the batches/models of drives. It has to do with
> KProbe itself, as Halcyon suggested, and only indirectly with the drives.

It seems you still think that all drives should report the same PI/PO errors for a given
disc : it's not the case, and this is what Halcyon has experienced too and reported in
his first post. Results can be very different due to the hardware and there's nothing
wrong with that. Also nobody said that Kprobe is buggy. It could be, but one cannot
prove it by comparing his PI/PO plots with other KProbe/CATS/Plextools plots from
another drive.

Many people seem to consider CATS as absolute PI/PO references without understanding
what these devices are about. Error scanning on a CATS is done by reading a disc in
a particular drive and collecting statistics on error correction, just like Kprobe
or Plextools does with your home drive. So a CATS gives you the PI/PO values measured
by a pulstec drive, nothing more.

Finally error scanning is the least useful feature of a CATS (especially for drive
manufacturers) so don't think that LiteON will spend their days making PI/PO plots
from these machines. You should not think either that c't are experts because they
made CATS PI/PO plots ; the way they interpreted these plots to criticize Kprobe
would instead suggest the contrary.

Big difference between PlexTools and KProbe result

Reply #13
I totally agree with Halcyon and spath ...

And I'd still vote for more "harmonized" procedures of media testing ... that means media testing for low-level (physically existant and thus reproduceable) disc errors like radial noise etc.

K-Probe, Nero CD Speed and Plextools Pro are handy to determine whether a disc is readable in the scanning device after writing but it will give you no hard evidence whether a written disc is good in your standalone.
The name was Plex The Ripper, not Jack The Ripper

Big difference between PlexTools and KProbe result

Reply #14
Just wondering... Are there any offices offering professional CD scans for private customers? If so, could any of you living in Germany (CiTay / JeanLuc / others) give me an example of a company which does these tests?

Another thing... Let's say I buy some Verbatim CD-Rs which state to be manufactured by Mitsubishi Chemicals and have that paricular ATIP code. Is it possible that other CD-Rs made by Verbatim, with the same ATIP, are produced by CMC, or does the ATIP code have to change in that case?

Big difference between PlexTools and KProbe result

Reply #15
Quote
Another thing... Let's say I buy some Verbatim CD-Rs which state to be manufactured by Mitsubishi Chemicals and have that paricular ATIP code. Is it possible that other CD-Rs made by Verbatim, with the same ATIP, are produced by CMC, or does the ATIP code have to change in that case?
[a href="index.php?act=findpost&pid=246759"][{POST_SNAPBACK}][/a]


I recently bought (at my local Saturn store) a 50-pcs-spindle of Verbatim DLP SuperAzo (the blue AZO ones - ATIP identifies them as MCC) and they smell like CMC (don't laugh, they got that characteristic CMC smell) ... I do think CMC produces media for MCC by using licensed MCC technology.
The name was Plex The Ripper, not Jack The Ripper