IPB

Welcome Guest ( Log In | Register )

Bit-perfect AAC/MP3/etc decoding?, Is it an issue?
mavere
post Jan 5 2013, 00:43
Post #1





Group: Members
Posts: 8
Joined: 30-May 05
Member No.: 22401



Sorry if this is easily answered elsewhere.

My question is that in a comparison of lossy-format decoders between, let's say, iTunes and Foobar and Winamp, will the output (edit: of a single source file) vary between decoding implementations?

Or is it that once a decoder follows a specified format, all output is equal assuming there are no bugs?

This post has been edited by mavere: Jan 5 2013, 00:51
Go to the top of the page
+Quote Post
 
Start new topic
Replies
saratoga
post Jan 5 2013, 01:17
Post #2





Group: Members
Posts: 4715
Joined: 2-September 02
Member No.: 3264



QUOTE (mavere @ Jan 4 2013, 18:43) *
My question is that in a comparison of lossy-format decoders between, let's say, iTunes and Foobar and Winamp, will the output vary between decoding implementations?


Yes. Lossy formats rarely specify a bit perfect output. Instead a series of test tracks is usually provided with a maximum allowable error for each track. All compliant implementations should produce output within this specification.

In practice, most software running on PC will be using 32 bit floating point and so will tend to have much greater accuracy then is required. Some embedded implementations may be less accurate, particularly on 16/24 bit processors and DSPs.
Go to the top of the page
+Quote Post

Posts in this topic


Reply to this topicStart new topic
1 User(s) are reading this topic (1 Guests and 0 Anonymous Users)
0 Members:

 



RSS Lo-Fi Version Time is now: 18th April 2014 - 23:02