Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: u-law (Read 7579 times) previous topic - next topic
0 Members and 2 Guests are viewing this topic.

u-law

How come u-law was not used with high sample rates in the old days of computing? A quick search here says u-law is not good for sampling above 16 khz because of high frequency distortion, but when I encoded a 320 kbps mp3 44.1 khz into an 8-bit u-law 44.1 khz format it actually sounded quite ok especially compared to 8-bit linear PCM and even 4-bit ADPCM. So how come it was not used in games and multimedia at good sample rates before mp3 came? I understand that it distorts the sound slightly however this distortion is much less hearable on my samples created with audacity than in the same sample encoded to either linear 8-bit PCM or 4-bit ADPCM. And that encoded music was a 1980s OMD track with high dynamic range, not a modern day overcopressed/clipped sample. I am kind of interested in old formats and computing, so that's why I'm asking. I would insert short sound samples, but I don't know how to add them on this forum.

EDIT - I uploaded some sample files here http://www.hydrogenaudio.org/forums/index....showtopic=98358 .

u-law

Reply #1
You should probably take yourself back to the soundcards used back then. The SNR of soundcards in the 90's were between -50dB and -70dB. (Nearing year 2000 probably around 90, but at that time, mp3s were already spreading).

Said that, mu-law is not a compression format (in the common sense), but a way to transmit a dynamic signal over a transmission channel in which the SNR doesn't allow that much dynamic range.

So, by definition, if you start with a 16bit signal and use mu-law to transmit it over an 8bit Wav file, it has the possibility to sound better (read: have a lower SNR value, and so be less noisy) than that signal at 8bit PCM.


u-law

Reply #2
Well, that would be a good reason to use u-law. So why was it not used except in telephony and some 80s samplers/digital drum machines? ADPCM has high frequency distortion quite obvious even on modest 1990s sound cards. Plus the native background noise of old 16-bit soundcards has generally got a very different character as opposed to rough ADPCM/8-bit LPCM noise.

I know u-law is not a regular compression format, but it effectively packs a 16-bit sample into 8-bit with cca 10-12 bit resolution so it halves the storage space needed, and does not add as much noise as ADPCM. All my u-law samples in that thread sound better than their 8-bit LPCM counterparts.

u-law

Reply #3
Hm, very interesting question, Neuron. I don't know the answer but I will speculate that maybe μ-law format would take too much of CPU power to playback? Otherwise I also don't see the reason why it wasn't used more often back in the days...
lame -V 0

u-law

Reply #4
u-law takes the same amount of CPU time (meaning practically zero CPU time, even the 1985 7 mhz Amiga 1000 was able to play stereo 8-bit audio with 4 voices in 28 khz through DMA entirely through the sound chip without taking CPU time) as ordinary 8-bit audio to play. It is used in telephony after all, they would not use anything that required processing power. The question is, why was it not used in higher sample rates? It sounds better than regular 8-bit audio at any sample rate, just listen to the samples.

u-law

Reply #5
As a game audio developer that lived through 8-bit -> ADPCM -> MP3 here's my take.

The simple answer is ADPCM = 4:1, u-law = 2:1. Sound designers are quite happy to be able to put twice as much audio in the game - this is a really big deal for them, there sure wasn't much memory around back then to use. It's still a huge deal today, you'll find it very rare to find 16-bit or FLAC samples in a game - it would be a waste of memory even if it does sound better. Programmers and producers like the sound of better compression too, better technical numbers just sound nicer right?

Back then nobody was performing super critical listening tests like you see on here either. A guy like me would put ADPCM in the audio engine, sound artists would hear the result and be happy with it. We wouldn't have even bothered trying u-law as it's only going to give half as much compression.

It's only when you start to critically listen to the audio and put all sorts of killer samples through it that you realize ADPCM creates some pretty bad noise in some cases. But for almost all audio in the game, it was good enough. It's kind of similar to what happens when you dissect MP3. It's often not until you start looking for problems do you notice them. The average person will never notice or care.

As well, u-law did get the rare bit of use too in video games way back then, granted not anywhere even close to what ADPCM did.

u-law

Reply #6
Well, I think not that super critical listening is required... Both of mu-law and IMA-ADPCM sounds terrible on these settings.
The attachment is generated by the following commands:
[attachment=7253:sweep.zip]
Code: [Select]
sox -r 48000 -c 1 -n -b16 16bit.wav synth 10 sine 1+24000
sox -r 48000 -c 1 -n -b8 8bit.wav synth 10 sine 1+24000
sox -r 48000 -c 1 -n -e ima-adpcm ima.wav synth 10 sine 1+24000
sox -r 48000 -c 1 -n -e mu-law mu-law.wav synth 10 sine 1+24000

(edit: typo on last command)

u-law

Reply #7
A sine wave sweep is the worst possible sample you could put through ADPCM, especially the high frequencies (I didn't even listen to the samples, I know how bad sine sweeps get). It will be dead obvious how bad ADPCM performs on those. The purer the tone the more noticeable the ADPCM prediction error.

This is what I meant by listening too critically ... video games (and most music) aren't filled with sine sweeps. Put some rock music and crunchy sound effects through and see how it does. Back in the day most sound designers compressed the hell out of everything with L2 plugins (probably a holdover from 8-bit samples where it made a big difference). Then everything is getting nearest neighbour (actually not even that good) resampled and lots of clipping on the final downmix on top of all that. There was crunchy audio all around so the ADPCM noise itself did not stand out as much as you would think. Funny thing is that when games finally started mixing at 16-bit 44.1 kHz they were marketed as having "CD QUALITY SOUND!" ... that's pretty laughable in retrospect, but there is some very remote truth in that statement.

I'm not trying to defend ADPCM as some great thing, just giving a context to what it was like back then and why u-law was not common. I was very happy when we moved on to MP3 and the like.

PS - u-law wasn't free of CPU usage (barring hardware decode support, which no computer has). There is a table look up required to decode each sample. IMA ADPCM wasn't too much more complicated in comparison. An Amiga 1000 is still going to chew up quite a bit of CPU decoding u-law (now I'm curious how much). Plus the Amiga only outputs 8-bit audio so the 12-bit u-law result isn't immediately useful ... unless you gang 2 channels together to create 14-bit, but then there's extra CPU needed for shuffling the data around for all that. Hmmm got an Amiga 1000 sitting behind me, maybe I'll try that one day ...


u-law

Reply #8
Amiga could do 4-bit ADPCM so it definitely could do u-law. Telephones do u-law. There was even a NES DDR pirate game that did 4-bit ADPCM instead of the usual 1-bit delta PCM or 7-bit PCM, through the quality was poor due to a 8000 Khz or so sample rate without any lowpass filtering http://www.youtube.com/watch?v=ivD6DYmKgMI . And NES was many, many times slower than any Amiga.

Anyways, I am not saying ADPCM sucks, it definitely sounds better to me than, say 64 kbps mp3. I prefer noise to heavy psychoacoustic artifacting. It is just that u-law has even less noise than ADPCM while it still reduces the file size.

Sine wave sweeps are "killer samples" for waveform coding just as sound of handclaps would be for an MP3 encoded in the old Xing encoder. As Kujibo said, most music is not going to sound like that at all.

About CD Quality sound, I would say that actually many games that advertised this did not lie as many old games have a full Redbook CD-Audio soundtrack. In the case of mid-1990s games it was actually usual to have a 10-100 MB game and the rest to be the CD-DA soundtrack on the same disc. Late 1990s games had the most lossy soundtracks because the game data were big and not much space was left for music thus those games usually had 4-bit ADPCM soundtracks (this is not to say they had low musical quality - for example Tomb Raider 3 and System Shock 2 soundtracks are awesome despite not having very high technological quality). Later games had mp3 soundtracks and today lossless sound is again making a comeback thanks to the capacity of DVDs and Blue Ray media.

Here is a good example of an ADPCM soundtrack from the late 1990s http://www.youtube.com/watch?v=KZrE6k5BNko . Sine sweeps make the codec look much worse than it actually is (and it actually sounds even better in the actual games as youtube compresses it into low bitrate AAC).

About dynamic compression... I would say that is more on individual samples than tracks. Most 90s game music has much more dynamics than any recently made clipressed CD.

Also, I am not sure how would dynamic compression help with ADPCM or u-law samples as quieter sections in ADPCM or u-law actually have less noise than the loud sections. Noise in 8-bit linear PCM can distort quieter sounds (without dithering), but in ADPCM and u-law the noise "rides" the waveform so loud sections have more noise than the quiet ones.


u-law

Reply #10
A little something more to read regarding CPU usage (namely cache). I'm not saying this means that ?-law was too much for CPUs from back in the day, I'm just linking to something I found and might be interesting for this discussion.


Thanks but I think the CPU usage would be high only on primitive microcontrollers, considering televisions since early 1990s were able to decode NICAM http://en.wikipedia.org/wiki/NICAM easily, and NICAM is basically a much better form of what u-law is (10 to 14 bit companding instead of 8 to 12, adaptively increasing or decreasing bit rate etc.). Most 8-bit samplers of the 1980s also used conpanding. In PC situations, it would really help sample storage in the Sound Blaster 16 era as many games back then used raw 8-bit samples that could have sounded a lot better with u-law/other companding algorithms.