Jump to content

[WAVC V1.0]


Ascension64

Recommended Posts

Instead of:

0x0010	  12 (bytes) Constant value (0x1c000000010010002256e777)

 

This is better referred to as:

0x0010 4 (dword) Pointer to ACM data (? invariably 0x1c)
0x0014 2 (word) Number of sound channels (? invariably 1)
0x0016 2 (word) Number of bits per sample (? invariably 16-bit)
0x0018 2 (word) Sample rate (? invariably 22050 Hz)
0x001a 2 (word) Unused (? invariably 0x777e)

 

Is 0x1a really invariable?

Link to comment

Ascension64, are you saying that is what the Infinity Engine actually reads, or are you just speculating? Have you seen the notes I made here? I'm still not sure if the IE itself can actually read 8-bit files, but ABel's converter (the base for all known wav to acm/wavc converters) always outputs as 16-bit, so it doesn't matter much anyway.

 

Downloads:

ACM Header Information

WAVC Header Information

Link to comment

Not speculating.

 

This information is unmarshalled into a WAVEFORMATEX struct, which is saved in a CResWav class. Note that 0x1A and 0x1B are not similarly unmarshalled.

 

In this, it is is used in:

1. BOOL CSound::SetFrequency(nFrequency) - when creating the sound buffer

2. DWORD CSound::GetSoundLength() - used in multiple places

3. bool CA3d::Play(pCSound) - I don't think the game gives you an option to use Aureal3d, but all its components are present and the WAVEFORMATEX struct is directly used in SetWaveFormat()

 

I had not seen the notes until now. Whether the information must correspond correctly with the embedded ACM file would be speculating. If you plan to read 8-bit files, then you do need to consider multiple parts of the pathway, such as:

1. Does the game read the 8-bit ACM format?

2. Does the sound system support playback of 8-bit ACM format?

Link to comment

I'm not sure I understand. Almost all of the vanilla game's effect sounds (EFF_***.wav) claim to have 2 channels in both the WAVC and ACM parts of the header (as seen in WAVC Header Information.xls above), but you are saying the number of channels is invariably 1.... Does this mean that the headers are lying and the actual data of the file is actually mono? In which case this part of the header is ignored and the IE plays the sounds for what they really are? The same goes for the sample rate. WAVC files whose headers ID them as 44100 Hz play fine in ToB. Is the header just lying and the data actually is 22050 Hz or what? The WAVC2WAV converter properly outputs them as 44100 Hz.

 

Trying to play an 8-bit acm or wavc crashes my game, so I agree it really must be 16-bit.

 

Edit: I just did a test where I took a stereo wave file, silenced the right channel and left the left channel alone. I converted it to a WAVC file and played it in ToB. I got sound from my left speaker but not my right. This means that WAVC files really can be stereo, and are played as stereo. The number of sound channels doesn't have to be 1....

Link to comment
Ah, I see where the confusion lies.

 

The 'invariable' nature of the fields I assumed because IESDP reports that the entire field is invariable. It may not be.

To test, you can see if you can play a mono file, or at different sample rate, etc.

Okay, that clears things up. I've been saying from the beginning that there is NO WAY that the IESDP is correct here. It is obvious by looking at the files that the "constant value" is far from constant. I personally have tested both mono and stereo files, and files with a very wide range of sample rates (in BG2: ToB). All play fine in the game. DLTCEP always uses the "constant value" specified in the IESDP when converting WAV files to WAVCs. The WAV to WAVC conversion done by my utility (PS gui) is better than DLTCEP's because it writes the actual values (for the number of channels, sample rate, etc.) to the header.

 

It would be really great if you could figure out what the two byte (word) actually is at 0x001a. The values are in fact dynamic, but when I manually change them in a file and play it in the game, I don't hear any difference. I suspect it has something to do with the compression of the sound data (that's what the "Levels" have to do with), but I'm not sure.

 

 

Edit:

 

Starting here, in post #19 through post #25, some relevant things are mentioned. Concerning the "Levels", here is an excerpt from the source of ABel's Sound to ACM converter:

II. -l levels. Determines the count of levels in the transform scheme. The

more this value is the more efficient redundancy elimination can be performed

and the greater compression can be achieved. But each level is very computation

expensive, so the more levels specified the longer the compression (and the

decompression as well) is.

The Interplay's ACM files use 7 or 8 level transform. So the default "levels"

value is 7.

The "levels" parameter can be zero, in this case no transformation is made

at all. The output coefficients are then equal to the input signal samples, and

the loseless (though not very efficient) compression can be made if no

quantization is applied (-q0 parameter, see below).

Link to comment
It would be really great if you could figure out what the two byte (word) actually is at 0x001a. The values are in fact dynamic, but when I manually change them in a file and play it in the game, I don't hear any difference. I suspect it has something to do with the compression of the sound data (that's what the "Levels" have to do with), but I'm not sure.
Considering that this field isn't even loaded into a CResWav class, I would say that they are redundant. Perhaps they meant something when BioWare compressed the ACM data, but not anymore.
Link to comment

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...