How to do USB audio
Pages: 1 ... 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 ... 52
Colin (478) 2433 posts |
re Task window performance on newer machines. While newer machines are faster they would also have to play and load the buffer via USB – that will be interesting.
Developing ROSS is a waste of time. We need ROSS API and a new API available at the same time. 1) You can’t make legacy programs choose a device to use so the only thing you can do with ROSS is add the ability to configure the device it uses outside of programs. Can we agree?. 2) New programs or programs still being developed can use a completely different API 3) A completely different API can be used by the existing ROSS to do the device → ROSS connection So you have
Anyway thats the theory :-) |
jim lesurf (2082) 1438 posts |
Yes. :-) I’m happy to agree with the way you describe. It would suit me fine! However I wonder what ROOL developers think of it. To me it looks like the main code for SoundControl and SoundDMA will need to change to suit. e.g. At present the endpoint of the existing ROSS is to throw the data stream though SoundDMA and regard that as a doorway to the HAL/hardware beyond. Here that is changed to go via the New API. In my view we need changes like this anyway. e.g. at present we have the awkward situation where the ARMiniX hardware is inherently 24bit but the existing ROSS is limited to 16bit. So we have to bodge that up in the HAL. Ditto for situations like having hardware which curently runs at high rates, requiring upsampling which the existing ROSS, etc, only cater for by fairly crude (in audio terms) linear interpolation of 16bit values. Jim |
Jeffrey Lee (213) 6048 posts |
No objections from me. |
jim lesurf (2082) 1438 posts |
I’ve now put up a new version of the RO app !WAV_Cleaner. The program is meant to generate a ‘cleaned’ copy of a Wave file with just the basic 44-byte LPCM header. The old version failed to change the ‘AudioFormat’ value to set it to ‘1’. The new version should do this. I’d be grateful if someone could check and confirm I’ve now done this OK. It can be downloaded from http://www.audiomisc.co.uk/software/index.html I’d though that I’d simply missed doing this. But looking at the code and thinking back I now recall I was confused by the inconsistencies I found at the time in wave files and documents on this. The ‘extended’ values in files seemed to be used quite often as soon as the content goes above 48k sample rate or 16bits per sample. And in general players, etc, seem to ignore this anyway! So I suspect in general it may be best to have programs accept both the ‘extended’ and ‘ordinary’ LPCM values in common use. That said, the new version should now clean up LPCM wave files that use additional header metadata chunks and give the basic header. Jim |
jim lesurf (2082) 1438 posts |
Excellent. :-) Jim |
Dave Higton (1515) 3526 posts |
My point is about layering; any higher API should make use of the USB Audio API and any similar lower level API. |
jim lesurf (2082) 1438 posts |
I just tried this and got an error that “USBAudio_OpenOut” failed. Accept that error box and I get “division by zero at line 2210”. Tried it with various wave files of different rates, some 16bit, some 24bit. I also had to specifically double-click !Boot before the app could find the RunImage file. I used it with the Halide Bridge as I’ve been focussing on this with recent tests of Colin’s modules and test prog. They worked OK. The Window field for the name of the dragged file showed its name. But the SR and other fields stayed blank. Was I doing something wrong? Jim |
Dave Higton (1515) 3526 posts |
It’s difficult to know at this range. I tested the app and module last night on two pairs of speakers, an iMic, two headset adaptors and a (very silly) mouse/handset. All of them worked, in replay and record as applicable. There is an outstanding question of what to do when the resolution of the replay source material and destination device don’t match exactly. The example a few posts back in this thread was playing back 16 bit source material through a 24 bit DAC. That requires some intermediate processing to pad the samples out. Another interesting one is 24 bit source through a 20 bit DAC (like the iMic), where the resolutions differ but the sample lengths are the same. Right now the module would fail that, but I need to put in more intelligent tests. If OpenOut failed, it’s probably because there wasn’t an exact match between source and destination. I need to put a small test app together to extract more useful info from a device. Later this evening, all being well. |
Dave Higton (1515) 3526 posts |
All you need to do is pad out all the samples with the same value. The value itself makes no difference (it’s analogous to a DC shift or a change in barometric pressure). The value must be the same for both signs of sample. It’s really simple. |
jim lesurf (2082) 1438 posts |
The Halide Bridge is like most of the USB DACs I use which expect 24bit values and – so far as I can tell – can’t accept 16bit. The only exception I know I have to this is an ‘original’ DAC Magic which only accepts 16bit and is limited to the lower rates. Jim |
Colin (478) 2433 posts |
Thats not an API its a free for all tidied up by a higher level API. If the source for your module was not made available for example, you would need another module to bridge between the your module and the new API module so that you could plug your code into the new API Module. If the code for your module was available you could avoid a bridge module by modifying your module directly. From the Application programmers point of view they don’t want to care whether its a USB device or a motherboard device so that makes any swis you have which do not conform to the new Audio DPI (Device programmers interface – just made that acronym up :-)) specification irrelevant. There are 2 API’s in a ‘generic’ interface one used by application programmers and another used by Device Programmers. USB works the same way. USBDriver has an API that you use to create USBAudio but it has another API that you use to write Controller Driver modules like EHCIDriver. An Audio module would be the equivalent of USBDriver and USBAudio would be the equivalent of EHCIDriver We could well end up with your USBAudio API being the same as the new Audio API but that is an API for the programmer not for the plug in modules of which USBAudio is one. It is useful at this stage for your USBAudio to prototype an API to get a feel for any problems but when we have it about right the API would be moved to a separate module and your module changed to plug into it. It may be useful to call your module Audio at this stage then when USB is removed from it into a separate module later any prototyping programs will still work. Thats how I see it anyway. |
Colin (478) 2433 posts |
Jim. All your DACs except the original DACMagic are 24bit only, the original DACMagic is 16bit only. |
Colin (478) 2433 posts |
Dave. Have you any devices which have alternate interfaces with different bit rates? I figure that alternates are the only place to change bit rates ie you’d need to choose a different alternate – if that makes sense. |
jim lesurf (2082) 1438 posts |
Agreed. I tend to convince myself of this by remembering two things; A) that 0 is one of the allowed values and is ‘all zeros’ in both 16bit and 24big format. B) When turning 16bit into 24bit the values are now all 256 increments apart. So the only distinction I’d add to what you said is to presume the zero value will be padded with a zero byte – hence using this for all the others as well. But my confusion tends to arise when I start off thinking about negative values like -1 actually being represented by a series ‘1’ bit values. … which is true, but I then have to remind myself of what duly happens when multiplied by 256 to convert. The problem is that I then forget and have to work it out again next time! :-) Jim |
jim lesurf (2082) 1438 posts |
Yes, that’s as I assumed. So if Dave’s program is sending 16bit that may be why it didn’t work. However I did try playing 24bit material as well, I think. So, Dave, is your program then cutting this to 16bit before sending? Jim |
Dave Higton (1515) 3526 posts |
The UAPlayer app is unaware of the resolution when it comes to transferring bytes. It simply looks how many bytes are free in the output buffer, and tries to fill the buffer in reasonable-sized chunks. Look at PROCdo_iso. Opening a stream, on the other hand, is absolutely sensitive to resolution; perhaps excessively so.
Not at all; it assumes the source and destination sample sizes are identical and makes no change to the data. |
Dave Higton (1515) 3526 posts |
One other thing: if you’re using devices that conform to the USB Device Class Definition for Audio Devices Release 2.0, I really have no idea whether the USBAudio module will work with them. I only have Release 1.0 conformant devices to test with. |
Dave Higton (1515) 3526 posts |
If you’re having trouble with my app and module, can you please do the following:
Thanks! |
jim lesurf (2082) 1438 posts |
All the USB DACs I have are set to Class 1, or are Class 1 only. The Halide Bridge I’m using for the Iyonix at present is Class 1. I’ll do the test later this morning and report by email. Jim |
Dave Higton (1515) 3526 posts |
I’ve updated the archive this evening with a new module that I hope will solve the issue seen by Colin and Jim, also priv-mailed to them. |
Dave Higton (1515) 3526 posts |
Regardless of the problems we’re still having… How should we handle endpoints that have a synchronising endpoint? Jim and Colin both have DACs with a single isochronous OUT endpoint for audio, and a single isochronous IN endpoint for synchronisation, with a packet length of 3. At least I can say that 3 is the packet length I expected. I don’t profess to understand synchronisation yet. I hope I will do, to an adequate level, before too long. In principle, though, synchronisation seems to aim to modify the packet length to suit the clock frequency of the DAC and/or ADC, which is not coupled to the USB clock frequency. The packet lengths are entirely handled within the USB stack, so it seems to me that the open command for an isochronous audio endpoint has to be given a reference to the synchronisation subsystem; logically, I think, to the stream number. That would seem to me to be enough information to make the link. What light can anyone else shed on the question of synchronisation? |
Colin (478) 2433 posts |
I’ve taken care of the synchronisation endpoint in the USBdriver. Its address is read from the asynchronous endpoint’s endpoint descriptor. You just need to open the out endpoint. Synchronisation works like this I’ll use decimal for clarity. For a 44100 sample rate you need to send 44.100 samples per millisec. You can’t send part samples so you send 44 and accumulate the error. When the error adds up to 1 you send 45 sample and subtract 1 from the error total. For an adaptive out endpoint the 44.100 is fixed. For an asynchronous out endpoint the 44.100 is constantly changed by the synchronisation (feedback) endpoint so it may become 44.120 or 44.076 this changes the error accumulation and so the frequency that you send 45 samples. For audio class 1 the 44.100 is returned by the feedback endpoint as a 10.14 fixed point number – so 3 bytes. Audio class 2 is a 16.16 fixed point number – so 4 bytes. |
Dave Higton (1515) 3526 posts |
Wow, you read and parse USB descriptors inside the USB stack? |
Colin (478) 2433 posts |
It’s nothing so glamorous :-) When you open an endpoint (pipe) in NetBSD code you have access to the pipe’s interface and endpoint descriptors thats enough to work out the feedback endpoint. I’ve been looking at parsing the descriptors to auto detect a device for my demo – just to get a feel for the problems. Its a bit like patting your head and rubbing your stomach at the same time. I’m reading through the Class 2 spec which complicates things further. Problems I see so far are. 1) Auto detection of the first device matching format requirements doesn’t work because I (the user) want the sound to come out of a specific device if I have 2 plugged in. So the user needs to choose a device. 2) Choosing a device. What criteria can the user use to match the device. I’ve seen units with no device name. More than choosing a device I’d want to chose the Device’s output terminal if there was more than 1. I had thought of a list of text identifiers like “device name-terminal type” but as I said device name may be missing. Best option I thought of was “hex_vendor-hex_product-hex_version-terminal-type” but no-one would recognise it other than by plugging the device in and out and seeing what changed. 3) so having chosen the device what happens if it doesn’t support the required format. You want the nearest better. How do you choose the nearest better or even best from 1 channel, 20bit, 44100 and 2 channel, 16bit, 48000 when you want 1 channel, 20bit, 8000. |
jim lesurf (2082) 1438 posts |
The ALSA approach tends to default to the ‘first found’. Beyond that you can specify either by the numbering or by name via a script file in a define location. So you’d either using something like “hw:1,2” to mean the second port of device 1 as found. Or “hw:fred” for a device called “fred”. Having adopted a similar strategy the user either accepts the default, or needs to put the correct description in the relevant file. Which in the case of RO could be used at bootup to create a system variable if that is more convenient. Similarly, you could have a ‘scan’ routine the user ran (via !Boot?) that showed them the list found and invited them to choose, then saved the appropriate details.
Again the ALSA approach is the either send to the device and you get “Format not accepted” sort of error if it isn’t native or you specify that some format(s) are required. e.g. my ALSA files all have a descriptor to specify 24bit to tell the OS this is what the hardware must be given to work. Of course ALSA also allows for “plughw:” ‘devices’ which are actually extra bits of code that can do various conversions and processes. OK, I realise we’re not going to promptly duplicate that and have it working. :-) But something akin to that seems to be the right direction of travel. It starts with a default ‘first found’ choice and ‘send what you like’ but then gives options which can be used to be more specific. I tend to specify the device by number as I only use one USB DAC at a time, but can change DAC. But other people may find ‘by name’ better. Jim |
Pages: 1 ... 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 ... 52