Discussion:
[fluid-dev] Akai EWI-USB, Raspberry-Pi, and FluidSynth
Ben Gonzales
10 years ago
Permalink
Hi all.

I'm running my EWI-USB through a Raspberry-Pi 2 using FluidSynth. It was
a challenging project, and it now works well.

Here's the web page: http://projects.gonzos.net/ewi-pi/

Interesting features:

- I configure the running FluidSynth using a smartphone accessing the
R-Pi via wifi (no need for a screen/keyboard/buttons)
- The PHP code talks to FluidSynth using the telnet port 9800
- The PHP code talks to the EWI using ALSA commands via a bash script

Ben
laalaa
10 years ago
Permalink
Your project inspire me a lot. I am also making a similar project with similar tools but talking with digital piano using a little bit more powerful Intel Compute Stick.

> On 2015年10月31日, at 上午5:00, Ben Gonzales <***@gonzos.net> wrote:
>
> Hi all.
>
> I'm running my EWI-USB through a Raspberry-Pi 2 using FluidSynth. It was a challenging project, and it now works well.
>
> Here's the web page: http://projects.gonzos.net/ewi-pi/
>
> Interesting features:
>
> - I configure the running FluidSynth using a smartphone accessing the R-Pi via wifi (no need for a screen/keyboard/buttons)
> - The PHP code talks to FluidSynth using the telnet port 9800
> - The PHP code talks to the EWI using ALSA commands via a bash script
>
> Ben
>
> _______________________________________________
> fluid-dev mailing list
> fluid-***@nongnu.org
> https://lists.nongnu.org/mailman/listinfo/fluid-dev
Marcus Weseloh
10 years ago
Permalink
Hi Ben,

very interesting project, thanks for sharing! I'm also working on a
(commercial) project with Fluidsynth on ARM hardware, but I'm using an
Allwinner A20 SOM board. I'm producing it commercially, because I'm
also developing the controller hardware (the instrument itself, with
all the keys etc). But the whole software stack will be released as
open-source. More details on http://www.midigurdy.com

Amazing that you get a good latency response with a stock raspbian
kernel. I'm using a preempt-rt enabled kernel with hand-optimized IRQ
priorities and that gives me a latency (from key press to start of
sound) of about 12-15ms, which is acceptable. But then again, my
system not only runs FluidSynth to produce the sound but also handles
all the sensor inputs, modellig of sensor values to MIDI events and
other stuff.

Have you made any measurements of the actual latency?

Cheers,

Marcus

2015-10-30 22:00 GMT+01:00 Ben Gonzales <***@gonzos.net>:
> Hi all.
>
> I'm running my EWI-USB through a Raspberry-Pi 2 using FluidSynth. It was a
> challenging project, and it now works well.
>
> Here's the web page: http://projects.gonzos.net/ewi-pi/
>
> Interesting features:
>
> - I configure the running FluidSynth using a smartphone accessing the R-Pi
> via wifi (no need for a screen/keyboard/buttons)
> - The PHP code talks to FluidSynth using the telnet port 9800
> - The PHP code talks to the EWI using ALSA commands via a bash script
>
> Ben
>
> _______________________________________________
> fluid-dev mailing list
> fluid-***@nongnu.org
> https://lists.nongnu.org/mailman/listinfo/fluid-dev
Ben Gonzales
10 years ago
Permalink
Hi Marcus.

Gee, that looks like fun! I hope you get a great response. You've
obviously put a lot of work into the project.

I haven't actually measured the latency. All I know is that the sound
comes out without enough delay to annoy me! I take it you'd need an
oscilloscope or similar to measure it?

The R-Pi is a model 2, so it's more powerful than the original Pi, and
since I play only one note at a time, I guess that keeps the load down.
There's no GUI, of course. The system load is usually <15%.


Ben

On 31/10/15 22:16, Marcus Weseloh wrote:
> Hi Ben,
>
> very interesting project, thanks for sharing! I'm also working on a
> (commercial) project with Fluidsynth on ARM hardware, but I'm using an
> Allwinner A20 SOM board. I'm producing it commercially, because I'm
> also developing the controller hardware (the instrument itself, with
> all the keys etc). But the whole software stack will be released as
> open-source. More details on http://www.midigurdy.com
>
> Amazing that you get a good latency response with a stock raspbian
> kernel. I'm using a preempt-rt enabled kernel with hand-optimized IRQ
> priorities and that gives me a latency (from key press to start of
> sound) of about 12-15ms, which is acceptable. But then again, my
> system not only runs FluidSynth to produce the sound but also handles
> all the sensor inputs, modellig of sensor values to MIDI events and
> other stuff.
>
> Have you made any measurements of the actual latency?
>
> Cheers,
>
> Marcus
>
> 2015-10-30 22:00 GMT+01:00 Ben Gonzales <***@gonzos.net>:
>> Hi all.
>>
>> I'm running my EWI-USB through a Raspberry-Pi 2 using FluidSynth. It was a
>> challenging project, and it now works well.
>>
>> Here's the web page: http://projects.gonzos.net/ewi-pi/
>>
>> Interesting features:
>>
>> - I configure the running FluidSynth using a smartphone accessing the R-Pi
>> via wifi (no need for a screen/keyboard/buttons)
>> - The PHP code talks to FluidSynth using the telnet port 9800
>> - The PHP code talks to the EWI using ALSA commands via a bash script
>>
>> Ben
>>
>> _______________________________________________
>> fluid-dev mailing list
>> fluid-***@nongnu.org
>> https://lists.nongnu.org/mailman/listinfo/fluid-dev
> _______________________________________________
> fluid-dev mailing list
> fluid-***@nongnu.org
> https://lists.nongnu.org/mailman/listinfo/fluid-dev
Peter Billam
10 years ago
Permalink
Marcus Weseloh wrote:

> I'm also working on a (commercial) project with Fluidsynth on ARM
> hardware, but I'm using an Allwinner A20 SOM board. I'm producing it
> commercially, because I'm also developing the controller hardware
> (the instrument itself, all the keys etc). But the whole software stack
> will be released as open-source. Details on http://www.midigurdy.com

Another great project :-)

There are more niche ARM-projects I can think of, like an affordable
30-note (or 32-note) organ-style pedalboard; or a midi-pedal-board
with 6..8 pedals (switches and potentiometers) a neat UI (web-page?)
to set them to particular channels and controllers (no fluidsynth
needed for that one :-( ); or plain-old-midi-keyboards but built for
stackability (ie: the lower kbds have a flat unpopulated top-panel
and the upper-keyboards a front-undercut of matching size so there's
no waste space between the keyboards).

At the OSDC conference https://2015.osdc.com.au/
I've just given a talk http://www.pjb.com.au/midi/osdc/index.html
which mentions (towards the end) exactly this stuff.
http://www.pjb.com.au/midi/osdc/index.html#80 and onwards
Also:
> I'm using a preempt-rt enabled kernel with hand-optimized IRQ
> priorities and that gives me a latency (from key press to start
> of sound) of about 12-15ms, which is acceptable.
On an ARM: yay! well done. At
http://www.pjb.com.au/midi/osdc/index.html#03
I reckon
About 10 milliseconds latency is acceptable. Of the linux synths,
TiMidity doesn't meet this; fluidsynth maybe just meets it,
on a fast CPU.

All the best with midigurdy.

Peter Billam

http://www.pjb.com.au ***@pjb.com.au (03) 6278 9410
"Follow the charge, not the particle." -- Richard Feynman
from The Theory of Positrons, Physical Review, 1949
Ben Gonzales
10 years ago
Permalink
Hi all.

How do you measure latency with a wind controller? I decided to try the
"record the actual sound and analyse" approach.

I tried using a mic next to the mouthpiece to record my "pfft" (leaking
out the side), and the synth-ed sound that followed. It was difficult to
distinguish the sounds. I then tried tapping with my free hand on the
recording PC's mic in sync with my "pfft", i.e. trying to tap and blow
at precisely the same time. After a bit of practice I measured about
80ms delay.

Then I got a real recorder and a clarinet and did the same "tap and
blow". I got about 40ms for that.

I know it's not very scientific, but I concluded that the traditional
instruments have a delay before the sound is produced (maybe 40ms), and
my synth increases that delay by about another 40ms.

When I play the EWI, however, I can't say that I notice a delay. Maybe
it is because my brain is used to the inherent delay in the traditional
instruments???

So, some questions:

1. How does one measure latency for a wind controller?
2. Does anyone have a HOWTO for a low latency implementation on a R-Pi?

Ben

On 01/11/15 10:52, Peter Billam wrote:
...
> I'm using a preempt-rt enabled kernel with hand-optimized IRQ
> priorities and that gives me a latency (from key press to start
> of sound) of about 12-15ms, which is acceptable.
> On an ARM: yay! well done. At
> http://www.pjb.com.au/midi/osdc/index.html#03
> I reckon
> About 10 milliseconds latency is acceptable. Of the linux synths,
> TiMidity doesn't meet this; fluidsynth maybe just meets it,
> on a fast CPU.
>
>
Marcus Weseloh
10 years ago
Permalink
Hi Ben,

2015-11-14 21:19 GMT+01:00 Ben Gonzales <***@gonzos.net>:
> 1. How does one measure latency for a wind controller?

An easier and more accurate way to measure the overall system latency
could be to measure it while changing from one note to another:

Use a sound card with two channel (i.e. stereo) input. One channel you
attach directly to the ouput of your RPi system, the other channel you
attach to a microphone. The microphone you place very close to the
fingers on your wind controller. The idea is that you tap very hard
when pressing a key on your controller, this tapping should be picked
up by the microphone. If you record the stereo signal and then measure
the time between the tap and the note changing, you get a fairly good
idea of the latency in your system. It might be good to use an
interval with very different harmonics, that way you can easily see
the change in frequency (in Audacity you could use the "Sectrogram"
view).

If you want to measure system internal latencies, for example the ALSA
latency introduced by your buffer and period size, you could use the
"latency" tool from the alsa-utils.

> 2. Does anyone have a HOWTO for a low latency implementation on a R-Pi?

For more information about low latency audio on the RPi, maybe this
page will help: http://wiki.linuxaudio.org/wiki/raspberrypi

If you set up a Preempt-RT kernel, then tuning the IRQ priorities
takes some time and experimentation. This page helped me a lot (from
archive.org, as the main site is currently down):
https://web.archive.org/web/20150223092644/http://subversion.ffado.org/wiki/IrqPriorities

Best regards,

Marcus
laalaa
10 years ago
Permalink
Hi,

By definition latency means the time difference between the "action begin" and "sound begin".  Applying to this case, "action begin" is "begin blowing" or "clapping hand".

As a result, a precise way to measure the latency of wind controller would be:

1. You need a stereo recorder, one channel (mic) close to the mouthpiece, another channel (mic) close to the speaker.
2. When begin blowing, one mic recorded the 'very little wind' noise.
3. After some time, another mic recorded the speaker out.
4. Use some audio editing tools to see the waveform for the latency.

This was what I attempt to do with my Digital Piano (record keyboard instead of monthpiece, of course) and Fluidsynth output.  Though not success since I only have an iPhone to treat as a mono recorder.

Alan

----------------------------------------
> Date: Sun, 15 Nov 2015 07:19:18 +1100
> From: ***@gonzos.net
> To: fluid-***@nongnu.org
> Subject: Re: [fluid-dev] Akai EWI-USB, Raspberry-Pi, and FluidSynth
>
> Hi all.
>
> How do you measure latency with a wind controller? I decided to try the
> "record the actual sound and analyse" approach.
>
> I tried using a mic next to the mouthpiece to record my "pfft" (leaking
> out the side), and the synth-ed sound that followed. It was difficult to
> distinguish the sounds. I then tried tapping with my free hand on the
> recording PC's mic in sync with my "pfft", i.e. trying to tap and blow
> at precisely the same time. After a bit of practice I measured about
> 80ms delay.
>
> Then I got a real recorder and a clarinet and did the same "tap and
> blow". I got about 40ms for that.
>
> I know it's not very scientific, but I concluded that the traditional
> instruments have a delay before the sound is produced (maybe 40ms), and
> my synth increases that delay by about another 40ms.
>
> When I play the EWI, however, I can't say that I notice a delay. Maybe
> it is because my brain is used to the inherent delay in the traditional
> instruments???
>
> So, some questions:
>
> 1. How does one measure latency for a wind controller?
> 2. Does anyone have a HOWTO for a low latency implementation on a R-Pi?
>
> Ben
>
> On 01/11/15 10:52, Peter Billam wrote:
> ...
>> I'm using a preempt-rt enabled kernel with hand-optimized IRQ
>> priorities and that gives me a latency (from key press to start
>> of sound) of about 12-15ms, which is acceptable.
>> On an ARM: yay! well done. At
>> http://www.pjb.com.au/midi/osdc/index.html#03
>> I reckon
>> About 10 milliseconds latency is acceptable. Of the linux synths,
>> TiMidity doesn't meet this; fluidsynth maybe just meets it,
>> on a fast CPU.
>>
>>
>
>
> _______________________________________________
> fluid-dev mailing list
> fluid-***@nongnu.org
> https://lists.nongnu.org/mailman/listinfo/fluid-dev
Brad Stewart
10 years ago
Permalink
If you are using a EWI 4000, you can trigger the EWI's audio output on
an oscilloscope on one channel, then attach the other channel to your PC
audio output. It should then be very easy to see the latency.
Brad

On 11/14/2015 12:19 PM, Ben Gonzales wrote:
> Hi all.
>
> How do you measure latency with a wind controller? I decided to try
> the "record the actual sound and analyse" approach.
>
> I tried using a mic next to the mouthpiece to record my "pfft"
> (leaking out the side), and the synth-ed sound that followed. It was
> difficult to distinguish the sounds. I then tried tapping with my free
> hand on the recording PC's mic n sync with my "pfft", i.e. trying to
> tap and blow at precisely the same time. After a bit of practice I
> measured about 80ms delay.
>
> Then I got a real recorder and a clarinet and did the same "tap and
> blow". I got about 40ms for that.
>
> I know it's not very scientific, but I concluded that the traditional
> instruments have a delay before the sound is produced (maybe 40ms),
> and my synth increases that delay by about another 40ms.
>
> When I play the EWI, however, I can't say that I notice a delay. Maybe
> it is because my brain is used to the inherent delay in the
> traditional instruments???
>
> So, some questions:
>
> 1. How does one measure latency for a wind controller?
> 2. Does anyone have a HOWTO for a low latency implementation on a R-Pi?
>
> Ben
>
> On 01/11/15 10:52, Peter Billam wrote:
> ...
>> I'm using a preempt-rt enabled kernel with hand-optimized IRQ
>> priorities and that gives me a latency (from key press to start
>> of sound) of about 12-15ms, which is acceptable.
>> On an ARM: yay! well done. At
>> http://www.pjb.com.au/midi/osdc/index.html#03
>> I reckon
>> About 10 milliseconds latency is acceptable. Of the linux synths,
>> TiMidity doesn't meet this; fluidsynth maybe just meets it,
>> on a fast CPU.
>>
>>
>
>
> _______________________________________________
> fluid-dev mailing list
> fluid-***@nongnu.org
> https://lists.nongnu.org/mailman/listinfo/fluid-dev
Loading...