LEGO MIDI Percussion Toolkit

Serching for “MIDI” + “percussion” + “LEGO” is training my search engine cookies. A few weeks ago I found this video of “Dada Machines” kickstarter campaign:

Such a great idea! I already knew about commercial MIDI adapter for playing just one drum and several “makers” or DIY projects using Arduinos and servo motors but Dada Machines idea of selling pre-made “modules” make things much more “clean” and interesting – each module can work as a solenoid or a mallet and can even be adapted to fit LEGO elements with studs so it really looks “plug and play”.

Now let’s think more about it:

  • a MIDI controller with several outputs is something I already been doing, a Technic Hub has 4 ports, looks like a nice start… and it can scale out quite well as I already found out with the LEGO Bagpiper
  • there are so many ways of mounting LEGO Technic… a few “actuator modules” look very feasible
  • and some of those percussion instruments on the video look really possible to achieve with LEGO parts (yes, that table with marbles, I am looking at you!)

But I was on holidays and had not enough Technic parts with me for a proof of concept (“I knew I should have brought more LEGO!”) so had to wait until back home.

I decided to use a BOOST motor for a solenoid-like module and copied Yoshihito Isogawa ‘Reciprocating Mechanism #63’ from his excellent “The LEGO MINDSTORMS EV3 Idea Book”. And since I had 2 of this motors available and a also a City Hub…

Then I found myself picking up all kind of objects at home to see how they sound when “banged” with those solenoids (“boy, this is addictive!”). I went to my #2 kid and took back those IKEA BYGGLEK boxes he was no longer using… nice sound, but for a drum-like sound a mallet actuator is better… and a stronger and faster motor is also better… so the second version of the Percussion Toolkit needed a more powerful hub for all those power bangs… and also found enough GBC balls to make that percussion table from Dada Machines:

Then I focused a little more on the software side. I was already parsing a MIDI stream and playing each percussion instrument according to the MIDI event I wanted (like “Note on 9 35” i.e. “Acoustic Drum” because on the MIDI Percussion Channel each note is mapped to a specific instrument) but without the remaining instruments most musics I played sounded very strange. So I wanted to split the MIDI stream in two – the percussion instruments would be parsed by the current setup and the remaining instruments would be forwarded to a Synth so I could listen them in a speaker.

Turns out this is very easy to achieve in linux with ‘qmidiroute’.

So after some minor adjustments and the addition of a fourth instrument, I now have a 4-instrument 100% LEGO MIDI Percussion Toolkit:

And all the micropython and python and bash scripts are available: https://github.com/JorgePe/randomideas/tree/master/MIDI-Percussion-Kit

As soon as I get more hubs will try to scale it out (the U2’s Sunday Bloody Sunday song uses 5 percussion instruments and I now have this crazy idea that a basic bass would sound great with Queen’s Another One Bites the Dust).

Then I will try to settle down on a few modules standard so this can be adapted easily to other instruments.

Then… I know… the software needs some attention.

LEGO Air Drums

This is a quick holiday project.

I had some LEGO Move Hubs with me with a new Pybricks firmware image and a new version of ‘pybricksdev’ python library that works with them.

Move Hub has an internal IMU sensor. And batteries. And wireless communication. So I could make a kind of wireless game controller.

But Facebook keeps hitting me with ads of wireless MIDI drumsticks. I can do this with LEGO:

It wasn’t difficult – the Pybricks script is very short, inspired on an Arduino project from Andrea Richetta (‘Air Drum with Arduino Nano 33 IoT‘) – I just calculate the size of the acceleration vector (in fact it’s square since Move Hub doesn’t support float numbers so no complex math functions) and print something out whenever it changes ‘suddenly’:

from pybricks.hubs import MoveHub
from pybricks.tools import wait

ACC_THRESHOLD = 170
ACC_MARGIN = 50
TIME = 40
KEY = '1'

hub = MoveHub()
max_v = 0

while True:
    x,y,z = hub.imu.acceleration()
    v = x*x+y+y+z+z
    
    if v > ACC_THRESHOLD:
        max_v = v
    elif v < max_v-ACC_MARGIN:
        print(KEY)
        max_v = 0
        wait(TIME)

Then I just used ‘pybricksdev’ library in a pyhton script on my laptop to capture the output of the Move Hub and redirect it with ‘xdotool’ to a MIDI controller (VMPK) that generates MIDI events usable by Hydrogen (a MIDI drum engine).

I could have used something on python to directly generate the MIDI events instead of using VMPK… but I’m a lazy guy on holidays (or call it ‘Agile’ 😀 )

So I finally made a kind of tutorial:

All the source code available at github as usual.

LEGO ipMIDI

This post is part 1 of 6 of  LEGO ipMIDI

3 weeks since COVID-19 lockdown.

Bored.

Let’s go back to MIDI and the never completed LEGO Laser Harp idea – instead of usinq MQTT to send codes from the EV3 to my laptop and converting there to MIDI… how can I send pure MIDI?

More than 2 years have past. Python MIDI libraries are better. ev3dev is better. I am bored. Let’s search.

TouchDAW supports two network MIDI protocols: multicast (ipMIDI) and RTP-MIDI. The first one is not exactly a MIDI standard although lots of products seem to support it. But RPT-MIDI is so I’ll try it first.

Found David Moreno’s rtpmidi. It installs a daemon on my Ubuntu laptop. I can use TouchDAW to play music on my laptop through it. Very nice!

But it is not available for ev3dev so I would have to build it… I’m always afraid of that. So maybe there is a python library that can send midi notes through rptmidi? Found none 🙁

OK, ipMIDI then.

‘qmidinet’ on my laptop works. Had to disable jack and sometimes notes get stuck (usually the last note) but it works.

It also works on ev3dev (without GUI, of course):

qmidinet -g -j off

And I can play a MIDI file directly to the laptop:

aplaymidi --port 128:0 happy_birthday.midi

playing heavy MIDI files seemed to stress EV3 (or maybe just Wi-Fi) but small files worked very well – and a single MIDI instrument like my Laser Harp will generate just a few notes per second (at best).

While I was fiddling with different MIDI files I dedided to try ‘multimidicast’, the linux father of ‘qmidinet’. And it also works, just needed to compile from source (not as slow as I expected). Since it is command line only and doesn’t support jack, it uses fewer resources so I’ll use it instead of qmidinet.

I can send a full MIDI file… but what I really need is to send notes, in real time. So I need to generate those notes in python.

‘python-rtmidi’ seems to be the best choice and I remember trying it with the Harp when I was playing MIDI locally on the EV3 (running timidity as a soft synth). At that time, I managed to install ‘python-rtmidi’ and also ‘mido’ that uses it as a backend.

But I couldn’t install it.

There is a ‘python3-rtmidi’ package for armel but not for stretcht. I tryed the buster package but requires python 3.7 (at this moment, ev3dev includes 3.5.3).
So I tried ‘pip’… and after a while I loose network connectivity.
Then i tried downloading the source code and install it… and after a while I also loose network. Even tried installing it without jack support to make it a bit lighter… same thing.

Argh!

So I needed a plan B: playing notes without a python library. And found a post on a Raspberry Pi forum where someone played notes from Ruby using system calls to ‘amidi’:

amidi -p hw:1,0 -S "90 3C 7F"
amidi -p hw:1,0 -S "90 3C 00"

This turns ‘C’ on then off on MIDI channel #0. But it only works with sound cards, not with MIDI connections.

So… there is a kernel module ‘snd-virmidi’ that creates a virtual midi sound card… and it is available in ev3dev !!

sudo modprobe snd-virmidi

This creates 4 MIDI clients:

client 20: 'Virtual Raw MIDI 1-0' [type=kernel,card=1]
    0 'VirMIDI 1-0     '
client 21: 'Virtual Raw MIDI 1-1' [type=kernel,card=1]
    0 'VirMIDI 1-1     '
client 22: 'Virtual Raw MIDI 1-2' [type=kernel,card=1]
    0 'VirMIDI 1-2     '
client 23: 'Virtual Raw MIDI 1-3' [type=kernel,card=1]
    0 'VirMIDI 1-3     '

so I connect one of them to multimidicast:

aconnect 20:0 128:0

and now amidi works!

From python I can now play a ‘C’:

os.system('amidi -p hw:1,0 -S "90 3C 7F"')
time.sleep(0.1)
os.system('amidi -p hw:1,0 -S "90 3C 00"')

Even better: I can do it also from micropython because ‘os.system’ is also available, including the new ‘pybricks-micropython’ library.

Now I need to learn a few hexadecimal MIDI codes and prepare a Raspberry Pi to work as an ipMIDI synth to free my laptop from that role.

LEGO laser harp – part II

This post is part 2 of 2 of  LEGO Laser Harp

About 10 years ago I offered my wife a M-Audio USB MIDI Keyboard and installed Ubuntu Studio on a computer so she could play some piano at home. She was so amazed with the possibility to generate music sheet while playing that almost accepted the idea of using Linux… almost 🙂

I remember that at that time I used timidity++ as a software MIDI synthesizer, tuned ALSA (one of the several Linux sound systems, perhaps the most generally used) and the preemptive kernel to work almost flawlessly with the Creative Labs sound card. My wife didn’t enjoy the KDE experience, Firefox was OK for her but OpenOffice were terribly with the spreadsheets she used and finally, when our first kid was born, she attended some English lessons at Wall Street Institute and we found out that the online lessons required an odd combination of an old version on Java, ActiveX and IE… so she returned to Windows XP and never looked back.

10 years is a LOT of time in computer systems but ALSA is still around, even on ev3dev. So I installed timidity++ and tried to play a MIDI file… to find that an ALSA module that is not currently available in ev3dev kernel is required just for MIDI.

I googled for alternatives and found fluidsynth with an unexpected bonus: there is a quite interesting python library, mingus, that works with fluidsynth. So I installed it in my Ubuntu laptop and in a few minutes I was playing harp – amazing!

sudo apt-get install fluidsynthsudo easy_install mingus
python
>>> from mingus.midi import fluidsynth
>>> from mingus.containers.note import Note
>>> fluidsynth.init("/usr/share/sounds/sf2/FluidR3_GM.sf2", "alsa")
>>> fluidsynth.set_instrument(1, 46)
>>> fluidsynth.play_Note(Note("C-3"))

In the previous example I just import the fluidsynth and Note parts of the library, initialize fluidsynth to work with ALSA loading the soundfount that cames with it, choose harp (instrument number 46) and play C3.

Well and polyphony? The correct way is to use a NoteContainer

from mingus.containers import NoteContainer
fluidsynth.play_NoteContainer(NoteContainer(["B-3", "C-3", "F-3"]))

but the lazy way is… just play several notes in a fast sequence.

So, let’s do it in the ev3dev!

Oops, fluidsynth also needs an ALSA module not available in current ev3dev kernel.

I’m not a linux music expert. Not even a linux expert! So after some more googling I gave up and asked for help in ev3dev’ GitHub project. And once again David accepted to include ALSA MIDI suport in future kernels, so I’ll just wait a bit.

Oh, but I can’t wait…

And if I read the color sensors in ev3dev and play the music in my laptop?

ALSA, once again, suports something like client/server MIDI communication with “aseqnet” and “aconnect” commands and some people are already using it with Raspberry Pi!

Yeah, I should have guessed… “aconnect” requires an ALSA MIDI module that is not available in current ev3dev kernel.

OK, let’s use MQTT: configure my EV3 as a publisher and my Ubuntu laptop as a subscriber and just send some notes as messages.

On the EV3:

sudo apt-get install mosquitto
sudo easy_install paho-mqtt

The publisher script is “harp-mqtt-pub.py”:

#!/usr/bin/env python

from ev3dev.auto import *
from time import sleep
import paho.mqtt.client as mqtt

DELAY = 0.01

# should have an auto-calibrate function
AMB_THRESHOLD = 9

sensor1 = ColorSensor('in1:i2c80:mux1')
sensor1.mode = 'COL-AMBIENT'
sensor2 = ColorSensor('in1:i2c81:mux2')
sensor2.mode = 'COL-AMBIENT'
sensor3 = ColorSensor('in1:i2c82:mux3')
sensor3.mode = 'COL-AMBIENT'
sensor4 = ColorSensor('in2')
sensor4.mode = 'COL-AMBIENT'
sensor5 = ColorSensor('in3')
sensor5.mode = 'COL-AMBIENT'
sensor6 = ColorSensor('in4')
sensor6.mode = 'COL-AMBIENT'

# there is no sensor7 yet, I need another MUX

s1 = 0
s2 = 0
s3 = 0
s4 = 0
s5 = 0
s6 = 0
s7 = 0

client = mqtt.Client()
client.connect("localhost",1883,60)

print 'Running...'

while True:
    key_touched = False
    s1 = sensor1.value(0)
    s2 = sensor2.value(0)
    s3 = sensor3.value(0)
    s4 = sensor4.value(0)
    s5 = sensor5.value(0)
    s6 = sensor6.value(0)
#    s7 = sensor7.value(0)

    if s1 < AMB_THRESHOLD:
        client.publish("topic/Harp", "C-3")
        key_touched=True
    if s2 < AMB_THRESHOLD:
        client.publish("topic/Harp", "D-3")
        key_touched=True
    if s3 < AMB_THRESHOLD:
        client.publish("topic/Harp", "E-3")
        key_touched=True
    if s4 < AMB_THRESHOLD:
        client.publish("topic/Harp", "F-3")
        key_touched=True
    if s5 < AMB_THRESHOLD:
        client.publish("topic/Harp", "G-3")
        key_touched=True
    if s6 < AMB_THRESHOLD:
        client.publish("topic/Harp", "A-3")
        key_touched=True
#    if s7 < AMB_THRESHOLD:
#        client.publish("topic/Harp", "B-3")
#        key_touched=True

    if key_touched == True:
        sleep(DELAY)

On the Ubuntu laptop side:

sudo easy_install paho-mqtt

The subscriber script is “harp-mqtt-sub.py”

#!/usr/bin/env python

import paho.mqtt.client as mqtt
from mingus.midi import fluidsynth
from mingus.containers.note import Note

EV3_IP = "192.168.43.35"

SOUNDFONT = 'Concert_Harp.sf2'
INSTRUMENT = 46 # Harp

NOTES = ['C-3','D-3','E-3','F-3','G-3','A-3','B-3']

def on_connect(client, userdata, flags, rc):
    print("Connected with result code "+str(rc))
    client.subscribe("topic/Harp")

def on_message(client, userdata, msg):
    global i
    if (msg.payload in NOTES):
        print msg.payload
        fluidsynth.play_Note(Note(msg.payload))
    
client = mqtt.Client()
client.connect(EV3_IP,1883,60)

client.on_connect = on_connect
client.on_message = on_message

fluidsynth.init(SOUNDFONT, "alsa")
fluidsynth.set_instrument(1, INSTRUMENT)

client.loop_forever()

And guess what? It works!!! I just love linux and open source!

I will keep waiting for David Lechner to include ALSA MIDI support in ev3dev’ kernel. I’m not so sure if there is enough horsepower in the EV3 to load a soundfont and play it with acceptable latency but if I can at least use the MIDI client/server functionality I can drop MQTT.

An interesting possibility that this client/server design allows is to scale my harp easily: with just a second EV3 (2 MUX each) I can make a 13-string harp with almost no modification on my code.