Yoshimi Pi

This post is part 4 of 6 of  LEGO ipMIDI

To free up my laptop from the MIDI synth role I installed ‘multimidicast’ on a Raspberry Pi 3. While testing Timidity++ and searching the net for a way to select a particular instrument in the MIDI sound bank (I want an harp) I found Yoshimi:

Yoshimi is an algorithmic MIDI software synthesizer for Linux.

It looks good. It can also run without GUI and even has is own command line console so I can use my Raspberry Pi in headless mode with sound through the 3.5 mm jack.

sudo apt-get install yoshimi
yoshimi -a -b 1024 -C

And we are running:

Yay! We're up and running :-)
Found 710 instruments in 23 banks
Root 5. Bank set to 5 "Arpeggios"
yoshimi>

After a while I found a way to select an instrument – probably not the proper way but it works:

‘list banks’ shows available banks:

list banks
Banks in Root ID 5
/usr/share/yoshimi/banks
ID 5 Arpeggios
ID 10 Bass
ID 15 Brass
ID 20 Choir_and_Voice
ID 25 Drums
ID 30 Dual
ID 35 Fantasy
ID 40 Guitar
ID 45 Misc
ID 50 Noises
ID 55 Organ
ID 60 Pads
ID 65 Plucked
ID 70 Reed_and_Wind
ID 75 Rhodes
ID 80 Splited
ID 85 Strings
ID 90 Synth
ID 95 SynthPiano
ID 100 The_Mysterious_Bank
ID 105 Will_Godfrey_Collection
ID 110 Will_Godfrey_Companion
ID 115 chip

‘list intrument 110’ shows all instruments in bank 110:

Instruments in Root ID 5, Bank ID 110
/usr/share/yoshimi/banks/Will_Godfrey_Companion
ID 4 Muffled Bells (P)
ID 6 Tinkle Bell (A)
ID 7 Super Ethereal (A)
ID 10 Metal Sweep (A)
ID 11 Slow Steel (AS)
ID 13 Bright Metal (A)
ID 14 Metal Tines (A)
ID 16 Soft Metal (A)
ID 19 Warm Square Swell (A)
ID 21 Bubbles (A)
...
ID 84 Cathedral Pipe Organ (P)
ID 87 Sub Choir (S)
ID 92 Wind Pipes (S)
ID 106 Harpsichord (P)
ID 107 Cathedral Harp (A)
ID 108 Angel Harp (AS)
ID 110 Angel Piano (SP)
ID 112 SciFi Piano (AS)
ID 114 Space Pipes (AS)
...

So there are two harps available. Now some voodoo from documentation:

yoshimi> set bank 110
yoshimi> s p 1 on
@ Part 1+
Part 1 Enable Value 1.000000
yoshimi> s pr 107
@ Part 1+
Loaded Cathedral Harp to Part 1

Not sure what “s p 1 on” does – it is short for “set part 1 ON” and it is a requirement for the next command (“set program 107”).

Now when I play something on the EV3 it sound like an harp.

Nice!

And even better: I don’t have stuck notes. At least not yet. But I did have some audio problems (the dreaded “Alsa xrun” from when I made a Ubuntu Studio DAW for my wife, aeons ago) while playing with some instruments so it is important to memorize this command:

yoshimi> st

STop everything. Panic!

To be honest, the xrun’s ocurred mostly with more eccentric instruments I was testing, with more complex sounds (including reverberation, something that I found on yoshimi topics that can cause xrun’s).

And even-even better: wife and kids said responsivity is much better, they can now play faster.

I am almost buying a USB MIDI synth 🙂

My first MIDI instrument

This post is part 3 of 6 of  LEGO ipMIDI

And here it is, the first working MIDI instrument:

  • 2 MINDSTORMS EV3
  • multimidicast running on each EV3 and on my laptop
  • aMIDIcat running on each EV3 listening from a named pipe
  • pybricks-micropython script running on each EV3, scanning the status of the touch sensors and sending MIDI commands (note ON/OFF) to the named pipe
  • Timidity++ and Rosegarden receiving and playing the MIDI commands
  • Sound Bank: General MIDI by D. Michael McIntyre, Program 89 – Pad 1 (new age)

The source code is available at github. If using more than one EV3 to extend the number of ‘keys’ just define each note here:

# notes associated to each sensor
my_notes = [midi_notes.C4, midi_notes.D4, midi_notes.E4, midi_notes.F4]

so if using 2x EV3 with a total of 8 touch sensors to produce a full octave (a requirement of my wife to be able to play “Happy Birthday” that needs two different C’s) the second one will have:

# notes associated to each sensor
my_notes = [midi_notes.G4, midi_notes.A4, midi_notes.B4, midi_notes.C5]

While recording the above video with my Android phone I noticed that I got much more stuck notes than when not. It looks like the phone being to close to the “intrument” degrades the multicast experience. So I am now reading TouchDAW’s FAQ and network tips and also searching for other clues related to midicast problems in Wi-Fi as it looks like that Wi-Fi routers don’t handle it like Ethernet routers do. So this may help:

  • turning off Bluetooth everywhere [I have some doubts but]
  • turning off all unnecessary Wi-Fi devices
  • … or even better, get a dedicated Wi-Fi router
  • … or even better [if it works] get a USB to Ethernet adapter for each EV3 and ditch Wi-Fi

aMIDIcat

This post is part 2 of 6 of  LEGO ipMIDI

Using a system call to ‘amidi’ seemed a bit slow. While searching the Net I found that ‘mido’ also supports ‘amidi’ as a backend but the documentation clearly states that it is very heavy to make system calls each time.

So I kept searching. Maybe opening ‘amidi’ just once and redirecting commands through a pipe? No, it doesn’t like allow.

But… found aMIDIcat:

It hooks up standard input, and standard output, to the ALSA sequencer.
This makes it easy to pipe data around.

Yes, yes! Ubuntu says ‘amidicat’ is included in ‘sndio-tools’ but after installing this package in ev3dev the command was not found so I download the source code and compiled it (very short, very fast, just use ‘make’).

And it works!

echo "903C7F" | ./amidicat --port 128:0 --hex

Even better: it works directly to ‘multimidicast’ so no need to load the ‘snd-virmidi’ kernel module and connect it to ‘multimidicast’.

So I create a pipe:

mkfifo midipipe

and in my python/micropython script I just open the pipe and write to it:

pipe = open("./midipipe", "w")
pipe.write("903C7F")

so no need to use ‘os.system’ at all!

The gain was huge: I can now send several notes in a row and they sound like they were played at the same time (a chord); with system calls to ‘amidi’ the delay between each system call was clearly noticeable.

I still have the problem that sooner or later a note stucks and I need to send a ‘all notes off’ (“B0 7B 00”) MIDI command. But for now I can live with it (of course, my wife -the musician that will use the LEGO ipMIDI instrument – will not).

LEGO ipMIDI

This post is part 1 of 6 of  LEGO ipMIDI

3 weeks since COVID-19 lockdown.

Bored.

Let’s go back to MIDI and the never completed LEGO Laser Harp idea – instead of usinq MQTT to send codes from the EV3 to my laptop and converting there to MIDI… how can I send pure MIDI?

More than 2 years have past. Python MIDI libraries are better. ev3dev is better. I am bored. Let’s search.

TouchDAW supports two network MIDI protocols: multicast (ipMIDI) and RTP-MIDI. The first one is not exactly a MIDI standard although lots of products seem to support it. But RPT-MIDI is so I’ll try it first.

Found David Moreno’s rtpmidi. It installs a daemon on my Ubuntu laptop. I can use TouchDAW to play music on my laptop through it. Very nice!

But it is not available for ev3dev so I would have to build it… I’m always afraid of that. So maybe there is a python library that can send midi notes through rptmidi? Found none 🙁

OK, ipMIDI then.

‘qmidinet’ on my laptop works. Had to disable jack and sometimes notes get stuck (usually the last note) but it works.

It also works on ev3dev (without GUI, of course):

qmidinet -g -j off

And I can play a MIDI file directly to the laptop:

aplaymidi --port 128:0 happy_birthday.midi

playing heavy MIDI files seemed to stress EV3 (or maybe just Wi-Fi) but small files worked very well – and a single MIDI instrument like my Laser Harp will generate just a few notes per second (at best).

While I was fiddling with different MIDI files I dedided to try ‘multimidicast’, the linux father of ‘qmidinet’. And it also works, just needed to compile from source (not as slow as I expected). Since it is command line only and doesn’t support jack, it uses fewer resources so I’ll use it instead of qmidinet.

I can send a full MIDI file… but what I really need is to send notes, in real time. So I need to generate those notes in python.

‘python-rtmidi’ seems to be the best choice and I remember trying it with the Harp when I was playing MIDI locally on the EV3 (running timidity as a soft synth). At that time, I managed to install ‘python-rtmidi’ and also ‘mido’ that uses it as a backend.

But I couldn’t install it.

There is a ‘python3-rtmidi’ package for armel but not for stretcht. I tryed the buster package but requires python 3.7 (at this moment, ev3dev includes 3.5.3).
So I tried ‘pip’… and after a while I loose network connectivity.
Then i tried downloading the source code and install it… and after a while I also loose network. Even tried installing it without jack support to make it a bit lighter… same thing.

Argh!

So I needed a plan B: playing notes without a python library. And found a post on a Raspberry Pi forum where someone played notes from Ruby using system calls to ‘amidi’:

amidi -p hw:1,0 -S "90 3C 7F"
amidi -p hw:1,0 -S "90 3C 00"

This turns ‘C’ on then off on MIDI channel #0. But it only works with sound cards, not with MIDI connections.

So… there is a kernel module ‘snd-virmidi’ that creates a virtual midi sound card… and it is available in ev3dev !!

sudo modprobe snd-virmidi

This creates 4 MIDI clients:

client 20: 'Virtual Raw MIDI 1-0' [type=kernel,card=1]
    0 'VirMIDI 1-0     '
client 21: 'Virtual Raw MIDI 1-1' [type=kernel,card=1]
    0 'VirMIDI 1-1     '
client 22: 'Virtual Raw MIDI 1-2' [type=kernel,card=1]
    0 'VirMIDI 1-2     '
client 23: 'Virtual Raw MIDI 1-3' [type=kernel,card=1]
    0 'VirMIDI 1-3     '

so I connect one of them to multimidicast:

aconnect 20:0 128:0

and now amidi works!

From python I can now play a ‘C’:

os.system('amidi -p hw:1,0 -S "90 3C 7F"')
time.sleep(0.1)
os.system('amidi -p hw:1,0 -S "90 3C 00"')

Even better: I can do it also from micropython because ‘os.system’ is also available, including the new ‘pybricks-micropython’ library.

Now I need to learn a few hexadecimal MIDI codes and prepare a Raspberry Pi to work as an ipMIDI synth to free my laptop from that role.

First useful version

This post is part 3 of 3 of  barcode

Added a few more lines to my first python script in order to achieve ‘Record and Play’ functionality, like the original LEGO Code Pilot.

Also change a bit the structure of the dicitionary, adding the code to execute to each “barcode item”. So the recording part is just adding “items” to a list and the playing is just a loop through the list, picking the code and executing it with the ‘exec’ function (kudos to Daniel Walton).

So ‘codepilot03.py’ is now able to record an undefined number of steps and execute them in a row. And since the barcode scanner is fully Plug and Play we can detach it and reattach it between runs (and since wi-fi is not needed, we can also remove the clumsy USB hub).

Next version will have an extended “language set”.

EV3 Code Pilot

This post is part 2 of 3 of  barcode

The first MINDSTORMS programmable bricks, the RCX, had several “incarnation”. Most people remember the yellow brick but there were several others, including the Code Pilot that used light sequences as commands. Those VLL codes could be printed as barcodes:

From ‘http://www.elecbrick.com/lego/’

and the Code Pilot set had a companion sheet with some codes and also key notes that could be used as a barcode piano.

So let’s try a simple set of instructions:

The barcodes are just EAN-13 code generated online from EAN-13 barcode generator. They all start with ‘560’ (the code for Portugal-based companies) and the remaining 12 digits are just 1/2/…/5 left padded with zeros. The last digit is the checksum code and is generated by the tool.

So my first program uses a nested dictionary with the EAN-13 codes paired with an action to be spoken. So the program waits for a barcode to be read, checks for the action to be spoken and execute the action we chose:

Source code:

https://github.com/JorgePe/ev3-barcode/blob/master/codepilot01.py

The code sheet used and further developments I might make are also at github:

EV3 barcode scanner

This post is part 1 of 3 of  barcode

Should be finishing my EV3 Alexa program… but not in the mood.

So I got a USB barcode scanner in front of me… hmm, I wonder…?

My Ubuntu linux laptop recognizes. HID device, a keyboard. Great, plug and play!

So does my EV3 running ev3dev:

[  538.808906] usb 1-1.2: new full-speed USB device number 6 using ohci-da8xx
[  538.972308] usb 1-1.2: New USB device found, idVendor=04b4, idProduct=0100
[  538.972383] usb 1-1.2: New USB device strings: Mfr=1, Product=2, SerialNumber=0
[  538.972425] usb 1-1.2: Product: USB Virtual PS2 Port    
[  538.972462] usb 1-1.2: Manufacturer: Future            
[  539.040788] input: Future             USB Virtual PS2 Port     as /devices/platform/soc@1c00000/ohci-da8xx/usb1/1-1/1-1.2/1-1.2:1.0/0003:04B4:0100.0001/input/input2
[  539.128526] hid-generic 0003:04B4:0100.0001: input,hidraw0: USB HID v1.00 Keyboard [Future             USB Virtual PS2 Port    ] on usb-ohci-da8xx-1.2/input0

So if this is a keyboard I can read it with input functions:

#! /usr/bin/env python3

# if you are running from ssh session invoke this script with 'brickrun'
from ev3dev2 .sound import Sound

sound = Sound()

while True:
    ucc = input('Scan a UCC: ')
    ucc = ucc.lower().strip().replace('\t','').replace('\n','')
    sound.speak(ucc)

Yes, it works. Just need to disable Brickman interface (first time I ran it the ‘keys’ read from from the scanner disabled my Wi-Fi connection). I can use print but the since the LCD is so small I prefer to use ‘speak’.

Now… what can I do with it?

Of course… identify LEGO sets!

So I found out that there are online databases of barcodes and some even offer webservices API. I asked a free trial key at “https://www.barcodelookup.com/api” and copy&pasted their python demo code… and it just works.

Code: https://github.com/JorgePe/ev3-barcode/blob/master/barcode02.py