We finally moved to a new home. Not an apartment on a 6th floor with dozens of crazy and noisy neighbors but a real house with a garden and a garage.
I also quit my job and I am taking a rest. So I finally have both time and opportunity to look into domotics.
I installed OpenHAB on a Raspberry Pi and started adding ‘things’ to get familiar with it. The first was a Nedis dehumidifier I bought a few months after moving – we moved to a place with its own micro-weather, lots of humidity, and my wife and the younger kid have allergies problems so I’m trying to keep humidity levels low.
Although the dehumidifier has builtin wi-fi we were using it just on manual mode but since it was easy to integrate with OpenHAB I started with it.
OpenHAB offers ‘bridges’ for several types of devices. This dehumidifier is a Tuya device so I added a Tuya bridge and found that I could read humidity and temperature values even when the dehumidifier was in standby mode (previously, to know the humidity level of the room I needed to turn it on to be shown on the display and just humidity, no temperature readings).
So I ordered two Tuya humidity+temperature sensors to monitor our bedroom and the #2 bedroom.
And then I added a older humidifier to OpenHAB through an also older wi-fi power plug. The humidifier has a ‘mechanic’ power switch so I left it always ON and control it from the power plug.
And then the third humidifier. But this one has a ‘digital’ power switch so you need to press it to turn it ON/OFF. And no builtin wi-fi. So a fingerbot-like gadget was needed. I chose a Switchbot Bot:
It is a Bluetooth BLE device, is supposed to be used with the SwitchBot Hub that works as a wi-fi to BLE gateway but I found a description of its API so used python to control it and created my own mqtt2switchbot gateway with a Raspberry Pi Zero 2W (using ‘bleak’ and ‘pahoo-mqtt’) and adding a MQTT broker to my OpenHAB server.
So I can control Bluetooth BLE devices from my domotics central.
And LEGO Hubs are BLE devices.
So the next obvious step was controlling LEGO from my domotics central 😀
Joel is trying to use a Raspberry Pi with a LEGO Toy Pad. He asked me about the ev3dev tutorial I wrote for using the EV3 with the LEGO DIMENSIONS NFC reader.
It was written a long long time ago in python2 so quick adapted it for python3:
It works on my laptop running Ubuntu 22.10 but had not tested it yet with a RPi.
The most important part is not the script – there are two requirements: the ‘python3-usb’ library and a udev rule to allow python to access the LEGO USB device without root privileges:
The previous script just flashes a red light once. Meanwhile I’ve been using the Toy Pad as a controller for my robotic band, following my wife suggestion of using NFC tags to select the music to play instead of running commands in the shell (my wife is very good at bringing me to reality – at our LUG events, she is the one that demonstrates my gadgets when I am not present):
So now that I had enough LEGO MIDI robotic instruments I could finally form a band.
So The Mindy Python’s Confined Circus band was born, playing a cover of Queen’s ‘Another bites the dust’ with just two instruments – a one-stringed bass and a 12-bar glockenspiel:
4 Technic Control+ hubs, 16 LEGO motors, 4 BLE sessions… but it needed more.
And I also need more space and time to proper join the band for jam sessions… luckily school is now physical again so my kids rooms are available during day time (as I am still working from home).
So the same cover of Queen but now with an extended Percussion Kit (a Robot Inventor hub with 6 ports allowing two more instruments than the previous version):
and also The Clash’ song “Should I Stay or Should I go”:
So I got back to the glockenspiel, trying to use all the bars I had from PV Productions.
At first I had the idea fo making a vertical glockenspiel or lyra:
but I wasn’t happy with dynamics of the motors in this ‘solenoid-like’ configuration and decided to go with a conventional (horizontal) glockenspiel:
The sound wasn’t great but since I believed I could make it better just by adjusting the metal bars configuration I went to the the software part, mixing tools like Rosegarden, qmidiroute to find a way to proper map some MIDI songs to the limited amount of notes I had:
After hollydays, back to home, I got disappointed with the metal bars.. I could not get all them to vibrate as they should. So I ordered a real (but inexpensive) glockenspiel – it would sound as it should with the bonus of having more notes so I could better map some music files.
So a 12-bar MIDI glockpenspiel was now possible:
I’m very happy with this improvement. It still is the only non-100% LEGO instrument on the band but it allows me to play several kinds of music because most musics use at least of MIDI track that can be played on it (usually a piano or an organ but sometimes other instruments do map very well to it).
I still think I should extend the range to a 16- or even 20 bar glockenspiel but will have to wait for the other main instruments in the band to be completed because each extra hub I add requires an extra Bluetooth LE session and I am not sure how many my laptop can handle (I can add extra BLE dongles but the ‘pybricksdev’ tool doesn’t seem have a way to specify the HCI device, I guess it assumes there is just one).
Yes, a simple bass would be great. And yes, the metallophone/glockenspiel could be perfected. And yes, most bands use more than 4 percussion instruments in their musics so the Percussion Kit could be extended a bit… and since some MIDI files include lyrics on a metadata track why not trying Text To Speech and have a singer? And what about that last year funny finding of doing music with pneumatics valves?
So The Mindy Python’s Confined Circus band has born.
And also yes, the software part is getting complex day after day. Thank god for open source and search engines.
So after having a basic 4-intrument Percussion Kit sorted out I decided to try a LEGO one-string instrument. Nothing fancy, just the basics: just a string and a resonance box:
It worked so I kept improving it… added an arm (neck), a longer string, some gear to adjust the tension… and got something that sounded on the bass range:
Then I reinforced the whole thing so I could play without dismantling it:
at this stage I was using a 300 cm Technic string, folded in two and twisted to increase the weight (weight, tension and length decide the note we get).
I wanted a slightly lower pitch and also a fret board with enough precision to mark 8 to 12 tones so I decided to increase the neck a bit but had to reinforce it a lot because tension was already too much. I ended with a 111 cm fret length:
Another advantage of such a long neck was having enough space to add some motors. So with a Technic Control+ hub I could have 1 motor to strum the string and 3 motors to press the string against the fret board:
I chose E, G, A and C because those were the notes used on Queen’s “Another one bites the dust’.
The code part shares most of the code of the Percussion Kit except that for the bass I don’t just map each note to a single motor action, I need to store on the hub the current state of the “fingers” to move the necessary ones for each new note command (for instance: if the bass is instructed to play ‘C’ it lowers the finger on the ‘C’ fret… if another ‘C’ follows, there is no need to move the finger but if it is a lower pitch note (like ‘E’) it needs to move the ‘C’ finger up and move the ‘E’ finger down, wait a bit for the finger to reach position and then strum the string.
With this robotic bass I am still facing some issues while strumming the string… it doesn’t sound the same as when using my fingers… and since I don’t know how to play a bass, that means a lot. I already tried hammering it instead but no luck… will probably need to find a mechanical design that moves the pick in a more complex way.
Serching for “MIDI” + “percussion” + “LEGO” is training my search engine cookies. A few weeks ago I found this video of “Dada Machines” kickstarter campaign:
Such a great idea! I already knew about commercial MIDI adapter for playing just one drum and several “makers” or DIY projects using Arduinos and servo motors but Dada Machines idea of selling pre-made “modules” make things much more “clean” and interesting – each module can work as a solenoid or a mallet and can even be adapted to fit LEGO elements with studs so it really looks “plug and play”.
Now let’s think more about it:
a MIDI controller with several outputs is something I already been doing, a Technic Hub has 4 ports, looks like a nice start… and it can scale out quite well as I already found out with the LEGO Bagpiper
there are so many ways of mounting LEGO Technic… a few “actuator modules” look very feasible
and some of those percussion instruments on the video look really possible to achieve with LEGO parts (yes, that table with marbles, I am looking at you!)
But I was on holidays and had not enough Technic parts with me for a proof of concept (“I knew I should have brought more LEGO!”) so had to wait until back home.
I decided to use a BOOST motor for a solenoid-like module and copied Yoshihito Isogawa ‘Reciprocating Mechanism #63’ from his excellent “The LEGO MINDSTORMS EV3 Idea Book”. And since I had 2 of this motors available and a also a City Hub…
Then I found myself picking up all kind of objects at home to see how they sound when “banged” with those solenoids (“boy, this is addictive!”). I went to my #2 kid and took back those IKEA BYGGLEK boxes he was no longer using… nice sound, but for a drum-like sound a mallet actuator is better… and a stronger and faster motor is also better… so the second version of the Percussion Toolkit needed a more powerful hub for all those power bangs… and also found enough GBC balls to make that percussion table from Dada Machines:
Then I focused a little more on the software side. I was already parsing a MIDI stream and playing each percussion instrument according to the MIDI event I wanted (like “Note on 9 35” i.e. “Acoustic Drum” because on the MIDI Percussion Channel each note is mapped to a specific instrument) but without the remaining instruments most musics I played sounded very strange. So I wanted to split the MIDI stream in two – the percussion instruments would be parsed by the current setup and the remaining instruments would be forwarded to a Synth so I could listen them in a speaker.
Turns out this is very easy to achieve in linux with ‘qmidiroute’.
So after some minor adjustments and the addition of a fourth instrument, I now have a 4-instrument 100% LEGO MIDI Percussion Toolkit:
As soon as I get more hubs will try to scale it out (the U2’s Sunday Bloody Sunday song uses 5 percussion instruments and I now have this crazy idea that a basic bass would sound great with Queen’s Another One Bites the Dust).
Then I will try to settle down on a few modules standard so this can be adapted easily to other instruments.
I had some LEGO Move Hubs with me with a new Pybricks firmware image and a new version of ‘pybricksdev’ python library that works with them.
Move Hub has an internal IMU sensor. And batteries. And wireless communication. So I could make a kind of wireless game controller.
But Facebook keeps hitting me with ads of wireless MIDI drumsticks. I can do this with LEGO:
It wasn’t difficult – the Pybricks script is very short, inspired on an Arduino project from Andrea Richetta (‘Air Drum with Arduino Nano 33 IoT‘) – I just calculate the size of the acceleration vector (in fact it’s square since Move Hub doesn’t support float numbers so no complex math functions) and print something out whenever it changes ‘suddenly’:
from pybricks.hubs import MoveHub
from pybricks.tools import wait
ACC_THRESHOLD = 170
ACC_MARGIN = 50
TIME = 40
KEY = '1'
hub = MoveHub()
max_v = 0
while True:
x,y,z = hub.imu.acceleration()
v = x*x+y+y+z+z
if v > ACC_THRESHOLD:
max_v = v
elif v < max_v-ACC_MARGIN:
print(KEY)
max_v = 0
wait(TIME)
Then I just used ‘pybricksdev’ library in a pyhton script on my laptop to capture the output of the Move Hub and redirect it with ‘xdotool’ to a MIDI controller (VMPK) that generates MIDI events usable by Hydrogen (a MIDI drum engine).
I could have used something on python to directly generate the MIDI events instead of using VMPK… but I’m a lazy guy on holidays (or call it ‘Agile’ 😀 )
I found out that everytime I tested Tuxie McPython the two Technic Hubs were getting hot. Very hot.
Also batteries were draining fast. I hate changing batteries on stationary robots/automata. For my MINDSTORMS EV3 I tend to use the Li-Ion battery as a PSU, leaving it connected to a 12V/1.5A wall power adapter.
So I ordered 2x USB Power Box from PV Productions. They fit on the Technic Hub in the place of the 6xAA battery holder and they can withstand 9V/2A so it seamed a good and clean method.
Less than 3 days after they arrived I melted one 🙁
The 4 Large Angular motors were consuming too much current. To proper close the holes I was forcing the motors to strongly press the fingers against the chanter but never thought that would require 2A (probably even more).
So I changed the code to reduce the fingers strength and opted for other fingertips also made with protective foam but a different one, apparently less “foamy” and a bit thinner. I glued over the EVA foam fingertips I already tried before and cut them a bit larger so the hole would get more covered (no fotos but can be seen used on last video).
Even so, rechargeable NiMH batteries were still draining too fast (and the hubs were still getting hot). So I picked some alligator cables and 2.1 mm barrel adapters and used 2 of my wall power adapters to power the Hubs. I set them for 9V (measured 9.1 with no load) and connected them – it worked:
but while playing I got a blinking LED and very often one of the hubs just powered off. The adapters are rated 1.5A… so the motors were still demanding too much current.
As I said before… I don’t have other proper motors – the smaller version of this motors, sold with the 51515 Robot Inventor, are not easy to find and it would be very expensive to buy 8 of them on Bricklink. I could use the BOOST external motors but I don’t have 8 of them and I am not so sure they have enough power to properly close the holes. Same for Technic motors (except maybe the larger ones) and none of them have internal zero reference (not a major blocker but its something that helps).
So I have 8 motors that demand too much current and two hubs that get too hot and shutdown… I have no other proper motors… but I do have other two identical hubs.
So last version of Tuxie McPython now uses 4 Technic Hubs, each controlling two fingers. Code is essentially the same, just less “if…then…elif” and a shorter “command vocabulary”:
So I now send just 4 different commands (‘0’, ‘1’, ‘2’ or ‘3’) to each hub instead of previous 16 but from the controller side have to send those messages to 4 hubs instead of just 2. My only concern was that my laptop couldn’t proper handle 4 simultaneous BLE sessions but thankfully it works fine.
With 2 motors per hub I can now use NiMH batteries without blinking LEDs. The hubs still get hot (not so hot as before but still very warm) and the batteries will probably run down after a few songs but for that I already ordered 9V/2A power adapters and more alligator cables.
Programming the LEGO Technic Hubs with Pybricks is not difficult (much, much more easier than using assembly with PIC controllers in the 90’s). But one thing is making motors move the way you want and other is making a choreography that sounds like music.
The chanter has 8 holes – 7 are placed above, the other one is downside. There are several sites on the web showing the proper fingertips positions and more than a few adopted a notation of black and white circles (representing closed and open holes). Much like binary notation.
So I chose to use ‘0’ for motor in rest position (closing the hole) and ‘1’ for motor in some other position that opens the hole. So a single byte can represent any possible note (and of course also lots of other combinations).
Using the most significant bit for the higher pitch hole (the one downside) and the less significant bit for the low pitch hole (the one near the end tip of the chanter) this is the result:
As I am (was) using two hubs, it seemed logic to group each representation in two nibbles of 4 bits so the same program running on both Hubs could position its 4 fingers just by receiving a char with a value between ‘0’ (all holes closed) and ‘F’ (all holes open):
The main program loop would be in fact just a bunch of if-then-elif’s based on the key pressed (if using Pybricks Chrome-based IDE) or the character received (if using pybricksdev or accessing the Nordic UART Service directly):
while True:
if keyboard.poll(0):
char = stdin.read(1)
if char == '0': #0000
fingerA_down()
fingerB_down()
fingerC_down()
fingerD_down()
elif char == '1': #0001
fingerA_down()
fingerB_down()
fingerC_down()
fingerD_up()
elif...
and this really worked – pressing some keys on Chrome made the fingers move.
But that’s not enough for playing music. What movements should be done and for how long for this apparently random fingering to produce something that doesn’t sound like me stepping on a cat’s tail?
XenonJohn was gentle enough to include his Arduino code with the Instructables for his Bagpipe Playing Robot. He used a 3-char notation for his music where the first char represents an embellishment (like a gracenote or a doubling), the second char represents the main note and the last char represents the duration of the note. For instance:
-GR
means playin a ‘G’ for 1/4 of the time of a full duration note (I believe this means a ‘quarter note’ or a ‘crotchet’) with no embellishment.
And he also included two songs in his source code: ‘Amazing Grace’ and ‘Scotland The Brave’.
Now this is something I can use.
Instead of implementing his whole code with Pybricks I just adapted his song sequence. I opted for not representing embellishments since they can be represented by (fast) sequences of finger positions (assuming that there is enough bandwidth between the “controller” and each hub to send everything and the hub processes everything fast enough… if I find that is not the case I might return to XenonJohn’s notation).
So this is my way of representing a song:
sequence = [
A, R,
D, Z,
F, R,
E, N,
D, N,
F, Z,
]
where the first character is the note and the second character is the duration (still using Ardu McDuino, too lazy to change that for now):
j = 1 = whole note (semibreve) Z = 1/2 = half note (minim) R = 1/4 = quarter note (crotchet) N = 1/8 = eight note (quaver) L = 1/16 = sixteenth note (semiquaver) K = 1/32 = thirty-second note (demisemiquaver)
But for gracenotes this doesn’t work good since a grace note is played for 1/32 and the next note should be reduced from this value. So I had three timing notations for the 3 cases where a gracenote is used in Ardu McDuino songs (before a note with a timing of R, Z or N) reducing these timing by 1/32:
x = R – 1/32 = 1/4 – 1/32 = 1/4.57 y = N – 1/32 = 1/8 – 1/32 = 1/10.67 z = Z – 1/32 = 1/2 – 1/32 = 1/2.13
these look strange but are easy to use in my code since I define a tempo for my song and each timing is just the multiplication of this tempo by this numbers.
First attempt looked promising but not quite right, then I find a silly math error and now I finally have something that sounds familiar:
After some attempts, decided to gave up (at least for now) using LEGO parts as fingertips. Even the LEGO rubber pads were too hard and narrow to close enough the holes for the chanter to play something that reminds music – for instance I could not notice any difference between a G (all holes closed) and a A (opening just the hole near the chanter endpoint).
I tried Technic rubber links and even rubber tires… no luck.
So I started with small circles of EVA foam glued to LEGO 2×2 dishes attached to the same Technic connectors I was using with the LEGO rubber pads. It looked much like those ear pads used on headphones… and then I remembered that Neil Fraser used exactly that for his Robotic Bagpipe Chanter:
These new fingertips improved sound quality but only for the higher notes.. still having problems with G, A, B…
Adding adhesive felt pads didn’t make it better.
But replacing EVA foam with packaging protective foam (not sure what material it is made off) made a big difference:
Great, I can finally produce distinguishable notes! Still far from perfection because air pressure inside the balloon has a strong influence on note pitch and sometimes the reed sounds very bad (and sometimes doesn’t sound at all)… but it’s still a major milestone: