Codatex RFID Sensor

Some time ago, when I started to use RFID tags with an automated LEGO train, I found out that there was a RFID sensor available for the MINDSTORMS NXT, the Codatex RFID Sensor for NXT:

Codatex doesn’t make them aymore so I ordered one from BrickLink but never got it working with ev3dev. I put it on the shelf hoping that after a while, with ev3dev constant evolution, things would get better.

Last month Michael Brandl told me that the Codatex sensors were in fact LEGO sensors and he asked if it was possible to use with EV3.

Well, it is possible, just not with original LEGO firmware. At least LeJOS and RobotC have support for the Codatex RFID Sensor. But not ev3dev 🙁

So this holidays I put the Codatex sensor on my case, decided to give it another try.

I read the documentation from Codatex and the source code from LeJOS and RobotC and after a while I was reading the sensor properties directly from shell.

After connecting the sensor to Input Port 1 I need to set the port mode to “other-i2C”:

echo other-i2c > /sys/class/lego-port/port0/mode

Currently there are two I2C modes in ev3dev: “nxt-i2c” for known NXT devices and “other-i2c” for other I2C devices, preventing the system to poll the I2C bus.

To read all the Codatex registers this line is enough:

/usr/sbin/i2cset -y 3 0x02 0x00 ; /usr/sbin/i2cset -y 3 0x02 0x41 0x83; sleep 0.1; /usr/sbin/i2cdump -y 3 0x02

(first wake up the sensor with a dummy write, then initialize the firmware, wait a bit and read everything)

Error: Write failed
No size specified (using byte-data access)
 0 1 2 3 4 5 6 7 8 9 a b c d e f 0123456789abcdef
00: 56 31 2e 30 00 00 00 00 43 4f 44 41 54 45 58 00 V1.0....CODATEX.
10: 52 46 49 44 00 00 00 00 00 00 00 00 00 00 00 00 RFID............
20: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
30: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
40: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
50: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
60: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
70: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
80: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
90: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
a0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
b0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
c0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
d0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
e0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
f0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................

The “Error: Write failed” is expected because the dummy write that wakes the Codatex sensor fails.

I got excited, it seemed easy.

So to read just once (singleshot mode) this should work:

/usr/sbin/i2cset -y 3 0x02 0x00 ; /usr/sbin/i2cset -y 3 0x02 0x41 0x01; sleep 0.25; /usr/sbin/i2cdump -r 0x42-0x46 -y 3 0x0

(wake up, send a singleshot read command, wait for aquiring, read the 5 Tag ID registers)

But it didn’t work:

Error: Write failed
No size specified (using byte-data access)
 0 1 2 3 4 5 6 7 8 9 a b c d e f 0123456789abcdef
40: 00 00 00 00 00 .....

The next days I read lots of code from the web, even from Daniele Benedettelli old NXC library. It seemed that my timings were wrong but I could not understand why.

After asking for help at ev3dev I found the reason: the Codatex sensor needs power at pin 1 for the RFID part. So the I2C works but without 9V at pin 1 all readings will fail.

LeJOS and RobotC activate power on pin1 but the two ev3dev I2C modes don’t do that. ITo be sure I tried LeJOS and finally I could read my tags:

package PackageCodatex;

import lejos.hardware.Brick;
import lejos.hardware.port.SensorPort;
import lejos.hardware.sensor.RFIDSensor;

public class Codatex
{
    public static void main(String[] args)
    {
    	RFIDSensor rfid = new RFIDSensor(SensorPort.S1);
		System.out.println(rfid.getProductID());
		System.out.println(rfid.getVendorID());
		System.out.println(rfid.getVersion());
		try {Thread.sleep(2000);}
		catch (InterruptedException e) {}
		long transID = rfid.readTransponderAsLong(true);    
		System.out.println(transID);
		try {Thread.sleep(5000);}
		catch (InterruptedException e) {}
	}
}

David Lechner gave some hint on how to write a driver but honestly I don’t understand at least half of it. So I made a ghetto adapter with a MINDSTORMS EV3 cable and a 9V PP3 battery and it finally works – I can read the Codatex 4102 tags as well as my other 4001 tags:

I created a GitHub repository with the scrips to initialize the port and to use singleshot reads. Soon I will add a script for countinuous reads.

An important note for those who might try the ghetto cable: despite many pictures on the web saying that pin 2 is Ground, IT IS NOT. Use pin 1 for power and pin 3 for Ground. And preferably cut the power wire so that if you make some mistake your EV3 pin1 is safe.

Now I really do need to learn how to write a driver instead of hacking cables. Ouch!

Using Grove devices to the EV3

After David Lechner announced ev3dev support for it I’ve been planning to offer myself a couple of BrickPi 3 from Dexter Industries (just one is not enough since the BrickPi 3 suports daisy chaining).

While I wait for european distributors to sell it (and my budget to stabilize) and since I’m also playing with magnets, I ordered a mindsensors.com Grove adapter so I can start testing Grove devices with my Ev3. Also got two Grove devices from Seeed Studio at my local robotics store, will start with the easiest one: Grove – Electromagnet.

ev3dev doesn’t have a Grove driver yet but since the adapter is an I2C device it recognizes it and configures it as an I2C host:

[  563.590748] lego-port port0: Added new device 'in1:nxt-i2c-host'
[  563.795525] i2c-legoev3 i2c-legoev3.3: registered on input port 1

Addressing the Grove adpter is easy, just need to follow the ev3dev documentation (Appendix C : I2C devices):

robot@ev3dev:~$ ls /dev/i2c-in*
/dev/i2c-in1

robot@ev3dev:~$ udevadm info -q path -n /dev/i2c-in1        
/devices/platform/legoev3-ports/lego-port/port0/i2c-legoev3.3/i2c-3/i2c-dev/i2c-3

So the Grove adapter is at I2C bus #3. According to mindsensors.com User Guide, it’s address is 0x42. That’t the unshifted address but fot i2c-tools we need to use the shifted address (0x21 – at the end of the ev3dev Appendix C doc there is a table with both addresses).

robot@ev3dev:~$ sudo i2cdump 3 0x21

     0  1  2  3  4  5  6  7  8  9  a  b  c  d  e  f    0123456789abcdef
00: 56 31 2e 30 32 00 00 00 6d 6e 64 73 6e 73 72 73    V1.02...mndsnsrs
10: 47 61 64 70 74 6f 72 00 00 00 00 00 00 00 00 00    Gadptor.........
20: 4a 61 6e 20 30 34 20 32 30 31 35 00 31 32 46 31    Jan 04 2015.12F1
30: 38 34 30 00 00 00 00 00 00 00 00 00 00 00 00 00    840.............
40: 00 97 03 32 00 00 00 00 00 00 00 00 00 00 00 00    .??2............
50: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00    ................
60: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00    ................
70: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00    ................
80: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00    ................
90: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00    ................
a0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00    ................
b0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00    ................
c0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00    ................
d0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00    ................
e0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00    ................
f0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00    ................

Acording to the User Guide, this is the expected content of the first 24 registers:

0x00-0x07: Software version – Vx.nn
0x08-0x0f: Vendor Id – mndsnsrs
0x10-0x17: Device ID – Gadptor

So I have a v1.02 Grove adapter.

To use the Grove – Electromagnet I just need to send a “T” (0x54) to the Command Register (0x41) to set the Grove Adapter into “Transmit” mode and next set the Operation Mode, which can be “Digital_0” (sending 0x02 to the Operation Mode register at 0x42) or “Digital_1” (sending 0x03 to the Operation Mode register).

So to turn the electromagnet ON:

sudo i2cset -y 3 0x21 0x41 0x54
sudo i2cset -y 3 0x21 0x42 0x03

And to turn it OFF:

sudo i2cset -y 3 0x21 0x41 0x54
sudo i2cset -y 3 0x21 0x42 0x02

Just a warning: with an operating current of 400 mA when ON the electromagnet gets hot very quickly – not enough to hurt but don’t forget to switch it OFF after use to prevent draing the EV3 batteries.

The same method (“T” + “Digital_0” / “Digital_1”) can be used with several other Grove devices, like the Grove – Water Atomization:

(a great way to add fog effects to our creations – just be careful with short circuits; if you add some kind of parfum you can also have scent effects)

Final note: you can use the mindsensors.com Grove Adapter with native EV3 firmware (just import the available EV3-G block) but if you are using ev3dev like me be sure to use a recent kernel (as of today, “4.4.61-20-ev3dev-ev3”) because older versions had a bug that caused some communication problems with I2C devices (the Grove Adapter is an I2C device).

Triplex – an holonomic robot

Este artigo é a parte 1 de 3 da série  Triplex

A few months ago, trying to find an use for a new LEGO brick found in NEXO Knights sets, I made my first omni wheel. It worked but it was to fragile to be used in a robot so I decided to copy one of Isogawa’s omni wheels and keep working on an holonomic robot with 3 wheels.

Why 3 wheels?

At first I only had NEXO parts to build 3 wheels but I enjoyed the experience – my first RC experiments seemed like lobsters. Controlling the motion is not easy but I found a very good post from Miguel from The Technic Gear  so it was easy to derive my own equations. But Power Functions motors don’t allow precise control of speed so I could not make the robot move in some directions. I needed regulated motors like those used with MINDSTORMS EV3.

So after assembling three Isogawa’s omniwheels and making a frame that assured the wheel doesn’t separate from the motor it was just a matter of making a triangled frame to join all 3 motors and sustain the EV3:

First tests with regulated motor control seem promising: Triplex is fast enough and doesn’t fall apart.  It drifts a bit so I’ll probably use a gyro sensor or a compass to correct it.

In this demo video I show Triplex being wireless controlled from my laptop keyboard through an SSH session. It just walks “forward” or “backward” (only two motors are used, running at the same speed in opposite directions) or rotates “left” or “right” (all motors are used, running at the same speed and the same direction).

For the code used in this demo I copied a block of code from Nigel Ward’s EV3 Python site that solved a problem I’ve been having for a long time: how do I use Python to read the keyboard without waiting for ENTER and without installing pygame or other complex/heavy library?

#!/usr/bin/env python3

# shameless based on
# https://sites.google.com/site/ev3python/learn_ev3_python/keyboard-control
#

import termios, tty, sys
from ev3dev.ev3 import *

TIME_ON = 250

motor_A = MediumMotor('outA')
motor_B = MediumMotor('outB')
motor_C = MediumMotor('outC')

#==============================================

def getch():
    fd = sys.stdin.fileno()
    old_settings = termios.tcgetattr(fd)
    tty.setcbreak(fd)
    ch = sys.stdin.read(1)
    termios.tcsetattr(fd, termios.TCSADRAIN, old_settings)
    
    return ch

#==============================================

def forward():
    motor_A.run_timed(speed_sp=-1200, time_sp=TIME_ON)
    motor_C.run_timed(speed_sp=1200, time_sp=TIME_ON)

#==============================================

def backward():
    motor_A.run_timed(speed_sp=1200, time_sp=TIME_ON)
    motor_C.run_timed(speed_sp=-1200, time_sp=TIME_ON)

#==============================================

def turn_left():
    motor_A.run_timed(speed_sp=1200, time_sp=TIME_ON)
    motor_B.run_timed(speed_sp=1200, time_sp=TIME_ON)
    motor_C.run_timed(speed_sp=1200, time_sp=TIME_ON)

#==============================================

def turn_right():
    motor_A.run_timed(speed_sp=-1200, time_sp=TIME_ON)
    motor_B.run_timed(speed_sp=-1200, time_sp=TIME_ON)
    motor_C.run_timed(speed_sp=-1200, time_sp=TIME_ON)

#==============================================

print("Running")
while True:
   k = getch()
   print(k)
   if k == 'a':
      forward()
   if k == 'z':
      backward()
   if k == 'o':
      turn_left()
   if k == 'p':
      turn_right()
   if k == ' ':
      stop()
   if k == 'q':
      exit()

Thanks for sharing Nigel!

Now let’s learn a bit of math with Python.

For those who might interest, I also have some photos with the evolution of the project.

 

Running LEGO LDD on linux

I’m finally going to try the EV3DPrinter.

3D pen

Now that my 3D pen arrived from China I downloaded Marc-André Bazergui LDD file to understand how to assemble it and then it striked me… dang, need Windows to run LDD!

I still have the Windows VM I used to update the firmware of my EV3 but I don’t want to use it (yes, I’m stubborn) so I decided to try wine. I once had LDD working with wine but never really used it and now that I got a new laptop I didn’t even bothered to install wine again.

So after a few tweaks I got LDD running – it seems that running 32-bit MS Windows programs on wine on a 64-bit linux breaks some things but essentially one just needs to add some 32-bit gstreamer plugins to make LDD work fine.

To show the full process I created a 64-bit virtual machine (1 CPU, 4 GB RAM, 32 GB thin provisioned disk), installed Ubuntu 16.10 (64-bit) on it (default installation, just enabled the download of updates while installing and the installation of 3rd party software).

As I’m using VirtualBox I also installed the VirtualBox Guest Addictions, enabled bi-directional clipboard to allow copy&past of commands between the VM and my desktop and enabled a shared folder to exchange files (just the LDD 4.3.10 setup file and the EV3DPrinter .lxf file).

Then a full last update:

sudo apt update
sudo apt upgrade
sudo apt dist-upgrade

followed by a reboot and a safety snapshot (“trust no one”).

So this is the full process:

sudo dpkg --add-architecture i386 
sudo add-apt-repository ppa:wine/wine-builds
sudo apt-get update
sudo apt-get install --install-recommends winehq-devel

at this moment, I have wine 2.4 installed:

wine --version
wine-2.4

I could install LDD right now but it will not work because at first run it tries to play some music and or video and it fails. The trick is to install some plugins for gstreamer:

sudo apt install gstreamer1.0-plugins-good:i386 gstreamer1.0-fluendo-mp3:i386

So we install LDD by just double-clicking it. As it is the first time wine runs, it first asks to install two dependencies: mono and gecko (that assures some .Net Framework and Internet Explorer compatibility).

LDD setup asks for a language (“English”) then asks us to accept the License Agreement and suggests creating two shortcuts (“Desktop” and “Quick lauch”).

Then it asks to install Adobe Flash Player and to choose a destination folder (default is fine).

When completed, we may check the option to “Run LEGO Digital Designer” but it will not work, it just shows a black window that we need to force close.

But if we launch LDD again, it works now.

Just a last issue: when opening the EV3DPrinter .lxf file we get a request for a FLEXnet license file, it is located at the installation folder:

~/.wine32/drive_c/Program Files/LEGO Company/LEGO Digital Designer/RL278-1000.lic

Everything seems to work, even creating a Building Guide and the HTML Building Instructions.

I recorded everything in this video:

It’s a long (21 min) non edited video so you may want to skip most of it (the download and installation of wine components, the install of LDD and the creation of the Building Guide).

And by the way, this is nothing really new – Marc pointed me this video with LDD running on Ubuntu 7.10 (2007!)

Virtual Mindstorms – using LEGO EV3 software on Linux

Yesterday Marc-André Bazergui incentivized me to make a video showing how to use LEGO MINDSTORMS EV3 Software inside a virtual machine. It is a shame that a product running Linux inside can only be used on PC or Mac – and that’s one of the reasons I started using ev3dev as I only have linux systems (laptops, Raspberry Pi’s, old DIY desktops without a Windows license…).

I got my first EV3 exactly 3 years ago as a birthday gift from my wife. I don’t remember if I ever installed the Windows software on a VM before – I did installed one or twice in Ubuntu with Wine (not sure why) and I did installed a Microsoft Robotics Developer Studio in a VMware Workstation virtual machine and do remember having connected it to the the EV3 thorough a bluetooth USB dongle (most modern hypervisors have this nice feature to allow a local device on the host to be passed-through into the guest).

I no longer have VMware Workstation but I have used Innotek VirtualBox in the past and knew that Oracle somehow managed to keep it alive after buying it (Oracle has the morbid habit of poisoning every good thing it owns – Java, Solaris, OpenOffice, MySQL…).

So I installed Oracle VM VirtualBox 5.1.4 (there is even a x64 .deb package for Ubuntu 16.04 “Xenial”) and after that the VirtualBox 5.1.4 Oracle VM VirtualBox Extension Pack.

It was quite easy and also very fast. After that I got a licensed version of Microsoft Windows 8 Professional (x64 also) – this is my work laptop so people immediatlely started making fun of me – hey, he is installing Windows on his laptop, finally!

The rest of the process was also quite easy after all – like I thought, it is possible to use a Bluetooth USB dongle and also just the direct USB cable connection:

  • create a Virtual Machine
  • make sure “Enable USB Controller” is checked and USB 2.0 (EHCI) Controller is selected – it might also work with USB 3.0
  • add an USB Device Filter for each USB device you want to passthrough into the VM (the EV3 itself and/or the Bluetooth dongle)
  • install Windows
  • present VirtualBox Guest Additions CD Image and install
  • define a Shared Folder so you can pass drivers and binaries into the VM
  • if the Bluetooth dongle is not automatic configured, install the proper drivers
  • pair the EV3 (or plug the USB cable)
  • install LEGO Mindstorms EV3 software and run it

I made a video showing every step (just skipped the LEGO Software as it’s pretty straightfoward):

Just one note: although USB cable connection seems to work fine, i tried to upgrade my EV3 firmware several times with no success – every single time it hangs at 0%. Perhaps it behaves better with another Windows version… who knows?

Edit: Laurens Valk and David Lechner know. So I made a second post showing how to upgrade the firmware.

LEGO laser harp – part II

Este artigo é a parte 2 de 2 da série  LEGO Laser Harp

About 10 years ago I offered my wife a M-Audio USB MIDI Keyboard and installed Ubuntu Studio on a computer so she could play some piano at home. She was so amazed with the possibility to generate music sheet while playing that almost accepted the idea of using Linux… almost 🙂

I remember that at that time I used timidity++ as a software MIDI synthesizer, tuned ALSA (one of the several Linux sound systems, perhaps the most generally used) and the preemptive kernel to work almost flawlessly with the Creative Labs sound card. My wife didn’t enjoy the KDE experience, Firefox was OK for her but OpenOffice were terribly with the spreadsheets she used and finally, when our first kid was born, she attended some English lessons at Wall Street Institute and we found out that the online lessons required an odd combination of an old version on Java, ActiveX and IE… so she returned to Windows XP and never looked back.

10 years is a LOT of time in computer systems but ALSA is still around, even on ev3dev. So I installed timidity++ and tried to play a MIDI file… to find that an ALSA module that is not currently available in ev3dev kernel is required just for MIDI.

I googled for alternatives and found fluidsynth with an unexpected bonus: there is a quite interesting python library, mingus, that works with fluidsynth. So I installed it in my Ubuntu laptop and in a few minutes I was playing harp – amazing!

sudo apt-get install fluidsynthsudo easy_install mingus
python
>>> from mingus.midi import fluidsynth
>>> from mingus.containers.note import Note
>>> fluidsynth.init("/usr/share/sounds/sf2/FluidR3_GM.sf2", "alsa")
>>> fluidsynth.set_instrument(1, 46)
>>> fluidsynth.play_Note(Note("C-3"))

In the previous example I just import the fluidsynth and Note parts of the library, initialize fluidsynth to work with ALSA loading the soundfount that cames with it, choose harp (instrument number 46) and play C3.

Well and polyphony? The correct way is to use a NoteContainer

from mingus.containers import NoteContainer
fluidsynth.play_NoteContainer(NoteContainer(["B-3", "C-3", "F-3"]))

but the lazy way is… just play several notes in a fast sequence.

So, let’s do it in the ev3dev!

Oops, fluidsynth also needs an ALSA module not available in current ev3dev kernel.

I’m not a linux music expert. Not even a linux expert! So after some more googling I gave up and asked for help in ev3dev’ GitHub project. And once again David accepted to include ALSA MIDI suport in future kernels, so I’ll just wait a bit.

Oh, but I can’t wait…

And if I read the color sensors in ev3dev and play the music in my laptop?

ALSA, once again, suports something like client/server MIDI communication with “aseqnet” and “aconnect” commands and some people are already using it with Raspberry Pi!

Yeah, I should have guessed… “aconnect” requires an ALSA MIDI module that is not available in current ev3dev kernel.

OK, let’s use MQTT: configure my EV3 as a publisher and my Ubuntu laptop as a subscriber and just send some notes as messages.

On the EV3:

sudo apt-get install mosquitto
sudo easy_install paho-mqtt

The publisher script is “harp-mqtt-pub.py”:

#!/usr/bin/env python

from ev3dev.auto import *
from time import sleep
import paho.mqtt.client as mqtt

DELAY = 0.01

# should have an auto-calibrate function
AMB_THRESHOLD = 9

sensor1 = ColorSensor('in1:i2c80:mux1')
sensor1.mode = 'COL-AMBIENT'
sensor2 = ColorSensor('in1:i2c81:mux2')
sensor2.mode = 'COL-AMBIENT'
sensor3 = ColorSensor('in1:i2c82:mux3')
sensor3.mode = 'COL-AMBIENT'
sensor4 = ColorSensor('in2')
sensor4.mode = 'COL-AMBIENT'
sensor5 = ColorSensor('in3')
sensor5.mode = 'COL-AMBIENT'
sensor6 = ColorSensor('in4')
sensor6.mode = 'COL-AMBIENT'

# there is no sensor7 yet, I need another MUX

s1 = 0
s2 = 0
s3 = 0
s4 = 0
s5 = 0
s6 = 0
s7 = 0

client = mqtt.Client()
client.connect("localhost",1883,60)

print 'Running...'

while True:
    key_touched = False
    s1 = sensor1.value(0)
    s2 = sensor2.value(0)
    s3 = sensor3.value(0)
    s4 = sensor4.value(0)
    s5 = sensor5.value(0)
    s6 = sensor6.value(0)
#    s7 = sensor7.value(0)

    if s1 < AMB_THRESHOLD:
        client.publish("topic/Harp", "C-3")
        key_touched=True
    if s2 < AMB_THRESHOLD:
        client.publish("topic/Harp", "D-3")
        key_touched=True
    if s3 < AMB_THRESHOLD:
        client.publish("topic/Harp", "E-3")
        key_touched=True
    if s4 < AMB_THRESHOLD:
        client.publish("topic/Harp", "F-3")
        key_touched=True
    if s5 < AMB_THRESHOLD:
        client.publish("topic/Harp", "G-3")
        key_touched=True
    if s6 < AMB_THRESHOLD:
        client.publish("topic/Harp", "A-3")
        key_touched=True
#    if s7 < AMB_THRESHOLD:
#        client.publish("topic/Harp", "B-3")
#        key_touched=True

    if key_touched == True:
        sleep(DELAY)

On the Ubuntu laptop side:

sudo easy_install paho-mqtt

The subscriber script is “harp-mqtt-sub.py”

#!/usr/bin/env python

import paho.mqtt.client as mqtt
from mingus.midi import fluidsynth
from mingus.containers.note import Note

EV3_IP = "192.168.43.35"

SOUNDFONT = 'Concert_Harp.sf2'
INSTRUMENT = 46 # Harp

NOTES = ['C-3','D-3','E-3','F-3','G-3','A-3','B-3']

def on_connect(client, userdata, flags, rc):
    print("Connected with result code "+str(rc))
    client.subscribe("topic/Harp")

def on_message(client, userdata, msg):
    global i
    if (msg.payload in NOTES):
        print msg.payload
        fluidsynth.play_Note(Note(msg.payload))
    
client = mqtt.Client()
client.connect(EV3_IP,1883,60)

client.on_connect = on_connect
client.on_message = on_message

fluidsynth.init(SOUNDFONT, "alsa")
fluidsynth.set_instrument(1, INSTRUMENT)

client.loop_forever()

And guess what? It works!!! I just love linux and open source!

I will keep waiting for David Lechner to include ALSA MIDI support in ev3dev’ kernel. I’m not so sure if there is enough horsepower in the EV3 to load a soundfont and play it with acceptable latency but if I can at least use the MIDI client/server functionality I can drop MQTT.

An interesting possibility that this client/server design allows is to scale my harp easily: with just a second EV3 (2 MUX each) I can make a 13-string harp with almost no modification on my code.

LEGO laser harp – part I

Este artigo é a parte 1 de 2 da série  LEGO Laser Harp

This is an idea I’ve been postponing for several months but the time has finally come: an laser harp.

After tinkering with lasers, fog, sound, color sensors and python I found myself wondering how to give a proper use to all that. Then I remembered Jean-Michel Jarre and how his laser harp made such a big impression on me at late 80’s when I finally decided “hey, i wanna study Electronics!”

For a first version, let’s call it “a proof of concept”, I just want a simple 7-string harp that can play the basic 7 notes. Polyphony would be great but I doubt that the EV3 sound capabilities allow that (and I cannot afford the brute force solution of using 7 EV3 so that each one plays only a single note).

So in the last months I’ve been buying EV3 color sensors and I finally have 7. Since the EV3 only has 4 input ports I need some kind of sensor multiplexer but thanks to mindsensors.com I already have one EV3SensorMux (and a second one is on the way, from an european distributor – portuguese customs DO stink!)

With 2 MUX it’s possible to connect up to 8 sensors to the EV3. Since I just want 7 “strings” I am considering using an 8th sensor to control the amplitude of the notes. I’ll try an ultrasonic sensor but I’m not sure if it has enough “wideness” to cover the whole harp, let’s see.

So of course I’ll be using ev3dev and python.

Using the EV3SensorMux is easy: just plug it to an input port and ev3dev immediately recognizes it:

lego-port port8: Registered 'in1:i2c80:mux1' on '3-0050'.
lego-port port8: Added new device 'in1:i2c80:mux1:lego-ev3-color'
lego-sensor sensor0: Registered 'ms-ev3-smux' on 'in1:i2c80'.
lego-port port9: Registered 'in1:i2c81:mux2' on '3-0051'.
lego-port port9: Added new device 'in1:i2c81:mux2:lego-ev3-color'
lego-sensor sensor1: Registered 'ms-ev3-smux' on 'in1:i2c81'.
lego-port port10: Registered 'in1:i2c82:mux3' on '3-0052'.
lego-port port10: Added new device 'in1:i2c82:mux3:lego-ev3-color'
lego-sensor sensor2: Registered 'ms-ev3-smux' on 'in1:i2c82'.
lego-sensor sensor3: Registered 'lego-ev3-color' on 'in1:i2c80:mux1'.
lego-sensor sensor4: Registered 'lego-ev3-color' on 'in1:i2c81:mux2'.
lego-sensor sensor5: Registered 'lego-ev3-color' on 'in1:i2c82:mux3'.

Even better: by default all 3 mux ports are configured for the EV3 color sensor, just as I wanted!

NOTE: as of today (kernel version ‘4.4.17-14-ev3dev-ev3’) my EV3 autodetection only works when booting with a non-default configuration:

sudo nano /etc/default/flash-kernel

 LINUX_KERNEL_CMDLINE="console=ttyS1,115200"

sudo flash-kernel
sudo reboot

this was suggested to me by David Lechner in another issue, hope will be fixed soon.

To use the color sensors in python I just need to know their ports. With the MUX in port ‘in1’ and 6 color sensors connected, these are the ports to use:

in1:i2c80:mux1
in1:i2c80:mux2
in1:i2c80:mux3
in2
in3
in4

And to play a note in python I just need to know it’s frequency to use with Sound.tone() function, so:

C3 = [(130.81, TONE_LENGHT)] 
D3 = [(146.83, TONE_LENGHT)] 
E3 = [(164.81, TONE_LENGHT)] 
F3 = [(174.61, TONE_LENGHT)] 
G3 = [(196.00, TONE_LENGHT)] 
A3 = [(220.00, TONE_LENGHT)] 
B3 = [(246.94, TONE_LENGHT)]

And so this was the first script for my harp:

#!/usr/bin/env python

from ev3dev.auto import *

TONE_LENGHT = 150

C4 = [(261.64, TONE_LENGHT)]   #Do4
D4 = [(293.66, TONE_LENGHT)]   #Re4
E4 = [(329.63, TONE_LENGHT)]   #Mi4
F4 = [(349.23, TONE_LENGHT)]   #Fa4
G4 = [(392.00, TONE_LENGHT)]   #Sol4
A4 = [(440.00, TONE_LENGHT)]   #La4
B4 = [(493.88, TONE_LENGHT)]   #Si4

AMB_THRESHOLD = 9

sensor1 = ColorSensor('in1:i2c80:mux1')
sensor1.mode = 'COL-AMBIENT'
sensor2 = ColorSensor('in1:i2c81:mux2')
sensor2.mode = 'COL-AMBIENT'
sensor3 = ColorSensor('in1:i2c82:mux3')
sensor3.mode = 'COL-AMBIENT'
sensor4 = ColorSensor('in2')
sensor4.mode = 'COL-AMBIENT'
sensor5 = ColorSensor('in3')
sensor5.mode = 'COL-AMBIENT'
sensor6 = ColorSensor('in4')
sensor6.mode = 'COL-AMBIENT'

# there is no sensor7 yet, I need another MUX

s1 = 0
s2 = 0
s3 = 0
s4 = 0
s5 = 0
s6 = 0
s7 = 0

while True:
    s1 = sensor1.value(0)
    s2 = sensor2.value(0)
    s3 = sensor3.value(0)
    s4 = sensor4.value(0)
    s5 = sensor5.value(0)
    s6 = sensor6.value(0)
#    s7 = sensor7.value(0)
  
    if s1 < AMB_THRESHOLD:
        Sound.tone(C4).wait()
    if s2 < AMB_THRESHOLD:
        Sound.tone(D4).wait()
    if s3 < AMB_THRESHOLD:
        Sound.tone(E4).wait()
    if s4 < AMB_THRESHOLD:
        Sound.tone(F4).wait()
    if s5 < AMB_THRESHOLD:
        Sound.tone(G4).wait()
    if s6 < AMB_THRESHOLD:
        Sound.tone(A4).wait()
#    if s7 < AMB_THRESHOLD:
#        Sound.tone(B4).wait()

So whenever the light level over one of the color sensor drops bellow AMB_THRESHOLD a note will play for TONE_LENGHT milliseconds.

Unfortunately the sound is monophonic (just one note can be played at a time) and it doesn’t sound like an harp at all – it sounds more like my BASIC games on the ZX Spectrum in the 80’s.

So I tried Sound.play(Wave File) instead. Found some harp samples, converted them to .wav files at 44100 Hz and it really sounds much better… but the length of the samples I found is to big so the “artist” have to wait for the note to stop playing before moving the hand to another “string”. Not good and also not polyphonic.

Next post I’ll show a better approach for both quality and polyphony: MIDI.

ev3dev – using IRLink with python

I got myself a HiTechnic IRLink sensor.

As of today (August 2016) ev3dev already recognizes the IRLink as a nxt-i2c sensor but there’s no language support for it. David Lechner suggested me using the “direct” attribute to communicate directly with the IRLink at I2C level.

Last time I wrote something mildly related to I2C was about 20 years ago for a Microchip PIC project but well… why not?

So after lots of trial and error, reading several times the LEGO Power Functions RC Protocol and shamelessly copying code from Mike Hatton (“Parax”), Xander Soldaat and Lawrie Griffiths I found on GitHub, RobotC forum and LeJOS forum I fanally managed to control a PF motor in ComboPWM mode.

In the following video, I’m increasing the motor speed (all 7 steps) then decreasing it again until it stops:

This is the python script running in the EV3:

#!/usr/bin/python

#
# based mainly on RobotC code from Mike Hatton ("Parax") and Xander Soldaat
# but also on LeJOS code from Lawrie Griffiths
#

# assumes IRLink at Input 1 as sensor0

import sys
from time import sleep

# channel: 0..3
# motorA, motorB: 0..7

channel = 0
for motorA in (1,1,2,2,3,3,4,4,5,5,6,6,7,7,6,6,5,5,4,4,3,3,2,2,1,1,0,0):

  motorB = motorA

  iBufferSize=2
  iBuffer = bytearray(iBufferSize)

  iBuffer[0] = ((0x04 | channel) << 4) | motorB
  iBuffer[1] = motorA << 4
  check = 0xF ^ (0x04 | channel) ^ motorB ^ motorA
  iBuffer[1] = iBuffer[1] | check

  oBufferSize=14
  oBuffer = bytearray(oBufferSize)

  # clear all positions
  for i in range (0,oBufferSize):
    oBuffer[i]=0

  oBuffer[0]=0x80    # Start Bit

  oBufferIdx = 0

  for iBufferByte in range (0,2):
    for iBufferIdx in range (0,8):
      oBuffer[1 + (oBufferIdx / 8)] |= (0x80 >> (oBufferIdx % 8) )
      if ( ( ( iBuffer[iBufferByte] ) & (0x80 >> (iBufferIdx % 8) ) ) != 0 ) :
        oBufferIdx = oBufferIdx + 6
      else:
        oBufferIdx = oBufferIdx + 3

# Stop bit
  oBuffer[1+ (oBufferIdx / 8)] |= (0x80 >> (oBufferIdx % 8) )

  tailIdx = 1 + (oBufferIdx / 8) + 1

  # Tail


  if (tailIdx == 10):
    oBuffer[tailIdx]= 0x10 # IRLink message payload length
    register = 0x43
  else:
    oBuffer[tailIdx]= 0x11
    register = 0x42

  oBuffer[tailIdx+1]= 0x02 # IRLink in Power Functions Mode
  oBuffer[tailIdx+2]= 0x01 # IRLInk Start transmission 


# clear IRLink (not sure if correct but seems to improve)

  fd = open("/sys/class/lego-sensor/sensor0/direct", 'wb',0)
  fd.seek(0x41)
  fd.write(chr(0x46))
  fd.write(chr(0x44))
  fd.write(chr(0x4C))
  fd.write(chr(0x50))
  fd.close()
  sleep(0.1)

  for i in range(0,5):
    fd = open("/sys/class/lego-sensor/sensor0/direct", 'wb',0)
    fd.seek(register)
    for oBufferIdx in range (0,oBufferSize):
      fd.write(chr(oBuffer[oBufferIdx]))
    fd.close()

    # Power Functions timings (for a 5-command burst)
    if (i==1):
      sleep(0.064)
    elif (i==5):
      sleep(0.096)
    else:
      sleep(0.080)

 

Running ev3dev on a Raspberry Pi 3

A few days ago the ev3dev project launched a great feature: nightly image builds. Right after that I got a received a notice that they included in the image for Raspberry Pi 2/3 support for onboard the Bluetooth and needed to test it.

So I did test it. And found out that onboard Bluetooth indeed works… as also onboard Wi-Fi… as also the Brick Pi, no need to disable BT. Yeah, no more USB dongles!

The procedure is very simple – the really important step is freeing the hardware serial port for the BrickPi (both the onboard Bluetooth and the BrickPi need a UART so a soft UART (“miniuart”) is used for BT instead of the default one.

  • get the latest nightly image build for the Pi2/Pi3 (mine was 26 July 2016) and restore it to a microSD card
  • insert the card in the Pi3
  • connect an USB keyboard and a HDMI display to the Pi3
  • power up the Pi
  • login (robot + maker) – if you cannot see the login prompt change to the proper console with Alt+F1 or Alt+F2 or Alt+F[n]
  • run ‘sudo connmanctl’ to configure BT and Wi-Fi (see this tutorial on how to configure Wi-Fi from command line; for BT just run ‘sudo connmanctl enable bluetooth’)
  • edit the ‘/boot/flash/config.txt’ and uncomment these 4 lines:
    • dtoverlay=brickpi
    • init_uart_clock=32000000
    • dtoverlay=pi3-miniuart-bt
    • core_freq=250
  • sudo reboot
  • remove the display and the keyboard and from now on just connect through Wi-Fi

To test that both Bluetooth and the BrickPi work properly I used a python script to read the NXT ultrasonic sensor (in the first input port) and change the color of my WeDo 2.0 Smart Hub from green to red:

#!/usr/bin/python

# run with sudo
# assumes NXT Ultrasonic at INPUT #1

from ev3dev.auto import *
from gattlib import GATTRequester
from time import sleep

BTdevice = "hci0"       # BlueTooth 4.0 BLE capable device

WeDo2HubAddress  = "A0:E6:F8:1E:58:57"

InputCommand_hnd = 0x3a
OutputCommand_hnd  = 0x3d

RGBAbsoluteMode_cmd = str(bytearray([01,02,06,17,01,01,00,00,00,02,01]))
RGBAbsoluteOutput_cmd = str(bytearray([06,04,03]))  # or "\x06\x04\x03"

DELAY      = 0.3

# DO NOT FORGET TO CONFIG FOR US sensor:
# sudo echo nxt-i2c > /sys/class/lego-port/port0/mode
# sudo echo "lego-nxt-us 0x01" > /sys/class/lego-port/port0/set_device
#
us = UltrasonicSensor('ttyAMA0:S1:i2c1')
assert us.connected

req = GATTRequester(WeDo2HubAddress,True,BTdevice)
sleep(DELAY)

# configure RBG LED to Absolute Mode (accepts 3 bytes for RGB instead of default Index Mode)
req.write_by_handle(InputCommand_hnd,RGBAbsoluteMode_cmd)

while(True):
  if (us.value() < 10):
    print("!")
    req.write_by_handle(OutputCommand_hnd, RGBAbsoluteOutput_cmd+chr(255)+chr(0)+chr(0))
    sleep(DELAY)
  else:
    print("-")
    req.write_by_handle(OutputCommand_hnd, RGBAbsoluteOutput_cmd+chr(0)+chr(255)+chr(0))
    sleep(DELAY)

My script need the gattlib library to talk with Bluetooth Low Energy devices. You can install this library with ‘pip’ but first need to install some dependencies:

sudo apt-get install pkg-config libboost-python-dev libboost-thread-dev libbluetooth-dev libglib2.0-dev python-dev

then

sudo pip install gattlib

LEGO WeDo 2.0 – playing sound

Este artigo é a parte 4 de 6 da série  WeDo 2.0 - reverse engineering

Great news – LEGO Eduction released the WeDo 2.0 SDK today!

After digging into it, I found the information needed to control the Piezo: as expected, it’s controlled by the same handle that is used for controlling the motor and the RGB LED (0x003d). The “port” is “05” and the “command” to activate the Piezo is “02”, followed by a payload of “04” bytes containing:

  • the Frequency in Hz (2 bytes, reversed)
  • the duration in ms (2 bytes, reversed)

So to play a “C” (or “Do”, 261 Hz) during 1/8 of a second (125 ms) we use this command:

[A0:E6:F8:1E:58:57][LE]> char-write-cmd 003d 050204B801E803

So let’s hear the very first music played by a WeDo 2.0 from a linux shell script:

#!/usr/bin/env bash

# In Ubuntu run this script with sudo
# "Imperial March on a WeDo 2.0" was inspired by https://gist.github.com/tagliati/1804108

# command: gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 050204

# beep(a, 500) 
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 050204B801F401
sleep 0.5

# beep(a, 500) 
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 050204B801F401
sleep 0.5

# beep(a, 500) 
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 050204B801F401
sleep 0.5

# beep(f, 350)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502045D015E01
sleep 0.35

# beep(cH, 150)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502040B029600
sleep 0.15

# beep(a, 500)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 050204B801F401
sleep 0.5

# beep(f, 350)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502045D015E01
sleep 0.35

# beep(cH, 150)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502040B029600
sleep 0.15

# beep(a, 1000)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 050204B801E803
sleep 1.0

# beep(eH, 500)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502049302F401
sleep 0.5
    
# beep(eH, 500)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502049302F401
sleep 0.5

# beep(eH, 500)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502049302F401
sleep 0.5

# beep(fH, 350) 
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 050204BA025E01
sleep 0.35

# beep(cH, 150)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502040B029600
sleep 0.15

# beep(gS, 500)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502049F01F401
sleep 0.5    

# beep(f, 350)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502045D015E01
sleep 0.35

# beep(cH, 150)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502040B029600
sleep 0.15

# beep(a, 1000)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 050204B801E803
sleep 1.0
 
# beep(aH, 500)   
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502047003F401
sleep 0.5

# beep(a, 350) 
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 050204B8015E01
sleep 0.35

# beep(a, 150)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 050204B8019600
sleep 0.15

# beep(aH, 500)   
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502047003F401
sleep 0.5

# beep(gSH, 250)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502043E03FA00
sleep 0.25

# beep(gH, 250)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502041003FA00
sleep 0.25

# beep(fSH, 125)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 050204E4027D00
sleep 0.125

# beep(fH, 125) 
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 050204BA027D00
sleep 0.125  
   
# beep(fSH, 250)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 050204E402FA00
sleep 0.25

# delay(250)
sleep 0.25

# beep(aS, 250)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 050204C701FA00
sleep 0.25

# beep(dSH, 500)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502046E02F401
sleep 0.5

# beep(dH, 250)  
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502044B02FA00
sleep 0.25

# beep(cSH, 250)  
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502042A02FA00
sleep 0.25

# beep(cH, 125)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502040B027D00
sleep 0.125

# beep(b, 125)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 050204D2017D00
sleep 0.125

# beep(cH, 250)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502040B02FA00
sleep 0.25
      
# delay(250)
sleep 0.25

# beep(f, 125)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502045D017D00
sleep 0.125

# beep(gS, 500)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502049F01F401
sleep 0.5

# beep(f, 375) 
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502045D017701
sleep 0.375

# beep(a, 125)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 050204B8017D00
sleep 0.125
  
# beep(cH, 500)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502040B02F401
sleep 0.5

# beep(a, 375)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 050204B8017701
sleep 0.375

# beep(cH, 125)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502040B027D00
sleep 0.125

# beep(eH, 1000)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502049302E803
sleep 1.0

# beep(aH, 500)   
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502047003F401
sleep 0.5

# beep(a, 350) 
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 050204B8015E01
sleep 0.35

# beep(a, 150)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 050204B8019600
sleep 0.15

# beep(aH, 500)   
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502047003F401
sleep 0.5

# beep(gSH, 250)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502043E03FA00
sleep 0.25

# beep(gH, 250)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502041003FA00
sleep 0.25

# beep(fSH, 125)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 050204E4027D00
sleep 0.125
    
# beep(fH, 125) 
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 050204BA027D00
sleep 0.125

# beep(fSH, 250)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 050204E402FA00
sleep 0.25

# delay(250)
sleep 0.25

# beep(aS, 250)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 050204C701FA00
sleep 0.25

# beep(dSH, 500)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502046E02F401
sleep 0.5

# beep(dH, 250)  
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502044B02FA00
sleep 0.25

# beep(cSH, 250)  
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502042A02FA00
sleep 0.25

# beep(cH, 125)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502040B027D00
sleep 0.125

# beep(b, 125)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 050204D2017D00
sleep 0.125

# beep(cH, 250)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502040B02FA00
sleep 0.25

# delay(250)
sleep 0.25

# beep(f, 250)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502045D01FA00
sleep 0.25

# beep(gS, 500)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502049F01F401
sleep 0.5 
  
# beep(f, 375)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502045D017701
sleep 0.375

# beep(cH, 125)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502040B027D00
sleep 0.125
           
# beep(a, 500)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 050204B801F401
sleep 0.5

# beep(f, 375)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 0502045D017701
sleep 0.375

# beep(c, 125)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 05020405017D00
sleep 0.125

# beep(a, 1000)
gatttool -i hci0 -b A0:E6:F8:1E:58:57 --char-write-req -a 0x003d -n 050204B801E803
sleep 1.0