EV3 Code Pilot

This post is part 2 of 3 of  barcode

The first MINDSTORMS programmable bricks, the RCX, had several “incarnation”. Most people remember the yellow brick but there were several others, including the Code Pilot that used light sequences as commands. Those VLL codes could be printed as barcodes:

From ‘http://www.elecbrick.com/lego/’

and the Code Pilot set had a companion sheet with some codes and also key notes that could be used as a barcode piano.

So let’s try a simple set of instructions:

The barcodes are just EAN-13 code generated online from EAN-13 barcode generator. They all start with ‘560’ (the code for Portugal-based companies) and the remaining 12 digits are just 1/2/…/5 left padded with zeros. The last digit is the checksum code and is generated by the tool.

So my first program uses a nested dictionary with the EAN-13 codes paired with an action to be spoken. So the program waits for a barcode to be read, checks for the action to be spoken and execute the action we chose:

Source code:

https://github.com/JorgePe/ev3-barcode/blob/master/codepilot01.py

The code sheet used and further developments I might make are also at github:

The new LEGO Powered Up Remote Control

This post is part 1 of 2 of  LEGO Powered Up Remote Control

This week I got a question from Nathan Kunicki about using the BOOST Tacho Motor with the new Powered Up hub.

He already succeeded with the other WeDo 2 and BOOST devices so the motor was the last challenge. So I tried a few ideas but nothing.

I told him that probably the motor was not supported by the current firmware… and then he told me that when we use the new Remote it works (as if a WeDo 2 motor, just ON or OFF).

Well… my lovely wife just gave me this weekend my future birthday gift: the new LEGO Cargo Train. And it has a Powered Up remote so… let me try it.

No, we couldn’t find a way to control the motor. But I learned a bit about the Remote (thanks Nathan!) Enough to use it as a standalone BLE controller without the LEGO Powered Up “HUB NO.4″… it’s so easy that any BLE master can use it.

And, of course, the MINDSTORMS EV3 when running ev3dev:

to be continued…

 

RC servo motors with linux

This post is part 1 of 2 of  LEGO-compatible RC servos

You can use RC servo motors with any Arduino. Even with a Raspberry Pi, no need for special hardware – just connect the signal wire to a free GPIO and send a short every 20 ms or so, where the position of the motor is proportional to the length of the pulse (usually between 0 and 2 ms).

A “normal” computer doesn’t have GPIO pins so some kind of controller is needed. I have a Pololu Mini Maestro that can control several RC servo motors at once through USB or UART – in my case up to 24 motors.

It works great with linux – the Maestro Control Center is a .Net Framework application that runs fine with Mono but after initial setup a simple bash script is enough.

So the fastest (not the best) walk through:

Connect the Maestro with an USB cable and run ‘dmesg’ to see if it was recognized – two virtual serial ports should be created:

[ 1611.444253] usb 1-3: new full-speed USB device number 7 using xhci_hcd
[ 1611.616717] usb 1-3: New USB device found, idVendor=1ffb, idProduct=008c
[ 1611.616724] usb 1-3: New USB device strings: Mfr=1, Product=2, SerialNumber=5
[ 1611.616728] usb 1-3: Product: Pololu Mini Maestro 24-Channel USB Servo Controller
[ 1611.616732] usb 1-3: Manufacturer: Pololu Corporation
[ 1611.616735] usb 1-3: SerialNumber: 00094363
[ 1611.646820] cdc_acm 1-3:1.0: ttyACM0: USB ACM device
[ 1611.649542] cdc_acm 1-3:1.2: ttyACM1: USB ACM device
[ 1611.651109] usbcore: registered new interface driver cdc_acm
[ 1611.651112] cdc_acm: USB Abstract Control Model driver for USB modems and ISDN adapters

The first one (‘/dev/ttyACM0’) is the ‘Command Port’ and the second one (‘/dev/ttyACM’1) is the ‘TTL Serial Port’.

We download and extract the Maestro Servo Controller Linux Software from Pololu site. To run it we need Mono and libusb, the (excellent!) User Guide gives this command:

sudo apt-get install libusb-1.0-0-dev mono-runtime libmono-winforms2.0-cil

I already had libusb and with current Ubuntu (17.04) the Mono packages are different:

Package libmono-winforms2.0-cil is not available, but is referred to by another package.
This may mean that the package is missing, has been obsoleted, or
is only available from another source
However the following packages replace it:
 mono-reference-assemblies-2.0 mono-devel

so I ran instead

sudo apt install mono-devel

It will work but it will give an access error to the device. We can follow the manual and create an udev rule or just use sudo:

sudo mono MaestroControlCenter

We will probably find our Maestro ruuning in default serial mode, i.e. ‘UART, fixed baud rate: 9600’ but we change that to ‘USB Dual Port’ so we can control our servos directly from the shell without the Maestro Control Center:But now that we are here let’s also see Status:

We can control our servos from here if we want.

Now back to the command line – in ‘USB Dual Port Mode’ we can control our servos sending commands to the Command Port (i.e. ‘/dev/ttyACM0’). There are several protocols available, let’s just see the Compact Protocol:

echo -n -e "\x84\x00\x70\x2E" > /dev/ttyACM0

Four bytes are used:

  • the first byte is always “84h” and it means we are using the “set target” command
  • the second byte is the channel number so my Maestro can accept a number between 0 and 23
  • the third and fourth bytes is the length of the pulse but in a special format:

The value is in quarters of microseconds. So for a 1.5 ms (1500 µs) we will use 6000. Usually this is the middle (center) position of the servo.

Also «the lower 7 bits of the third data byte represent bits 0–6 of the target (the lower 7 bits), while the lower 7 bits of the fourth data byte represent bits 7–13 of the target». So

So 6000d = 1770h

The third byte is calculated with a bitwise AND:

1770h & 7Fh = 70h

And the fourth byte is calcultated with a shift and an AND:

1770h >> 7 = 2Eh

2Eh & 7Fh = 2Eh

So 1500 µs is represented as 70h 2Eh

Pololu makes life easier with a bash script, ‘maestro-set-target.sh’:

#!/bin/bash
# Sends a Set Target command to a Pololu Maestro servo controller
# via its virtual serial port.
# Usage: maestro-set-target.sh DEVICE CHANNEL TARGET
# Linux example: bash maestro-set-target.sh /dev/ttyACM0 0 6000
# Mac OS X example: bash maestro-set-target.sh /dev/cu.usbmodem00234567 0 6000
# Windows example: bash maestro-set-target.sh '\\.\USBSER000' 0 6000
# Windows example: bash maestro-set-target.sh '\\.\COM6' 0 6000
# CHANNEL is the channel number
# TARGET is the target in units of quarter microseconds.
# The Maestro must be configured to be in USB Dual Port mode.
DEVICE=$1
CHANNEL=$2
TARGET=$3

byte() {
 printf "\\x$(printf "%x" $1)"
}

stty raw -F $DEVICE

{
 byte 0x84
 byte $CHANNEL
 byte $((TARGET & 0x7F))
 byte $((TARGET >> 7 & 0x7F))
} > $DEVICE

So instead of echo’ing “\x84\x00\x70\x2E”  to the Command Port we can also use

./maestro-set-target.sh /dev/ttyACM0 0 6000

So now we can control a servo with common bash commands. For example this script makes the servo rotate from left to right then back in 20 increments then repeats it 4 times:

#!/bin/bash

sleep 5
for j in `seq 1 5`;
do
  for i in `seq 4000 200 8000`;
  do
    ./maestro-set-target.sh /dev/ttyACM0 0 $i
    sleep 0.1
  done
  for i in `seq 8000 -200 4000`;
  do
    ./maestro-set-target.sh /dev/ttyACM0 0 $i
    sleep 0.1 
  done
done

So we can now control up to 24 servo motors.

So let’s control a special servo motor, more related to my hobby:

That’s a 4DBrix Standard Servo Motor, a motor that 4DBrix sells for controlling LEGO train or monorail models. They also sell their own USB controller but since it is in fact a pretty common RC micro servo inside a 3D printed LEGO-compatible ABS housing we can also use the Maestro:

The same script:

These motors work fine with 4DBrix LEGO-compatible monorail parts:

But better yet… the Maestro also works fine with ev3dev:

I now realize it wasn’t very clever to show motors rotating slowly when the monorail switches only have two functional states so this new video looks a little better, with the three 4DBrix servo motors and the LEGO EV3 medium motor changing the state of the monorail switches every second:

So I can now control 24 RC Servo motors with my MINDSTORMS EV3.

Even better: we can add several Maestros through an USB hub… so why just 24 motors? With 126 Mini Maestros 24ch and an USB hub we could use 3024 motors 🙂

Codatex RFID Sensor

Some time ago, when I started to use RFID tags with an automated LEGO train, I found out that there was a RFID sensor available for the MINDSTORMS NXT, the Codatex RFID Sensor for NXT:

Codatex doesn’t make them aymore so I ordered one from BrickLink but never got it working with ev3dev. I put it on the shelf hoping that after a while, with ev3dev constant evolution, things would get better.

Last month Michael Brandl told me that the Codatex sensors were in fact LEGO sensors and he asked if it was possible to use with EV3.

Well, it is possible, just not with original LEGO firmware. At least LeJOS and RobotC have support for the Codatex RFID Sensor. But not ev3dev 🙁

So this holidays I put the Codatex sensor on my case, decided to give it another try.

I read the documentation from Codatex and the source code from LeJOS and RobotC and after a while I was reading the sensor properties directly from shell.

After connecting the sensor to Input Port 1 I need to set the port mode to “other-i2C”:

echo other-i2c > /sys/class/lego-port/port0/mode

Currently there are two I2C modes in ev3dev: “nxt-i2c” for known NXT devices and “other-i2c” for other I2C devices, preventing the system to poll the I2C bus.

To read all the Codatex registers this line is enough:

/usr/sbin/i2cset -y 3 0x02 0x00 ; /usr/sbin/i2cset -y 3 0x02 0x41 0x83; sleep 0.1; /usr/sbin/i2cdump -y 3 0x02

(first wake up the sensor with a dummy write, then initialize the firmware, wait a bit and read everything)

Error: Write failed
No size specified (using byte-data access)
 0 1 2 3 4 5 6 7 8 9 a b c d e f 0123456789abcdef
00: 56 31 2e 30 00 00 00 00 43 4f 44 41 54 45 58 00 V1.0....CODATEX.
10: 52 46 49 44 00 00 00 00 00 00 00 00 00 00 00 00 RFID............
20: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
30: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
40: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
50: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
60: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
70: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
80: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
90: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
a0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
b0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
c0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
d0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
e0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
f0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................

The “Error: Write failed” is expected because the dummy write that wakes the Codatex sensor fails.

I got excited, it seemed easy.

So to read just once (singleshot mode) this should work:

/usr/sbin/i2cset -y 3 0x02 0x00 ; /usr/sbin/i2cset -y 3 0x02 0x41 0x01; sleep 0.25; /usr/sbin/i2cdump -r 0x42-0x46 -y 3 0x0

(wake up, send a singleshot read command, wait for aquiring, read the 5 Tag ID registers)

But it didn’t work:

Error: Write failed
No size specified (using byte-data access)
 0 1 2 3 4 5 6 7 8 9 a b c d e f 0123456789abcdef
40: 00 00 00 00 00 .....

The next days I read lots of code from the web, even from Daniele Benedettelli old NXC library. It seemed that my timings were wrong but I could not understand why.

After asking for help at ev3dev I found the reason: the Codatex sensor needs power at pin 1 for the RFID part. So the I2C works but without 9V at pin 1 all readings will fail.

LeJOS and RobotC activate power on pin1 but the two ev3dev I2C modes don’t do that. ITo be sure I tried LeJOS and finally I could read my tags:

package PackageCodatex;

import lejos.hardware.Brick;
import lejos.hardware.port.SensorPort;
import lejos.hardware.sensor.RFIDSensor;

public class Codatex
{
    public static void main(String[] args)
    {
    	RFIDSensor rfid = new RFIDSensor(SensorPort.S1);
		System.out.println(rfid.getProductID());
		System.out.println(rfid.getVendorID());
		System.out.println(rfid.getVersion());
		try {Thread.sleep(2000);}
		catch (InterruptedException e) {}
		long transID = rfid.readTransponderAsLong(true);    
		System.out.println(transID);
		try {Thread.sleep(5000);}
		catch (InterruptedException e) {}
	}
}

David Lechner gave some hint on how to write a driver but honestly I don’t understand at least half of it. So I made a ghetto adapter with a MINDSTORMS EV3 cable and a 9V PP3 battery and it finally works – I can read the Codatex 4102 tags as well as my other 4001 tags:

I created a GitHub repository with the scrips to initialize the port and to use singleshot reads. Soon I will add a script for countinuous reads.

An important note for those who might try the ghetto cable: despite many pictures on the web saying that pin 2 is Ground, IT IS NOT. Use pin 1 for power and pin 3 for Ground. And preferably cut the power wire so that if you make some mistake your EV3 pin1 is safe.

Now I really do need to learn how to write a driver instead of hacking cables. Ouch!

Update: Since yesterday (25-July-2017) the ghetto cable is no longer needed. David Lechner added the pin1 activation feature to the “other-i2c” mode so since kernel 4.4.78-21-ev3dev-ev3 a standard cable is enough.

Thank you David!

EV3 – minifig inventory

This post is part 2 of 2 of  EV3 and Chromecast

Now that communication with the Chromecast is working let’s make a small game with the few parts that I have in my “holiday bag”:

I’m using the LEGO Dimensions Toy Pad to read the NFC tags. I already scanned the ID’s of a blue and an orange tag and saved two JPEG images of the Frenchman LEGO minifigure and also the Invisible Woman (sorry, only had one minifig with me these holydays) on the web server folder so each time one of these NFC tags is recognized it’s JPEG image is presented on the TV.

The code is based on my tutorial for the using the LEGO Dimensions Toy Pad with ev3dev, I just updated it to work with Python 3:


#!/usr/bin/python3

import usb.core
import usb.util
from time import sleep
import pychromecast

TOYPAD_INIT = [0x55, 0x0f, 0xb0, 0x01, 0x28, 0x63, 0x29, 0x20, 0x4c, 0x45, 0x47, 0x4f, 0x20, 0x32, 0x30, 0x31, 0x34, 0xf7, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00]

OFF   = [0,0,0]
RED   = [255,0,0]
GREEN = [0,255,0]
BLUE  = [0,0,255]
ORANGE= [255,30,0]

ALL_PADS   = 0
CENTER_PAD = 1
LEFT_PAD   = 2
RIGHT_PAD  = 3

# Actions
TAG_INSERTED = 0
TAG_REMOVED  = 1

# UIDs can be retrieved with Android App (most probably in hexadecimal)
bluetag = (4, 77, 198, 218, 162, 64, 129)         # a Blue Tag from LEGO Dimensions
orangetag = (4, 91, 182, 122, 177, 73, 129)       # an Orange Tag from LEGO Dimensions

DELAY_PLAY = 1.75     # 1.5 NOT OK

def init_usb():
    global dev

    dev = usb.core.find(idVendor=0x0e6f, idProduct=0x0241)

    if dev is None:
        print('Device not found')
    else:
        print('Device found:')
        if dev.is_kernel_driver_active(0):
            dev.detach_kernel_driver(0)

        print(usb.util.get_string(dev, dev.iProduct))

        dev.set_configuration()
        dev.write(1,TOYPAD_INIT)

    return dev


def send_command(dev,command):

    # calculate checksum
    checksum = 0
    for word in command:
        checksum = checksum + word
        if checksum >= 256:
            checksum -= 256
        message = command+[checksum]

    # pad message
    while(len(message) < 32):
        message.append(0x00)

    # send message
    dev.write(1, message)

    return


def switch_pad(pad, colour):
    send_command(dev,[0x55, 0x06, 0xc0, 0x02, pad, colour[0], colour[1], colour[2],])
    return


def uid_compare(uid1, uid2):
    match = True
    for i in range(0,7):
        if (uid1[i] != uid2[i]) :
            match = False
    return match 


def main():

    # init chromecast
    chromecasts = pychromecast.get_chromecasts()
    cast = next(cc for cc in chromecasts if cc.device.friendly_name == "NOMAD")
    cast.wait()
    mc = cast.media_controller
    print("Blacking chromecast...")
    mc.play_media('http://192.168.43.104:3000/black.png', 'image/png')

    result=init_usb()
    # print(result)
    if dev != None :
        print('Running...')
        mc.stop()
        mc.play_media('http://192.168.43.104:3000/presentminifig.png', 'image/png')
        print('Present a minifg')

        while True:
            try:
                in_packet = dev.read(0x81, 32, timeout = 10)
                bytelist = list(in_packet)

                if not bytelist:
                    pass
                elif bytelist[0] != 0x56: # NFC packets start with 0x56
                    pass
                else:
                    pad_num = bytelist[2]
                    uid_bytes = bytelist[6:13]
#                    print(uid_bytes)
                    match_orange = uid_compare(uid_bytes, orangetag)
                    match_blue = uid_compare(uid_bytes, bluetag)

                    action = bytelist[5]
                    if action == TAG_INSERTED :
                        if match_orange:
                            switch_pad(pad_num, ORANGE)
                            mc.play_media('http://192.168.43.104:3000/french-man-2.png', 'image/png')
                        elif match_blue:
                            mc.play_media('http://192.168.43.104:3000/invisible-woman-2b.png', 'image/png')
                            switch_pad(pad_num, BLUE)
                        else:
                            # some other tag
                            switch_pad(pad_num, GREEN)
                    else:
                        # some tag removed
                        switch_pad(pad_num, OFF)
                        mc.stop()
                        # sleep(1)
                        mc.play_media('http://192.168.43.104:3000/presentminifig.png', 'image/png')
                        sleep(DELAY_PLAY)

            except usb.USBError:
                pass

        switch_pad(ALL_PADS,OFF)
    return

if __name__ == '__main__':
    main()
  

You probably need to install pyusb for python3. With Ubuntu I just needed

sudo apt install python3-usb

but it is not available for Debian Jessie so

sudo pip3 install pyusb --pre

(ev3dev is changing to Debian Stretch that also has python3-usb)

Using Grove devices to the EV3

After David Lechner announced ev3dev support for it I’ve been planning to offer myself a couple of BrickPi 3 from Dexter Industries (just one is not enough since the BrickPi 3 suports daisy chaining).

While I wait for european distributors to sell it (and my budget to stabilize) and since I’m also playing with magnets, I ordered a mindsensors.com Grove adapter so I can start testing Grove devices with my Ev3. Also got two Grove devices from Seeed Studio at my local robotics store, will start with the easiest one: Grove – Electromagnet.

ev3dev doesn’t have a Grove driver yet but since the adapter is an I2C device it recognizes it and configures it as an I2C host:

[  563.590748] lego-port port0: Added new device 'in1:nxt-i2c-host'
[  563.795525] i2c-legoev3 i2c-legoev3.3: registered on input port 1

Addressing the Grove adpter is easy, just need to follow the ev3dev documentation (Appendix C : I2C devices):

robot@ev3dev:~$ ls /dev/i2c-in*
/dev/i2c-in1

robot@ev3dev:~$ udevadm info -q path -n /dev/i2c-in1        
/devices/platform/legoev3-ports/lego-port/port0/i2c-legoev3.3/i2c-3/i2c-dev/i2c-3

So the Grove adapter is at I2C bus #3. According to mindsensors.com User Guide, it’s address is 0x42. That’t the unshifted address but fot i2c-tools we need to use the shifted address (0x21 – at the end of the ev3dev Appendix C doc there is a table with both addresses).

robot@ev3dev:~$ sudo i2cdump 3 0x21

     0  1  2  3  4  5  6  7  8  9  a  b  c  d  e  f    0123456789abcdef
00: 56 31 2e 30 32 00 00 00 6d 6e 64 73 6e 73 72 73    V1.02...mndsnsrs
10: 47 61 64 70 74 6f 72 00 00 00 00 00 00 00 00 00    Gadptor.........
20: 4a 61 6e 20 30 34 20 32 30 31 35 00 31 32 46 31    Jan 04 2015.12F1
30: 38 34 30 00 00 00 00 00 00 00 00 00 00 00 00 00    840.............
40: 00 97 03 32 00 00 00 00 00 00 00 00 00 00 00 00    .??2............
50: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00    ................
60: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00    ................
70: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00    ................
80: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00    ................
90: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00    ................
a0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00    ................
b0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00    ................
c0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00    ................
d0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00    ................
e0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00    ................
f0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00    ................

Acording to the User Guide, this is the expected content of the first 24 registers:

0x00-0x07: Software version – Vx.nn
0x08-0x0f: Vendor Id – mndsnsrs
0x10-0x17: Device ID – Gadptor

So I have a v1.02 Grove adapter.

To use the Grove – Electromagnet I just need to send a “T” (0x54) to the Command Register (0x41) to set the Grove Adapter into “Transmit” mode and next set the Operation Mode, which can be “Digital_0” (sending 0x02 to the Operation Mode register at 0x42) or “Digital_1” (sending 0x03 to the Operation Mode register).

So to turn the electromagnet ON:

sudo i2cset -y 3 0x21 0x41 0x54
sudo i2cset -y 3 0x21 0x42 0x03

And to turn it OFF:

sudo i2cset -y 3 0x21 0x41 0x54
sudo i2cset -y 3 0x21 0x42 0x02

Just a warning: with an operating current of 400 mA when ON the electromagnet gets hot very quickly – not enough to hurt but don’t forget to switch it OFF after use to prevent draing the EV3 batteries.

The same method (“T” + “Digital_0” / “Digital_1”) can be used with several other Grove devices, like the Grove – Water Atomization:

(a great way to add fog effects to our creations – just be careful with short circuits; if you add some kind of parfum you can also have scent effects)

Final note: you can use the mindsensors.com Grove Adapter with native EV3 firmware (just import the available EV3-G block) but if you are using ev3dev like me be sure to use a recent kernel (as of today, “4.4.61-20-ev3dev-ev3”) because older versions had a bug that caused some communication problems with I2C devices (the Grove Adapter is an I2C device).

Triplex – an holonomic robot

This post is part 1 of 3 of  Triplex

A few months ago, trying to find an use for a new LEGO brick found in NEXO Knights sets, I made my first omni wheel. It worked but it was to fragile to be used in a robot so I decided to copy one of Isogawa’s omni wheels and keep working on an holonomic robot with 3 wheels.

Why 3 wheels?

At first I only had NEXO parts to build 3 wheels but I enjoyed the experience – my first RC experiments seemed like lobsters. Controlling the motion is not easy but I found a very good post from Miguel from The Technic Gear  so it was easy to derive my own equations. But Power Functions motors don’t allow precise control of speed so I could not make the robot move in some directions. I needed regulated motors like those used with MINDSTORMS EV3.

So after assembling three Isogawa’s omniwheels and making a frame that assured the wheel doesn’t separate from the motor it was just a matter of making a triangled frame to join all 3 motors and sustain the EV3:

First tests with regulated motor control seem promising: Triplex is fast enough and doesn’t fall apart.  It drifts a bit so I’ll probably use a gyro sensor or a compass to correct it.

In this demo video I show Triplex being wireless controlled from my laptop keyboard through an SSH session. It just walks “forward” or “backward” (only two motors are used, running at the same speed in opposite directions) or rotates “left” or “right” (all motors are used, running at the same speed and the same direction).

For the code used in this demo I copied a block of code from Nigel Ward’s EV3 Python site that solved a problem I’ve been having for a long time: how do I use Python to read the keyboard without waiting for ENTER and without installing pygame or other complex/heavy library?

#!/usr/bin/env python3

# shameless based on
# https://sites.google.com/site/ev3python/learn_ev3_python/keyboard-control
#

import termios, tty, sys
from ev3dev.ev3 import *

TIME_ON = 250

motor_A = MediumMotor('outA')
motor_B = MediumMotor('outB')
motor_C = MediumMotor('outC')

#==============================================

def getch():
    fd = sys.stdin.fileno()
    old_settings = termios.tcgetattr(fd)
    tty.setcbreak(fd)
    ch = sys.stdin.read(1)
    termios.tcsetattr(fd, termios.TCSADRAIN, old_settings)
    
    return ch

#==============================================

def forward():
    motor_A.run_timed(speed_sp=-1200, time_sp=TIME_ON)
    motor_C.run_timed(speed_sp=1200, time_sp=TIME_ON)

#==============================================

def backward():
    motor_A.run_timed(speed_sp=1200, time_sp=TIME_ON)
    motor_C.run_timed(speed_sp=-1200, time_sp=TIME_ON)

#==============================================

def turn_left():
    motor_A.run_timed(speed_sp=1200, time_sp=TIME_ON)
    motor_B.run_timed(speed_sp=1200, time_sp=TIME_ON)
    motor_C.run_timed(speed_sp=1200, time_sp=TIME_ON)

#==============================================

def turn_right():
    motor_A.run_timed(speed_sp=-1200, time_sp=TIME_ON)
    motor_B.run_timed(speed_sp=-1200, time_sp=TIME_ON)
    motor_C.run_timed(speed_sp=-1200, time_sp=TIME_ON)

#==============================================

print("Running")
while True:
   k = getch()
   print(k)
   if k == 'a':
      forward()
   if k == 'z':
      backward()
   if k == 'o':
      turn_left()
   if k == 'p':
      turn_right()
   if k == ' ':
      stop()
   if k == 'q':
      exit()

Thanks for sharing Nigel!

Now let’s learn a bit of math with Python.

For those who might interest, I also have some photos with the evolution of the project.

 

EV3 Chaos

Need to practice with Python. Not just for ev3dev/EV3 but surely this will be my main target.

So today I remembered the 80’s… Chaos and fractals and my 8088 taking hours to draw a Mandelbrot on an ambar Hercules screen.

So I found a simple python script from Lennart Poettering and adapted it to work on EV3:

#!/usr/bin/python3

# EV3 03m58s - video: https://youtu.be/w5aKqmXz_Wk
# based on a python script from  Lennart Poettering
# found here http://0pointer.de/blog/projects/mandelbrot.html

from PIL import Image, ImageDraw
import math, colorsys
import ev3dev.ev3 as ev3
from time import sleep

lcd = ev3.Screen()
lcd.clear()

sleep(1)

dimensions = (178, 128)
scale = 1.0/(dimensions[0]/3)
center = (2.0,1.0)
iterate_max = 15
colors_max = 2

img = Image.new("1", dimensions,"white")

d = ImageDraw.Draw(img)

# Calculate the mandelbrot sequence for the point c with start value z
def iterate_mandelbrot(c, z = 0):
    for n in range(iterate_max + 1):
        z = z*z +c
        if abs(z) > 2:
            return n
    return None

# Draw our image
for y in range(dimensions[1]):
    for x in range(dimensions[0]):
        c = complex(x * scale - center[0], y * scale - center[1])

        n = iterate_mandelbrot(c)

        if n is None:
            v = 1
        else:
            v = n/100.0

        if v > 0.5 :
            d.point((x,y), fill = 0)
        else:
            d.point((x,y), fill = 1)

        lcd.image.paste(img, (0,0))
        lcd.update()

del d
img.save("result.png")
sleep(1)

My EV3 is running ev3dev-jessie-2016-12-21 release. No need to install PIL or anything else, just create the script, give execution permissions and run it.

The script takes 3m58s to run. next video shows the result (4x speed):

And also the output file:

Ah, those mighty 80’s!!

Virtual Mindstorms – using LEGO EV3 software on Linux

Yesterday Marc-André Bazergui incentivized me to make a video showing how to use LEGO MINDSTORMS EV3 Software inside a virtual machine. It is a shame that a product running Linux inside can only be used on PC or Mac – and that’s one of the reasons I started using ev3dev as I only have linux systems (laptops, Raspberry Pi’s, old DIY desktops without a Windows license…).

I got my first EV3 exactly 3 years ago as a birthday gift from my wife. I don’t remember if I ever installed the Windows software on a VM before – I did installed one or twice in Ubuntu with Wine (not sure why) and I did installed a Microsoft Robotics Developer Studio in a VMware Workstation virtual machine and do remember having connected it to the the EV3 thorough a bluetooth USB dongle (most modern hypervisors have this nice feature to allow a local device on the host to be passed-through into the guest).

I no longer have VMware Workstation but I have used Innotek VirtualBox in the past and knew that Oracle somehow managed to keep it alive after buying it (Oracle has the morbid habit of poisoning every good thing it owns – Java, Solaris, OpenOffice, MySQL…).

So I installed Oracle VM VirtualBox 5.1.4 (there is even a x64 .deb package for Ubuntu 16.04 “Xenial”) and after that the VirtualBox 5.1.4 Oracle VM VirtualBox Extension Pack.

It was quite easy and also very fast. After that I got a licensed version of Microsoft Windows 8 Professional (x64 also) – this is my work laptop so people immediatlely started making fun of me – hey, he is installing Windows on his laptop, finally!

The rest of the process was also quite easy after all – like I thought, it is possible to use a Bluetooth USB dongle and also just the direct USB cable connection:

  • create a Virtual Machine
  • make sure “Enable USB Controller” is checked and USB 2.0 (EHCI) Controller is selected – it might also work with USB 3.0
  • add an USB Device Filter for each USB device you want to passthrough into the VM (the EV3 itself and/or the Bluetooth dongle)
  • install Windows
  • present VirtualBox Guest Additions CD Image and install
  • define a Shared Folder so you can pass drivers and binaries into the VM
  • if the Bluetooth dongle is not automatic configured, install the proper drivers
  • pair the EV3 (or plug the USB cable)
  • install LEGO Mindstorms EV3 software and run it

I made a video showing every step (just skipped the LEGO Software as it’s pretty straightfoward):

Just one note: although USB cable connection seems to work fine, i tried to upgrade my EV3 firmware several times with no success – every single time it hangs at 0%. Perhaps it behaves better with another Windows version… who knows?

Edit: Laurens Valk and David Lechner know. So I made a second post showing how to upgrade the firmware.

LEGO laser harp – part II

This post is part 2 of 2 of  LEGO Laser Harp

About 10 years ago I offered my wife a M-Audio USB MIDI Keyboard and installed Ubuntu Studio on a computer so she could play some piano at home. She was so amazed with the possibility to generate music sheet while playing that almost accepted the idea of using Linux… almost 🙂

I remember that at that time I used timidity++ as a software MIDI synthesizer, tuned ALSA (one of the several Linux sound systems, perhaps the most generally used) and the preemptive kernel to work almost flawlessly with the Creative Labs sound card. My wife didn’t enjoy the KDE experience, Firefox was OK for her but OpenOffice were terribly with the spreadsheets she used and finally, when our first kid was born, she attended some English lessons at Wall Street Institute and we found out that the online lessons required an odd combination of an old version on Java, ActiveX and IE… so she returned to Windows XP and never looked back.

10 years is a LOT of time in computer systems but ALSA is still around, even on ev3dev. So I installed timidity++ and tried to play a MIDI file… to find that an ALSA module that is not currently available in ev3dev kernel is required just for MIDI.

I googled for alternatives and found fluidsynth with an unexpected bonus: there is a quite interesting python library, mingus, that works with fluidsynth. So I installed it in my Ubuntu laptop and in a few minutes I was playing harp – amazing!

sudo apt-get install fluidsynthsudo easy_install mingus
python
>>> from mingus.midi import fluidsynth
>>> from mingus.containers.note import Note
>>> fluidsynth.init("/usr/share/sounds/sf2/FluidR3_GM.sf2", "alsa")
>>> fluidsynth.set_instrument(1, 46)
>>> fluidsynth.play_Note(Note("C-3"))

In the previous example I just import the fluidsynth and Note parts of the library, initialize fluidsynth to work with ALSA loading the soundfount that cames with it, choose harp (instrument number 46) and play C3.

Well and polyphony? The correct way is to use a NoteContainer

from mingus.containers import NoteContainer
fluidsynth.play_NoteContainer(NoteContainer(["B-3", "C-3", "F-3"]))

but the lazy way is… just play several notes in a fast sequence.

So, let’s do it in the ev3dev!

Oops, fluidsynth also needs an ALSA module not available in current ev3dev kernel.

I’m not a linux music expert. Not even a linux expert! So after some more googling I gave up and asked for help in ev3dev’ GitHub project. And once again David accepted to include ALSA MIDI suport in future kernels, so I’ll just wait a bit.

Oh, but I can’t wait…

And if I read the color sensors in ev3dev and play the music in my laptop?

ALSA, once again, suports something like client/server MIDI communication with “aseqnet” and “aconnect” commands and some people are already using it with Raspberry Pi!

Yeah, I should have guessed… “aconnect” requires an ALSA MIDI module that is not available in current ev3dev kernel.

OK, let’s use MQTT: configure my EV3 as a publisher and my Ubuntu laptop as a subscriber and just send some notes as messages.

On the EV3:

sudo apt-get install mosquitto
sudo easy_install paho-mqtt

The publisher script is “harp-mqtt-pub.py”:

#!/usr/bin/env python

from ev3dev.auto import *
from time import sleep
import paho.mqtt.client as mqtt

DELAY = 0.01

# should have an auto-calibrate function
AMB_THRESHOLD = 9

sensor1 = ColorSensor('in1:i2c80:mux1')
sensor1.mode = 'COL-AMBIENT'
sensor2 = ColorSensor('in1:i2c81:mux2')
sensor2.mode = 'COL-AMBIENT'
sensor3 = ColorSensor('in1:i2c82:mux3')
sensor3.mode = 'COL-AMBIENT'
sensor4 = ColorSensor('in2')
sensor4.mode = 'COL-AMBIENT'
sensor5 = ColorSensor('in3')
sensor5.mode = 'COL-AMBIENT'
sensor6 = ColorSensor('in4')
sensor6.mode = 'COL-AMBIENT'

# there is no sensor7 yet, I need another MUX

s1 = 0
s2 = 0
s3 = 0
s4 = 0
s5 = 0
s6 = 0
s7 = 0

client = mqtt.Client()
client.connect("localhost",1883,60)

print 'Running...'

while True:
    key_touched = False
    s1 = sensor1.value(0)
    s2 = sensor2.value(0)
    s3 = sensor3.value(0)
    s4 = sensor4.value(0)
    s5 = sensor5.value(0)
    s6 = sensor6.value(0)
#    s7 = sensor7.value(0)

    if s1 < AMB_THRESHOLD:
        client.publish("topic/Harp", "C-3")
        key_touched=True
    if s2 < AMB_THRESHOLD:
        client.publish("topic/Harp", "D-3")
        key_touched=True
    if s3 < AMB_THRESHOLD:
        client.publish("topic/Harp", "E-3")
        key_touched=True
    if s4 < AMB_THRESHOLD:
        client.publish("topic/Harp", "F-3")
        key_touched=True
    if s5 < AMB_THRESHOLD:
        client.publish("topic/Harp", "G-3")
        key_touched=True
    if s6 < AMB_THRESHOLD:
        client.publish("topic/Harp", "A-3")
        key_touched=True
#    if s7 < AMB_THRESHOLD:
#        client.publish("topic/Harp", "B-3")
#        key_touched=True

    if key_touched == True:
        sleep(DELAY)

On the Ubuntu laptop side:

sudo easy_install paho-mqtt

The subscriber script is “harp-mqtt-sub.py”

#!/usr/bin/env python

import paho.mqtt.client as mqtt
from mingus.midi import fluidsynth
from mingus.containers.note import Note

EV3_IP = "192.168.43.35"

SOUNDFONT = 'Concert_Harp.sf2'
INSTRUMENT = 46 # Harp

NOTES = ['C-3','D-3','E-3','F-3','G-3','A-3','B-3']

def on_connect(client, userdata, flags, rc):
    print("Connected with result code "+str(rc))
    client.subscribe("topic/Harp")

def on_message(client, userdata, msg):
    global i
    if (msg.payload in NOTES):
        print msg.payload
        fluidsynth.play_Note(Note(msg.payload))
    
client = mqtt.Client()
client.connect(EV3_IP,1883,60)

client.on_connect = on_connect
client.on_message = on_message

fluidsynth.init(SOUNDFONT, "alsa")
fluidsynth.set_instrument(1, INSTRUMENT)

client.loop_forever()

And guess what? It works!!! I just love linux and open source!

I will keep waiting for David Lechner to include ALSA MIDI support in ev3dev’ kernel. I’m not so sure if there is enough horsepower in the EV3 to load a soundfont and play it with acceptable latency but if I can at least use the MIDI client/server functionality I can drop MQTT.

An interesting possibility that this client/server design allows is to scale my harp easily: with just a second EV3 (2 MUX each) I can make a 13-string harp with almost no modification on my code.