Triplex v0.4

This post is part 2 of 3 of  Triplex

Triplex v0.3 was demoed at São João da Madeira’s BRInCKa 2017, just using remote control from my laptop through a ssh session. As expected, some people find it similar to a lobster and a few visitors noticed the omniwheels and asked for further details.

The best moment was after the exhibition being closed – a private test drive session just for ‘Pocas’, our LUG mosaic expert:

One of these days I do have to complete my wifi-to-IR gateway so Pocas can drive Technic Power Functions models like the 42065 RC Tracked Racer.

Now one of the lessons learned at BRINcKa 2017 was that Triplex was bending with is own weight. So last weeked I redesigned it to be more solid. Version 0.4 “legs” are now a little longer and overall I think it looks more elegant:

Also managed to make the math working and tested my matrix in python:

 0.667      0        0.333
-0.333    -0.575     0.333 
-0.333     0.575     0.333

To test moving it in 24 different directions (multiples of 15º) I used this python script (with some simplifications – since I don’t want it to rotate)

#!/usr/bin/env python3
from math import cos,sin
from time import sleep
import ev3dev.ev3 as ev3

M11 = 0.667
M12 = 0
M13 = 0.333
M21 = -0.333
M22 = -0.575
M23 = 0.333 
M31 = -0.333
M32 = 0.575
M33 = 0.333

SPEED = 1000
TIME = 1200
PAUSE = 0.8
PI = 3.14159

m1 = ev3.MediumMotor('outA')
m2 = ev3.MediumMotor('outB')
m3 = ev3.MediumMotor('outC')

# select an angle a in PI/12 radians = 15º

for x in range(0, 24):

# move
    a = x*PI/12
    ax = cos(a)
    ay = sin(a)

    f1 = M11 * ax + M12 * ay
    f2 = M21 * ax + M22 * ay
    f3 = M31 * ax + M32 * ay

    s1 = f1 * SPEED
    s2 = f2 * SPEED
    s3 = f3 * SPEED

    m1.run_timed(time_sp=TIME, speed_sp=s1)
    m2.run_timed(time_sp=TIME, speed_sp=s2)
    m3.run_timed(time_sp=TIME, speed_sp=s3)

    sleep(TIME/1000)
    sleep(PAUSE)

# move back
    a = PI + x*PI/12
    ax = cos(a)
    ay = sin(a)

    f1 = M11 * ax + M12 * ay
    f2 = M21 * ax + M22 * ay
    f3 = M31 * ax + M32 * ay

    s1 = f1 * SPEED
    s2 = f2 * SPEED
    s3 = f3 * SPEED

    m1.run_timed(time_sp=TIME, speed_sp=s1)
    m2.run_timed(time_sp=TIME, speed_sp=s2)
    m3.run_timed(time_sp=TIME, speed_sp=s3)

    sleep(TIME/1000)
    sleep(PAUSE)

The result can be seen in this video:

The robot drifts a lot after the 48 moves and also rotates a bit. Will have to compensante it with a gyro (and probably better wheels, I’m considering mecanum wheels).  But the directions are quite as expected.

I uploaded a couple of photos to the Triplex Project Album. Will try to add a few more later on.

LEGO Voice Control – EV3

This post is part 2 of 2 of  LEGO Voice Control

And now the big test – will it work with EV3?

So, ev3dev updated:

Linux ev3dev 4.4.47-19-ev3dev-ev3 #1 PREEMPT Wed Feb 8 14:15:28 CST 2017 armv5tejl GNU/Linux

I can’t find any microphone at the moment so I’ll use the mic of my Logitech C270 webcam – ev3dev sees it as an UVC device as you can see with dmesg:

...
[ 1343.702215] usb 1-1.2: new full-speed USB device number 7 using ohci
[ 1343.949201] usb 1-1.2: New USB device found, idVendor=046d, idProduct=0825
[ 1343.949288] usb 1-1.2: New USB device strings: Mfr=0, Product=0, SerialNumber=2
[ 1343.949342] usb 1-1.2: SerialNumber: F1E48D60
[ 1344.106161] usb 1-1.2: set resolution quirk: cval->res = 384
[ 1344.500684] Linux video capture interface: v2.00
[ 1344.720788] uvcvideo: Found UVC 1.00 device <unnamed> (046d:0825)
[ 1344.749629] input: UVC Camera (046d:0825) as /devices/platform/ohci.0/usb1/1-1/1-1.2/1-1.2:1.0/input/input3
[ 1344.772321] usbcore: registered new interface driver uvcvideo
[ 1344.772372] USB Video Class driver (1.1.1)
[ 1352.171498] usb 1-1.2: reset full-speed USB device number 7 using ohci
...

and we can check with “alsamixer” that ALSA works fine with the internal microphone:

First press F6 to select sound card (the webcam is a sound card for ALSA)

Then press F5 to view all sound devices – there is just one, the mic:

We also need to know how ALSA addresses the mic:

arecord -l
**** List of CAPTURE Hardware Devices ****
card 1: U0x46d0x825 [USB Device 0x46d:0x825], device 0: USB Audio [USB Audio]
  Subdevices: 1/1
  Subdevice #0: subdevice #0

Card 1, Device 0 means we should use ‘hw:1,0’

Now we just follow the same process we used with Ubuntu. First we install pocketsphinx:

sudo apt install pocketsphinx
...
The following extra packages will be installed:
  javascript-common libblas-common libblas3 libjs-jquery liblapack3 libpocketsphinx1 libsphinxbase1
  pocketsphinx-hmm-en-hub4wsj pocketsphinx-lm-en-hub4
Suggested packages:
  apache2 lighttpd httpd
The following NEW packages will be installed:
  javascript-common libblas-common libblas3 libjs-jquery liblapack3 libpocketsphinx1 libsphinxbase1
  pocketsphinx pocketsphinx-hmm-en-hub4wsj pocketsphinx-lm-en-hub4
0 upgraded, 10 newly installed, 0 to remove and 0 not upgraded.
Need to get 8910 kB of archives.
After this operation, 30.0 MB of additional disk space will be used.
..

Although Ubuntu and Debian packages seem to be the same, the maintaners made some differente choices because in Ubuntu the ‘pocketsphinx-hmm-en-hub4wsj’ and ‘pocketsphinx-lm-en-hub4’ packages are missing.

So we copy 3 files from our previous work in Ubuntu:

  • keyphrase_list.txt
  • 0773.lm
  • 0772.dic

And we test it:

pocketsphinx_continuous -kws keyphrase_list.txt -adcdev hw:1,0 -lm 0772.lm -dict 0772.dic -inmic yes -logfn /dev/null

We get a “Warning: Could not find Capture element” but… yes, it works!

Of course it is slow… we see a big delay while starting until it displays “READY….” and also a big delay between each “Listening…” cycle. But it works! Isn’t open source great?

So we install expect to use our pipe again:

sudo apt install expect
mkfifo pipe

and we rewrite our ‘transmit.sh’ to command two EV3 motors (let’s call it “controller.sh” this time):

#!/bin/bash

while read -a words
do
case "${words[1]}" in

  move)
    if [ "${words[2]}" = "forward" ]; then
      echo "FRONT"
      echo run-timed > /sys/class/tacho-motor/motor0/command
      echo run-timed > /sys/class/tacho-motor/motor1/command
      sleep 0.2
    fi

    if [ "${words[2]}" = "backward" ]; then
      echo "BACK"
      sleep 0.2
    fi
    ;;

  turn)
    if [ "${words[2]}" = "left" ]; then
      echo "LEFT"
      echo run-timed > /sys/class/tacho-motor/motor1/command
      sleep 0.2
    fi

    if [ "${words[2]}" = "right" ]; then
      echo "RIGHT"
      echo run-timed > /sys/class/tacho-motor/motor0/command
      sleep 0.2
    fi    
    ;;

  stop)
    echo "STOP"
    ;;

  *)
    echo "?"
    echo "${words[1]}"
    echo "${words[2]}"
    ;;
esac
done

For some reason I don’t yet understand I had to change 2 things that worked fine with Ubuntu:

  • increase the index of the arguments (“${words[1]” and “${words[2]” instead of “${words[0]” and “${words[1]”
  • use capital letters for the keywords

This script sends “run-timed” commands to the motor file descriptors (you can read a good explanation on this ev3dev tutorial: ‘Using the Tacho-Motor Class’). I didn’t write commands for “move backward” this time (it would require extra lines to change direction, not difficult but I don’t want to increase the script to much).

Before we can use this script, we need to initialize the motors so we can use this other script, “init.sh”

#!/bin/bash

echo 1050 > /sys/class/tacho-motor/motor0/speed_sp
echo 200 > /sys/class/tacho-motor/motor0/time_sp
echo 1050 > /sys/class/tacho-motor/motor1/speed_sp
echo 200 > /sys/class/tacho-motor/motor1/time_sp

(it just sets maximum speed to motor0 and motor1 and the timer to 200 ms for the duration of each “run-timed” command).

So we open two a second ssh session to our EV3 and we ran in the first session:

unbuffer pocketsphinx_continuous -kws keyphrase_list.txt -adcdev hw:1,0 -lm 0772.lm -dict 0772.dic -inmic yes -logfn /dev/null > pipe

and in the second session:

cat pipe | ./controller.sh

And presto!

The robot is a RileyRover, a “very quick to build” design from Damien Kee.

LEGO Voice Control

This post is part 1 of 2 of  LEGO Voice Control

This is going to be (I hope) the first of a series of posts about voice recognition.

Decided to control my LEGO  RC Tracked Racer with my recent FTDI based IR Transmitter. While reading some blogs I find my self thinking… hey, I can use voice control on my Ubuntu laptop, doesn’t seem to dificult!

So, in a nutshell:

  • install pocketsphinx
  • create a keyhphrase list
  • write a bash script to parse commands and control the LEGO
  • glue it all

So there are a few open source speech recognition projects. I picked Sphinx from Carnegie Mellon University, mainly because it is available in Debian and Ubuntu and they have lighter version, pocketsphinx, for lighter devices like Android or Raspberry Pi (of course I also thought that, with some luck and sweat, it could be used with ev3dev later on).

pocketsphinx is a command line tool but can be also used with python with a library, I made some fast tests but gave up when complexity started to increase – pyaudio and gstreamer may be OK on Ubuntu or Raspberry Pi but the EV3 will most probably choke, so let’s try just shell scripts first.

I decided to have 5 commands for my LEGO (4 directions and STOP). Documentation suggests that it is best to use sentences with at least 3 syllables so I created this keyphrase-list.txt file:

move forward /1e-12/
move backward /1e-5/
turn left /1e-12/
turn right /1e-14/
stop /1e-20/

The numbers represent detection threshold values, I started with /1e-10/ for all and then adapted for better results by trial and error. Not quite happy yet and will probably use just “front” and “back” instead of “forward” and “backward”.

I also created a Sphinx knowledge base compilation with CMU’s Sphinx Knowledge Base Tool, using a file with the same keyphrases:

move forward
move backward
turn left
turn right
stop

Your Sphinx knowledge base compilation has been successfully processed!

This generated a ‘0772. TAR0772.tgz’ file containing 5 files:

[TXT] 0772.dic                110    Pronunciation Dictionary
[   ] 0772.lm                 1.3K   Language Model
[   ] 0772.log_pronounce      100    Log File
[   ] 0772.sent                98    Corpus (processed)
[   ] 0772.vocab               43    Word List

I made some tests with these files as parameters for the pocketsphinx_continuous command as also the pyhton library but for the next examples they don’t seem to be required. But they will be used later 🙂

Now to test is, just run this command and start speaking:

$ pocketsphinx_continuous -inmic yes -kws keyphrase_list.txt -logfn /dev/null
READY....
Listening...
READY....
Listening...
stop
READY....
Listening...
^C

So I just use pocketsphinx_continuous command to keep listening to what I say to the microphone (“-inmic yes”) and find my keyphrases (“-kws keyphrase_list.txt) without filling my console with log messages (“-logfn /dev/null”).

Each time a keyphrase is detected with enough confidence it is displayed so I just need to redirect the output of these command to a shell script that parses it and sends the right IR codes to my LEGO:

#!/bin/bash

while read -a words
do

case "${words[0]}" in

  move)
    if [ "${words[1]}" = "forward" ]; then
      echo "FRONT"
      irsend -d /var/run/lirc/lircd SEND_ONCE LEGO_Combo_Direct FORWARD_BACKWARD
      sleep 0.2
      irsend -d /var/run/lirc/lircd SEND_ONCE LEGO_Combo_Direct BRAKE_BRAKE
    fi
    if [ "${words[1]}" = "backward" ]; then
      echo "BACK"
      irsend -d /var/run/lirc/lircd SEND_ONCE LEGO_Combo_Direct BACKWARD_FORWARD
      sleep 0.2
      irsend -d /var/run/lirc/lircd SEND_ONCE LEGO_Combo_Direct BRAKE_BRAKE
    fi
    ;;
  turn)
    if [ "${words[1]}" = "left" ]; then
      echo "LEFT"
      irsend -d /var/run/lirc/lircd SEND_ONCE LEGO_Combo_Direct FORWARD_FORWARD
      sleep 0.2
      irsend -d /var/run/lirc/lircd SEND_ONCE LEGO_Combo_Direct BRAKE_BRAKE
    fi
    if [ "${words[1]}" = "right" ]; then
      echo "RIGHT"
      irsend -d /var/run/lirc/lircd SEND_ONCE LEGO_Combo_Direct BACKWARD_BACKWARD
      sleep 0.2
      irsend -d /var/run/lirc/lircd SEND_ONCE LEGO_Combo_Direct BRAKE_BRAKE
    fi    
    ;;

  stop)
    echo "STOP"
    irsend -d /var/run/lirc/lircd SEND_ONCE LEGO_Combo_Direct BRAKE_BRAKE
    ;;

  *)
    echo "?"
    ;;

esac

Not pretty but it works – we can test in the command line like this:

$ echo "move forward" | ./transmitter.sh
FRONT

Of course, the ‘irsend’ commands only work if lircd is running and controlling an IR transmitter.

Now to glue everything we need to use a trick: Ubuntu version of pocketsphinx doesn’t flush stdout so the piping its output to my script wasn’t working, I found that I need to use the “unbuffer” command from “expect” package:

$ sudo apt install expect
$ make pipe

So in one console window I send the output, unbuffered, to the pipe I created

$ unbuffer pocketsphinx_continuous -inmic yes -kws keyphrase_list.txt -logfn /dev/null > pipe

And in another console window I read the pipe and send it to the trasmitter.sh script:

$ cat pipe |./transmitter.sh

And that’s it.

 

 

 

 

 

Using a FTDI adapter as an IR emitter – 4

This post is part 4 of 5 of  Using a FTDI adapter as an IR

We finally have LIRC but if we run it now it will fail looking for “liblirc.so.0” so we need to configure ev3dev to look for it in the right place:

sudo nano /etc/ld.so.conf.d/lirc.conf

  include /usr/local/lib

sudo ldconfig

We could also build LIRC with proper prefix options in order to prevent this last step but I’m lazy and this also helps when searching the web for common problems.

We also need to create a folder for LIRC to place a pid file:

sudo mkdir /var/run/lirc

and at least one remote control configuration file that tells LIRC how to talk with the Power Fucntions IR Receiver. So after two years I’m back to Connor Cary’s GitHub and find that he now has 3 configuration files available:

  • Combo_Direct
  • Combo_PWM
  • Single_Output

The last one was contributed by Diomidis Spinellis, the author of a very nice post “Replace Lego’s $190 Intelligent Brick with MIT’s Scratch and a $40 Raspberry Pi” I read a few months ago – what a small world we live 🙂

We should save these 3 files with a “.conf” extension under the folder

/usr/local/etc/lirc/lircd.conf.d/devinput.lircd.conf

There is already a “devinput.lircd.conf” file there but it only works with LIRC default device so we should rename it:

sudo mv /usr/local/etc/lirc/lircd.conf.d/devinput.lircd.conf /usr/local/etc/lirc/lircd.conf.d/devinput.lircd.dist

And that’s it, next post we’ll finally start LIRC!

Using a FTDI adapter as an IR emitter – 3

This post is part 3 of 5 of  Using a FTDI adapter as an IR

Now back to where we extracted LIRC:

cd lirc-0.9.4d
./configure

If all conditions are satisfied we get this at the end:

...
checking for FTDI... no
checking for FTDI... yes
...
Summary of selected options:
----------------------------------------
prefix:                         /usr/local
sysconfdir:                     ${prefix}/etc
x_progs:                        
host:                           armv5tejl-unknown-linux-gnueabi
host_os:                        linux-gnueabi
forkpty:                        -lutil
usb_libs                        -lusb -lusb-1.0
lockdir:                        /var/lock/lockdev

Conditionals:

BUILD_ALSA_SB_RC:no
BUILD_DSP:yes
BUILD_FTDI:yes
BUILD_HIDDEV:yes
BUILD_I2CUSER:yes
BUILD_LIBALSA:no
BUILD_LIBPORTAUDIO:no
BUILD_USB:yes
BUILD_XTOOLS:no
HAVE_DOXYGEN:no
HAVE_LIBUDEV:no
HAVE_MAN2HTML:no
HAVE_PYMOD_YAML:no
INSTALL_ETC:yes
NEED_PYTHON3:no
SYSTEMD_INSTALL:yes
DEVEL:no
HAVE_UINPUT:yes
DARWIN:no
LINUX_KERNEL:yes

We may now proceed with

make

and in a perfect world or at least with my Ubuntu it will build everything fine. But on my EV3 for two times I got this:

CDPATH="${ZSH_VERSION+.}:" && cd . && /bin/bash /home/robot/lirc-0.9.4d/missing aclocal-1.15 -I m4
/home/robot/lirc-0.9.4d/missing: line 81: aclocal-1.15: command not found
WARNING: 'aclocal-1.15' is missing on your system.
         You should only need it if you modified 'acinclude.m4' or
         'configure.ac' or m4 files included by 'configure.ac'.
         The 'aclocal' program is part of the GNU Automake package:
         <http://www.gnu.org/software/automake>
         It also requires GNU Autoconf, GNU m4 and Perl in order to run:
         <http://www.gnu.org/software/autoconf>
         <http://www.gnu.org/software/m4/>
         <http://www.perl.org/>
Makefile:479: recipe for target 'aclocal.m4' failed
make: *** [aclocal.m4] Error 127

That’s strange because my Ubuntu doesn’t have autoconf installed.

I tried installing several packages but make always failed. After some googling I found a workaround. Is rather strange and honestly I don’t know why but it works:

sudo apt install automake m4 autoconf
autoreconf -i

This wil take a lot of time (at least half an hour) but after that the compiling process works as expected (almost an hour more):

./configure
make
sudo make install

 

Using a FTDI adapter as an IR emitter – 5

This post is part 5 of 5 of  Using a FTDI adapter as an IR
sudo lircd -dserial=DN01DR29,output=3 -Hftdix

We gave lircd 3 parameters:

  • “DN01DR29” is the serial number of my FTDI adapter reported by dmesg
  • “output=3″ is the CTS pin we use to control the LED (in the ‘hello-ftdi.c” test we see LED defined as 0x08, that’s because LIRC ftdix drivers calculates the pin by left-shifting, so 2<<3 = 2³ = 8)
  • “ftdix” is the driver to use

We should check if lircd is running. In Ubuntu it writes several messages at “/var/log/messages” but this log doesn’t exist in ev3dev so

pgrep lircd
5411

E also see in dmesg that the ttyUSB device was disconnected by libftdi:

[47897.512814] ftdi_sio ttyUSB0: FTDI USB Serial Device converter now disconnected from ttyUSB0
[47897.513393] ftdi_sio 1-1.2:1.0: device disconnected

We now use ‘irsend’ to check for available transmitter:

sudo irsend -d/var/run/lirc/lircd LIST "" ""

LEGO_Combo_Direct
LEGO_Combo_PWM
LEGO_Single_Output

We can also list all commands avaible for a particular transmitter:

sudo irsend -d/var/run/lirc/lircd LIST LEGO_Combo_Direct ""

000000000000010e FLOAT_FLOAT
000000000000011f FLOAT_FORWARD
000000000000012c FLOAT_BACKWARD
000000000000013d FLOAT_BRAKE
000000000000014a FORWARD_FLOAT
000000000000015b FORWARD_FORWARD
0000000000000168 FORWARD_BACKWARD
0000000000000179 FORWARD_BRAKE
0000000000000186 BACKWARD_FLOAT
0000000000000197 BACKWARD_FORWARD
00000000000001a4 BACKWARD_BACKWARD
00000000000001b5 BACKWARD_BRAKE
00000000000001c2 BRAKE_FLOAT
00000000000001d3 BRAKE_FORWARD
00000000000001e0 BRAKE_BACKWARD
00000000000001f1 BRAKE_BRAKE

For first test we’ll just use “FORWARD_FORWARD” command (move both motors, “Red” and “Blue”, forward):

sudo irsend -d /var/run/lirc/lircd SEND_ONCE LEGO_Combo_Direct FORWARD_FORWARD

And our motor do spin!

So, after such a big post, whats the point?

Well, since LIRC can handle several transmitter and for ftdix it uses the serial number of the FTDI adapter to identify each transmitter… we can have as much transmitters as we want as long as our system can handle it. On a laptop or a Raspberry Pi 3 that’s probably 127 (the max number of USB devices we can have). Most probably the Ev3 will gasp will all that USB devices but at least two I know it can handle:

Will show how in a fourth post, later on.

Using a FTDI adapter as an IR emitter – 2

This post is part 2 of 5 of  Using a FTDI adapter as an IR

We should now compile LIRC  but as I said before I never got it working  without also compiling libftdi.

I downloaded and extracted ibftdi1-1.3 source code. Then:

cd  libftdi1-1.3
mkdir build
cd build
cmake -DCMAKE_INSTALL_PREFIX="/usr" ../

If everything is OK, we see:

...
-- Building unit test
-- Configuring done
-- Generating done
-- Build files have been written to: /home/robot/libftdi1-1.3/build

Then

make

If everything OK:

...
[100%] Built target test_libftdi1

And finally:

sudo make install

 

 

Upgrading EV3 firmware from a VM

After yesterday post I received two precious hints from Laurens Valk and David Lechner that finally allowed me to upgrade the firmware of my EV3.

When changing to firmware upgrade mode, the EV3 changes its USB device ID (that’s why it appears as disconnected in the EV3 tool). So we just need to add a new USB filter rule in the VM settings:

0694:0006 "LEGO EV3 Firmware Update"

The original filter, for normal operating mode, is

0694:0005 "LEGO Group EV3"

And of course I made a video showing how to do it:

LEGO WeDo 2.0 with MIT App Inventor

I got a request for help today, Mr. Rocha is trying to use MIT App Inventor to control the WeDo 2.0 Smart Hub RGB LED.

I’ve never used App Inventor before but I had already installed the Companion once in my Android Phone because I read something somewhere and found it quite similar to Snap! and Scratch (and  also just because it is from MIT… I have a fetiche for MIT back from when I was at college and read Nicholas Negroponte articles on Wired). So let’s give it a try.

I just wanted to connect to the WeDo 2.0 Smart Hub and change the color to RED. When using gatttool that’s done with just

char-write-cmd 3d 06040109

Just needed to add the BLE extension to start working, getting a connection was easy but writing to the handle took a while since App Inventor BLE extension doesn’t use handles, just UUIDs. So I had to go back to my notes and find the Service UUID and the Characteristic UUID:

service_uuid = 00004f0e-1212-efde-1523-785feabcd123
characteristic_uuid = 00001565-1212-efde-1523-785feabcd123

Then I tried a block called “call BluetoothLE. WriteStringValue” but I couldn’t find a way to convert an hexadecimal string (“06040109”) to a proper string to send.

So I tried another block, “call BluetoothLE.WriteIntValue”. At first I made an old mistake, converting “06040109h” to “100925705”. Didn’t work.

Then I wrote it in reverse (“09010406h”) and converted it to “151061510”. And now it works!

WeDoLED

Now that I finally started, I think I will use App Inventor some more times. Damn easy to create an BLE Android app!