The new LEGO Powered Up Remote Control

This post is part 1 of 2 of  LEGO Powered Up Remote Control

This week I got a question from Nathan Kunicki about using the BOOST Tacho Motor with the new Powered Up hub.

He already succeeded with the other WeDo 2 and BOOST devices so the motor was the last challenge. So I tried a few ideas but nothing.

I told him that probably the motor was not supported by the current firmware… and then he told me that when we use the new Remote it works (as if a WeDo 2 motor, just ON or OFF).

Well… my lovely wife just gave me this weekend my future birthday gift: the new LEGO Cargo Train. And it has a Powered Up remote so… let me try it.

No, we couldn’t find a way to control the motor. But I learned a bit about the Remote (thanks Nathan!) Enough to use it as a standalone BLE controller without the LEGO Powered Up “HUB NO.4″… it’s so easy that any BLE master can use it.

And, of course, the MINDSTORMS EV3 when running ev3dev:

to be continued…

 

RC servo motors with linux

This post is part 1 of 2 of  LEGO-compatible RC servos

You can use RC servo motors with any Arduino. Even with a Raspberry Pi, no need for special hardware – just connect the signal wire to a free GPIO and send a short every 20 ms or so, where the position of the motor is proportional to the length of the pulse (usually between 0 and 2 ms).

A “normal” computer doesn’t have GPIO pins so some kind of controller is needed. I have a Pololu Mini Maestro that can control several RC servo motors at once through USB or UART – in my case up to 24 motors.

It works great with linux – the Maestro Control Center is a .Net Framework application that runs fine with Mono but after initial setup a simple bash script is enough.

So the fastest (not the best) walk through:

Connect the Maestro with an USB cable and run ‘dmesg’ to see if it was recognized – two virtual serial ports should be created:

[ 1611.444253] usb 1-3: new full-speed USB device number 7 using xhci_hcd
[ 1611.616717] usb 1-3: New USB device found, idVendor=1ffb, idProduct=008c
[ 1611.616724] usb 1-3: New USB device strings: Mfr=1, Product=2, SerialNumber=5
[ 1611.616728] usb 1-3: Product: Pololu Mini Maestro 24-Channel USB Servo Controller
[ 1611.616732] usb 1-3: Manufacturer: Pololu Corporation
[ 1611.616735] usb 1-3: SerialNumber: 00094363
[ 1611.646820] cdc_acm 1-3:1.0: ttyACM0: USB ACM device
[ 1611.649542] cdc_acm 1-3:1.2: ttyACM1: USB ACM device
[ 1611.651109] usbcore: registered new interface driver cdc_acm
[ 1611.651112] cdc_acm: USB Abstract Control Model driver for USB modems and ISDN adapters

The first one (‘/dev/ttyACM0’) is the ‘Command Port’ and the second one (‘/dev/ttyACM’1) is the ‘TTL Serial Port’.

We download and extract the Maestro Servo Controller Linux Software from Pololu site. To run it we need Mono and libusb, the (excellent!) User Guide gives this command:

sudo apt-get install libusb-1.0-0-dev mono-runtime libmono-winforms2.0-cil

I already had libusb and with current Ubuntu (17.04) the Mono packages are different:

Package libmono-winforms2.0-cil is not available, but is referred to by another package.
This may mean that the package is missing, has been obsoleted, or
is only available from another source
However the following packages replace it:
 mono-reference-assemblies-2.0 mono-devel

so I ran instead

sudo apt install mono-devel

It will work but it will give an access error to the device. We can follow the manual and create an udev rule or just use sudo:

sudo mono MaestroControlCenter

We will probably find our Maestro ruuning in default serial mode, i.e. ‘UART, fixed baud rate: 9600’ but we change that to ‘USB Dual Port’ so we can control our servos directly from the shell without the Maestro Control Center:But now that we are here let’s also see Status:

We can control our servos from here if we want.

Now back to the command line – in ‘USB Dual Port Mode’ we can control our servos sending commands to the Command Port (i.e. ‘/dev/ttyACM0’). There are several protocols available, let’s just see the Compact Protocol:

echo -n -e "\x84\x00\x70\x2E" > /dev/ttyACM0

Four bytes are used:

  • the first byte is always “84h” and it means we are using the “set target” command
  • the second byte is the channel number so my Maestro can accept a number between 0 and 23
  • the third and fourth bytes is the length of the pulse but in a special format:

The value is in quarters of microseconds. So for a 1.5 ms (1500 µs) we will use 6000. Usually this is the middle (center) position of the servo.

Also «the lower 7 bits of the third data byte represent bits 0–6 of the target (the lower 7 bits), while the lower 7 bits of the fourth data byte represent bits 7–13 of the target». So

So 6000d = 1770h

The third byte is calculated with a bitwise AND:

1770h & 7Fh = 70h

And the fourth byte is calcultated with a shift and an AND:

1770h >> 7 = 2Eh

2Eh & 7Fh = 2Eh

So 1500 µs is represented as 70h 2Eh

Pololu makes life easier with a bash script, ‘maestro-set-target.sh’:

#!/bin/bash
# Sends a Set Target command to a Pololu Maestro servo controller
# via its virtual serial port.
# Usage: maestro-set-target.sh DEVICE CHANNEL TARGET
# Linux example: bash maestro-set-target.sh /dev/ttyACM0 0 6000
# Mac OS X example: bash maestro-set-target.sh /dev/cu.usbmodem00234567 0 6000
# Windows example: bash maestro-set-target.sh '\\.\USBSER000' 0 6000
# Windows example: bash maestro-set-target.sh '\\.\COM6' 0 6000
# CHANNEL is the channel number
# TARGET is the target in units of quarter microseconds.
# The Maestro must be configured to be in USB Dual Port mode.
DEVICE=$1
CHANNEL=$2
TARGET=$3

byte() {
 printf "\\x$(printf "%x" $1)"
}

stty raw -F $DEVICE

{
 byte 0x84
 byte $CHANNEL
 byte $((TARGET & 0x7F))
 byte $((TARGET >> 7 & 0x7F))
} > $DEVICE

So instead of echo’ing “\x84\x00\x70\x2E”  to the Command Port we can also use

./maestro-set-target.sh /dev/ttyACM0 0 6000

So now we can control a servo with common bash commands. For example this script makes the servo rotate from left to right then back in 20 increments then repeats it 4 times:

#!/bin/bash

sleep 5
for j in `seq 1 5`;
do
  for i in `seq 4000 200 8000`;
  do
    ./maestro-set-target.sh /dev/ttyACM0 0 $i
    sleep 0.1
  done
  for i in `seq 8000 -200 4000`;
  do
    ./maestro-set-target.sh /dev/ttyACM0 0 $i
    sleep 0.1 
  done
done

So we can now control up to 24 servo motors.

So let’s control a special servo motor, more related to my hobby:

That’s a 4DBrix Standard Servo Motor, a motor that 4DBrix sells for controlling LEGO train or monorail models. They also sell their own USB controller but since it is in fact a pretty common RC micro servo inside a 3D printed LEGO-compatible ABS housing we can also use the Maestro:

The same script:

These motors work fine with 4DBrix LEGO-compatible monorail parts:

But better yet… the Maestro also works fine with ev3dev:

I now realize it wasn’t very clever to show motors rotating slowly when the monorail switches only have two functional states so this new video looks a little better, with the three 4DBrix servo motors and the LEGO EV3 medium motor changing the state of the monorail switches every second:

So I can now control 24 RC Servo motors with my MINDSTORMS EV3.

Even better: we can add several Maestros through an USB hub… so why just 24 motors? With 126 Mini Maestros 24ch and an USB hub we could use 3024 motors 🙂

Google Cloud SDK on EV3

This post is part 1 of 1 of  Google Cloud SDK

A fellow from PLUG defied me to show a LEGO robot that translates conversation, much like the C3PO protocol droid from Star Wars.
I only had a couple of hours so I decided to copy the Raspsberry Pi approach of using “the Cloud”. Google offers a one year free trial so I registered and tried a few examples on my Ubuntu laptop, amazing what one can do with just a few curl commands!

So, how to use Google Cloud SDK directly from LEGO MINDSTORMS EV3?

Google has a repository for Debian but it doesn’t work with ev3dev – there are no packages for the ARM architecture. But I found someone saying that he managed to install the x86 tar.gz package on his Raspberry Pi so.. why not give it a try? And yes, it really works.

So this is the process to install Google Cloud SDK on EV3 running ev3dev. It was tested with a fresh installation of the latest release available today, “2017-06-09”

robot@ev3dev:~$
Linux ev3dev 4.4.68-20-ev3dev-ev3 #1 PREEMPT Mon May 15 12:45:40 CDT 2017 armv5tejl GNU/Linux

No dependencies needed – just download the most recent of the “Versioned archives” available for download:

wget https://dl.google.com/dl/cloudsdk/channels/rapid/downloads/google-cloud-sdk-158.0.0-linux-x86.tar.gz

Then just extract it and run the install script:

tar -xvf google-cloud-sdk-158.0.0-linux-x86.tar.gz
./google-cloud-sdk/install.sh

The install takes 5 minutes:

Welcome to the Google Cloud SDK!

To help improve the quality of this product, we collect anonymized usage data
and anonymized stacktraces when crashes are encountered; additional information
is available at <https://cloud.google.com/sdk/usage-statistics>. You may choose
to opt out of this collection now (by choosing 'N' at the below prompt), or at
any time in the future by running the following command:

gcloud config set disable_usage_reporting true

Do you want to help improve the Google Cloud SDK (Y/n)? N

Your current Cloud SDK version is: 158.0.0
The latest available version is: 158.0.0

+------------------------------------------------------------------------------------------+
| Components |
+---------------+-----------------------------------+--------------------------+-----------+
| Status | Name | ID | Size |
+---------------+-----------------------------------+--------------------------+-----------+
| Not Installed | Cloud Datalab Command Line Tool | datalab | < 1 MiB |
| Not Installed | Cloud Datastore Emulator | cloud-datastore-emulator | 15.4 MiB |
| Not Installed | Cloud Datastore Emulator (Legacy) | gcd-emulator | 38.1 MiB |
| Not Installed | Cloud Pub/Sub Emulator | pubsub-emulator | 21.0 MiB |
| Not Installed | gcloud Alpha Commands | alpha | < 1 MiB |
| Not Installed | gcloud Beta Commands | beta | < 1 MiB |
| Not Installed | gcloud app Java Extensions | app-engine-java | 132.2 MiB |
| Not Installed | gcloud app Python Extensions | app-engine-python | 6.4 MiB |
| Installed | BigQuery Command Line Tool | bq | < 1 MiB |
| Installed | Cloud SDK Core Libraries | core | 6.1 MiB |
| Installed | Cloud Storage Command Line Tool | gsutil | 2.9 MiB |
| Installed | Default set of gcloud commands | gcloud | |
+---------------+-----------------------------------+--------------------------+-----------+
To install or remove components at your current SDK version [158.0.0], run:
 $ gcloud components install COMPONENT_ID
 $ gcloud components remove COMPONENT_ID

To update your SDK installation to the latest version [158.0.0], run:
 $ gcloud components update

Modify profile to update your $PATH and enable shell command 
completion?

Do you want to continue (Y/n)? Y

The Google Cloud SDK installer will now prompt you to update an rc 
file to bring the Google Cloud CLIs into your environment.

Enter a path to an rc file to update, or leave blank to use 
[/home/robot/.bashrc]:

Backing up [/home/robot/.bashrc] to [/home/robot/.bashrc.backup].
[/home/robot/.bashrc] has been updated.

==> Start a new shell for the changes to take effect.

For more information on how to get started, please visit:
 https://cloud.google.com/sdk/docs/quickstarts

Now exit from the SSH session and login again. The SDK commands should be available so let’s configure our environment:

robot@ev3dev:~$ gcloud init

This will take about 6 minutes:

Welcome! This command will take you through the configuration of gcloud.

Your current configuration has been set to: [default]

You can skip diagnostics next time by using the following flag:
 gcloud init --skip-diagnostics

Network diagnostic detects and fixes local network connection issues.
Checking network connection...done. 
Reachability Check passed.
Network diagnostic (1/1 checks) passed.

You must log in to continue. Would you like to log in (Y/n)? Y

Go to the following link in your browser:

https://accounts.google.com/o/oauth2/auth?redirect_uri=urn%3Aietf%3Awg%3Aoauth%3A2.0%3Aoob&prompt=select_account&response_type=code&client_id=32555940559.apps.googleusercontent.com&scope=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fuserinfo.email+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fappengine.admin+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcompute+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Faccounts.reauth&access_type=offline

Just copy the link in last paragraph and open it on your browser. You will need to login with a valid Google account. Mine was already associated with a project (‘ev3-pd’) because I already started testing the APIs on the laptop so I picked that project but you can also create a new one.

You will get a verification code like this:

6/vzSXbihAPCTeewAazTZo0YqL49qYDFUcuIR0HBDWnvz

Just copy&past it to the last prompt to continue

Enter verification code: 6/vzSXbihAPCTeewAazTZo0YqL49qYDFUcuIR0HBDWnvz

You are logged in as: [yourgoogleid@gmail.com].

Pick cloud project to use: 
 [1] ev3-pd
 [2] Create a new project
Please enter numeric choice or text value (must exactly match list 
item): 1

Your current project has been set to: [ev3-pd].

Do you want to configure Google Compute Engine 
(https://cloud.google.com/compute) settings (Y/n)? Y

Which Google Compute Engine zone would you like to use as project 
default?
If you do not specify a zone via a command line flag while working 
with Compute Engine resources, the default is assumed.
 [1] asia-east1-a
 [2] asia-east1-b
 [3] asia-east1-c
 [4] asia-northeast1-b
 [5] asia-northeast1-c
 [6] asia-northeast1-a
 [7] asia-southeast1-b
 [8] asia-southeast1-a
 [9] europe-west1-d
 [10] europe-west1-c
 [11] europe-west1-b
 [12] europe-west2-a
 [13] europe-west2-b
 [14] europe-west2-c
 [15] us-central1-c
 [16] us-central1-f
 [17] us-central1-a
 [18] us-central1-b
 [19] us-east1-c
 [20] us-east1-b
 [21] us-east1-d
 [22] us-east4-a
 [23] us-east4-b
 [24] us-east4-c
 [25] us-west1-a
 [26] us-west1-b
 [27] us-west1-c
 [28] Do not set default zone
Please enter numeric choice or text value (must exactly match list 
item): 9

Your project default Compute Engine zone has been set to [europe-west1-d].
You can change it by running [gcloud config set compute/zone NAME].

Your project default Compute Engine region has been set to [europe-west1].
You can change it by running [gcloud config set compute/region NAME].

Created a default .boto configuration file at [/home/robot/.boto]. See this file and
[https://cloud.google.com/storage/docs/gsutil/commands/config] for more
information about configuring Google Cloud Storage.
Your Google Cloud SDK is configured and ready to use!

* Commands that require authentication will use yourgoogleid@gmail.com by default
* Commands will reference project `ev3-pd` by default
* Compute Engine commands will use region `europe-west1` by default
* Compute Engine commands will use zone `europe-west1-d` by default

Run `gcloud help config` to learn how to change individual settings

This gcloud configuration is called [default]. You can create additional configurations if you work with multiple accounts and/or projects.
Run `gcloud topic configurations` to learn more.

Some things to try next:

* Run `gcloud --help` to see the Cloud Platform services you can interact with. And run `gcloud help COMMAND` to get help on any gcloud command.
* Run `gcloud topic -h` to learn about advanced features of the SDK like arg files and output formatting

Since I already activated a service account for my project I already had a JSON file with a private authorization key to use (if you don’t know how to do it look here). I copied it from my laptop as ‘EV3-PD.json’ and defined a path variable for Goggle Cloud SDK to find it when needed:

robot@ev3dev:~$ export GOOGLE_APPLICATION_CREDENTIALS=/home/robot/EV3-PD.json

This key allows us to generate an access token that grants access to Google Cloud APIs for the next 3600 seconds:

robot@ev3dev:~$ gcloud auth application-default print-access-token

ya29.ElpnBDIm1MCsz4isiMF6NL3Hc5yzGpkoGr0iJG1sB68DX00ZvkecQaBL-fkviWYq6HVtkezRjg9Vv_lSxJ6Q7XXFRfH-2Gon_Q4H2784wYZkvZox2UfP2ncJJ0Q

And we are now able to use Skynet The Cloud for our most CPU intensive tasks. Next post I will show how to transcript voice to text through Google Cloud Speech API.