Once I was able to digitize myself, the power of Kinect was truly realized. I initially made a digital drum set, with different virtual zones corresponding to different sounds. However as an Arduino lover, I wanted to make my next project have a physical component. Since I like to wave, I figured what better project than a friendly robot that can wave back.
The first step was to connect the Arduino to the computer. Since I was already using Processing and had several prior programs interface between the two programs, this was relatively straightforward for me. The hardest step was to make sure that the correct COM port was selected.
The next phase of the project was to use Kinect to get the arm positions. Luckily there were Kinect skeleton libraries that made this relatively quick work. Transforming the 3d positions to 2d positions was also straight forward. The last step was reviewing some geometry so that I could determine the angle to send to the servos attached to the Arduino. The easiest way to send the data was through an array, so I just had to ensure that the angles sent to the Arduino matched my processing sketch. The video below is from the test to make sure that the Servos were controlled by my movements.
For my “robot” I wanted to make it very approachable. My goal with the Kinect platform was to have an easy way to introduce students to vision based projects. Therefore I was reluctant to make a finished machined metal or even 3d plastic robot – and I opted for a wooden robot like the example in “Making things See.”
The Making things See book was a fantastic start and resource for the geometry review/learning and the code. However I always try to use someone else’s project as a starting point and take it to another level to make sure that I learned the information. Therefore I added an extra arm:
I really like how this turned out… but I really didn’t like the jerkiness of the servos. Therefore my next step was to filter the input to the Arduino to get rid of this jerkiness. I looked into some filtering options, and I needed a quick and easy method to filter the data. I tried a rudimentary averaging filter. Lucky for me, this worked really well at reducing the noise!
Once you have the Kinect Connected, you can play with the data. While there are ways to do this without point clouds, one of the coolest things is to see yourself digitized in real time. It actually isn’t too hard, and when you get it right, you can do things like wave to yourself!
After I started the regular point cloud, I tried some more things… like point clouds with color! Sorry about the vertical format!
Well, that was easy!
I have an old Kinect, and wanted to connected to my new computer. This wasn’t exactly straightforward – since the book “Making Things See” is now 5 years old. Here’s what worked for me (Windows 10 Computer).
Installed Kinect for Windows SDK 1.8.
While the book says that this won’t work with OpenNI, evidently the later version of the SDK took care of this.
I installed OpenNi from this website – but took the advice for the 32 bit version:
It was important to use the version of processing (1.5.1) used in the the book here:
Finally – this one got me for a little bit, I had to install the libraries on my C drive, and not the network folder!
All of that settled… Viola! Depth Image Cloud me and Me!
Adafruit has a cool little board called the Gemma. It is gerat for wearables because you can use conductive thread to sew right to it. Students have had soe issues getting it going though.
The hardest part is having the Arduino IDE see the board. Adadfruit has a tutorial for this but it is easily missed on the Gemma setup site. Like many things, the fix is simple once you know it. Add this url:
In “Additional Boards Manager URLs” under the Files>Preferences. Then, in board manager (Tools>Boards Manager), you can add Adafruits boards easily.
For a more in depth tutorial, click here
Today I had
- A little Thermal Camera sitting around my office
- Some free time.
Therefore I now have a Thermal Camera hooked up to a Raspberry Pi.
This tutorial is really helpful.
2 Hints: The Github guy moved the folder locations. Move your Zip folder from the Downloads folder.
After I was done with version 1 of my claw bot, I wanted to use what I learned to take my line following robot to the next level. This involved developing a little joystick app that could switch between joystick mode and line following mode.
I first just tried to have two scroll bars to control the speed an direction of the treads, but this proved to be unsuccessful because the phone was unable to easily process two simletaneous moving fingers.
Then I simply made a circular target that controlled the speed of the motors such that when you pushed forward the bot moved forward, right, left etc. I did this by having the x axis control the speed of the left motor, y axis speed of the right motor. The key learning was that I needed to rotate the axis by 45 degrees counter clockwise so that forward direction is intuitive to the user.
Once the joystick was taken care of, I made a button to switch between the two modes. The key learning on this step is to disable any “Serial.print” commands on the arduino program so that the Serial line doesn’t get overburdened.
Overall it works pretty well – check out the video below!
I recently took an Edx class about the MIT app inventor, which was pretty informative. However, I always tell students that the best way to really learn something is to use the subject in a project that you care about.
Therefore, my first app “on my own” was to make a controller for my Claw Bot. Ever since I made the claw bot, I was thinking that I really should make my own controller, so this project was a good way to knock this out.
The first thing that I needed to do was to connect my Bluetooth Mate Silver to the phone that I was going to use for control. I read that the tools for Serial Port Protocol (SPP) were included in the app inventor, and found this tutorial to be a really good starting point. Once two phones were connected, it was pretty straightforward to have my “client” phone look for my Claw Bot instead of the other phone. Once I was connected, it was downhill all the way because I already had created an Arduino program to respond to commands sent over a Serial line. See the video below to see how it turned out!
An interesting challenge was coding the buttons to respond in a way that was intuitive for the user. I did this by making active buttons green, and disabled buttons red. I like the way that it turned out, but know that this project is just the tip of the iceberg!
It is easy to make a light orange.
It is easy to blink a light at precisely 500 hz.
If you know AVR programming, it is pretty easy to combine the two.
But it was a little bit of a test to read a button at the same time.
The key, like a good Crossfit workout, is to keep your pullup enabled.
While getting started with App inventor, one of the first apps that MIT has you make is the Cat Tap App. You tap a cat, and it meows.
The hardest part so far with App inventor is simply connecting your phone or an emulator. I had a phone connected with USB, but then the Genymotion emulator tried to access the same USB. My advice, choose one or the other and save yourself an hour of troubleshooting.
With that figured out, well the results speak for themselves:
Many students like to create projects connected with Bluetooth. MIT has created a tool that allows students to gain success quickly and easily so I thought that I’d take a crack at it! I also want to see how the EDx online classes are, so I signed up for this class:
TrinityX: T007x Mobile Computing with App Inventor – CS Principles
One stone. 2 Birds.
As I progress I will let you know how it goes!