Garden Light Ideas | Custom Globes for Solar Lamps

In this build video, I design a 3D printed globe for some inexpensive garden lights and give them a custom look. The build didn’t go exactly as planned, but check out how things turned out!

What was the other design supposed to look like? Well, I was going for a Rougier Tube Lamp style. It turns out that I miscalculated the diameter of the straws needed to completely cover the surface of the solar lamp. To make matters worse, the glue I was sure would work had some sort of chemical reaction with the straw plastic, and wouldn’t bond. It just turned into a disaster. I don’t think this will be my only attempt at making that style of lamp, but it just didn’t work out for this project.

That was my project day!

If you liked this project, check out some of my others:

DIY Halloween Decorations: Animated Skull

The ThrAxis – Our Scratch-Built CNC Mill

Instant Parade!

Did you like It’s Project Day? You can subscribe to email notifications by clicking ‘Follow’ in the side bar on the right, or leave a comment below.



DIY Halloween Decorations: Animated Skull

In this build video, I show you my Animated Skull project. Using a servo, some 3D printing, and some clever circuitry, this inexpensive Halloween prop can automatically move in sync with recorded audio!

Check out the photo gallery below for more details!

Project Bill of Materials :

Please Note: Its Project Day! is a participant in the Amazon Services, LLC Associates Program, an affiliate advertising program designed to provide means for sites to earn advertising fees. Please consider helping us bring you awesome projects by using the affiliate links when you order from Amazon.

(1) 3D printed parts kit (link to my Etsy Page)
(1) Hi-Tec HS-311 servo (Amazon)
(1) steel wire
(1) audio jack (Amazon)
(1) power jack (Amazon)
(1) power supply (Amazon)
(1) Arduino Pro Mini (Amazon)
Audio conditioning circuit board:
(1) 2N3904 transistor
(1) 10k Ohm Resistor
(1) 100k Ohm Resistor
(1) 22k Ohm Resistor
(1) 3.3uF cap
(1) 0.01uF cap
(1) 0.047uF cap
(1) 1N4148 diode
(1) trimmed proto board
(1) 2-pin 0.1” header
(1) 0.1” jumper
(1) twisted pair from ethernet cable (3in)
(1) 22Ga Solid Copper Black wire (6in)
(1) 22Ga Solid Copper Red wire (6in)
(1) 22Ga Solid Copper Green wire (3in)

Sample Arduino Code:

This is sample code for my Animated Skull project

You can see this code in action on YouTube:
Or visit my webpage at Written by Eric Wiemers for It's Project Day. (C)2018 Copyright Eric Wiemers **************************************************************/ #define Audio_pin A3 #define Servo_pin 11 #include <Servo.h> Servo TalkServo; int pos = 0; int Amp = 0; int AmpMin = 0; int AmpMax = 60; int AmpLow = 10; int Ave = 0; int Level = 0; int ServoMax = 165; //Up int ServoMin = 125; //Down void setup() { // put your setup code here, to run once: pinMode(Audio_pin, INPUT); pinMode(Servo_pin, OUTPUT); TalkServo.attach(Servo_pin); } void loop() { Amp = 0; // Sample audio amplitude averaged over 5 samples for(int i = 0; i < 5; i++){ Level = analogRead(Audio_pin); if(Level > AmpLow){ if(Amp == 0){Amp = Level;} Amp = Level * 1/5 + Amp * 4/5; } } if(Amp < AmpMin){Amp = AmpMin;} if(Amp > AmpMax){Amp = AmpMax;} pos = map(Amp,AmpMin,AmpMax,ServoMax, ServoMin); TalkServo.write(pos); }

When Halloween gets closer, I’ll post more pictures of the final dressing.

That was my project day!

If you liked this project, check out some of my others:

Instant Parade!

The ThrAxis – Our Scratch-Built CNC Mill

Give Aging Technology a Chance

Did you like It’s Project Day? You can subscribe to email notifications by clicking ‘Follow’ in the side bar on the right, or leave a comment below.

Baby + IFTTT + Google Assistant = Awesome!

Setting up Google Assistant to do practical things is surprisingly easy. In this tutorial I set up spoken commands to track events in a spreadsheet.

If you have a newborn, then you know it’s important to make sure they eat enough and often enough. After we got home from the hospital with our little one, my wife and I tracked our baby’s food by making handwritten notes of the start and stop time, how much our baby ate and if it was breastmilk or formula. It was super frustrating to keep track of the baby while keeping the bottle in the right position, the pen and notebook handy, a writing surface nearby, but not in the way, where is the clock, and remembering how much food was in the bottle when you started. Clearly, some sort of hands-free solution was needed and I immediately thought of our Google speaker.

Google assistant on the Google speaker is really useful as a speech to text service, so when I say “Ok Google” followed by a key phrase, it will recognize what I said, translate that into text and allow other programs to process that text.

I’ll need a program like Google’s web-based spreadsheet program Google Sheets to keep track of detailed notes. A separate service IFTTT (pronounced “ift”) which stands for “if this then that” will act as the glue bringing these programs together.

The basic idea here is I want to say a key phrase like “Ok Google, the baby has started a bottle”, then I want the process to create a new line in a specific spreadsheet in Google Sheets, noting the new bottle and the date and time. Then i want the speaker to acknowledge that it understood what I said; something like “num num nummy”.

The concepts here can translate to almost any job where it would be advantageous to do a repetitive, digital task hands-free. I recommend playing around with IFTTT and have some fun with it.

Here are the links that you’ll need to get started:

That was my project day!

If you liked this project, check out some of my others:

Machine Learning Color Classifier

Fume Extractor

Installing Python 2.7 and Modules on Windows

Did you like It’s Project Day? You can subscribe to email notifications by clicking ‘Follow’ in the side bar on the right, or leave a comment below.

Machine Learning Color Classifier

My line-following robot needed to differentiate between 5 colors to choose the line to follow. Instead of hard-coding and trial-and-error to figure out what the sensor is seeing, I decided to use a Machine Learning algorithm. It was surprisingly easy to implement on my Arduino Uno.

For a recent line-following robot competition at the RSSC, I was presented with a new challenge. The organizers made the path change colors along the way, forcing the robot to make decisions about which path to take when it came to an intersection. My typical line-following sensor of choice is an array of five generic infrared reflective sensors, which work great when there’s high-contrast between the lines and the field. However for this challenge, those sensors don’t have the color-spectrum range to get the job done. Adafruit manufactures a breakout circuit board based on the TCS34725 sensor which can detect the full RGB visible spectrum, has an on-board LED to reduce the effects of ambient light on the sensor output, and conveniently communicates via I2C. The basic line-following algorithm I used is to have 3 sensors in a line-array that are constantly scanning the ground and reporting back the color they see while the robot uses that data to try to keep itself centered on the right colored line. The TCS34725 chip has a fixed I2C address, so I also bought an I2C multiplexer breakout board which allows me to access all three sensors without network collisions.

The data from the color sensors comes in the form of an integer for each color Red, Green, and Blue and another integer for Clear light. My algorithm will need to take in that raw data and produce a prediction of what color the sensor is actually looking at all within the limitations of the Arduino. It would be somewhat difficult to train the algorithm and run it all on the Arduino, but the reality is: I don’t have to.

The whole process is actually two smaller processes, training and prediction. Training requires more processing power than prediction because it processes my entire collection of known data to work the prediction model down to the best fit for the data provided. Prediction on the other hand, is simple. It uses a simple math calculation called the sigmoid function which is easy to implement on the Arduino Uno and processes very quickly as well. I don’t need to train the algorithm on the Arduino. That can be done on the computer. Here’s an outline of the whole process:


  1. Use the color sensors and an Arduino, connected to my computer to collect raw data.
    • This is the supervised teaching method (i.e. a person is collecting the data and doing the work of picking what color it is up front.)
    • The data will be output over the serial port and saved in a text file
    • I need to know which color the sensor is trained on for each data set, so I’ll make one text file for each color
    • To make the prediction robust, I’ll have to frequently change the position of the sensor and the lighting conditions
    • The colored lines will always be near a white background, so I’ll need to collect data near those borders also
  2. Use Octave to import the data from the text files and use the fminunc function to quickly train the algorithm using gradient descent.
    • When I import the data, I create an additional column that represents the known output from each set and assign a value; either 1 for match or 0 for not a match.
      For example, if I’m training for finding Red, I’ll set all of the known outputs for the Red data set to 1 and all of the known outputs for the Blue, Green, Black, and White data to 0.
    • Before training, I randomize the order of the data, mixing matches and non-matches, then set aside about 30% of the data points which I won’t use during training. After I’m done training, I’ll use the trained algorithm on this data to test if it’s working or not.
  3. The key part of the trained algorithm is the weights that I will use in the Arduino during real-time operation.
Training Process

Training Process


  1. During real-time operation, the Arduino will only use the prediction model.
    • Unlike unsupervised learning algorithm, this prediction model will not “learn” any more than it already has learned at the time I program the Arduino
    • The Arduino takes in the raw data, multiplies it by the weights, then passes the sum through the sigmoid function to decide if the color matches.
    • I have to break the classification problem into a series of smaller problems. Instead of predicting “which color is it”, the Arduino will consider each color one-at-a-time as a yes/no question.
  2. The output isn’t guaranteed to be clear. For example, if the sensor is looking at a red/blue transition, it might produce a positive result for red and blue at the same time.
    • I consider this possibility when writing the rest of my Arduino program.
Prediction Process

Prediction Process

If you’re new to machine learning and you want to dive deeper into using on your own, I recommend you check out Prof. Andrew Ng’s Machine Learning course on Coursera.

The algorithm worked out well for me. When I gathered data in my garage under LED light, the trained algorithm worked in that setting, probably very close to 99% success rate. On the day of the competition, the lights tended more toward the red end of the spectrum, so I gathered more data on the day and reran my training algorithm. After that, it worked fine. In general, gathering more data in more lighting conditions would help build robustness into the algorithm, but for a one-time-use project, it was good enough to get the job done. One major limitation I experienced was with the color sensor. It uses an accumulation time to gather data and that really hurt me on computation time. Every call to the sensor had a built-in 154ms delay, then multiply that by 3 sensors polled sequentially and the robot was very difficult to keep on-track. If I had more time, I might have tried to use something other than a delay, perhaps try to allow accumulation in parallel in the background and reduce the delay to 154ms total.

If you need a little help trying this out on your own, here is some sample code you can use to get started:

Arduino code for collecting the data to a text file:

#include <Wire.h>
#include "Adafruit_TCS34725.h"

/* Initialize with specific int time and gain values */
Adafruit_TCS34725 tcs = Adafruit_TCS34725(TCS34725_INTEGRATIONTIME_154MS, TCS34725_GAIN_1X);

void RawColorPrint(int x[5]){
//debug code – color data for classifier
//Serial.print(x[0],DEC);Serial.print(“,”); // always = 1

void setup() {
while(!Serial); //wait for Serial to start



// Initialize the sensor
if (tcs.begin()) {
Serial.println(“Found sensor”);
} else {
Serial.println(“No TCS34725 found… check your connections”);
while (1);


void loop() {
//debug code – color data for classifier
uint16_t col0[5];
col0[0] = 1;
tcs.getRawData(&col0[1], &col0[2], &col0[3], &col0[4]);

Octave code for training the prediction algorithm:


function g = sigmoid(z)
g = zeros(size(z));
g = 1./(1.+exp(1).^(-z));


function p = predict(theta, X)
m = size(X, 1); % Number of training examples
p = zeros(m, 1);
p = round(sigmoid(X*theta));


function [J, grad] = costFunction(theta, X, y)

m = length(y); % number of training examples
J = 0;
grad = zeros(size(theta));
J = -((1/m)*sum(y.*log(sigmoid(X*theta))+(1.-y).*log(1.-sigmoid(X*theta))));
grad = (1/m)*sum((sigmoid(X*theta)-y).*X);

Main code:

%% Initialization
clear ; close all; clc

%% Load Data
Whidata = load(‘White.txt’);
Bladata = load(‘Black.txt’);
Bludata = load(‘Blue.txt’);
Reddata = load(‘Red.txt’);
Gredata = load(‘Green.txt’);

fprintf(‘Training for Red\n’)

%% Assign Match(1) to Red, and No Match(0) to all the rest
data = [Reddata ones(size(Reddata)(1),1)];
data = [data; Bladata zeros(size(Bladata)(1),1)];
data = [data; Bludata zeros(size(Bludata)(1),1)];
data = [data; Whidata zeros(size(Whidata)(1),1)];
data = [data; Gredata zeros(size(Gredata)(1),1)];

%% Randomize Order
data = data(randperm(size(data,1)),:);

%% Sequester 30% of the data for testing after training
datacutoff = int16(size(data)(1)*0.7);
X = data([1:datacutoff], [1:4]); y = data([1:datacutoff], 5);
Xtest = data([datacutoff + 1:size(data)(1)], [1:4]);
ytest = data([datacutoff + 1:size(data)(1)], 5);

%% Add constant term to X and Xtest
[m,n] = size(X);
X = [ones(size(X)(1), 1) X];
Xtest = [ones(size(Xtest)(1),1) Xtest];

%% Initialize weights
initial_theta = zeros(n + 1, 1);

%% Set options for fminunc
options = optimset(‘GradObj’, ‘on’, ‘MaxIter’, 10000);

%% Run fminunc to obtain the optimal weights
[theta, cost] = …
fminunc(@(t)(costFunction(t, X, y)), initial_theta, options);

%% Print weights to screen in Arduino-compatible format
fprintf(‘{%f, %f, %f, %f, %f} \n’, …

%% Use the optimal weights to predict the color of the checking data set
ycheck = predict(theta, Xtest);
fprintf(‘Train Accuracy: %f\n’, mean(double(ycheck == ytest)) * 100);

Arduino code for the prediction algorithm:

#include <Wire.h>
#include "Adafruit_TCS34725.h"

#define TCAADDR 0x70

/* Initialize with specific int time and gain values */
Adafruit_TCS34725 tcs0 = Adafruit_TCS34725(TCS34725_INTEGRATIONTIME_154MS, TCS34725_GAIN_1X);
Adafruit_TCS34725 tcs1 = Adafruit_TCS34725(TCS34725_INTEGRATIONTIME_154MS, TCS34725_GAIN_1X);
Adafruit_TCS34725 tcs2 = Adafruit_TCS34725(TCS34725_INTEGRATIONTIME_154MS, TCS34725_GAIN_1X);

/* Trained algorithm for 154ms delay and 1x gain */
double WhiteTh[5] = {-17.197105, -0.007517, 0.033887, -0.040166, 0.010233};
double BlackTh[5] = {13.863546, -0.011893, -0.022478, -0.013213, 0.005284};
double BlueTh[5] = {0.113073, -0.031746, -0.112839, 0.120412, 0.010445};
double RedTh[5] = {-0.051582, 0.077581, -0.036699, 0.042285, -0.027792};
double GreenTh[5] = {-12.740958, -0.187098, 0.227913, -0.174424, 0.033591};

void ClassifiedColorPrint(){
uint16_t col0[5], col1[5], col2[5];
col0[0] = 1;
col1[0] = 1;
col2[0] = 1;

tcs0.getRawData(&col0[1], &col0[2], &col0[3], &col0[4]);
tcs1.getRawData(&col1[1], &col1[2], &col1[3], &col1[4]);
tcs2.getRawData(&col2[1], &col2[2], &col2[3], &col2[4]);

double B1Pred, B2Pred, WPred, RPred, GPred, MPred;

B1Pred = predict(col0,BlackTh);
B2Pred = predict(col0,BlueTh);
WPred = predict(col0,WhiteTh);
RPred = predict(col0,RedTh);
GPred = predict(col0,GreenTh);

//debug code – checking color classifier
Serial.println(” “);
Serial.println(“Sensor 0”);
Serial.print(“Black: “); Serial.println(B1Pred * 100.0);
Serial.print(“White: “); Serial.println(WPred * 100.0);
Serial.print(“Blue: “); Serial.println(B2Pred * 100.0);
Serial.print(“Red: “); Serial.println(RPred * 100.0);
Serial.print(“Green: “); Serial.println(GPred * 100.0);

B1Pred = predict(col1,BlackTh);
B2Pred = predict(col1,BlueTh);
WPred = predict(col1,WhiteTh);
RPred = predict(col1,RedTh);
GPred = predict(col1,GreenTh);

Serial.println(” “);
Serial.println(“Sensor 1”);
Serial.print(“Black: “); Serial.println(B1Pred * 100.0);
Serial.print(“White: “); Serial.println(WPred * 100.0);
Serial.print(“Blue: “); Serial.println(B2Pred * 100.0);
Serial.print(“Red: “); Serial.println(RPred * 100.0);
Serial.print(“Green: “); Serial.println(GPred * 100.0);

B1Pred = predict(col2,BlackTh);
B2Pred = predict(col2,BlueTh);
WPred = predict(col2,WhiteTh);
RPred = predict(col2,RedTh);
GPred = predict(col2,GreenTh);

Serial.println(” “);
Serial.println(“Sensor 2”);
Serial.print(“Black: “); Serial.println(B1Pred * 100.0);
Serial.print(“White: “); Serial.println(WPred * 100.0);
Serial.print(“Blue: “); Serial.println(B2Pred * 100.0);
Serial.print(“Red: “); Serial.println(RPred * 100.0);
Serial.print(“Green: “); Serial.println(GPred * 100.0);

void tcaselect(uint8_t i) {
if (i > 7) return;

Wire.write(1 << i);

double sigmoid(double z) {

return (double)1.0/((double)1.0 + exp(-z));

double predict(int x[5], double theta[5]) {
double answer = 0;

answer = ((double)x[0]*theta[0] + (double)x[1]*theta[1] + (double)x[2]*theta[2] + (double)x[3]*theta[3] + (double)x[4]*theta[4]);
answer = sigmoid(answer);
return answer;

void setup() {
while(!Serial); //wait for Serial to start



// Initialize the sensors
if (tcs0.begin()) {
Serial.println(“Found sensor”);
} else {
Serial.println(“No TCS34725 found on channel 0… check your connections”);
while (1);

if (tcs1.begin()) {
Serial.println(“Found sensor”);
} else {
Serial.println(“No TCS34725 found on channel 1… check your connections”);
while (1);
if (tcs2.begin()) {
Serial.println(“Found sensor”);
} else {
Serial.println(“No TCS34725 found on channel 2… check your connections”);
while (1);

void loop() {
//Print Classifications

For my code and these examples, I used the TCS34725 library from Adafruit and leaned heavily on their examples for the TCS34725 and TCA9548A.

FTC disclosure: this article contains affiliate links

If you liked this project, check out some of my others:

Instant Parade!

The ThrAxis – Our Scratch-Built CNC Mill

Fume Extractor

Did you like It’s Project Day? You can subscribe to email notifications by clicking ‘Follow’ in the side bar on the right, or leave a comment below.

Fume Extractor

If you tinker with electronics, then you’re certainly familiar with solder fumes. You probably do your best to avoid breathing them, but let’s face it: like I used to, you probably just accept them as a necessary evil. I got wary of dealing with them, so I built this fume extractor project as an excellent way to filter the smoke and breathe a little easier.

As a prototype, I put the pair of fans in series (one in front of the other) with an activated charcoal filter on the front of both, but I found that the air throughput was about the same as with a single fan. I should have realized this for the fact that fans have a maximum speed which sets the airflow limit. The cable I used for the power cord had a thumbwheel switch that was pretty difficult to turn, so that had to go. I also learned that it would have been easier to have a handle and a shorter run of power cord since my desk is small with a power strip right next to my soldering workspace.

The final design needed a handle, a short power cord, a rocker switch on the body, both fans in parallel (next to each other), and a fuse for short-circuit protection. I designed the whole assembly in Fusion 360 starting with the fan housings. The two fans are sandwiched  with a 3D printed handle between two laser-cut 1/4″ plywood panels. The activated charcoal filter is held in front of the fans by another laser-cut plywood panel and four 3D printed brackets. The brackets have a nice snap-in action. If you’d like to make your own version of my design, they’re hosted on

That was my project! If you liked this project, check out some of my others:

Instant Parade!

Applications and Fabrication with Plastics

The ThrAxis – Our Scratch-Built CNC Mill

Did you like It’s Project Day? You can subscribe to email notifications by clicking ‘Follow’ in the side bar on the right, or leave a comment below.


Installing Python 2.7 and Modules on Windows

python-logo-master-v3-tm-flattenedInstalling Python isn’t hard. Figuring out what to install is a nightmare. This is my experience with installing Python and some Modules on Windows and it’s good news, I promise.

I’ve done some work with Python on Raspberry Pi Rasbian and on Ubuntu Linux, but I’ve steered well clear of Python on Windows for one simple (albeit embarrassing) reason: I couldn’t figure out how to add modules (also sometimes called packages). Installing Python is really straightforward thanks to the ladies and gentlemen over at the Python Software Foundation,  but third-party modules are extremely daunting to the novice. Enter Python hero: Christoph Gohlke of the Laboratory of Fluorescence Dynamics at UCI. He has compiled versions of popular modules available for free on his UCI website.

Check out “The Short Version” if you only want the step-by-step.

The Long Version:

For those that are unfamiliar with Python, here’s a quick tutorial: Python is a programming language that has been designed to be cross-platform and human-readable. The modules I was referring to earlier are libraries of code, both written in C and python that provide extra functions to vanilla Python that it’s developers didn’t include. For example, Numpy is a popular module which provides mathematical operators like trigonometric, exponential, and logarithmic functions.

Python and the modules I’m talking about in this post are also licensed as open-source software meaning that anyone can download them and use them in their own software solutions, provided they abide by the conditions of the licenses. While Python as the language is governed and distributed by the Python Software Foundation, the modules aren’t and can be created by anyone. You could imagine that this might cause problems like “which modules can be trusted” and “how do I know it will work” and they do. I overcome these issues by either looking online for examples that do similar things to what I want to do, checking out module documentation, and finally, trying it out. It’s a time consuming process to vet a new module, but the price is right.

If you’ve tried to install Python and a module (I’m going to talk about Numpy, for example, but other modules are similar), then you probably found the right Python installer on, installed it without issue, then emboldened by your newly found powers, struck off for only to panic when the download option is for source code from GitHub and not an installer. I was there, friend, I feel your pain.

The expectation with supplying the source code is that you as the user would compile it yourself. They generally provide some instructions, but the general consensus is that their target audience are developers and compiling code is something developers know how to do. To be fair, there is such a variety of compilers that it’s almost impossible to give a universal set of instructions. Because Linux comes with open-source compilers for C and C++, it’s easy for Linux installations to be automated, but no such luck for Windows.

To make installation easier, Eggs (legacy) and Wheels (modern since 2012) were introduced as standardized formats for code and binaries (compiled code). They are essentially Python-specific installation files. Wheels seem great in theory, but at the end of the day, they contain pre-compiled code which means that anyone that uses them are subject to the details of the computer that compiled them. For example, if a developer like Mr. Gohlke wants to contribute to the community and make life easier for the rest of us by compiling the source code, then the wheel that he produces may or may not work for someone else because of the version of source he used, the compiler and any options he selected, the OS running on his computer, the processor architecture, and the versions of Python and PIP he had installed at the time. Generally speaking, this tends to be a non-issue as long as the broad strokes are same, but sometimes the differences between his computer and someone else’s may make the compiled code misbehave. The good news, here is that wheels are generally useful and they can be a lot easier than compiling your own source if your computer science skills are limited.

“Okay, now I know the history of the world, but what can I do with it?” you may ask…

The Short Version:

Based on my (limited) experience, I put together these steps which I’m sharing as a guide to help accelerate your python installation process. To be clear, I’m not guaranteeing that these steps will work and some interpretation may be required to get them to work.

My process involves changing the existing path variables which if done incorrectly can destabilize your computer, so use extreme caution! These steps are provided for information only. Your actual mileage may vary. If you aren’t comfortable with these steps, I’d suggest you find another way to install Python..

  1. Download and install Python from
  2. Add python and pip to the path variable.
    • Right click on “My Computer” in windows explorer, then click “Properties” in the menu.
    • Click “Advanced System Settings” in the System window (administrative credentials required)
    • In the “System Properties” dialog, click the “Environment Variables…” button in the “Advanced” tab.
    • Find “Path” in the System variables frame of the “Environment Variables” dialog and click “Edit…”
    • Add the python and pip paths to your path variable
      On my computer, Python installed to the default location, so the added text is (no quotes): “;C:\Python27;C:\Python27\Scripts”
    • Click OK all the way back.
  3. Download the pre-compiled pip files for the modules you want from Christoph Gohlke’s webpage.
    • The version you download has to match your computer configuration. For example, I’m installing wheels for Python 2.7 and my computer is an AMD 64 machine, so for Numpy, I picked
  4. Install the wheel files using pip through an admin command prompt.
    • When you go to launch the command prompt, right-click on it and select the “Run as administrator” option.
    • use the “cd” command to navigate to the folder where you downloaded the wheel files
      For example:
      cd C:\Users\Guest\Downloads
      navigates to the default downloads folder on my computer for username “Guest”. (Note the space between cd and C:\…)
    • invoke pip to install the wheel file by typing “pip install <wheel>” where <wheel> represents the filename for the wheel including the .whl extension.
  5. Test out the module by opening IDLE and typing “import <module>” where <module> represents the module name like “numpy” for Numpy and “cv2” for OpenCV. If something went wrong, python will return an error message.

I hope this little guide helps you understand Python a little better and helps you get up and running faster. Please share your experience by leaving a comment below.

If you liked this project, you might also like:

Entry into Machine Vision

Did you like It’s Project Day? You can subscribe to email notifications by clicking ‘Follow’ in the side bar on the right, or leave a comment below.

Applications and Fabrication with Plastics

Essentially, this presentation is going to come down to one question: Why should you use plastics? or phrased differently, How could you use plastics?
Is the strength suitable? Will it stretch? What does the final shape need to be? Do you have the tools? Will the part be abraded? Are there unconventional features that you need to have?



Even though they’re all lumped together as “plastics” different plastic material can have wildly different properties and it’s important to understand those differences so you can pick the right material for the job. At the highest level, plastics can be separated into two groups by the methods that you can use to work them. The first group, thermoplastics will get soft when heated and harden when cooled and this process is repeatable many times. Thermoplastics are often sold in blocks, sheets, tubes, and rods because they are ready to work in their solid form. Thermoset plastics on the other hand, can’t be heated to be worked after their initial forming. They are sold in a liquid or gel form with a matching hardener or that harden by some other means (UV light, exposure to moisture in the air, etc).
Each of those categories has a wide spectrum of plastics with a varied array of properties to choose from. I’ve listed some of the ones I’m going to talk about today.
I’m going to focus mostly on thermoplastics because of their availability and usefulness in robot-building.


How do you know if plastic will do the job as a structure? To know the answer to that question, we can use ultimate tensile strength (write on board) which is the stress at which the part will break. Tensile stress is like the opposite of pressure. When you pull the two ends of a piece apart, the force is distributed over the cross-section of the part and if you apply enough force, the part will break and that’s where the ultimate part comes in. We can use that property to compare different materials to each other. Also, in robotics, weight is your enemy, so I charted the ultimate tensile strength against density so you could have a general idea of how these plastics might perform compared to each other.
For example, UHMW has a strength of about 20MPa and very low density, meaning it’s very light weight. Polyethylene has about the same strength, but a much higher density. You’ll also notice that this distribution is about linear where the heavier the material is, the stronger it is, but that’s not always the case. Some special varieties may have additives that make them stronger or lighter. You can see ABS, Acrylic, Polycarbonate, and PLA.
Now, I’m sure the 3D printers among you are probably looking at this chart and saying “Wait, that’s not right. ABS is stronger than PLA!” But what’s happening here is that PLA IS in fact stronger than ABS if you apply an even, slowly applied force, but PLA is BRITTLE, so it can’t take a beating like ABS can. What I mean is that PLA will break easier if a sudden impulse force is applied.
Also, for comparison:
Aluminum: Density 2.7g/cm³, 80MPa to 570MPa, depending on alloy type. If you buy aluminum at the hardware store, I wouldn’t assume higher than 80MPa. 6061-T6 (aircraft-grade aluminum) has the best properties and is very expensive.
Structural Steel: Density 8.08g/cm³, 400-550MPa. Steels like Chromoly can be stronger than 670MPa.


You may or may not have heard about the strong axis and the weak axis regarding 3D printed parts. If you haven’t, the general idea is this: 3D printed parts act like composites, even though it’s all made of the same material, it can be weaker if pulled in one direction than another direction. This phenomenon can be attributed to the bonding strength between printed layers and the difference in cross-section between slices in different directions.
Consider this example of a 1” cube cross-section of a printing with ¼” beads which I’ve drawn as rounded squares. I realize the proportions aren’t realistic. If you calculate the cross-sectional area between layers (xy), in my example, you’ll find that the rounded corners take about ½ of each square. If you multiply that by the 1” length, you’ll end up with 0.5in² or about 50% of the possible 1×1.
If you do the same with the cross section at the end where we’ve cut it out of the part, you’ll find that the rounded corners only take (4-pi)/32 from each bead, leaving a cross-sectional area of 0.94in² or 94%, nearly double the area between layers!
So in this example, if you pull the cube from the top and bottom as shown (or side from side), the part will break under about half the force required for the axis into and out of the board.
Since FDM printed parts are printed as whole perimeters, the actual cross-section through the whole part will vary significantly with most regions having combinations of cross sections like these and others where layers are stacked up so the strength will be somewhere between the 94% and 50%. For certain, though the strength in the z-axis will be less than any other axis of the part.


While we’re on the topic of 3D printers, I wanted to write for a minute about a problem that FDM printers have called heat creep that happens in the hot end. Also if you’ve printed with ABS and PLA, you may have noticed that it’s more of a problem with PLA than ABS, but why? The answer is the glass transition temperatures of the two plastics. The glass transition temperature refers to the temperature where the plastic becomes rubbery and flexible, but isn’t yet a melted liquid. Before I get to why that’s important, let me start off by explaining the major parts of the extruder:

  • The build plate is where the plastic is deposited
  • The nozzle squeezes the plastic filament diameter down to the printed size
  • The heater block brings the filament to its melting temperature
  • The cooling fins keep the incoming filament cool enough so the pressure from the mover gets transferred all the way down to the nozzle.
  • The incoming plastic which is at room temperature also helps to keep the channel cool because it’s coming in at room temperature.

Generally, you’d use the same extruder for both PLA and ABS, so both plastics will be subject to the same basic principles.
For PLA, generally you can print near 180°C and with ABS, you can print near 220°C. Most of the time when you’re printing perimeters or the first few layers, the filament remains solid in the tube because the combination of ambient cooling and incoming filament keep everything just perfect. But some of the time when you aren’t consuming that much filament, the filament in the tube will heat up and the rubbery filament can get deformed and jam, then the mover just spins and hogs out this nice simicircle right here and you get to spend a couple hours cleaning up the mess. This problem is more prevalent with PLA because the glass transition temperature is so much lower that it happens more easily than with ABS.


here are lots of other ways that plastics can be used. For example:
If you needed a plain bearing material or a non-marking material for bumpers, you might use UHMW because it has high abrasion resistance even though it’s quite soft and bad for structural parts.
Or if you needed a cover for your circuit boards, you might use sheets of styrene, ABS, acrylic, or polycarbonate because they can be heated and formed to any shape.
Everyone knows if you want a part that can’t otherwise be machined, go with a 3D printer which most commonly use ABS or PLA plastic
Another lesser-known use for 3D printer filament is as a rivet material. Basically you mushroom one end using a heat source like a heat gun or soldering iron and then, push the filament through your parts and mushroom the other end.


Earlier, I mentioned that UHMW was  a good bearing material, but wasn’t good for structures, but why is that? The answer comes down to one number: Young’s Modulus. Earlier, we were talking about stress which is the expression of force distributed over the cross-section of the material. Strain is another related reaction of material to forces. Strain is the measure of how much a material stretches. Young’s Modulus is just the ratio of stress and strain. UHMW has extremely low Young’s modulus meaning that when UHMW is under stress (pulled apart) it changes it’s length drastically. Clearly, such a material would make flimsy structures.


In general, wood-cutting tools will work with most thermoplastics and thermoset plastics. You’ll need to be careful when working with brittle materials like acrylic. Acrylic may chip and shatter if the tool is too aggressive or catches. In those cases, it’s best to take a little at a time by using tools like razorknives that cut slowly or step drills that slowly widen a large hole. It often helps to prevent chipping by using a backing material like a scrap piece of wood. That prevents blowout which chips the material.
When drilling or cutting with power tools, there’s a relationship between the speed of your tool and the amount of pressure you apply. You’ll want to avoid running the tool too fast and overheating your plastic, possibly melting it or, on the other extreme forcing too much material into your tool, causing the plastic to chip or crack. The ideal zone will grow or shrink with different materials. For example, with acrylic because it’s so brittle, the ideal zone will be very small. ABS, Styrene, and Polycarbonate are much more forgiving and you can just tear through UHMW because it cuts like butter.



Some techniques you might not be familiar with are general heat forming where you apply heat to a specific area of the plastic and bend it into a shape and let it cool. For example turning a strip of acrylic into a 90° angle bracket.



Vacuum forming is a more advanced version of heat forming where you heat a sheet of thermoplastic, then stretch it across an object and use a vacuum to suck the air out, forming the part. This is not as hard as it sounds, but generally takes a few tries to get it just right.


Entry into Machine Vision

To paraphrase Helen Keller: the only thing worse than being a blind robot is to be a robot with sight, but no vision. 

Machine vision is the combined methods for taking in video or still image data and translating it into some meaningful interpretation of the world that a computer can understand. Some of the most popular applications include face recognition, glyph tracking, 2D barcoding, navigation, and in recent years, using artificial intelligence algorithms to be able to describe the objects in a picture using computers. My thinking is that if I can understand the tools of machine vision, then I can extend those tools to robotics problems I’m working on like object grasping and manipulation and navigation. Machine vision algorithms are built into many sophisticated software packages like MatLab or LabView, but these are typically VERY expensive, making them completely inaccessible to me. Fortunately, the principles of machine vision are well-documented and published outside of these expensive software packages, so at least there’s hope that if I work for it, I can build up my own library of machine vision tools without spending a fortune.

Since I’m building up this library for myself, I want avoid having to rewrite the programs to adapt them to any hardware that I might have connected to a robot including Windows and Linux machines or system-on-a-chip computers like Raspberry Pi, BeagleBone Black, or Intel Edison. My programming experience has been based in Windows computers, so I realize that the languages I’m familiar with won’t be directly useful. I chose Python because it’s open source, free, well-supported, popular, and has versions across all of the hardware platforms I’m concerned with.

I chose finding a color marker as my first venture into the murky waters of machine vision. The problem is pretty simple: use the webcam to find a color, send the coordinates to a robot arm, and move the robot arm. Au contraire, mon frere. That is not all.

This looks easy, doesn't it?

This looks easy, doesn’t it?

The first hurdle to overcome is capturing the webcam data. Fortunately, the Python module Pygame is incredible for doing just that. It allowed me to capture and display a live feed of images from the webcam and superimpose drawing objects over the images so I could easily understand what I was seeing. Most of the code I used came from the tutorial programs in one form or another. In the picture above, you can see the webcam image shown with the superimposed green dot representing the center of the marker.

The second battle to face is lighting. When we look at an object, we are actually seeing the light it’s reflecting. What that means is that when you change the light level or the color content of your light source (for example: use yellow instead of white), suddenly your robot will get lost because the color it “sees” is different than the color it expects. So, now we have to add a step to calibrate the vision system with it’s marker every time we run the program in case the light levels have changed between runs.

The next challenge comes in the form of coordinate systems. When we look at the image that the webcam captures, we can get the position of the object in what I’ll call webcam coordinates. Webcam coordinates are basically the count of pixels in the x- and y-axis from one of the corners. However, the robot arm doesn’t know webcam coordinates. It knows the space around it in what I’ll call robot coordinates, or the distance around the base measured in inches. In order for the computer to provide the arm with something that makes sense, we’ll have to be able to translate webcam coordinates into robot coordinates. If your webcam and arm have parallel x- and y-axes, then conversion may be just scaling the pixel count linearly. If the axes aren’t parallel, then a rotation will be needed. I kept the axes parallel and simply put a ruler down on the surface and used that to “measure” the distance that the webcam sees, then divided by the number of pixels in that axis.

The final roadblock is when you go to make the arm move to the location you’ve given it. The solution to this problem could be as simple as geometric equations or as complicated as inverted kinematic models. Since both methods relate to the movement of the arm, I’ll just call them both kinematics. Even though this will probably be the hardest challenge to overcome, you should probably take it on first since you’ll be able to use the kinematic model of the arm to simplify many other programs you may have for the same arm.

Geometric solution to the kinematic equations

Geometric solution to the kinematic equations

The idea behind kinematic modeling is that you want to write a group or system of equations that tell you what angles to move the arms to so the end of the arm is in a particular position and orientation. In general terms, if you want a robot that can move to any position in a plane (2D), it needs to have at least 2 degrees of freedom (meaning it has 2 moving joints) or if you want to move to any position in a space (3D), it needs to have at least 3 degrees of freedom. In my case, my arm is a 5 degree of freedom arm (meaning it has 5 moving joints) and that makes it particularly complicated to use a purely mathematical inverse kinematic model because I could move to any position in the space around my robot (3D), but I’d have multiple solutions to the equations. I chose to constrain the arm to 3 degrees of freedom by forcing the gripper to have a particular orientation. Then it became easier to model it geometrically.

The program I wrote works fairly well. It’s able to find objects of many colors and it’s pretty entertaining to watch it knock over the marker and chase it outside the field of view. I demonstrated it at the August meeting of the Robotics Society of Southern California which meets every month on the second Saturday.

RSSC Machine Vision Demo

RSSC Machine Vision Demo

If you have any suggestions on an application for color marker tracking or if you’d like to know more about this project, please leave a comment below.

That was my project day!

If you liked this project, check out some of my others:

My Thanks to Mr. Clavel

Instant Parade!

The ThrAxis – Our Scratch-Built CNC Mill

Did you like It’s Project Day? You can subscribe to email notifications by clicking ‘Follow’ in the side bar on the right, or leave a comment below.


A Fun Recipe for Kitchen Chemistry

Chemistry is all around us every day. And now, it’s all over my father-in-laws refrigerator, too.

My father-in-law is a chemical engineer and a kitchen chemist in his free time. My wife and I thought that the chemistry equivalent of poetry fridge magnets would be the perfect Christmas gift for him. The concept is pretty simple, find a set of molecules that are interesting, then imprint their molecular model in some aluminum flat bar and stick some magnetic strips to the back.

In honor of the kitchens they end up in, here is our recipe:

1  –   1″ x 1/16″ aluminum bar, cut to length

1  –   Sharpie marker

1  –   polymer magnetic strip, cut to length

1  –   1/8″ letter/number metal stamping set

Find a set of molecules that you find interesting and figure out their molecular models. Draw them out to scale on a sheet of paper to make sure they fit on the bar stock.

Cut each medallion to length from the aluminum bar with a hacksaw. Leave yourself a little extra around the edges for a cleaner look. Make sure the cut is square to the bar stock and be sure to debur the edges with either a file or a deburring tool.

Plan out the lettering on each medallion by drawing the letters out with the Sharpie marker. My wife figured out this cool trick: press down on the aluminum with the stamps and it will leave the slightest ghost of the image, then trace that with the marker.

Glucose Medallion Before Stamping

Glucose Medallion Before Stamping

Imprint the letters on the surface of the medallion with the metal stamps and a hammer. Use a steel bar as a backing when stamping to make sure the aluminum doesn’t stretch and draw. (drawing is when the edges of the metal curl upward and the spot where the die is struck forms a bowl shape). Wash the marker off with rubbing alcohol or nail-polish remover.

Finish the surface of the medallion whichever way looks good to you. Aluminum can be worked with fine-grit sandpaper to make a brushed finish, hit with a steel wire wheel to give a rough finish, or polished until it shines.

Stick the self-adhesive side of the polymer magnet to the back of each medallion. Make sure the magnetic strip doesn’t delaminate by stacking the finished medallions in width order (widest on the bottom) and clamp the stack to a table for 24 hours or more. Use cardboard to keep from marring the surface.

Refrigerator Magnets!

Refrigerator Magnets!

That was my project day!

If you liked this project, check out some of my others:

Featured Artist Nameplates

Wooden Time Machine

Set your Creativity Adrift

Did you like It’s Project Day? You can subscribe to email notifications by clicking ‘Follow’ in the side bar on the right, or leave a comment below.


My Thanks to Mr. Clavel

I spent some time working on this backburner project over my Christmas vacation. It’s nowhere near finished, but it functions, so I thought I’d share what I have.

This robot arm configuration is called a Clavel Delta Arm and where you might think of a typical robot arm composed of a series of links, this is a parallel design. Well, strictly speaking, my robot arm is missing the parallelogram lower links that are a key feature of the Reymond Clavel design, but that’s a future improvement and I’ll get there.

My goal for this project is to develop the mechanics and controls to make this arm functional. I’ll be adding the parallelogram lower links and developing the kinematic equations so I can drive this arm by cartesian (x, y, and z position) or cylindrical (r, theta, and z position) coordinates.

That was my project day!

If you liked this project, check out some of my others:

Instant Parade!

The ThrAxis – Our Scratch-Built CNC Mill

Give Aging Technology a Chance

Did you like It’s Project Day? You can subscribe to email notifications by clicking ‘Follow’ in the side bar on the right, or leave a comment below.