Robotic Arm 3 dof (RRR type)

Here is one of many “leave it for later” projects that I always wanted to finish. Although it is not finished I will show you what I have so far.

Of course is a very easy project! But at the same time is very gratifying; according to the philosophy of this blog, Fast & Simple Projects.

The required material:

-> 1 microcontroller based system (embedded type or not!).

-> 2 Half bridge Drivers (LN298, for instance).

-> 2 DC power supply (battery or charger); one for feeding the microcontroller and another one for feedding the motors.

-> 4 DC motors. But it could be possible that is needed more than one motor per joint, It all depends on the size/weight of the robot parts.

-> 5 endstop sensors.

My basic design for controlling the Robotic Arm Manipulator is shown on the drawing below:

Robot_Arm_3_dof_Basic_Diagram

Important: As it is shown in the drawing, the ground reference (GND) of microcontroller and ground reference (GND) of the H-B drivers have to be the same.

Note: I know that a Robot Arm Manipulator should be controlled by rotative encoders (for each motor) in order to close the feedback control loop of the system, but, this will be handled in another post.

The next step is to find the best parts for the “mechanics” that better suits your idea or project:

-> Strong metal parts for robust configuration (like steel); but they are very heavy and this means you’ll need big motors and more power to control the system.

-> Light metal parts (like aluminium parts);  this is the best option but keep in mind that aluminium parts are quite expensive.

-> Plastic parts; they’re not robust, but they’re the lightest! and the cheapest! Also they’re are very “modular” and for the fast development projects this will be the best way to build the robot system.

Yes, thats it, I went for the plastic “mechanical” assembly option, and here is the result:

dsc_0026-e1491250366234

This picture shows the different parts of the robot:

M1 -> double DC motor that rotate the base of the robot (Need a reduction gear to gain power transmission).

M2 -> double DC motor that push the arm towards or backwards (Need a reduction gear to gain power transmission).

M3 -> DC motor that move the gripper up or down (Need a reduction gear to gain power transmission).

Gripper -> DC motor for open and close the Gripper.

The “very ugly” plastic flanges in the base were a temporary solution until I found a wider gear to couple the robot’s base with the rest of the robot arm (shown in the next picture).

The reduction gear coupled on the axis of the motors is very important to gain torque and also for increase accuracy when the motor is rotating. Here are shown the reduction gear used for M1 (the”very ugly” plastic flanges to hold the robot,  is solved):

dsc_0126-e1491388769781

The other hardware installed on the robot is an endstop sensor for each joint (except for the M1 that has two, left & right endstop sensor). The idea is that after playing with robot or after a number of robot moves, the robot’s joints could be located in an unknown position. To restart the robot’s position, many mechatronic systems have a “home” position that is knowm by the system itself with the help of some external sensors (out of the joint, even if we’ve got encoders) so that when the robot reach the “home” position we’re able to perform some kind of reset of the read value of the joint positions.

The basic diagram for the endstop sensors follows as:

Robot_Arm_3_dof_Basic_EndStop_Sensors

Here are some pictures showing where I put the endstop sensors in order to know the left and right stop position for M1 rotations:

dsc_0096-e1491307613845dsc_0097dsc_0098

The second picture from the image set above shows the left endstop sensor before being reached by M1 rotation and the third picture shows when the left endstop sensor is reached by the M1 rotation.

For the rest of the endstop sensors are quite easy to put them, just think about which is the best “home” position for the robot.

Concerning the gripper, it was a little bit more tricky. Here the pictures showing the endstop sensor for detecting if the gripper is open or closed:

The next step (and the best!) is make your computer able to communicate with our robot. Here is displayed the basic idea abou it:

Robot_Arm_3_dof_Diagram_Serial_Com

The PC application could be anything (Serial Console, C++ application, Python code using serial.py, ..) that is  able to open a serial port and to receive/send data. And it could look like my Robot’s “Service/Diagnostics User Interface”:

user_interface_Robot_Service_and_Diagnostics

The microcontroller is designed (by code) to receive ASCII characters as commands; depending on the command the robot has to perform several actions. Here is the realtion of “characterrobot action” that I settled:

ASCII Decimal value Command
A 65 Turn right M1
B 66 Turn left M1
C 67 Move forward M2
D 68 Move Backwards M2
E 69 Move up M3
F 70 Move down M3
G 71 Open Gripper
H 72 Close Gripper
I 73 Select speed  M1 “FINE”
J 74 Select speed  M1 “SLOW”
K 75 Select speed  M1 “MEDIUM”
L 76 Select speed  M1 “FAST”
M 77 Select speed  M2 “FINE”
N 78 Select speed  M2 “SLOW”
O 79 Select speed  M2 “MEDIUM”
P 80 Select speed  M2 “FAST”
Q 81 Select speed  M3 “FINE”
R 82 Select speed  M3 “SLOW”
S 83 Select speed  M3 “MEDIUM”
T 84 Select speed  M3 “FAST”
U 85 Select speed  GRP “FINE”
V 86 Select speed  GRP “SLOW”
W 87 Select speed  GRP “MEDIUM”
X 88 Select speed  GRP “FAST”

Note: Be sure that before open the serial port with your software, the  microcontroller “clock” (via internal oscillator, via external crystal oscillator,..) is stable, otherwise it will cause a lot of communication errors.

And … It worked! (on Monday 02:00 A.M…)

dsc_0115-e1491253470611

Soon I will post some videos of the robot.

Electrical Pet (just a break for robotics…)

This project is not about robotics, but it is very funny and fast to do. The material needed:

  1. Liquid Crystal Display (LCD); here it is used a 20×4 LCD with I2C communication.
  2. A microcontroller.
  3. At least 3 buttons to interact with the electrical pet.
  4. Batteries; here is used a 5V  @ 2000 mAh Power Bank.

The next step is to design the behaiviour of the pet, and then manage to put all the deisgn in a astate machine. For example, for this project, first the pet is inside the egg and after a few seconds the egg is broken the the pet finally comes out as a baby electrical pet. Just following this flowchart:

0_1_Electrotama_modified

The blue bubbles of the previous flowchart are states from the state machine. The conditions for trasition from one state to the next, in this case, it’s only the time. But on the next steps, there could be other conditions like health, the level of happiness of the pet, and whatever you think it could be nice to consider.

For example the next states of the tama are going to follow the same pattern, like you will be able to access to the menu in order to feed, play, study or even perform a reset:

 

1_to_next_tama_modified

And the final assembly has been done with a little box:

DSC_0489_modifiedDSC_0490_modified

The basic functionality of this standalone electrical pet can be seen in some videos posted in instagram @sasha_star_dust.

 

Monster Robot (not reached!)

The Robot shown in this project was designed in order to never stop feeding itself. During the day when there is light the flower obsorbs all the sunlight available in the environment and always tries to search which is the place of the room with more light. On the other hand, when there is not light available, the robot has to feed via other way. During the night the Robot rests peacful showing a blue led bulb and waits until something or someone is close to the bulb. In this moment the Robot hit the detected presence in order to kill and then eat it.

So this Robot never sleeps…

First, I tried to visualize the robot’s best appearance by making a draft of the model on a sheet of paper. This was the result (I guess it should have remained on a sheet of paper…):

flower_monster_concept_modified.jpg

By this stage of the process I ran out of my favorite motors and the ones that I’ve been using since the first project in this Blog. So I gave a try to the “very cheap” and little motors that long time ago I bought them and I have not used yet.

Once I had built the main structure of the Robot I tried to install the minimum wiring just to move the Robot via the little motors. And… of course It did not work!

Flower_Monster_Little_Motors_for_nothing_cropped_modified

Again and again… the same lesson for me: “first think, then do!”

While I don’t have my favorite motors to replace the previous motors, I will post some of the flow charts that  the different behaviours of the Robot are shown and also how they are changed.

First, the Mode Selector process is a very simple code that uses the light detector (Photoresistor) as a switching criteria between the Night Mode and the Day Mode:

Flower_Monster_Mode_Selector_modified

Then, the differents Robot’s  modes (or behaviours) are established:

Day Mode: The robot is always searching for the place with more ligh of the room and at the same time is avoiding the possible obstacles that are in the room:

Flower_Monster_Day_Mode_modified

Night Mode: The robot waits for someone or something that bites the bait while showing a blue light bulb. Once the prey approaches to the blue light bulb and touches it, a hammer-like arm hits the prey.  Otherwise if there are no prey approaching, the robot will have a walk in order to change the hunting/fishing place:

Flower_Monster_Night_Mode_modified

All the code was already written for programming the robot but another project came and I had to leave this project. At least the dataflows remain…

Drawing Robot (SCARA type)

Here is posted another project related to a SCARA Robot that is going to draw whatever you command it (with limitations).

The basic robot’s structure of a SCARA type robot is based on a 2 dof planar robot (with height) and with a prismatic joint on the same axis of the end efector. The end efector of this robot is going to be a pen or a pencil in order to let it draw on a paper. Here is displayed the robot’s assembly:

SCARA_ROBOT_STRUCTURE_modified

The picture below shows the view from above of the robot. And now there are already installed all the endstop sensor needed in order to reset motor encoders when the sensors are reached (the same idea of the previous robot projects):

DSC_0229_modified

The endstop sensor for the end efector (EF) is placed when the prismatic joint has reached the maximum position:

DSC_0224_cropped_modified

And when it’s time to define the home position of the robot, just think of which is the position that the robot structure (and the joints) suffer less.

 

Now that all the structure is assembled and all the wiring done, it is necessary to create a coordinate system as a reference for the robot to know every position that we send to it as a setpoint.

Drawing_Robot_coordinate_System_modified

It can be started by the definition of a World’s origin (displayed in blue), and it can be located where you want. The origin of the other coordinate system is placed on the base of the SCARA robot, and it’s called Robot’s origin (displayed in Green). There are several ways to referenciate the Robot’s position:

A) . Robot’s Origin:  All the position coordinates starts on the base of the robot. The inverse kinematics of the robot gives the position of the End Efector (EF). It is also located through the vector P_re.  It works fine when there’s only one element/robot working within the space.

 B). World’s Origin: All starts on the world’s origin point that you have chosen. It will be needed the relation that exists between the World’s origin and the Robot’s origin; this relation is solved by the vector P_wr, and then (after computing the inverse kinematics of the robot, that give the EF position), we find the vector P_re, and finally we are able to referenciate the EF position within the World’s coordinate system. This is the best solution if there are other elements (obstacles, other robots working) within the robot’s workcell.

Once the coordinate systems are settled, there is another important thing to consider. Usually, we want the robot to reach a single position, as a setpoint. But, when we want the robot to draw something (letters for instance or whatever), the thing gets a little bit more complicated beacause there are a set of points (positions) for every letter.

Also, the position where the letters are going to be drawn, need to be referenciate in World’s coordinate system. And once the letter’s position are defined, it is necessary to split the letter in several points in order to perform the drawing as accurate as possible. The more points per letter the more computing time.

Drawing_Robot_letter_point_to_point_modified

Here is a picture where is explained how the set of points per letter are stored as data, and then are read in order to perform the Robot’s point-to-point trajectory:

Drawing_Robot_Compute_point_to_point_modified

 Pending video…

Zashiki Karakuri’s Replica (Tea-Service Robot)

Here is posted a Zashiki Karakuri Robot project. The real ones date back to Edo period’s (1603-1868) and they were made as a Tea-Service Robots. Their main operating sequence was:

Zashiki_Karakuri_Real_FlowChart

The working sequence configured for the robot replica in this project  will be a little bit different; It will be shown later.

Before that, let me show you the basic parts of the robot’s replica:

Zashiki_Karakuri_Parts

Now it’s time to install the wiring!

First the Motor Drivers + the Batteries:

mounting_motor_drivers_battery

Installing the endstop sensor as a Teacup detector:

DSC_0153_modified

Then all the connections:

DSC_0157_modified

Checking if the Arm’s mechanism keep the horizontal line of the Tea tray when the robots move it up and down:

         DSC_0152_modified     DSC_0151_modified

A last modification was needed because of the “poor” stability during the robot’s movement. Now the robot have two little caster wheels:

Here is displayed the working sequence configured for the zashiki karakuri’s replica (when it works…):

 Zashiki_Karakuri_Basic_FlowChart

And here there is the flowchart of the main program when the robot has its “day off”(no tea-service)…I mean like an autonomous navigation mode; just avoiding obstacles by changing its speed and direction depending on what the robot finds around it:

Zashiki_Karakuri_Autonomous_FlowChart

Some videos of this robot can be seen on instagram: @sasha_star_dust