Wednesday, 25 December 2013

H Bridge simulation in NI multisim

H Bridge configuration has many applications in controlling a motor. Generally, a motor can be either switched ON or OFF(uni directional rotation) based on the needs and the direction of rotation depends on the polarity. But, in the field of Robotics and Medical applications, a single motor must have the ability to rotate in two directions(clockwise and anti clockwise). For this purpose, an H bridge configuration is preferred. As the name indicates, the circuit will be in the shape of alphabet “H”. The motor is made to rotate in two directions by changing the polarity of the motor.

Look at the following connections, 
Here, the direction of rotation of the motor varies with the change in its polarity. A simulation of the above schematic is shown below,

A combination of these two schematic gives a H Bridge circuit and the switching between polarities is done using the transistors(electronic switches) as shown below,

From the above circuit, it can be seen that when the switches(SW1 and SW3) closed, the motor will be in one polarity and when the switches(SW2 and SW4) are closed while the other switches are open, the motor will be in another polarity. And this is how an H Bridge circuit works and the transistors are used as the switches.  Below shown is the simulation of the H Bridge circuit in NI multisim,

Here, when the gate is ON, the MOSFET is turned ON, hence forms a closed circuit and the motor receives the polarity based on the above explanations. 

Sunday, 8 December 2013

Making of my Autonomous Vehicle

I have built an Autonomous Vehicle as a part of my final year project, which was sponsored by “Analog Devices, Inc”.The vehicle have the ability to move from one point to another point based on the GPS coordinates of both points and also plots the path it travels in google maps. In order to avoid obstacles in its path, the vehicle uses a grid of sensors that includes laptop based RADAR, camera, proximity sensors. Following are few snapshots of the prototype that was built (camera is not shown in the picture),

Front view

Side view

Here are some videos of making of my Autonomous Vehicle,

Initial obstacle avoidance algorithm testing 1 - outdoor

Initial obstacle avoidance algorithm testing 2 - outdoor

Optimized algorithm testing 1 - outdoor

Optimized algorithm testing 2 - indoor

Testing after changing vehicle dynamics 1 (vehicle turning radius test)

Testing after changing vehicle dynamics 2 (vehicle turning radius test)

Testing after changing vehicle dynamics 3 + Acquiring GPS data of start and end points

Image processing algorithm testing ( webcam+ MATLAB Simulink)

Plotted GPS data in google earth (while vehicle is moving)

Plotted GPS data in google earth (while vehicle is stationary)

Preliminary model built to test the algorithms 

Friday, 6 December 2013

Heart rate measurement in LabVIEW

This is one of my project in which I have designed a transducer that can measure the variations in oxygenated hemoglobin and deoxygenated hemoglobin ( simply known as an oximeter).The sensor was developed based on the key principle that these two parameters have two different optical spectra in the range of 500nm to 1000nm. Hence, two light sources of two different wavelength(red-660nm and infra red-940nm) are used. A photo diode was used to sense absorption rate and the output of the photo diode was obtained in the LabVIEW using a data aquisition card(NI 6211).  The photo plethysmograph(PPG) waveform was obtained in LabVIEW by driving the leds using PWM signals of 25% duty cycle. Hence, based on the PPG waveform, the heart rate(Beats Per Mintue) was found.
This is the video of the PPG waveform obtained in the LabVIEW. Since ordinary sensors was used, there were lot of noise interference in the reading and the BPM was very fluctuating.

(the PPG waveform in the video is a recorded measurement file of the actual experiment)
This is the PWM pulse which was generated to drive the LEDs from the LabIEW. LEDs are switched alternatively at same duty cycle.

Shown below is the transducing part which I have developed in my college laboratory,

“Simulation” is always a good idea before an actual execution of a project. I have simulated the transducing circuit in NI Multisim and it helped me a lot in actual implementation. Following is the video of the simulation. 

I have used the current source since the output of a photo diode is current and used an I-V converter to convert it into voltage( since input of NI 6211 must be in volts). Here I have also added a sample and hold circuit to hold one led`s value while other is being sampled and vice versa. The manual switch in the simulation was replaced with the control signal from the LabVIEW.

Though the acquired results got some flaws, obtaining a biological parameter was really awesome!

Thursday, 5 December 2013

The Free Radicals

Human body consists of n number of feedback loops right from a single cell to the  major parts of the body. Alteration to any of these cells, vessels or parts, leads the regular bio feedback loop to collapse and in some cases, may even results in death. These process often acts like a butterfly effect.

One of the major issue to consider is our immune system and free radicals. Our body`s defense system releases free radicals(a by-product of metabolic process of oxidation)to fight against viruses and bacteria. But the excess free radicals produced due to pollution, smoking, stress, along with the indigenous unterminated  free radical chain tends to steal(in order to get paired) electrons available in the body in every part of the tissue. And this chain goes on until all the free radicals are perfectly bonded.

This is the root cause for majority of heart disease, artery blockage and cancer since these free radicals are more fond of electrons in the region of heart and brain. Unfortunately there are no effective drugs have been developed so far to address this issue. And since free radicals are essential, it cannot be terminated too. But the excess free radicals bonded with the oxygen to form oxygen free radicals can be reversed or neutralized with proper intake of glutathione peroxidase and anti oxidants.

One of the impact of these excess free radicals is that, its tendency to steal electrons from the DNA. If this process is succeeded, the DNA is subjected to mutation and incase of pregnant women, it could even damage the fetus. Thus the effective control of the free radicals must be considered as a preventive step for proper health condition. To do so, keep away from stress and pollution. Also consume anti oxidant rich foods. Vitamin C, E and glutathione peroxidase can reverse the oxygen free radicals into pure oxygen and also prevent excess free radicals and heart blockage. Hope this information helps!!

(I have written this article for my college magazine during my third year of Engineering. These are my perception of free radicals and body chemistry. and I am not an expert in medicine or biochemistry)

Wednesday, 4 December 2013

Increasing the voltage levels of PWM signals

PWM signals are often used in robotics for the purpose of controlling the DC motor speed and to drive the servo motors. Most of the microcontrollers have the output voltage of 5V. When  the PWM signals are generated from these controllers, the average output voltage for maximum duty cycle will be around 3.3 to 4.2 V(approx). This voltage level cannot drive a 5 or 12V motor efficiently. Hence the voltage level must be boosted with appropriate external circuits.

Following is a simulation that demonstrates how to increase the voltage level of the PWM signals generated from a microcontroller. Here, I have used CCP module of PIC microcontroller to generate the PWM pulses. The DC motor is driven by the MOSFET based on the PWM signals. 

From the video, you can see the voltage level from the microcontroller and the boosted voltage in the oscilloscope.  It can be noted that only the amplitude is increased and the duty cycle remains unchanged. 
Voltage increase can also be seen by connecting a voltmeter across the motor terminals, as shown below,

Sunday, 10 November 2013

Image processing based robot using MATLAB and Simulation using Proteus ISIS

Build a robot using MATLAB

For the beginners, who are attempting to make an image processing based  robots, here comes few steps to guide you through the process. Remember that the robots sees what actually  the programmer wanted it to see by using a camera as its sensor along with proper image processing algorithms. Usually it involves high computation and hence a normal microcontroller would not be enough. So let us use MATLAB for processing the images and an ordinary microcontroller to execute the commands from the MATLAB.
As I always follow the KISS concept(Keep It Simple Stupid), I will first explain how to control the motors from MATLAB using serial communication(RS232). In the explanation, I have used the basic 8051 microcontroller to execute the command from the MATLAB(you can use any controller of your choice). To reduce the time in setting up the hardware assembly of motors and microcontroller, I have interfaced proteus ISIS and MATLAB(since our aim is to check whether proper control commands are sent to the controller based on the image processing algorithm developed). The following video shows how the commands from MATLAB is received by the controller via serial com port and which in turn controls the motors actions,

And now, we know how to control a motor from MATLAB. All we have to do now is to develop a suitable image processing algorithm based on our needs and to introduce the motor command function ‘fwrite()’ at appropriate places.

Image Processing:
Let us consider an image is an MxN matrix. So,processing an image is nothing but manipulating the values in the MxN matrix as per their needs. The entire picture that can be seen from a camera is the Field Of View(FOV) of the camera and the desired region in which the processing has to be done(or a feature to be extracted) is known as the Region Of Interest(ROI). There are lot of resources available in internet to learn image processing, but, after gaining some basic knowledge about image processing, Please do think and try to develop your own algorithm or just try to combine several algorithms and check for your output(Am sure it will be more fun than just implementing an already existing algorithm).
In the following video, I have used the lane detection sample video of MATLAB and applied global thresholding on it. I have marked and extracted the ROI from the entire FOV of the video. Let us assume, this ROI is a few meters ahead of the robot in which obstacles has to be detected. The threshold plot from the label matrix gives the obstacles on the road(search google for label matrix and thresholding). The pink line indicates the preset threshold value for obstacles and when the real time threshold exceeds the preset value, it can be taken as an indication of an obstacle and appropriate motors can be activated.( this is where you have to use the motor commands). Here, the white color is considered as an obstacle and black color indicates obstacle free area.

To know whether the direction of the obstacle, multiple ROIs can be used(left and right ROI).  So, based on the direction of the obstacle, activate the respective motors. The following MATLAB commands will capture the images from the selected camera,
vid=videoinput('winvideo',1, 'YUY2_160x120');

% Do all image processing and analysis here

‘img’ is the image matrix and you can apply all your algorithms to it inside the while loop. And, as usual, when you are aiming for higher accuracy, you may focus more on the computation part and a better control loop for the motors(PID controlling is often used most cases).

Snap shots of the videos are given below,

Friday, 19 July 2013

Getting started with an Autonomous Vehicle

Every technology can be understood and recreated by anyone once they are clear with its fundamental operation. One such technology I am going to deal with is the Autonomous vehicle/Robot. Most of the beginners think (as I did when I was a beginner) that only master minded Engineers and Scientists can develop a Robot or Vehicle that can move autonomously( like this). Well, that is true if one is expecting 99% accuracy in operation of that vehicle. But with lesser accuracy and with better knowledge in vehicle control mechanism, one can build their own Autonomous vehicle. So, here, I will be explaining how to make your own simple Autonomous vehicle.
Let us first consider an ordinary vehicle. The vehicle control mechanism involves increasing/decreasing the vehicle`s speed, braking and steering. The steering and motion of the vehicle is normally controlled by the user who makes decision based on the vehicle`s environment, whereas in case of an Autonomous vehicle there will be no user interaction, except for the marking of the destination point that the vehicle must reach. Therefore, suitable sensors must replace the human user.
 The following pictures depict the functioning of an ordinary vehicle with a human user on it and an Autonomous vehicle with sensors and processor.  

Now lets start to built an Autonomous vehicle. From the above two pictures, it is understood that the designer of an autonomous vehicle must decide how to sense the obstacles around the vehicle, How to move the vehicle from source to destination point while deciding the appropriate path to be taken.
The following flowchart shows the simple open loop architecture of an autonomous vehicle,

To keep it simple, I am considering an autonomous vehicle with following specifications,
·         Obstacle detection using proximity sensor
·         Navigation using GPS
·         Battery powered vehicle(control using PWM and Relay drives)
·         Differential drive(skid steer)

Of course, if you are familiar with image/signal processing you can use camera, LIDAR/RADAR for obstacle sensing. Since this explanation is for beginners, I am not considering the on road rules detection and path determination as it makes things more complicated.

For clear understanding, the vehicle construction is explained in 4 steps as follows,

1.Vehicle control mechanism:
 Before designing the vehicle control mechanism one must be aware of the maximum payload of that vehicle based on which the components can be picked up.
There are two ways by which the vehicle can be controlled
1.      Fixed speed control
2.      Variable speed control
In fixed speed control,  the vehicle can be either moved(with a constant speed) or stopped. A simple relay logic controller is enough to perform this type of control. You can either build your own relay based controller for your vehicle or else you can buy a relay shield to control the vehicle motors from the micro controller. Below shown is the relay based motor shield and Arduino MCU.

Check for the motor driver`s data sheet whether the current rating of the motors used falls within the range of the driver being used or else you will end up frying the driver on continuous usage. The following picture shows the sample connection between the relay driver and the microcontroller,

Logic sequence can be sent via the microcontroller to the input of the relay driver to switch the motors ON/OFF as well as to control their rotating direction.
For eg: consider 10,11,12,13 pins of Arduino as input to the motor driver, 
10 - LOW
11 - HIGH
12 - HIGH
13 - LOW
By using the above sequence you can switch corresponding relays associated with input pins from Arduino. Likewise, you can try different input sequences for forward and reverse motions.

In variable speed control of the vehicle, MOSFETs can be used to control the motors by passing PWM signals from the microcontroller. The two types of MOSFETs available are the TTL driven and CMOS driven MOSFETs. The former type can be directly interfaced with a processor/microcontroller since most of the MCU provides TTL output. Below shown is the motor control circuit using PWM signal from Arduino with TTL driven MOSFET,
Connect the Gate of the MOSFET to PWM pin of Arduino. By varying the duty cycle of that pin, you can control the MOSFET`s gate which in turn controls the motor`s speed. 

However, this configuration works fine only for few trials and causes permanent damage to the transistor on prolonged usage. The reason is that, the transistor stays in linear region for a long time before it reaches the saturation region. Thus the performance of the MOSFET never reaches its optimum point which leads to over heating of the MOSFET and you will end up frying the MOSFET(as I did) on further usage.

For better understanding, the linear and saturation region of a transistor can be compared to a human sprinting and jogging respectively. One cannot sprint for a long time since he deprives of energy while he can jog for a long time. And that’s why the MOSFET gets damaged with direct TTL output. In order to avoid this problem, a buffer can be added between the TTL output and the gate to improve the MOSFET`s switching performance by decreasing the rise and fall timings(which provided faster current sourcing and sinking). TTL output with buffer is shown below,

If you are using a CMOS driven MOSFET, the TTL output(0-5V) from a MCU cannot be used directly to switch the gate ON/OFF, since voltage higher than 5V is required to turn the transistor gate ON. In that case, the circuit shown below can be used,

Here, the optocoupler and the diode is used for protection purpose. R3 resistor(resistor connected to ground) is to assist transistor turn OFF. R4 resistor( resistor connected to 3rd terminal of optocoupler) must be low in order to avoid positive feedback(which creates oscillations) and to obtain better transistor switching speed. Its value can be calculated using the formula,
R= rise or fall time/ 2.2Ciss
Ciss is the sum of capacitors from gate to drain and gate to source(this value can be obtained from the MOSFET data sheet). Also keep in mind that, in CMOS configuration, the Vgate must be plus 5V higher than the Vcc to prevent damage against the MOSFET used.
 Multiple motors can be controlled with the same circuit as shown below,

So now the first part in Autonomous vehicle construction is completed. Now we can manually control the vehicle`s acceleration, deceleration and braking with the MCU.

2.Obstacle detection:
As I mentioned already, there are several ways of detecting obstacles in robotics. To keep it as simple as possible for beginners, I consider the ultrasonic proximity sensor(for eg: HC SR04). You can google for Arduino interface code for the sensor module and the connection is shown below,

The obstacle sensing range can be set as you wish and the sensor can be mounted over a servo motor to get 180 degree field of view. Don’t forget to use a pull up resistor to avoid garbage values in the sensor readings. Also provide enough delay time in the servo sweep so that the echo pulse can reach the receiver completely.
Now, we have a developed an obstacle sensing mechanism and we already know how to control the vehicle manually. Now all we have to do is to combine both the methods.

3. Manipulating the vehicle based on obstacle detected:
Lets consider the maximum range of the sensor is 4 meters and anything within 2meters is considered as obstacle. The following algorithm can be used to control the vehicle based on the sensor reading,

Function name()
            Read sensor value;
            If (value>2meter)
                        Move forward
            Activate servo and scan for obstacles in left and right side of vehicle
                        If (more space in left than in right)
                                    Move left
                                    Move right
} // continue loop

This algorithm detects the obstacle and manipulates the vehicle based on the sensor values. And the resultant vehicle will be an obstacle avoiding vehicle(with no motive, i.e, it moves randomly avoiding obstacles)

Go to this link to see the control of DC motors based on the obstacles sensed by an IR proximity sensor and also check this link in which the sensor is mounted on a servo motor and the above algorithm is implemented in it. 

4. GPS based navigation:
The components required for this purpose are the GPS receiver/shield and a compass. Using the Haversine formula, the distance from the source to destination point can be calculated. Since the distance is calculated in straight lines, way points must be declared in case of destination points other than straight lines.
Below shown is the schematic of the Arduino GPS shield, tiny GPS function can be uploaded to the Arduino MCU to obtain the NMEA strings and make sure the jumpers are placed in the right position.
click here  to learn more about Arduino and GPS

Below shown is the compass interface with Arduino MCU,

Using the compass, determine the vehicles direction with respect to the destination point and turn the vehicle until both the values are equal(use atan2() to get the vehicle`s current direction with respect to destination using the GPS coordinates). Then, move the vehicle to the calculated distance(measure the distance moved using an encoder) and “you will reach your destination point”(with lot of errors obviously). While moving from the source to destination point, the obstacle detection and manipulation function will act as an interrupt function.
I hope right now you got an idea about how to build an Autonomous vehicle. However, to increase the accuracy, you have to add lot of feedback loops and control algorithms to the system(this is a basic model after all).
For further clarifications, feel free to contact me at and