rajath umesh joshi · door # 219 1st floor, 3 rd main 7th cross, ramanjaneyanagar, chikkallasandra,...

13
Rajath Umesh Joshi +91-8951241680 “Athreya” Door # 219 1 st floor, 3 rd Main 7 th Cross, Ramanjaneyanagar, Chikkallasandra, Bangalore -560061 Karnataka, India [email protected] OBJECTIVE: To augment my knowledge and qualifications from an institution of global repute and to make valuable contributions in the field of science and technology. PROJECTS COMPLETED: Project 1: “A New Approach for an Intelligent Swarm Robotic System” (Kindly refer Annexure A) Project 2: “Drone (Quadcopter)” (Kindly refer Annexure B) Project 3: "Driver’s Navigation Assistance System", (Kindly refer Annexure C) Project 4: “Gas Pipeline monitoring using Wireless Sensor Networks” (Kindly refer Annexure D) Project 5: “Humanoid robot with Speech Recognition” (Kindly refer Annexure E) Project 6:”Ultra Glove for visually impaired” (Kindly refer Annexure F) Project 7:” Self-balancing Robot” (Kindly refer Annexure G) Project 8:”GPS Guided Autonomous Robot” (Kindly refer Annexure H) Project 9:”Mobile Controlled Robot” (Kindly refer Annexure I)

Upload: others

Post on 26-Jul-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Rajath Umesh Joshi · Door # 219 1st floor, 3 rd Main 7th Cross, Ramanjaneyanagar, Chikkallasandra, Bangalore -560061 Karnataka, India rajathjo@gmail.com OBJECTIVE: To augment my

Rajath Umesh Joshi +91-8951241680

“Athreya”

Door # 219 1st floor, 3rd

Main 7th Cross,

Ramanjaneyanagar,

Chikkallasandra, Bangalore -560061

Karnataka, India [email protected]

OBJECTIVE: To augment my knowledge and qualifications from an institution of global repute and to make valuable contributions in the field of science and technology. PROJECTS COMPLETED:

Project 1: “A New Approach for an Intelligent Swarm Robotic System”

(Kindly refer Annexure A)

Project 2: “Drone (Quadcopter)” (Kindly refer Annexure B) Project 3: "Driver’s Navigation Assistance System", (Kindly refer Annexure C) Project 4: “Gas Pipeline monitoring using Wireless Sensor Networks” (Kindly refer Annexure D) Project 5: “Humanoid robot with Speech Recognition” (Kindly refer Annexure E) Project 6:”Ultra Glove for visually impaired” (Kindly refer Annexure F) Project 7:” Self-balancing Robot” (Kindly refer Annexure G) Project 8:”GPS Guided Autonomous Robot” (Kindly refer Annexure H)

Project 9:”Mobile Controlled Robot” (Kindly refer Annexure I)

Page 2: Rajath Umesh Joshi · Door # 219 1st floor, 3 rd Main 7th Cross, Ramanjaneyanagar, Chikkallasandra, Bangalore -560061 Karnataka, India rajathjo@gmail.com OBJECTIVE: To augment my

NOTABLE ACHEIVEMENTS:

I have won the 1st prize in Robotics competition organised by IEEE.

My final year project won the Best Project of the year award.

I have won the 2nd prize in Robotics competition organised by IIT Bombay

I have won 3rd prize in Robotics competition organised by IISC (Indian Institute of Science).

I have won the 1st prize in Robotics competition held at NMIT.

I have won the 2nd prize in Robotics competition held at Christ Engineering College.

I have published a paper titled” A NEW APPROACH FOR AN INTELLIGENT SWARM ROBOTIC

SYSTEM” in IJRET (International Journal of Research in Engineering and Technology).

ACADEMICS:

Education Institution University Year Percentage

Bachelor Of

Engineering

(Electronics)

4th Year

Dayananda Sagar College Of

Engineering

Visvesvaraya

Technological

University

Belgaum

2015

72.46%

Sem 7: 68.66

Sem 8: 76.26

Bachelor Of

Engineering

(Electronics)

3rd Year

Dayananda Sagar College Of

Engineering

Visvesvaraya

Technological

University

Belgaum

2014

66.5%

Sem 5 : 67.77

Sem 6: 65.22

Bachelor Of

Engineering

(Electronics)

2nd Year

Dayananda Sagar College Of

Engineering

Visvesvaraya Technological

University

Belgaum

2013

71.94%

Sem 3 : 68.66

Sem 4 : 75.22

Bachelor Of

Engineering

(Electronics)

1st Year

Dayananda Sagar College Of

Engineering

Visvesvaraya

Technological

University

Belgaum

2012

70.96%

Sem 1 : 73.29

Sem 2 : 68.64

12th

Grade(PUC)

Sri Kumarans Children’s Home

Pre University

College(Bangalore)

Karnataka

Pre University

2011

89%

10th Grade Jyothi Kendriya Vidhyalaya Karnataka State

Board

2009

85.76%

Page 3: Rajath Umesh Joshi · Door # 219 1st floor, 3 rd Main 7th Cross, Ramanjaneyanagar, Chikkallasandra, Bangalore -560061 Karnataka, India rajathjo@gmail.com OBJECTIVE: To augment my

INTERNSHIP:

I am an intern at Indian Institute of Science (IISC). I am working on developing a GPS guided Autonomous

UAV (Unmanned Aerial Vehicle).

COMPUTER PROFICIENCY:

Programming Languages familiar with – C, C++, Java, MATLAB and Python.

Hardware Descriptive Languages familiar with – VHDL, Verilog, System Verilog and

Assembly language.

Microcontrollers worked with – Arduino, MSP430, 8051, Raspberry Pi, PIC.

Simulators worked with – PSpice, TINA by TI, Xilinx ISE, Keil Uvision.

EDA Tool worked with – VCS by Synopsys.

CO-CURRICULAR AND EXTRA-CURRICULAR ACTIVITIES:

I have been involved in various sporting events, co-curricular and extracurricular activities.

I organised the inter-departmental technical fest “Technico”.

I was one of the organisers of the cultural fest of the department.

I was the captain of the department’s cricket team.

My team under my captaincy has won several cricket tournaments.

Participated in the "Walk-A-Thon 2013, walk to educate", organised by Samarthanam Trust for

the Disabled.

I had formed a rock band in the college. I was the lead guitarist of the band. We have given several

performances.

LANGUAGES KNOWN:

English, Kannada and Hindi.

HOBBIES AND OTHER INTERESTS: My favourite pastimes include playing guitar, reading Electronics

for You Magazine, playing and following cricket, and listening to music.

Rajath Umesh Joshi

Page 4: Rajath Umesh Joshi · Door # 219 1st floor, 3 rd Main 7th Cross, Ramanjaneyanagar, Chikkallasandra, Bangalore -560061 Karnataka, India rajathjo@gmail.com OBJECTIVE: To augment my

PROJECT DETAILS

ANNEXURE A

Title: “A new Approach for an Intelligent Swarm Robotic System” Published in the IJRET (International Journal of Research in Engineering and Technology) Vol.4, Issue 11 of 2015 paper link (http://esatjournals.net/ijret/2015v04/i11/IJRET20150411043.pdf)

Introduction: Swarm robotics is a field of multi-robotics in which large number of robots are coordinated in a distributed and decentralized way. The multi robot system comprises of a group of robots which exhibit cognitive and coordinative behaviour. The collective behaviour emerges from interaction between the robots and interaction of robots with the environment. Swarm robotics is inspired from the swarm behaviour exhibited by ants, honey bees, fishes and birds. The tasks which are highly demanding and generally cannot be accomplished by a single robot can be accomplished by a multi robot system through swarm robotics. Swarm robotics generally faces problems such as lack of coordination among the robots, the need for regular human interference and monitoring, and lack of precision in accomplishing the task. These problems can be prevented from occurring by adopting a new approach. The approach emphasises on a new multi robot system comprising of a drone (quadcopter) and a swarm of ground robots. The drone activates the ground robots, monitors and controls the movement of ground robots until the task is accomplished.

Description: The multi robot system consisting of a drone (quadcopter) and 5 ground robots will be used to accomplish the task of lifting the heavy object and then moving it for some distance. The drone is the control unit of the system. The drone comprises of 4 brushless motors, electronic speed controllers (ESC) which are used to vary the speed of brushless motors. The Pulse Width Modulation (PWM) signals are given to ESC from microcontroller in order to vary the speed of brushless motors. A MPU-6050 module which is a 6 axis (Accelerometer + Gyroscope) module is used to maintain the stability and enhance the manoeuvrability of the drone. Based on the feedback from MPU-6050 the PWM signals are sent by MSP430 (microcontroller) to the ESCs. The drone is equipped with a camera and Raspberry Pi B+ (700MHz Broadcom processor, 512MB RAM, 40 GPIO pins). The camera and Raspberry Pi are used in order to carry out Colour Object Tracking and detection through Image Processing. The drone can be made to stick to the ceiling or roof so that the camera captures the entire arena comprising of ground robots and the object which has to be shifted. The 5 ground robots are divided into two groups comprising of 2 robots each and an individual robot. Each group comprises of two robots. Both the robots of a group are lit up with LED strips of same colour. If robots of one group are lit up with Blue LEDs then the robots of other group are lit up with Red LEDS. The 5th robot which is single is lit up with a colour different from the colours of LED strips mounted on the 2 groups of robots. The Raspberry Pi is used extensively for Image Processing. The program to carry out Colour Object Detection and tracking is written in Python using OpenCV library. The colour object tracking is carried out based on camera feed. First, each frame is taken from the video. The captured image is then converted to HSV (Hue, Saturation, Value) which is a representation in cylindrical coordinate system. The image thresholding is done to isolate colours and threshold values are set to each colour. The Moments (weighted average of image Pixel intensities) are determined and then the x and y coordinates of the centre of the obstacle or the object (which is of a colour different from the colours of the robots). The object to be lifted and moved will be easily recognised by the drone as it is of a different colour and the code is written in such a way that any colour other than the colours pertaining to the robots is considered as the colour of the object. The colours of the robots are well known to the drone as their colours will be specified well in advance in the code to facilitate detection and to differentiate between object and robots. This way the colour will be recognised as the colour of the object and the entity is considered as the object. From the Moments, the x and y coordinates of the centre of object is determined by dividing the moments by its area. This is done by using a special OpenCV function called “GetSpaticalMoment”. Since the two robots of a group which are lit up with LED strips of same colour are in close proximity, the camera on the drone perceives it as a single entity or a single patch of light. This makes it easier to calculate the centre of the patch of light and it is calculated in the same way as that of centre of the object. Once the centre of object and the common centre of 2 robots of a group are calculated, the distance between the centre of the object and the common centre of two robots is calculated by applying Euclid’s Distance Formula between the centres. The robots are activated based on colour that is, the activation will be based on the colour by which the robot is lit up. The priority given to the colour facilitates the activation of the

Page 5: Rajath Umesh Joshi · Door # 219 1st floor, 3 rd Main 7th Cross, Ramanjaneyanagar, Chikkallasandra, Bangalore -560061 Karnataka, India rajathjo@gmail.com OBJECTIVE: To augment my

robots. The colour priority is set well in advance while coding the Raspberry Pi. For instance the priority set is such that, the group of robots lit up in red is to be activated first, then the robots lit up in blue and at last the robot in yellow is activated. Once the colour which is at the top of the priority list is detected, the centre of the colour patch which is the common centre of the two robots is calculated and then the distance between this centre and the centre of the object is calculated. Once this is done, the calculated distance is transmitted from the drone to the robots of the group through NRF24L01 which is a 2.4 GHz transmitter. Since all the ground robots comprise of NRF24L01 receiver, in order to establish proper communication between the drone and the particular group, a 2 bit binary value is pre assigned to the robots. Two robots of the same group have the same 2 bit binary value assigned. The 5th robot which is not a part of other two groups is assigned a 2 bit binary value different from the rest. Each group has a 2 bit binary value assigned which is different from the other group. For example, the two robots of Group 1 are assigned two bit value 01, the two robots of Group 2 are assigned two bit value 11 and the 5th robot is assigned value 00. The drone while transmitting the distance also transmits the two bit binary value in order to ensure that the appropriate group receives the value of the distance. Once the robots of the group receive the signal, the 2 bit binary value which is sent by the drone is compared with the 2 bit binary value assigned to the robots of the group. If the values match, the robots of the group travel the distance calculated and transmitted by the drone. The robots travel towards the object which has to be shifted. This is done with the help of Ultrasonic sensor (HC-SR04) which continuously measures the distance between the robot and the object which is in front of it. Each robot is equipped with a Digital Compass (HMC5883L) which provides a sense of direction to the robots. The robots comprise of MSP430 microcontroller. If a robot deviates from its course or changes direction, based on the feedback from Digital Compass the microcontroller rectifies the offset and brings it back on course. The robots have motors which are driven by 4 Channel Relay. Once the first group covers the distance, the LED strips on the two robots turn off. Detecting this, the drone activates the second group which is next in the priority list. The 2 bit binary value is sent after determining the centre and calculating the distance. This distance value is transmitted to the robots of group 2. The robots cover the distance with the help of HC-SR04 and HMC5883L. The LED strips on the two robots turn off after covering the distance. The 5th robot is activated and it covers the distance in the same way that of the previous 2 groups and in the similar way the LEDs on the 5th robot also turn off. Detecting this, the drone sends out a signal to all the robots. Once the signal is received by all the robots, the microcontrollers on the robots start executing the code concerned with shifting the object. Shifting involves lifting the object and placing it at a desired distance. The lifting is facilitated by the Servo motors present in all the robots. All the robots begin the process of lifting the object simultaneously and also accomplish the task in unison. Each robot is equipped with IR (Infrared) sensors to prevent collision between one another. The task of automatic arrangement of robots based on priority has been accomplished. Each robot is assigned a number from 1 to 5. The robots transmit and receive the values or numbers of their neighbours through IR communication. Each robot compares the value of the neighbour with its own value. If the value assigned to the neighbour is greater than its own value and if the neighbour is placed in a superior position in the arrangement, then the robot with lower value trades places with the neighbour, there by attaining the superior position in the arrangement, which was once occupied by the neighbour. Consider an example wherein the robot with the value 4 is in superior position in the arrangement and its neighbouring robot whose value is 3 is below it in terms of superiority. Once the robot with value 3 receives the value of the neighbour which is 4, the robot with value 3 compares the value of the neighbour with its own value, then after finding its own number to be superior in terms of priority, it starts taking the place of the neighbour with a value 4 there by attaining superiority over the robot with value 4 and simultaneously the robot with value 4 also starts taking the place once acquired by the robot with value 3. In the same way all the robots arrange themselves based on priority. This arrangement is facilitated by Ultrasonic sensor, Digital Compass, IR sensor, IR transceiver. Each robot is equipped with IR (Infrared) sensors to prevent collision between one another.

Application: 1. This system will be of great use during natural disasters, in removing debris and identifying and

rescuing survivors. Due to smaller size of ground robots, the robots will be able to sneak into small and confined spaces. This feature coupled with Infrared sensors and cameras mounted on the drone, will be helpful in identifying humans trapped in the debris or stuck between debris.

2. The ground robots when equipped with ammunitions and the drone when equipped with

Infrared camera and sensors can be used for detecting and neutralising the enemies. This system prevents the loss of human life.

Page 6: Rajath Umesh Joshi · Door # 219 1st floor, 3 rd Main 7th Cross, Ramanjaneyanagar, Chikkallasandra, Bangalore -560061 Karnataka, India rajathjo@gmail.com OBJECTIVE: To augment my

ANNEXURE B

Title: "Drone (Quadcopter)"

Introduction: The objective of the project is to build a quadcopter or a drone. Drones are being used to deliver

goods, for surveillance and photography.

Description: The drone is built on a quadcopter frame. Four brushless motors are used for providing sufficient

thrust during the take-off, for manoeuvring and for hovering at constant altitude. The two brushless

motors which are diagonally opposite turn in clockwise direction and the other pair of diagonally

opposite motors turn in anti-clockwise direction. The speed of rotation of motors is controlled

through ESCs (Electronics Speed Controller). Four ESCs are used to control four brushless motors,

one ESC for one brushless motor. The ESC is given a PWM (Pulse Width Modulation) signal from the

microcontroller. Based on the duration of the ON pulse (Ton) the speed of the motor is controlled.

Higher the duration of ON pulse, higher the speed of rotation of the rotor.

The microcontroller used is MSP430. The microcontroller sends the PWM signals to ESCs in order to

vary the speed of motors. A 2.2AH, 11.1V Lithium polymer battery is used to power the drone. The

drone is controlled through a 2.4 GHz remote control. The 2.4 GHz receiver outputs are connected

to microcontroller. The remote control comprises of a throttle control stick and a direction control

stick. By moving the throttle stick, the speed of all the 4 motors can be varied. The variation in

throttle stick results in equivalent variation in the speed of motors. A single throttle stick will control

all the 4 motors. If the throttle stick is pushed to the maximum then speed of all the 4 motors are at

maximum and if the throttle stick is brought to the minimum then the speed of all the 4 motors are

at minimum. The range for the throttle variation is determined by considering the value read from

the particular pin of microcontroller. The throttle stick when is at minimum position, the

corresponding value received is noted and when at maximum position, corresponding value received

is noted. The intermediate values can be used to ascertain other speeds. There will be a

corresponding variation in speed whenever there is movement of throttle stick. In the same way the

direction stick’s range is also estimated. When the stick is moved to the left, the 2 left motors of the

drone rotate at higher speeds than that of the other 2 motors, thus resulting in the motion of

quadcopter in the left direction. When the stick is moved right, the 2 right motors run at higher

speeds than that of the other 2 motors, thereby moving the quadcopter to the right. In order to

ensure stability of the drone and to determine Roll, Pitch and Yaw variations, a MPU6050 6 axis

(Accelerometer + Gyroscope) module is used. Based on the feedback from the MPU6050 and from

the movement of the sticks in remote control, the microcontroller sends the PWM signals to the

ESCs. The MPU6050 ensures stability. For example if the drone is dragged and tilted to the right due

to high wind speed, the MPU6050 recognises the offset and based on its feedback given to the

microcontroller, the drone is stabilised by rotating the 2 left motors at a higher speed until the drone

becomes stable that is, there are no anomalies in the readings obtained from Gyroscope and

Accelerometer present in MPU 6050.

The drone can be made autonomous by using LIDARs and GPS.

Applications

1. The drone when equipped with a camera can be used for surveillance and for photography.

2. The drone equipped with IR sensors and IR camera can be used to detect hiding enemies

during the night.

3. The drone can also be used for delivering goods. This approach eliminates the fear of delay

in delivering the goods due to traffic.

Page 7: Rajath Umesh Joshi · Door # 219 1st floor, 3 rd Main 7th Cross, Ramanjaneyanagar, Chikkallasandra, Bangalore -560061 Karnataka, India rajathjo@gmail.com OBJECTIVE: To augment my

ANNEXURE C

Title” Driver’s Navigation Assistance System”

Introduction: The roads in developing countries such as India are inconsistent due to the presence of potholes, inconsistent and unmarked speed bumps. Unevenly laid roads also pose a threat to the commuters. The road accidents due to these inconsistencies are constantly on the rise. Most of these inconsistencies go undetected, which lead to accidents. While travelling on the highways, there is always a need to look out for sign roads, which gives information about the diversions ahead, speed limits, the speed breakers, distance to be travelled to reach the desired destination and the various routes or diversions to be taken to reach various cities. The drivers lose their focus in doing so. The vehicles on the highway will be travelling at high speeds, when the driver of the car travelling at high speed starts looking out for the sign boards he loses concentration on the road and it might lead to fatal accidents. During the night, cars travelling at high speeds miss out on many sign boards depicting important information and due to which people might end up taking wrong routes .In order to prevent these from happening, a multifunctional system is designed. Description: The multifunctional system based on Wireless technology will provide information about the forthcoming inconsistencies well in advance, while travelling in cities. It also provides information regarding the sign boards within the car through audio and through vibration of the steering wheel while travelling on Highways. The multifunctional system comprises of two modules, the transmitter module and the receiver module. The transmitter module is present near sign boards and receiver module is present inside the car. The transmitter module comprises of a low power consuming RF 434 MHz transmitter module which includes HT12E Encoder, MSP430 microcontroller and the transmitter module is solar powered. The solar panels provide sufficient voltage and current for the operation of microcontroller and the RF 434MHz transmitter module. Each RF Transmitter module is assigned a unique 4 bit binary value which has to be transmitted. The information on a particular sign board is stored in the database (Stored in the EEPROM) of the receiver module which is inside the car, under the 4 bit binary value. While travelling on the highways, all the information regarding numerous sign boards are stored in the EEPROM of the receiver module. All the 4 bit values of the transmitters are stored in the EEPROM including the information on the sign boards. The information regarding a particular sign board is stored under the respective 4 Bit Binary value of the RF 434 transmitter and the 4 bit value acts as the Address. Each RF 434 transmitter is given different 4 Bit Binary value and the information depicted by each sign board is stored in the EEPROM of receiver module under the unique 4 bit binary value. There is no need to use one transmitter module for each sign board; instead a group of consecutive 4 to 5 sign board’s information can be clubbed together under the 4 bit Binary value of transmitter. The distance from one sign board to another is calculated in advance and the distances between sign boards are also stored in the EEPROM under same 4 bit Binary value. When the car travelling on the highway is almost 100 meters away, the 4 bit value transmitted by the transmitter is received by the receiver module and the 4 bit binary value received by the RF 434 receiver (comprises of RF 434 receiver and HT12D Decoder) is given to Microcontroller. Then the microcontroller compares the 4 bit Binary value with the 4 bit values present in the database. Once the 4 bit binary value matches with the value present in the database, the information under the particular 4 bit value is received by the Microcontroller from the EEPROM and is read out through audio feedback system and by the movement of actuators present under the steering wheel. The odometer reading of the car is used by the microcontroller to indicate the distance at which the next sign board appears and distance between sign boards. This facilitates the use of single transmitter for 4 -5 sign boards and the increment in the reading of odometer results in decrement of the distance to the next sign board. The odometer readings are used to provide real time tracking of the sign boards and also help in perceiving real time information present on the signboards. When travelling in the city, the state of the switch present in the receiver module has to be changed, this results in activation of the database containing information about inconsistencies. The multifunctional system provides the information regarding the inconsistencies in the road. Transmitter module is placed at the beginning of the road. Transmitter modules are same as those used for sign boards. Initially a survey is conducted in order to create a database of the inconsistencies. In the survey, the accelerometer readings are used to determine whether the inconsistency is a speed bump or a pothole. The microcontroller monitors the odometer reading as it will be helpful to create a database of inconsistencies and to determine the distance at which the inconsistencies are present. The database is stored in the EEPROM which is a part of the receiver module. A RF 434 transmitter module which is assigned a 4 bit Binary value is placed at the beginning of the road. Once the device is detected by the receiver and the 4 bit binary value is received, the microcontroller extracts the information present under the particular 4 bit value from EEPROM and gives out the information in the form of audio and vibration. The odometer reading which is monitored by microcontroller helps in providing real time distance estimation. Every time a small distance such as 2 meters is travelled, the distance to the forthcoming inconsistency reduces by 2 meters and this enables real time tracking and detection of inconsistencies. The information whether the inconsistency is a speed bump or a pothole is also provided based on accelerometer readings.

Page 8: Rajath Umesh Joshi · Door # 219 1st floor, 3 rd Main 7th Cross, Ramanjaneyanagar, Chikkallasandra, Bangalore -560061 Karnataka, India rajathjo@gmail.com OBJECTIVE: To augment my

ANNEXURE D

Title: “Gas Pipeline Monitoring using Wireless Sensor Networks"

Introduction:

The project was aimed at developing a comprehensive gas pipeline monitoring system

using the concept of wireless sensor networks. The main objective of the project is to

provide a low power system which can efficiently and continually track the conditions of a

gas pipeline.

Description:

The system can broadly be said to work in two modes:

Normal functioning mode: In this mode the system continuously logs important data

such as pressure, temperature and in turn computes the volume. This helps in

keeping tab on the ideal conditions to be maintained.

Alarm mode : With prior calibration the normal pressure and temperature are

known. Hence the system by use of appropriate pressure sensors, senses the

dynamic pressure. If the pressure is greater or lesser than the normal an appropriate

alarm signal is actuated.

The project implements a prototype of the above specifications by making use of 2 sensor

nodes and a base station. The sensor nodes consist of the pressure sensor (BMP 085 or BMP

180) which measures both temperature and pressure.

The project is implemented by making use of the star topology, in which the two nodes send

data to the base station. The protocol to be used between the sensor nodes and base

station is

XBee( 802.15.4Zigbee standard).

Applications:

Gas pipelines have to be constantly monitored as any mishaps would create a catastrophe in

the true sense of the word. By making use of WSN in monitoring, the gas pipelines can be

smartly yet accurately monitored. Human intervention is of less use in monitoring of gas

pipes as most often the gas flowing in the pipes is either poisonous or highly inflammable.

Page 9: Rajath Umesh Joshi · Door # 219 1st floor, 3 rd Main 7th Cross, Ramanjaneyanagar, Chikkallasandra, Bangalore -560061 Karnataka, India rajathjo@gmail.com OBJECTIVE: To augment my

ANNEXURE E

Title: "Humanoid robot with Speech Recognition”

Introduction: A humanoid robot is inspired form humans. The robot emulates the hand and leg movements of a

real human. Humanoid robot is made up of servo motors. The instruction or command is given to

the robot through voice.

Description: The humanoid robot is built using servo motors. The servo motors are used because the angle of

rotation of the shaft of servo motor can be controlled. Totally 14 servo motors are used for the

entire humanoid enabling it to walk and move its hands. The servos are controlled by the

microcontroller (MSP430). In order to help the humanoid maintain its balance, MPU6050

(Accelerometer + Gyroscope) module is used. The feedback from the MPU6050 is used to control

the movement of servos, in order to maintain stability. The torque of a servo plays vital role in

determining the enormity of the humanoid and the load bearing capacity of the humanoid. A Xbee

802.15.4 module which is a 2.4GHz RF transceiver is used for controlling the motion of humanoid.

The humanoid is controlled through voice. For voice recognition and then converting the speech to

text, Processing software is used. The Processing software has an extensive speech processing

library. New words can be added to the library. If a word like forward is spoken, the software

converts it into appropriate text and this text “Forward” is transmitted by Xbee to the receiver. The

microcontroller is programmed in such a way that, the text which is received is compared with the

pre-defined texts already stored in the microcontroller. It compares the text with the received one

and if it matches, the corresponding code which is under the text is executed. The code under the

text “Forward” will be executed which results in appropriate movement of servos and it finally

results in forward movement of the humanoid. In the same way, the humanoid can be moved in

various ways through speech processing. The humanoid’s stability is ensured by microcontroller

based on the feedback from MPU6050. The humanoid if equipped with camera can track objects,

follow the coloured objects through Colour Object tracking. The humanoid can also be made to

detect objects and also avoid collision with objects by using proximity sensors or IR sensors and

motion sensors.

APPLICATIONS:

1. Humanoids will be helpful in space explorations.

2. Humanoids can be used for robotic warfare, thereby avoiding loss of human life.

3. Humanoids can assist people in their day to day jobs.

Page 10: Rajath Umesh Joshi · Door # 219 1st floor, 3 rd Main 7th Cross, Ramanjaneyanagar, Chikkallasandra, Bangalore -560061 Karnataka, India rajathjo@gmail.com OBJECTIVE: To augment my

ANNEXURE F

Title” Ultra glove for Visually Impaired”

Introduction: Visually impaired people face many problems while walking down the streets. The blind people find

it very hard to walk on footpaths and to cross a road. Blind people while walking on footpaths, they

might collide with streetlights and pedestrians. While crossing a road, they might be hit by speeding

cars or trucks. In-order to prevent these accidents from happening, a device called as Ultra Glove is

designed. The device comprises of ultrasonic sensors, motion sensor and servos. The device informs

the blind people about the forthcoming obstacles. The device also indicates the blind, whether the

forthcoming obstacle is stationary or in motion. The indication or feedback is through the vibration

created by the movement of servo motors.

Description:

The device comprises of two Ultrasonic sensors (HC-SR04), which indicate the distance between the

person and the obstacle. One ultrasonic sensor is right facing and the other is left facing. The left

facing ultrasonic sensor, which is mounted on the glove, provides the information pertaining to the

distance of the obstacles on the left and the right facing ultrasonic sensor provides the information

pertaining to the distance of the obstacles on the right. The ultrasonic sensors are mounted on the

glove. Arduino pro mini microcontroller is used and the echo and trigger pins of ultrasonic sensors

are connected to the pins of microcontroller. Control pins of two servo motors are connected to pins

of microcontroller. Three ranges are set that is, if the obstacle is within 20 cm distance then the

servo moves at high speed, if the obstacle is between 20 – 50cms, then the shaft of the servo rotates

at medium speed and if the obstacle is at a distance greater than 50cms then the servo shaft rotates

at slow speed. The servo motors are mounted on the glove such that, the shafts of the servo motors

are in contact with the skin, so that the person feels the movement of the servo motors. One servo

motor is placed on the right side of the glove and the other is placed on the left side of the glove.

The motion of servo motor on the right depends on the data obtained from the right ultrasonic

sensor. The microcontroller is programmed in such a way that the right ultrasonic sensor readings

control the motion of right servo motor. Similarly the left servo motor’s motion is controlled based

on the left ultrasonic sensor readings. This way the blind person can identify the position of obstacle

and distance of the obstacle from the person. The PIR motion sensor (HC-SR501) is used to identify

whether the obstacle is in motion or is stationary. The PIR sensor can also be used to identify

whether the obstacle is a human or an object. The PIR sensor along with the readings of ultrasonic

sensors is used to detect moving objects. A buzzer is used to inform the blind person that the object

is a moving object. In the same way the blind person can cross roads with his glove directed towards

the approaching vehicles. This device also prevents the blind person from leaving the footpath and

moving towards the road. The ultrasonic sensors are placed at close proximity, in-order to detect the

obstacles present right in front. In-order to obtain accurate readings from PIR sensors, the glove

bearing hand should be stable and firm. When the obstacle is at the centre and is in motion, both

the servos move and the buzzer will produce a tone. This device helps blind people in walking on the

streets.

Application:

The Ultra glove helps blind people cross roads and helps blind people to walk on the footpaths

without getting hurt or obstructed.

Page 11: Rajath Umesh Joshi · Door # 219 1st floor, 3 rd Main 7th Cross, Ramanjaneyanagar, Chikkallasandra, Bangalore -560061 Karnataka, India rajathjo@gmail.com OBJECTIVE: To augment my

ANNEXURE G

Title “Self-balancing Robot”

Description:

A self-balancing robot is a robot on two wheels. The robot balances itself on two wheels.

Two DC motors drive the robot. The two motors of the robot are driven by L293D motor

driving IC. The robot is stable and does not suffer loss of balance. The stability of the robot

is ensured by the appropriate movement of the motor based on the feedback from MPU

6050(Accelerometer+ Gyroscope) to the microcontroller. The stability is achieved by the PID

controller (proportional–integral–derivative controller) which is a control loop feedback

mechanism. A PID controller continuously calculates an error value as the difference

between a measured process variable and a desired set point. The controller attempts to

minimize the error over time by adjustment of a control variable. The movement of two

motors of the robot are controlled by the microcontroller based on the feedback from MPU

6050. Based on MPU 6050’s feedback, the direction of rotation of motors is determined.

If the robot is leaning forward, the MPU6050 detects this offset and gives a feedback to

microcontroller. The microcontroller based on the feedback drives the motor is reverse

direction through L293D until the MPU6050 suggests that the robot is stable. Similarly, if the

robot is leaning backward, based on the feedback from MPU6050, the motors are driven

forward until the stability is attained. The robot can be driven in any direction provided that

the stability is ensured. The robot can travel forward or backward, without losing stability.

The movement of the robot is controlled through a NRF24L01 2.4 GHz transceiver. The

robot’s motion can be controlled but the robot continuously corrects itself so that it is stable

with the help of MPU6050. The MPU6050 continuously senses the variation and provides a

feedback to the microcontroller, the microcontroller will control the motor movement

based on the output of MPU6050. The compensation for the offset is achieved by

controlling the motor movement. The robot can attain high speed without falling or without

losing stability. The direction of rotation of motors and speed of rotation of motors is

determined by microcontroller based on the readings of MPU 6050 and the data received by

the NRF24L01 receiver. Since the robot moves on 2 wheels, the power consumption is less.

The robot can also be incorporated with obstacle avoidance feature by using IR and

ultrasonic sensors. This feature will enable the robot to detect obstacle and also avoid

collision with the obstacle. The obstacle avoidance feature makes the robot more intelligent

and autonomous. Through Colour Object Tracking, the robot can be made to follow a

coloured object such as a coloured ball autonomously. This can be achieved by mounting a

camera on the robot and by using Raspberry Pi for programming the robot to follow the

coloured object through Image Processing. By adding all these features to the robot, the

robot can be made autonomous and by making it autonomous, human intervention can be

prevented.

Page 12: Rajath Umesh Joshi · Door # 219 1st floor, 3 rd Main 7th Cross, Ramanjaneyanagar, Chikkallasandra, Bangalore -560061 Karnataka, India rajathjo@gmail.com OBJECTIVE: To augment my

ANNEXURE H

Title” GPS guided Autonomous Robot”.

Description:

The robot comprises of 4 wheels, 4 DC motors, motor driving IC (L293D), Ultrasonic sensor (HC-

SR04), MSP430 microcontroller, Ublox GPS module, Digital Compass(HMC 5883L). The robot will

move on a predefined path consisting of vertices made up of GPS coordinates. The microcontroller

receives the data from GPS module in String format through Serial Communication. This data is

properly interpreted and used by the path finding algorithm in order to ensure that the robot is on

course and also to calculate the distance to be covered in order to reach the next GPS coordinate.

The distance between two coordinates is calculated by using Euclid’s distance formula. The path to

be traversed is calculated by the path finding algorithm by using the GPS coordinates and estimating

the distance between coordinates. The latitude and longitude are displayed with five decimal places

in order to depict an accuracy of 1.1 meters. The path finding algorithm also calculates the angle of

movement by using the coordinates. The robot continuously evaluates its position. The direction in

which the robot has to move is calculated by using two sets of reference coordinates. To get the

coordinates, the robot has to initially move forward. The angle of movement is calculated using the

coordinates. The robot uses its own position coordinate and the next predefined coordinate to

calculate the distance and direction. In order to prevent the robot from going off course in terms of

direction, Digital Compass (HMC 5883L) is used. The digital compass determines the direction by

measuring Earth’s Magnetic field. Once the direction in determined by the path finding algorithm,

the HMC 5883L continuously monitors the movement of robot. If the robot deviates from its path,

the digital compass senses this deviation and informs the microcontroller about the deviation, the

microcontroller then facilitates the movement of robot in order to compensate for the deviation and

ensures that the robot is on course. The Digital Compass continuously monitors the direction in

which the robot is travelling. The microcontroller compares the direction in which the robot has to

travel with the output of digital compass. The robot is equipped with Ultrasonic sensor in order to

detect obstacles in the path. The Ultrasonic sensor (HC-SR04) detects the obstacle and also

calculates the distance between the robot and the obstacle. If a robot is closer to the obstacle, with

the help of Ultrasonic sensor the collision of robot with the obstacle can be avoided. Based on the

Ultrasonic sensor output, the microcontroller will control the motion of the motors. If the Ultrasonic

sensor detects the obstacle and if the obstacle is very close, the microcontroller with the help of

L293D will avoid the obstacle by controlling the motors in such a way that the robot takes a

diversion and then comes back to its course after avoiding the obstacle. The autonomous robot does

not require humans to monitor and control the robot.

Application:

1. The robot if equipped with ammunitions, IR camera and sensors, it can be used to detect

and neutralise enemies, thereby preventing loss of human life.

2. The robot when equipped with camera can be used for surveillance.

3. The robot when equipped with water pump, a small tank of water and IR sensors, can be

used to extinguish fire within industries and airports.

Page 13: Rajath Umesh Joshi · Door # 219 1st floor, 3 rd Main 7th Cross, Ramanjaneyanagar, Chikkallasandra, Bangalore -560061 Karnataka, India rajathjo@gmail.com OBJECTIVE: To augment my

ANNEXURE I

Title” Mobile controlled Robot”

Description:

The robot comprises of Bluetooth Hands free, DTMF (Dual Tone Multi Frequency) Decoder

(MT 8870), 4 dc Motors, 4 Channel Relay. The Bluetooth transceiver in the Mobile is paired

with the Bluetooth Hands free device. This facilitates the transmission and reception of

data. The Bluetooth Hands free device is mounted on the robot. The Bluetooth Hands free

device acts as the receiver. DTMF Tone generator Mobile application is used to generate a

pair of tone for every number. The application consists of numbers from 0 to 9 and *, #.

When a number is pressed, a pair of tone which is a combination of two frequencies is

generated. One tone is generated from a High DTMF frequency group of tones and the

other from Low DTMF frequency group of tones. The decoder IC internally consists of

operational amplifier whose output is given to pre filters to separate low and high

frequencies, then it is passed through code detector circuit which decodes the incoming

tone into 4bits of binary data. The DTMF decoder decodes the tone into 4 bit code. The tone

is given as an input to the DTMF decoder (MT8870) after it is received by the Bluetooth

Hands free. The tone is given as the input to DTMF Decoder (MT 8870) through the Audio

out of Bluetooth Hands free. The 4 bit binary code is used to control the motors of the robot

through 4 Channel Relay. The 4 bit binary value is given to the 4 Channel Relay as input.

Based on the 4 bit binary value, the Relays are activated. The switching of Relays depends

on the 4 bit code. This approach does not require a Microcontroller. Based on the 4 bit

value, the robot’s movement is controlled. DTMF decoder (MT 8870) adopts the digital

counting technique to detect and decode all 16 DTMF tone pairs into 4 bit binary code. The

Bluetooth Hands free acts as the receiver and the data is given to the DTMF Decoder from

the Audio out of Hands free. Each tone is decoded into a 4 bit binary value. If the value is

1001, then the 1st and the 4th motor are in motion and the other two are static. In the same

way the Relay controls the motion of motors based on the tone generated by pressing a

number.