You are on page 1of 9

GMR Institute of Technology, Rajam

3/
24
/1
9
Department of IT
Gesture controll physical devices
using IoT
GMR Institute of Technology

Project Supervisor
Mrs.Archana
Assistant Professor
Dept. of IT 1.G.Manohar Prasad(16341A1214)
2. L. Hanuman Sai(16341A1227)
3. M. S.Gayatri(16341A1229)
4. P.Durga Sai Prasad(16341A1244)

3/24/19 1
1
3/24/19 1
ABSTRACT

Now a days we are mainly focusing on the emergence of IoT in the world of
automation.The primary drive for automation IoT is to significantly reduce operating expenditures
when automation devices, sensors and actuators become Internet-enabled devices.We can also
automate various connecting devices using our gestures . Gestures are a form of communication
in which visible bodily actions are used to communicate important messages, either in place of
speech or together and in parallel with spoken words.Gestures include movement of the hands,
face, or other parts of the body. Recently Gesture controlled Laptops or physical things are getting
very famous , which enables us to control certain functions on our Laptops/ physical things by
simply waving our hand in front of it. This technique is called Leap motion . So in this project we
GMR Institute of Technology

are combining leap motion technology with Internet of Things to build our own Gesture control
Laptops/Physical Things.
EXISTING SYSTEM
3/ Touchscreen Systems:
24
/1  Major IoT devices are controlled using a webpage system.In which touchscreens are
9 used,where user can directly interact with what is displayed on screen or user have to
preprogram what he/she want to do.
 Touch screen is difficult to be operated for the visually impaired. Some creative ways to let
them know where to touch are needed.
 Since a display is directly touched, the display may get dirty, then become less-visible. A
direct touch may also cause scratches on the screen sensor that may cause malfunction in some
cases.
GMR Institute of Technology

3
PROPOSED SYSTEM
3/ Gesture System:
24
 The Leap Motion is a technology that connects with a PC or Mac and enables users to
/1
9 manipulate digital objects with hand motions.
 Working with other hardware the Leap Motion controller adds a new way to interact with the
digital world.Programs designed to interpret gesture based computing allow the user to play
games, create designs, and learn in a ‘hands on’ way.
 sensors are used to map and track the human hand. This information is used to create, in real
time, a digital version of the hand that can manipulate digital object by this we can controll
various physical devices using our hand gestures.
GMR Institute of Technology

4
PROJECT REQUIREMENTS

Hardware Requirements:
i. Arduino Uno board
ii. Two Ultrasonic (US) sensors
Software Requirements:
iii. Python:pyautogui library
Circuit Diagram:
GMR Institute of Technology
ARDUINO SKETCH
3/
24  The Arduino should be programmed to read the distance of hand from the US sensor.
/1 By reading the value of distance we can arrive at certain actions to be controlled with
9 gestures, for example in this program we have
 Action 1: When both the hands are placed up before the sensor at a particular far
distance then the video in VLC player should Play/Pause.
 Action 2: When right hand is placed up before the sensor at a particular far distance
then the video should Fast Forward one step.
 Action 3: When left hand is placed up before the sensor at a particular far distance
GMR Institute of Technology

then the video should Rewind one step.


 Action 4: When right hand is placed up before the sensor at a particular near distance
and then if moved towards the sensor the video should fast forward and if moved away
the video should Rewind.
 Action 5: When left hand is placed up before the sensor at a particular near distance
and then if moved towards the sensor the volume of video should increase and if
moved away the volume should Decrease.
 To perform actions on our computer we use Python pyautogui library. The commands
from Arduino are sent to the computer through serial port (USB). This data will be then
read by python which is running on the computer and based on the read data an
action will be performed.

6
GMR Institute of Technology
OUTPUT
REFERENCES

1. M A Rashid & Xiao Han, “Gesture and Voice Control of Internet of Things”(IEEE -2016).
2. Jyoti Jadhav & Prashant Avhad,” Hand gesture based home appliances control system”
International Research Journal of Engineering and Technology (IRJET-2017).
3. Rupali Deshmukh, Abhishek Bange, Akshay Nerkar & Sandip Mane, “Automatic Hand
gesture based remote control for home appliances” International Research Journal of
Engineering and Technology (IRJET-2016).
GMR Institute of Technology

4. M A Rashid & Xiao Han,“Gesture Control of ZigBee Connected Smart Home Internet of
Things” International Conference on Informatics, Electronics and Vision (ICIEV-16).
5. Pomboza-Junez Gonzalo & Holgado-Terriza Juan A, “Control of Home Devices based on
Hand Gestures” IEEE-International Conference on Consumer Electronics Berlin (ICCE-15).
9
3/

/1
24
GMR Institute of Technology

T H AN
YOU
K

You might also like