Design concerns. As a result ofthese obstacles, deaf and

Design and development of hand gesture recognitionsystem for speech impaired peopleMrs.NeelaHarish , Dr.S.PoonguzhaliCentre for Medical electronicsDepartment of Electronics and CommunicationCollege of Engineering GuindyAnna UniversityChennai, [email protected] , [email protected]— All over world, deaf and dumb people face strugglein expressing their feelings to other people. There are variouschallenges experienced by speech and hearing impaired peopleat public places in expressing themselves to normal people.The solution to this problem is determined in this paper, bythe usage of the Indian sign language symbols which aregeneric to all deaf and dumb people in India. The gesturesillustrated by the Indian sign language symbols will beconquered with the support of the flex sensors andaccelerometer. The movements included during gesturerepresentation are rotation, angle tilt, and direction changes.The flex sensor and the accelerometer are incorporated overfingers and wrist respectively to acquire their dynamics, athese sensors are fitted over the data glove. These voltagesignals will then be processed by microcontroller and sent tovoice module, where the words voice outputs are stored andplay backed equivalent to each word values to produce theappropriate voice words with the help of the speaker.Keywords—Indian sign language, speech impaired, flexsensors, accelerometer and voice module.I. INTRODUCTIONThe language used by speech and hearing impairedto represent themselves is known as Sign Languages. Butthese languages vary from one country to other, as it is notcommon to all people. Some of the main challengesexperienced by speech and hearing impaired peoplewhile communicating with normal people were socialinteraction, communication disparity, education, behavioralproblems, mental health, and safety concerns. As a result ofthese obstacles, deaf and dumb people are discouraged tospeak out about themselves or their situations in a publicplace or emergency cases or in a private conversation.Moreover the language diversify is very vast in India, fromplace to place hence a common mode of connection wasneeded for speech and hearing impaired people. This resultedin the usage of the Indian Sign Language symbols betweendeaf and dumb people to interact among them, but it cannotbe understood by other normal people. In this paper, IndianSign language ISL has been used. ISL has its own specificsyntax, grammar, alphabets, words and numerals. HandGestures made by using these symbols are the effective wayof communication by speech impaired people to express theiridea or meaning. These gestures are made with the help offingers, hands, wrist movements and elbow movements fordifferent sequence of words. Here two aspects are beinggoverned as one with only finger position without changinghand position and orientation and the other one is change inboth finger and hand position and orientations. The main needarises when these sign language symbols are not understoodby normal people, as most of them would not have studiedISL. As in real time image processing methods, only a singleindividual can be benefited by capturing his or her image andprocessing it into text or speech. But in this paper, any speechimpaired people hand gesture movements can be captured bythe flex sensors and accelerometers and produced as voiceoutput through the voice module.Researches have been done for so many years in theHand gesture interpretation system using various signlanguages. As mentioned in paper 1, the sign languagegestures are converted into voice for a single alphabet or acomplete string by concatenating each and every word andthereby forming the full meaningful words, but it was doneonly for both American and Pakistan Sign Languages. Themethod described 2 aims to help patients with wristimpairments to perform some of their daily excercises.In oneof the research method 3 American Sign Language havebeen used, where the boundary of the gesture image depictedby the speech impaired people was approximated into apolygon. Then on further image processing Douglas Peuckeralgorithm using Freeman Chain Code Direction the wordswas determined. ISL have also been used in a research paper4 where each set of signs have been represented by thebinary values of the `UP’ &`DOWN’ positions of the five2015 International Conference on Industrial Instrumentation and Control (ICIC)College of Engineering Pune, India. May 28-30, 2015978-1-4799-7165-7/15/$31.00 ©2015 IEEE 1129fingers. The respective images of the symbols wasdynamically loaded and converted into text. A materialknown as Velostat was being used in one of the papers 5 formaking piezo resistive sensors, then these sensors was used todetect bend in fingers. This data was mapped to a characterset by implementing a Minimum Mean Square Error machinelearning algorithm. The method used in one of the researchpaper 6 is the usage of sensor gloves for detecting handgestures which uses British Sign Language system. Here onlythe normal hand gestures are depicted, but not to any signlanguage symbols pertaining to any country was captured.The outputs are produced in the text format using LCD andaudio format using flex sensor. One of the researches 7 wason the robust approach for recognition of bare-handed staticAmerican Sign Language using a novel combination of theLocal Binary Pattern histogram and Linear Binary SupportVector Machine (SVM) classifiers. In one of the papers 8 itwas mentioned to use a device which detects and tracks handand finger motions for Arabic Sign Language. It is done bythe data acquisition using the Multilayer Perceptron networksusing Naves Bayes classifier. In the research approach 9discussed for the American Sign Language uses glove withsix colored markers and two cameras to extract the coordinatepoints .The detection of the alphabets is done by theCircle Hough Transform and backpropagation of theArtificial neural network. One of the ways 10 to detectAmerican Sign Language was capable of recognizing handgestures even when the fore arm was involved and also itsrotation. It has been implemented using Principal componentanalysis to differentiate between two similar gestures..Thus there were various limitations on the previousresearches done so far in the field of Sign languageinterpretation system. Some of them were usage of the imageprocessing method, as it will be restricted to only individualimages being captured and processed, hence it can bedynamically loaded and calculated for different persons usingit. Only finger gestures and alphabets have been obtained fromthe sign language movements and were produced as output forother country languages as British, American and Pakistan.Also the distance between the camera and the person maydisturb the accuracy. Therefore in this project, the gestures forwords in Indian sign languages have been used and eightcommonly used words are produced as voice outputs. Themovements are captured with the help of flex sensor andaccelerometer and can be changed dynamically with thechange in person and hand orientations.II. MATERIALS AND METHODOLOGYThe hand gesture recognition setup represented in thispaper comprise of the Data glove, Sensory part (flex sensorsand accelerometer), Amplifier, PIC Microcontroller, voicemodule and speaker. The Fig.1.shows the block diagram of theISL hand gesture recognition system.A. Data gloveA data glove is an associative device , which facilitatestactile sensing and fine-motion control. It is is specially used tocapture the shape and dynamics of the hand in a moreeffective and direct manner. The flex sensors are being fixedover each finger and accelerometers over the wrist part. Thesesensors are being fixed onto the cloth type data glove by the useof the cello tape or glue. The Fig.2 shows the data glove thatwas worn by the speech impaired people and which doesacquisition of the gestures with the aid of the flex sensors andacclerometers.B. Sensory partThe sensory part consists of flex sensors for gaining thefinger arrangements and acclerometer for the wrist spins.In theflex sensor, the resistance varies equivalent to.Fig. 1. Block Diagram of the ISL hand gesture recognition systemFlex sensorsAccelerometerFig. 2. Data glove fitted with senorsthe bending of the sensors. This resistance is then converted tovoltage value by the use of the voltage divider circuit using a113010k? resistor. Similiarly the voltage conversion is done foreach flex senor situated over the fingers. The accelerometerconsist of 3 axes as x,y,z axes and produces 3 different set ofvalues corresponding to each axis location and based on thewrist movement or orientation made in the hand gesture.C. PIC MicrcontrollerThe microcontroller is used to govern the operartion ofthe signal vlaues that are being handled from the sensors. Thusthe outcome voltages of flex sensors and the accelerometers aregiven as inputs to the to the ports of A and E of the PICmicrocontroller for further processing. The other end of all thesensory part is connected to common ground. These signalvalues are converted to digital by the use of the inbuilt ADC inthe microcontroller.deviation to mean to each flex sensor and accelerometer values.Once all the flex sensors and accelerometer were tested withtheir good reputable readings, the experimental setup shown inFig. 3. was arranged. The Fig 3. Shows the Experimental set upof the ISL hand gesture recognition system. The data glovefitted with sensors after testing was connected to a PICmicrocontroller, then to a voice module, speaker and LCD tohear the voice signals.LCDMicrocontrollerD. Voice ModuleSignals from the microcontroller was then given to thevoice module. The voice module consist of eight channels, inwhich eight words can be recorded. The voice module can beoperated in various modes as parallel and serial modes. Thevoices were recorded when the both signals CE(reset soundtrack) and RE(record) are low till the rising edge of the trigger.Then the same voice can be play back when only RE is highand a high to low edge is applied as trigger. The sound of thewords can be heard loud and clearly with the help of thespeaker. The setup also incorporate a LCD panel to display theflex sensor and acclerometer voltages.SpeakerVoice moduleData glove fittedwith sensorsIII. EXPERIMENTAL RESULTSAll the sensors on the data glove were first tested.The flex sensor and the accelerometer readings were observedwith variation in their position, rotation and bending for 5 trials.The bending in hand is being determined at threebends of the bones of the hand known as distal, middle andproximal phalanges as mentioned in TABLE I. of the flexsensor readings. The Fig. 2.shown is the experimental setup ofthe project used for the Indian Sign Language gestureInterpretation system. The inference was observed from thesesensor readings of flex sensor in TABLE I.and accelerometer inTABLE II., at different positions and trials. For the flex sensor,the voltage across the finger when the sensor is straight is 3.5V,for a power supply of 5V is given. The Voltage drop across theflex sensor was maximum at the middle phalanges bend andminimum when at the proximal phalanges bend. For theaccelerometer, the maximum values were observed for X-axiswhen hand turns right, Y-axis when hand moves up and Z-axiswhen hand slants to up position. The trial readings were takenand from those the final readings were derived and given asmean ± standard deviation. On calculating the coefficient ofvariance for these readings, which is very less than one whichindicates these values have good repeatability andreproducibility. The coefficient variance is the ratio of standardFig. 3. Experimental Setup of the ISL hand gesture recognition systemTABLE I. FLEX SENSOR READINGSFingerNameNobend(Mean±Stddev)(V)Distal phalanxBend(Mean±Stddev)(V)Middle phalanxBend(Mean±Stddev)(V)Thumb 3.498±8.944e-4 3.536±0.01073 3.662±3.57e-3Index 3.504±1.788e-3 3.742±3.577e-3 3.872±8.944e-4Middle 3.508±3.577e-3 3.786±2.68e-3 3.898±8.944e-4Ring 3.506±2.68e-3 3.694±1.788e-3 3.842±3.577e-3Little 3.502±8.944e-4 3.536±1.788e-3 3.694±1.788e-3Subsequently after setting the full plan of the system, with theconsideration of tested both the sensors for their repeatablevalues. After this the data glove with flex sensors over eachfinger and accelerometer over the wrist was worn out by the1131Axis/PositionUp(Mean±Stddev)(V)Down(Mean±Stddev)(V)Tiltright(Mean±Stddev)(V)Tilt left(Mean±Stddev)(V)Slantposition(Mean±Stddev)(V)X axis 1.644±1.788e-31.688±3.577e-31.912±8.944e-41.358±8.944e-41.46±4.472e-3YAxis 1.352±3.577e-31.858±3.577e-31.744±1.788e-31.698±8.944e-41.386±2.683e-3Z-axis 1.45±4.472e-31.394±2.68e-31.512±3.577e-31.442±3.577e-31.312±3.577e-3Sensor value for word-MondayPosition Voltage Values (Mean±Stddev)( (V)Thumb finger 4.68±0.014142136Index finger 3.71±0.0083666Middle finger 4.3±0.021679483Ring finger 3.96±0.008944272Little finger 4.37±0.010954451X-axis 1.83±0.01Y-axis 1.58±0.0083666Z-axis 1.42±0.008944272speech impaired people. Once they get ready with their gesturesand start expressing in hands, simultaneously the voltage signalequal to the bend and rotation will be fed to the microcontroller.The flex sensor voltage of each finger movement according toeach word gesticulation was noted down depending on the bendin each word. Similarly it was done for each finger’s variousangle bends. Also in the accelerometer all the x, y, z axesvariations were calculated corresponding to each wordsrotation, up, down positions. This scheme of measurements wasrepeated for different set of speech impaired people and for anumber of trails. By this approach the minimum and maximumthreshold of each finger and wrist was calculatedproportionate to eight words.From these measurements the average values of thesensor readings were computed. The mostly commonly usedeight set of words were listed, noted their each finger and wristmovement and their corresponding values. The selected eightwords were Monday, Tuesday, Thursday, What and Which.The Indian sign language symbols for these words are thegesture movements that was obtained from the speech impairedpeople after they wore the glove fitted with seniors. TheTABLE III shows the derived and average values of one ofthe words which was reprented through Indian signlanguage symbols. The graph shown in Fig.4. represnts thesensor readings of a single word Monday .At present the gestures made by only single hand havebeen captured, but in future it can be extended to symbolsproduced by both of the hands. Also as far as now only eightwords are produced in the voice module, which can also beenhanced to more number of words as voice turnouts.TABLE II. ACCLEROMETER READINGSSimilarly the same procedure was repeated for all TABLE III. SENSOR READINGS FOR WORDother words by calculating their minimum and maximumvoltages for their corresponding ISL gestures made by speechimpaired people. These values will be given to the voicemodule after processed by the PIC microcontroller. In thevoice module ,the voltages which are received from the sensorsand microcontroller select their appropiate words’s soundoutput. If the match between the word and voltage readings areequivalent , then voice of the words can be heard through thespeaker. The same custom of steps will be applicable for all theother word’s voltages and their respective voice turnouts canbe heard as output.IV. CONCLUSIONSIn this paper, the usages of the hand gesture made byspeech and hearing impaired people have been made successfulto interpret their expression of words. Hence the gesture foreach word was acquired with the help of the flex sensors andaccelerometers. Their corresponding distinct voltages were fedserially to the setup. The data on processing by themicrocontroller and voice module would generate theconsonant words which can be heard by normal people with thehelp of the speaker. Thus, the communication gap betweennormal and speech and hearing impaired people is reduced. Thediscussions of the Indian sign language have been made and thesymbols of the eight commonly used words was captured andproduced as voice output. Hence this research provides anelucidation for all the obstacles faced by all speech impairedpeople, as from this they will be satisfied, motivated and gainself confidence that their feelings will also be understood byother people.1132. 4. Graphical Representaion of the Sensor readings for wordgesture recognition,”IEEE International Conference on System ofSystems Engineering on 2-4 June 2008,Farmingdale,USA.2 Al-Osman,H.Gueaieb,El Saddik.A,and Karime. A,” E-Glove: Anelectronic glove with vibro-tactile feedback for wrist rehabilitation ofpost-stroke patients,”IEEE International Conference on Multimedia andExpo on 11-15 July 2011 in LaSalla university, Spain.3 Menon.R, Jayan.S, James.R, Janardhan and Geetha.M, “GestureRecognition for American Sign Language with PolygonApproximation,” IEEE International Conference Technology forEducation on 14-16 July 2011, Chennai..4 Balakrishnan.G and Rajam.P.S, “Real time Indian Sign LanguageRecognition System to aid deaf-dumb people,” IEEE 13th InternationalConference on Communication Technology on 25-28 September 2011,pp 737-742, Australia.5 Ramakrishnan.G,Kumar.STamse.A,Krishnapura.N,andPreetham.C,”Hand Talk-Implementation of a Gesture Recognizing Glove,” TexasInstruments Conference on India Educators on 4-6 April 2013, BangloreNIMHANS Convention Centre.6 Vikram Sharma, M.Vinay Kumar, N.Masaguppi, S.C.Suma andM.N.Ambika , “Virtual Talk for Deaf, Mute, Blind andNormalHumans,” Texas Instruments Conference on India Educators on4-6 April 2013, IEEE Bangalore Section.V. ACKOWLEDGMENTI would like to thank the Principal, teachers7 M.H.Kamrani and Weerasekera,” Robust ASL Finger spellingRecognition Using Local Binary Patterns and Geometric Features,”International Conference on Digital Image Computing Techniques andApplications on 26-28 November 2013,Hobart,Australia.Figand students of the St. Louis Institute, Chennai whogave me permission to trial this hand gesture system intheir school. I would also convey my regards to Mrs.Jayanthi, who assisted me in the techniques of learningIndian sign language.REFERENCES1 SanAntonio.R, Shadaram.M, Nehal.S, Virk.M.A, Ahmed, Ahmedani,and Khambaty.Y, “Cost effective portable system for signlanguage8 Mohandes.M,Aliyu.S and Deriche.M,”Arabic Sign Languagerecognition using the leap motion controller,”IndustrailElecttroincs(ISIE),IEEE 23rd International Symposium 1-4 June,2014,Istanbul.9 Tangsuksant.W,Adhan.S and Pintavirooj.C,”Amercian Sign Languagerecognition using 3D geometric invariant feature and ANNclassification,”,Biomedical Engineering InternationalConference(BMEiCON),26-28 November,2014 , Fukuoka.10 Hussain.I,Tulukdar.A.K and Sarma.K.K,”Hand Gesture recognitionsystem with real time palm tracking,” IndianConference)INDICON),Annual IEEE, 11-13 December,2014 Pune.1133


I'm Angelica!

Would you like to get a custom essay? How about receiving a customized one?

Check it out