Smart Wearable Controlling System by Hand and Fingers Gesture Recognition

In this project, a smart wearable controlling system by hand and fingers gesture recognition as home appliances controller is developed on movements of hand and fingers. The proposed smart wearable system is built with least sensors possible for gesture recognition. Thus, motion sensors are placed on two fingers, namely the thumb, index finger to detect fingers’ motions and another sensor at the back of the palm to measure the hand movement. Total of twenty-two gestures are used in this study by analyzing the movements of fingers. The motion sensors data are transmitted to the mobile device through Bluetooth. An application is built by Qt cross-platform and deployed into mobile device (various platform) to detect and stimulate the effectiveness of the gestures recognition by the embedded n-dimension dynamic time warping (ND-DTW) classifier. The proposed smart wearable system was able to detect the gestures at mean accuracy of approximately 97.55%.

Video Demo {::nomarkdown}

{:/nomarkdown}

System Design

Figure 1. System design overview..

Essentially, the system is built up with three modules:

  • sensor module
  • processing module
  • mobile application.

The sensor module consisted of three IMU sensors attached on thumb and index finger, and back of the palm are communicated with the processing module. The processing module remove the internal noises and motion artifact of the received sensors data and further transmitted the filtered data to the mobile application via Bluetooth communication. Gestures are classified by the trained ND-DTW model and the respective pre-defined command in the prototype home controller application is triggered as shown in Figure 1 . The proposed smart wearable controller system is depicted in Figure 2.

Figure 2. Proposed smart wearable controller system.

Our proposed system can be applied to various applications as described in Figure 3. Qt cross-platform software is used for developing the prototype mobile application in this study as it is capable to run on various software and hardware platforms with little or no change in the underlying code base such as Android, iOS, and embedded system.

Figure 3 Proposed smart wearable controller system that can applied to various applications with Qt cross-platform software.

Smart Home application

Figure 9. System overview of smart home application.

In order to stimulate the effectively of the proposed smart wearable controller system, a smart home application as depicted in Figure is developed in Android platform with encoded template of 7 hand gestures and 2 finger gestures. Figure 9 shows the system overview of smart home application. The functions recognized by the gestures included displaying of date and time, temperature, humidity, at home. Other controller functions included turning the lamps on/off, controlling the brightness of the lamps, displaying the weather forecast, alarm setting or playing songs. The summary of the gestures:

  1. Swipe up - turn on the lamp.
  2. Swipe down - turn off the lamp.
  3. Swipe left - display content of previous tab in application.
  4. Swipe right - display content of next tab in application.
  5. W shape - display weather forecast.
  6. M shape - play music.
  7. S shape - stop music.
  8. Rotate left (finger gesture) – increase the brightness of lamp.
  9. Rotate right (finger gesture) – decrease the brightness of lamp.
Figure 10. Screenshot of the smart home application.
Smart Wearable Body Equilibrium Correction System with Mobile Device

In recent decades, employees who worked in office suffered from bone diseases and muscle stress, mainly due to improper sitting posture. This paper is proposed to develop a novel body equilibrium correction to meet solve such issue. The system consists of four modules. A motion sensor module is placed on the center of the user’s chest to measure the sitting orientation. Moreover, an electromyography (EMG) module is located at the both side of the shoulders to measure the user’s muscle tension as incorrect sitting position over a long period of time tends to increase the pain and tense at the shoulders. Thus, it is reliable to serve as references in accordance to the sitting orientation. Both sensor modules transmit the raw data to a processing module. This module integrates the raw data under a packet and further transmits the data packet to a mobile device. A mobile application is implemented to classify the correctness of sitting position. In case the incorrect position is detected, alert is triggered to user by the vibrator motor and light emitting diodes (LEDs) installed along with the system. The proposed system is able to measure and detect any abnormal sitting positon with average true accuracy of 98% with low-cost implementation.

The proposed system tends to utilize the m-Health technology to evaluate the correctness of the subject’s sitting posture based on the analysis of the muscle tension and body orientation. Features are extracted from the EMGs and motion sensors which are served as inputs to a support vector machine (SVM) classifier. Information of current sitting posture is displayed on the mobile device concurrently and warning is triggered if incorrect posture is detected.

System Design

Figure 1. The proposed wearable system design that consists of sensors module, processing module and mobile application module.

The proposed system is divided into four modules: a) an EMG sensors module, b) an Inertial Motion Unit (IMU) sensor module, c) a processing unit module, and lastly d) a mobile application as illustrated in Figure 1. The EMG sensors adopted in this study is the surface dry electronic EMG sensors (sEMGs) which eliminate the usage of gel for measurement. The sEMGs measure the muscle strains of the back of subjects’ shoulders. Studies in [4-5] proved that improper sitting posture for consecutive period of time increases the muscle strains. Thus, the EMGs are applied in this study. Meanwhile, IMU sensor from Adafruit BNO055 [10] computes the orientation of the body in three axes: pitch, roll and yaw. This IMU is integrated with an MEMS accelerometer, gyroscope and magnetometer under a single die, processed using a high-speed ARM Cortex-M0 processor. The installation and placement of the sensors modules are depicted in Figure 2. Sensors data from both sensors modules are gathered and processed by a CortexTM-M0 embedded processor to integrate data under a single packet. A mobile application is developed to receive the data packet through Bluetooth low energy (BLE) wireless communication. Features are extracted from the received data and served as input to a SVM classifier to determine the correctness of the user’s sitting posture.

Figure 2. The installation of the proposed system on the body for detecting the normality and correctness of the user body posture.

Lastly, Figure 3 illustrates the sample design and prototype of the proposed mobile application to detect body posture in real-time.

Figure 3. The prototype of the developed mobile application for the proposed body equilibrium correction system.

Methods

Figure 4. The flow diagram of the body equilibrium correction system that consists of two modes: calibration mode and normal analysis mode.

Figure 4 illustrates flow diagram of software design of the mobile application for the proposed system. There are two modes available in the system: calibration and normal sensing mode. The calibration is required for setting up the initial or default values of the EMG and IMU as each different individual has different muscle strains level by default. The qualitative comments indicated that sitting posture which best matched the natural shape of the spine, and appeared comfortable and/or relaxed without excessive muscle tone and is adopted as the standard or correct posture in this study. As the neutral sitting position varied between subjects during calibration mode, a ±10 degrees of sitting posture is taken into consideration to cope in this study. The linear orientation of the body posture is computed with complimentary filter method accordingly in three axes where the orientation doesn’t take the gravity into computation. Alert is triggered in two ways: via a vibration motor and LEDs indicator on the chest. LEDs in green lights indicated normal or neutral mode whereas red lights depicted incorrect or abnormal modes of sitting posture.

Gesture Control Armband using Single EMG and IMU sensor

This project is focused to develop a wearable device system to recognize hand gestures in real-time that utilizes a single surface electromyogram (EMG) sensor positioned on the forearm and an inertial measurement unit (IMU - accelerometer and gyroscope) to realize user-friendly interaction between human and computers. We analyze the EMG signals coming from seven different subjects using a novel integration of wavelet and K-nearest Neighbor (KNN). The efficiency of wavelet transform in surface EMG feature extraction is investigated from 3 levels of wavelet decomposition to common statistical features such as mean, root square mean, and standard deviation. KNN classifier is used to recognize 3 gestures (hand close, hand open and hand extension). IMU streams are utilized as decision fusion method to recognize the hand movements. Overall, results revealed the true accuracy of 3 gestures detection is greater than 95% in average by using the KNN classifier. The performance of the interfacing system was evaluated by camera controller application that is controlled by the hand gestures and hand movements. The proposed method facilitates intelligent and natural control based on gesture interaction.

Video Demo

System Design and implementation

Here, the system overview design and experiments setup are being discussed (Figure 1). The proposed system is divided into two parts that are (1) hand gesture and movement acquisition module on the wearable armband, and (2) gestures analysis module by end terminal application.

Figure 1. System design.

The EMG signal of the performing arm muscles is obtained by an EMG sensor (myoware muscle mensor [13]), that is attached on a forearm to capture the gestures of the hand. Myoware muscle sensor is a new wearable design that attached the biomedical sensor pads directly to the board itself which getting rid of those pesky cables with single-supply voltage (+2.9V to +5.7V, two outputs (EMG Envelope, Raw EMG)), polarity protected power pins. Moreover, an absolute 9 degree-of-freedom inertial motion unit (IMU) orientation sensor [14] is attached in the wearable armband at the back of the wearable MCU platform. The IMU is an intelligent 9-axis absolute orientation sensor consists of an accelerometer, a gyroscope, and a magnetometer sensor (10-pins) operated at 3.3V, footprint of 3.8 x 5.2 mm 2, height of 11.3 mm 2 and integrated under a high speed ARM Cortex-M0 based processor to process all the data. These sensors are connected to an Arduino-compatible, Adafruit FLORA wearable electronic platform micro-controller (MCU) [15], powered by a small size 3.7V ion Lithium polymer battery. A HC-06 Bluetooth module [16] transmits the received signals to the end terminal application as described in Figure 2(b). Once the signals are received by the end terminal application, features are extracted from the respective sensors reading, and serve as input parameters to a pattern classifier for hand gestures evaluation.

Figure 2. (a) the proposed wearable arm, consisting of EMG sensor, an inertial motion unit, a wearable electronic platform, a Bluetooth and a lithium polymer battery operated at 3.3V and (b) the camera controller application running on a desktop computer, controlled by the wearable armband attached on the right hand.
Basic knowledge about quadcopter (part 2)

As part 1 introduced about necessary components to build a quadcopter. Today, I will talk about basic knowledge that I accumulated on the internet.

Orientation - Angles

Orientation

The quadcopter orientation can be defined by three angles: Pitch, Roll, and Yaw. These angles determine orientation and therefore the direction the quadcopter will take. Basically, changing the pitch will make the quadcopter go forward/backward, the roll bends to the left/right and the yaw will make it rotate around its vertical axis. And final parameter you need to control attitude of quadcopter is throttle that will spin your all brushless motors up.

I will use these axes as reference:

Quadcopter motion

There are 2 kinds of quadcopter configuration: + and X with clockwise rotation and counter-clockwise rotation propeller.

I am using quad X configuration.

Most commercially available quadcopters work in one of two possible modes:

  • Aerobatic Mode: This mode allows you to perform spins and flips on the quadcopter.
  • Attitude/ Stable Mode: This is the preferred mode for beginners and this is the mode my quadcopter will run in. In this mode data from accelerometer and gyroscope is combined to caculate the quadcopter angle, no spins or flips will be performed.

Quadcopter motion in different axis:

  • Throttle control: Quadcopter moves up when you increase speed all motors. Moving down when you decrease all motors.
  • Pitch control: Quadcopter moves forward when you increase 2 front motors and decrease 2 back motors. Revert with the quadcopter moves backward.
  • Roll control: Quadcopter bend left when you increase speed 2 right motors and decrease speed 2 left motors. Vice versa for bending right
  • Yaw control: Quadcopter rotates left when you increase speed front right motor and back left and rest motors are decreased. Quadcopter rotates right when you speed up front left motor and back right motor and rest motors are at normal speed.

Quadcopter Flowchart

This is a flowchart for quadcopter programing that I will follow this reference:

1. PWM Decoder Driver:

In order to interpret commands from a standard RC Transmitter/Receiver, it needs PWM decoder from each channel of transmitter to control speed of 4 motors and flight modes.

2. Command Translator:

Depending on the flight mode, the PWM values give by the transmitter are interpreted differently. The command Translator determines which mode the quadcopter is flying in and translates commands accordingly.

3. Sensor Fusion:

Actually, in BNO055 there is a 32-bit ARM Cortex M0+ microcontroller running Bosch Sensortec sensor fusion software, that means you don’t need to implement a fusion part. However, the problem here is that price of BNO055 is expensive (around 17$) then maybe I will change another one that is cheaper (MPU-6000). For fusion algorithms, there are Extended Kalman Filter (EKF) and Complementary Filter to implement, for instance. The reasons for implementing fusion algorithm are:

  • Accelerometer - Good for long duration (in short time it has many noise)
  • Gyroscope - Good for short duration (in long time it will be drift)

Therefore, fusion algorithm will combine accelerometer and gyroscope to solve their weakness.

4. Proportional-Integral-Derivative (PID) Stabilization:

PID controller is used to determine the fastest way to make the desired quadcopter’s flying become reality from transmitter commands and value of sensor fusion. PID controller is very efficient for control of a system in which an accurate physical model is unknown. Using calculus to determine error slopes and areas, PID compensates for environmental noise and disturbances while overcoming steady state error and oscillations.

5. PWM Encoder Driver:

Once everything has been computed, the PWM Encoder takes the control values and generates a pulse width modulated (PWM) output for each motor.

Drone Project Introduction (part 1)

When I was an undergraduate student, I saw my seniors played and tried to make something flied. At that time, I don’t know what it is and how it could fly. And they just said to me that I called something is Drone. With propellers, motors, MCU, controller, gyro & accelerometer, they said in detailed, of course, at 2nd-year I can not understand anything about these components or devices. Just something can fly. It has been making me curious and excited. Until now, I have a chance to build it for my own.

There are a lot of drone projects with open source even open hardware for users. However, this project I hope I can build a quadcopter by my self. Additionally, this project is relative with my master thesis that I am researching about using a ring gesture control device to control drone.

I will update process in this blog.

Quadcopter parts

Here is my list I have chosen for building a quadcopter:

Frame + Motors + ESC (Kits)

I chose to use is F450 DJI kit with:

  • Frame (282g)
  • Flame wheel integrated PCB wiring
  • 2312E 960KV Motors
  • 420E ESC (Electronic Speed Control)
  • Propellers 10 x 4.5in ; 8 x 4.5in

MCU - LPC1768

LPC1768 features:

  • High performance ARM® Cortex™-M3 Core
  • 96MHz, 32KB RAM, 512KB FLASH
  • Ethernet, USB Host/Device, 2xSPI, 2xI2C, 3xUART, CAN, 6xPWM, 6xADC, GPIO
  • 5V USB or 4.5-9V supply

IMU - BNO055

BNO055 Intelligent 9-Axis Absolute Orientation Sensor:

  • integrating a triaxial 14-bit accelerometer
  • a triaxial 16-bit gyroscope with a range of ±2000 degrees per second
  • Magnetometer typical ±1300μT (x-, y-axis); ±2500μT (z-axis)
  • a triaxial geomagnetic sensor and a 32-bit ARM Cortex M0+ microcontroller running Bosch Sensortec sensor fusion software
  • digital bidirectional I²C and UART interfaces

Transmitter and Receiver

FS-i6 Specifications:

  • Channels: 6 Channels
  • Model Type: Glider/Heli/Airplane
  • RF Range: 2.40-2.48GHz
  • Bandwidth: 500KHz
  • Band: 142
  • RF Power: Less Than 20dBm
  • Control Range: 500m
  • DSC Port: PS2;Output:PPM

LiPo Battery and LiPo Charger, balancer, and discharger

LiPo Battery (3s, 11.1V, 2200mA)

LiPo Charger, balancer, and discharger

FPV (Receiver and Transmitter) and Camera:

Fist Person View - TS840 RC840:

Camera ELITE QB58 TX CAMERA COMBO:

Crazyflie

This one is for my backup in case I can not finish this project on the time. Then I have to change to this platform to be flight controller (MCU, IMU, Controller). Crazyflie is really interesting open source project with Python API for users want to embedded their purposes.

GPS (optional)

If possible I will make an auto-pilot mode with GPS module for my quadcopter.