The Wireless Smart Gripper

Print   

02 Nov 2017

Disclaimer:
This essay has been written and submitted by students and is not an example of our work. Please click this link to view samples of our professional work witten by our professional essay writers. Any opinions, findings, conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of EssayCompany.

Abstract In today’s world of advanced technology the best way of human machine interface is the touch screen interface. Beyond touch input one can think of a more comfortable input that is human voice. This paper features the design of a wireless robotic gripper for human assistance which is controlled by human voice commands. The robot’s vision is a camera. Image processing technique is used for the detection of the objects. The robotic arm is a four axis arm.

Keyword

INTRODUCTION

The purpose of this project is to design a wireless gripper controlled by human voice commands. The gripper is fully autonomous. The flow of the working of the project goes this way, firstly the user gives a voice input , then this voice signal is encoded into digital format using specific voice recognition unit ,this digital data is then wirelessly transmitted to the gripper . Once the gripper has acknowledged the digital data ,then the camera starts imaging and then the processor processes the image and gives respective control signals to the another controller . This controller then controls the servo motors at each linkage of the gripper. The gripper has sensors on it for feedback purpose. Sensors like tactile or pressure or force sensors are used to acknowledge whether the gripper has

grabbed the object or not. Using this feedback signal the camera and the gripper synchronize to complete the task.

The block diagram of the project is as under:

Wireless transmission

Voice recognition unit

Control unit

Gripper

Edge detection unit

Wireless receiver

The Elements of the block diagram are the voice recoginition unit, wireless Tx – Rx, the main processing unit, gripper unit , edge detection unit.

LITERATURE

(A) Voice Recoginition:

The objective of speech recognition is the transcription of speech into text, i.e., word strings. To accomplish this, one might wish to create word models from training data. However, in the case of large vocabulary speech recognition, there are simply too many words to be trained in this way. It is necessary to obtain several samples of every word from several different speakers, in order to create reasonable speaker independent models for each word. Furthermore, the process must be repeated for

each new word that is added to the vocabulary. In this project filters are implemented for voice samples formation.

Features of chebyshev filter:

The primary attribute of Chebyshev filter is their speed, typically more than an order of magnitude faster than the windowed sinc. By using 8 filters, a range of 200Hz – 1800Hz frequency component can be collected and from Euclidean distance formula voice can be detected. The advantage of Chebyshev filters over Butterworth filters is that Chebyshev filters have a sharper transition between the passband and the stopband with a lower order filter, which produces smaller absolute errors and higher execution speeds. Compared to a Butterworth filter, a Chebyshev filter can achieve a sharper transition between the passband and the stopband with a lower order filter. The sharp transition between the passband and the stopband of a Chebyshev filter produces smaller absolute errors and faster execution speeds than a Butterworth filter. The cut-off slope of an elliptic filter is steeper than that of a Butterworth, Chebyshev, or Bessel, but the amplitude response has ripple in both the passband and the stopband, and the phase response is very nonlinear. The Chebyshev has only poles in the passband while the Elliptic has poles in the passband as well as zeros in the stopband. The Elliptic gives a faster increase in the attenuation after the cutoff frequency but has "rebounds" in the frequency response which the Chebyshev does not have.

(B) Wireless Communication:

Wireless communication means transfer of information between two points that are not connected physically. For this we have various transceiver modules to carry out such operations.CC25000, HC- 05 and BTM400 are few among them. Further, communication in these modules can be established using different techniques such as SPI and I2C.

SPI vs I2C:

SPI stands for serial peripheral interface and I2C stand for Inter Integrated circuit.I2C is a two wire bus that is used to enable communication between two or more devices that are normally on the same board. It operates in master slave mode in which master continues in either transmit or receive mode and the slave in its complimentary mode.SPI on the other hand is synchronous serial data link standard that operates in full duplex mode. In addition, SPI is not limited to 8-bit words; one can send any message size up to 16bit words. It supports for a high bandwidth(1 mega baud rate)networks against CPUs and other devices supporting the SPI. The device connected to the SPI bus may be classified as master or slave devices. A master device initiates an information transfer on the bus and generates clock and control signals. A slave device is controlled by the master through a slave select (chip enable) line and is active only when selected. The SPI bus employs a simple shift register data transfer scheme: data is clocked out of and into the active device in a first in first out fashion. It is in

this manner that SPI device transmit and receive in duplex mode. In our project, the voice signal of doctor is sampled and then it is transmitted wirelessly for processing at the gripper part. The voice recognition takes place at the gripper part. Hence the distance of doctor to gripper is going to be less but the amount of data being transmitted is more i.e. burst data. Hence CC2500 is used which allows burst transmission.

Configuring CC2500:

CC2500 is configured via a simple 4-wire SPI compatible interface (SI, SO, SCLK and CSn)

where CC2500 is the slave. This interface is also used to read and write buffered data. All address

and data transfer on the SPI interface is done most significant bit first. All transactions on the SPI interface start with a header byte containing a read/write bit, a burst access bit and a 6-bit address. During address and data transfer, the CSn pin (Chip Select, active low) must be kept low. If CSn goes high during the access, the transferwill be cancelled. The timing for the address and data transfer on the SPI interface is shown in Figure 7 with reference to Table 16. When CSn goes low, the MCU must wait until CC2500 SO pin goes low before starting to transfer the header byte. This indicates that the

voltage regulator has stabilized and the crystal is running. Unless the chip is in the SLEEP or XOFF states or an SRES command strobe is issued, the SO pin will always go low immediately after taking CSn low. Figure gives a brief overview of differentregister access types possible.

C:\Users\tejas\Pictures\cc circuit.png

(C ) Image Processing :

Web Camera:

Camera is used to capture the real time image. Our requirement is to have a camera whose capturing speed is compatible with the processor which we use and also to get the proper synchronization. The speed of the camera is measured using fps units i.e. frames per second. CMOS cameras with the required specifications are available in market but its interfacing is difficult. So it is more convenient to use web camera.

Edge detection:

Digital image processing is a subset of electronic domain, wherein image is converted into an array of small integers, called pixels, representing a physical quantity. Edges characterize boundaries and edge detection is in the forefront of image processing. An edge is the boundary between an object and the background.

Following are the Causes of change in Intensity:

Geometric events:

Object boundary (discontinuity in depth and/or surface colour and texture)

Surface boundary (discontinuity in surface orientation and/or surface colour and texture)

Non-geometric events:

Specularity (direct reflection of light, such as a mirror)

Shadows (from other objects or from the same object)

Inter-reflections

Criteria for optimal edge detection

Good detection: the optimal detector must minimize the probability of false positives (detecting spurious edges caused by noise), as well as that of false negatives (missing real edges).

Good localization: the edges detected must be as close as possible to the true edges.

Single response constraint: the detector must return one point only for each true edge point; that is, minimize the number of local maxima around the true edge.

An Edge in an image is a significant local change in the image intensity, usually associated with a discontinuity in either the image intensity or the first derivative of the image intensity. The four steps in Edge detection process are:

Smoothening: It suppress as much noise as possible, without destroying the true edges.

Filtering: Images are corrupted by noise such as salt and pepper noise, impulse noise and Gaussian noise. As there is a trade-off between edge strength and noise reduction, filtering is done.

Enhancement: It emphasizes pixels where there is a significant change in local intensity values and is usually performed by computing the gradient magnitude.

Detection: Many points in an image have a nonzero value for the gradient, and not all of these points are edges for a particular application. Thresholding is used for the detection of edge points.

Comparison between different types of Edge Detection Technique

Operator

Advantages

Disadvantages

Classical (Sobel, Prewitt etc)

Simplicity, Detection of Edges and orientation

Sensitivity to noise, Inaccurate

Laplacian Second directional derivative

Detection of edges and their orientation, Having fixed characteristics in all direction

Sensitivity to noise

Laplacian of Gaussian (LoG), (Marr-Hildreth)

Finding the correct places of edges

Malfunctioning at the corners, curves and the places where the gray level intensity function varies

Canny

Localization and response, Improving signal to noise ratio, better detection even in noisy condition

Complex computation, Time consuming

Following are the results seen after the simulation of edge detection using different operators in MATLAB:

The Canny edge detector will be carried out in following four steps:

Image smoothing.

Gradient Calculation Non – Maximum suppression.

Non- Maximum Suppression.

Thresholding with hysteresis.

1. In the first step, we smoothen the image with a two dimensional Gaussian. In most cases the computation of a two dimensional Gaussian is costly, so it is approximated by two one dimensional Gaussians, one in the x direction and the other in the y direction.

2. In the second step, we take the gradient of the image. This shows changes in intensity, which indicates the presence of edges. This actually gives two results, the gradient in the x direction and the gradient in the y direction.

3. Third step is Non-maximal suppression. Edges will occur at points where the gradient is at a maximum. Therefore, all points not at a maximum should be suppressed. In order to do this, the magnitude and direction of the gradient is computed at each pixel. Then for each pixel we check if the magnitude of the gradient is greater at one pixel's distance away in either the positive or the negative direction perpendicular to the gradient. If the pixel is not greater than both, suppress it.

4. Fourth step isEdge Thresholding: The method of thresholding used by the Canny Edge Detector is referred to as "hysteresis". It makes use of both a high threshold and a low threshold. If a pixel has a value above the high threshold, it is set as an edge pixel. If a pixel has a value above the low threshold and is the neighbor of an edge pixel, it is set as an edge pixel as well. If a pixel has a value above the low threshold but is not the neighbor of an edge pixel, it is not set as an edge pixel. If a pixel has a value below the low threshold, it is never set as an edge pixel.

(D) GRIPPER:

The gripper mainly comprises of FIVE elements, they are as follows:

Number of axis:

The robotic arm is a Four axis robot. All the joints are of rotator movement . the robotic arm is a articulated configuration. The workspace of the robotic gripper is a ¼ th sphere in first quadrant of 3-D axis.

Controller:

The controller used is an atmega 16 microcontroller. The controller is programmed to give specific PWM pulses to the joint servos, to make a movement of specific angle to complete the specified task.

Actuator:

The actuator is nothing but the electric DC servos. The servo is controlled using PWM pulses.

Feedback sensors:

The need of feedback sensors is that to give the idea whether the object has been grabbed or not. The gripper being smart is that using these sensors the gripping force for different objects.

Workspace:

5cm

5cm

5cm

\

R = 5cm

60°

SIMULATION & RESULT

Simulation result of Voice recognition unit:

Matlab code:

[y, fs, nbits] = wavread('hel.wav'); %read in the wav file

Ts=1/fs;

fs2=fs/2;

L=length(y);

sound(y,fs) %play back the wav file

t = 0:1/fs:length(y)/fs-1/fs; %create the proper time vector

subplot(331) %create a subplot

plot(t,y) %plot the original waveform

NFFT = 2^nextpow2(L);

Y = fft(y,NFFT)/L;

f = fs/2*linspace(0,1,NFFT/2+1);

subplot(332)

plot(f,2*abs(Y(1:NFFT/2+1)));

[B1, A1]=cheby2 (2, 20, [50/fs2 350/fs2])

filt = filter(B1,A1,y)

subplot(333)

plot(t,filt)

xlabel('50-350');

[B2, A2]=cheby2 (2, 20, [350/fs2 500/fs2])

filt2 = filter(B2,A2,y)

subplot(334)

plot(t,filt2)

xlabel('350-500');

[B3, A3]=cheby2 (2, 20, [500/fs2 750/fs2])

filt3 = filter(B3,A3,y)

subplot(335)

plot(t,filt3)

xlabel('500-750');

[B4, A4]=cheby2 (2, 20, [750/fs2 1000/fs2])

filt4 = filter(B4,A4,y)

subplot(336)

plot(t,filt4)

xlabel('750-1000');

[B5, A5]=cheby2 (2, 20, [1000/fs2 1500/fs2])

filt5 = filter(B5,A5,y)

subplot(337)

plot(t,filt5)

xlabel('1000-1500');

[B0, A0]=cheby2 (2, 20, 50/fs);

filt6 = filter(B0,A0,y)

subplot(338)

plot(t,filt6)

xlabel('<50');

[B6, A6]=cheby2 (2, 20, 1500/fs,'high');

filt7 = filter(B6,A6,y)

subplot(339)

plot(t,filt7)

xlabel('>1500');

5.1. OUTPUT:

CONCLUSION

We successfully simulated and analyzed various voice commands. The mechanical assembly of the robotic arm was implemented and each joint was successfully controlled. We found it difficult to use a processor like ARM 7 for voice and image synthesis so for simplicity we have used Matlab Software to generate specific signals for the gripper Atmega 16 microcontroller.



rev

Our Service Portfolio

jb

Want To Place An Order Quickly?

Then shoot us a message on Whatsapp, WeChat or Gmail. We are available 24/7 to assist you.

whatsapp

Do not panic, you are at the right place

jb

Visit Our essay writting help page to get all the details and guidence on availing our assiatance service.

Get 20% Discount, Now
£19 £14/ Per Page
14 days delivery time

Our writting assistance service is undoubtedly one of the most affordable writting assistance services and we have highly qualified professionls to help you with your work. So what are you waiting for, click below to order now.

Get An Instant Quote

ORDER TODAY!

Our experts are ready to assist you, call us to get a free quote or order now to get succeed in your academics writing.

Get a Free Quote Order Now