Contents & References of Automatic control of model helicopter flight by machine vision
List:
Chapter 1. 14
Generalities of research. 14
1-1- Research generalities. 15
1-1-1- Defining the problem and research objectives. 16
1-2- Objectives. 16
1-3- Thesis organization. 16
Chapter 2. 18
Research background. 18
2-1- Research background. 19
2-2- Autonomous vehicles. 19
2-2-1- Internal intelligence. 19
2-2-2- Foreign intelligence. 20
2-2-3- Combination of external intelligence with internal intelligence. 20
2-3- Image processing. 20
2-4- Tracking the object. 21
2-5- Challenges in object tracking. 22
2-6- Feature selection. 22
2-6-1- Color. 22
2-6-2- edge. 23
2-6-3- texture. 24
2-6-4- Object recognition. 24
2-6-5- Background subtraction. 25
2-6-6- Segmentation of images. 25
2-7- Edge detection and connection (connection) 26
2-8- Edge-based active contour models. 26
2-9- Tracking moving objects. 27
2-10- Conclusion. 27
Chapter 3. 29
Research execution method, materials and methods 29
3-1- Research execution method/materials and methods 30
3-2- Software tools. 30
3-2-1- Control theory. 30
3-2-2- OpenCV computer vision library. 31
3-3- Thresholding. 32
3-3-1- Binary Thresholding. 32
3-3-2- Inverse Binary Thresholding. 32
3-3-3- Thresholding the threshold to zero. 33
3-3-4- Thresholding threshold inverse to zero. 33
3-4- Arduino board. 34
3-5- Hardware tools. 34
3-5-1- Helicopter. 34
3-5-2- Kinect. 35
3-5-3- Arduino description. 37
3-5-4- Built board 37
Chapter 4 39
Executing the system. 39
4-1- System implementation. 40
4-2- Control through Arduino. 40
4-3-1- Reverse engineering of infrared signals. 40
4-3-2- PWM modulation. 41
4-3-3- Infrared closed structure. 41
4-4- Kinect depth map accuracy. 43
4-5- Results obtained 44
4-6- Stability in air 46
4-7- Summary. 47
Chapter 5. 48
Discussion, conclusions and suggestions. 48
5-1- Digital potentiometer control. 49
5-2- Helicopter control through computer. 49
5-2-1- Tracking. 50
5-2-2- Extraction of helicopter position. 50
5-2-3- Edge tracking. 51
5-2-4- Camera used in the project 51
5-2-5- Computer required for the project 52
5-2-6- Color recognition method. 52
5-2-7- Color recognition and color filter 53
5-2-8- Image softener. 54
5-3- Hardware communication module with the computer used in the project 54
5-4- Project programming environment
5-5- OpenCV library. 55
5-6- How the project works
5-7- Algorithm of the project 56
5-8- Summary. 58
5-9- Proposals. 58
List of sources. 60
5-3- Appendices. 62
Source:
[1] Aguilar-Ponce, R. (2007), Automated object detection and tracking based on clustered sensor networks. PhD thesis, University of Louisiana. AAI3294839.
[2] Altug, E., Ostrowski, J. P., and Taylor, C. J., (2003), Quadrotor control using dual camera visual feedback. In International Conference on Robotics & Automation, IEEE, pp. 4294-4299.
[3] Amidi, O., Mesaki, Y., and Kanade, T., (1993), Research on an autonomous vision guided helicopter.
[4] Andersen, M., Jensen, T., Lisouski, P., Mortensen, A., Hansen, M., Gregersen, T., and Ahrendt, P. (2012), Kinect depth sensor evaluation for computer vision applications. Tech. rep., Aarhus, Aarhus University.
[5] Benezeth, Y., Jodoin, P., Emile, B., Laurent, H., and Rosenberger C. (2008), Review and evaluation of commonly-implemented background subtraction algorithms. In Pattern Recognition, pp. 1-4.
[6] Fan, J., Yau, D., Elmagarmid, A., and Aref, W. (2001), Automatic image segmentation by integrating color-edge extraction and seeded region growing. IEEE Transaction on Image Processing 1454-1466.
[7] Fieguth, P., and Terzopoulos, D. (1997), Color-based tracking of heads and other mobile objects at video frame rates. In Proc. IEEE Conf. on Computer Vision and Pattern Recognition, pp. 21-27.
[8] Heath, M., Sarkar, S., Sanocki, T., and Bowyer, K. (1996), Comparison of edge detectors: A methodology and initial study. In Computer Vision and Image Understanding, IEEE Computer Society Press, pp. 38-54.
[9] Jiang, X., and Bunke, H. (1999), Edge detection in range images based on scan line approximation. Computer Vision and Image Understanding 73, 183-199.
[10] McGillivary, P., Sousa, J., Martins, R., Rajan, K., and Leroy, F. (Southampton, UK, 2012), Integrating autonomous underwater vessels, surface vessels and aircraft as persistent surveillance components of ocean observing studies. In IEEE Autonomous Underwater Vehicles. [11] Moeslund, T. B., and Granum, E. (Mar. 2001), A survey of computer vision-based human motion capture. Computer Vision Image Understanding 81, 3 231-268.
[12] Ning, J., Zhang, L., Zhang, D., and Wu, C. (2009), Robust object tracking using joint color texture histogram. IJPRAI, 1245-1263.
[13] Nummiaro, K., Koller-Meier, E., and Gool, L. V. (2003), Color features for tracking nonrigid objects. Special Issue on Visual Surveillance, Chinese Journal of Automation 29, 345-355.
[14] OpenCV. (2012), Basic thresholding operations.
[15] Pavlidis, T., and Liow, Y. T. (1990), Integrating region growing and edge detection. IEEE Trans. Pattern Analysis and Machine Intelligence 225-233. [16] Schmid, C. (2005), Introduction into system control. and Breckon, T. P. (2010), Fundamentals of Digital Image Processing: A Practical Approach with Examples in Matlab. Wiley-Blackwell. ISBN-13: 978-0470844731.
[19] Sonka, M., Hlavac, V., and Boyle, R. (1999), Image Processing, Analysis and Machine Vision, 2 ed. Brooks/Cole.
[20] Thrun, S., Montemerlo, M., Dahlkamp, ??H., Stavens, D., Aron, A., Diebel, J., Gale, P. F. J., Halpenny, M., Hoffmann, G., Lau, K., Oak-ley, C., Palatucci, M., Pratt, V., and Stang, P. (2006). Stanley: The robot that won the Darpa grand challenge. Journal of Field Robotics 23, 661-692.
[21] Xu, R. Y. D., Allen, J. G., and Jin, J. S. (Darlinghurst, Australia, 2004), Robust real-time tracking of nonrigid objects. In Proceedings of the Pan-Sydney area workshop on Visual information processing, VIP '05, Australian Computer Society, Inc., pp. 95-98.
[22] Yilmaz, A., Javed, O., and Shah, M. (2006), object tracking: A survey. ACM Computing Surveys (CSUR) 38, 45.
[23] Arduino Home Page, http://www.arduino.