b. sc. in bio-industrial mechatronics engineering national...

38
B. Sc. in Bio-industrial Mechatronics Engineering National Taiwan University

Upload: others

Post on 09-Aug-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

B. Sc. in Bio-industrial Mechatronics Engineering National Taiwan University

Page 2: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

Wearable Gait Analysis System

http://robinhsieh.com/?p=1115

This gait analysis system uses four force sensors and two encoders to measure plantar force and joint angles. The measured data can be uploaded to Internet via a laptop in a backpack. For analysis, the software we developed can detect gait events such as initial contact and toe-off and also provide the center of ground contact force. The performance of the system was validated by VICON motion capture system and force plates. I presented this topic on 2012 IEEE EMBC.

Force sensors embedded inside shoe insole Angle displacement sensor with wearable brace NI DAQ card Remote controlled measuring process

A 4-channel circuit was made to achieve amplification and filtering of the signals from the force sensors, and a 2-channel circuit was made to measure the divided voltage of the potentiometers. A USB data acquisition device was used to transmit all the data to the host laptop in 1000 Hz sampling rate. During the experiment, the circuits, DAQ device and laptop were all carried by the subject in the backpack. After the trial, the data were uploaded to the internet immediately, and could be downloaded by any other computer to perform offline analysis.

For gait analysis, a novel algorithm was developed to detect initial contact (IC) and toe-off (TO) in order to separate stance phase and swing phase. After the gait phases were detected, the program would further segment the four gait periods during the stance phase, including loading response (LR), mid-stance (MS), terminal stance (TS) and pre-swing (PS).

Click me!

Page 3: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

http://robinhsieh.com/?p=1115

The gait of two healthy subjects (males, 21 and 22 years old) were measured and analyzed. The experiments were started as soon as the subjects triggered the remote controller. During the experiment, each subject was guided by computer voice that was generated by the laptop in the backpack. The program would ask the subject to stand still for 10 seconds, and walk for 10 seconds. The figures below shows sample analysis results where the gait events were detected and marked.

Upper: Segmentation of gait events during gait. Lower: Angles of hip joint and knee joint.

The trajectory of the change in the center for the ground contact force was plotted the figure below. The timing information of IC and TO was used to determine the required region.

Calculation of center of ground contact force. Each circle corresponds to a percentage of the gait cycle.

Page 4: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

http://robinhsieh.com/?p=1115

To validate the result, one subject was asked to simultaneously undertake the experiment for five trials with the proposed system, VICON motion capture system and force plate. Experiments showed that the behavior of the joint angles were similar to VICON system, and the average error in the timing of initial contact was less than 90 ms.

The results of the comparison between our system with the VICON system and the force plate indicated similar behavior. In addition, the low cost, wearable, portable, and easy-to-use features of the system make it suitable for performing outdoor experiments. I had presented this work at the 2011 Symposium of Biomechatronic Engineering (BIOME) as a poster paper when I was in my senior year. It won the best poster award for the student poster competition. After the Symposium of BIOME, we kept modifying the system for better performance. We also submitted an update on the project’s progress (which is presented in this article) to the 2012 IEEE Engineering in Medicine and Biology Society (EMBC). I gave a presentation at the conference.

C. W. Chang, T. H. Hsieh, A. C. Tsai, and T. T. Lin, ”Development of a Portable Plantar Pressure

Measurement System,” in Proceedings of 2012 Symposium of Biomechatronic Engineering, Tainan,

Taiwan, October 2012. (in Traditional Chinese) Best Poster Award

T. H. Hsieh, A. C. Tsai, C. W. Chang, K. H. Ho, W. L. Hsu, and T. T. Lin, “A Wearable Walking Monitoring

System for Gait Analysis,” 34th International Conference of the IEEE Engineering in Medicine and Biology

Society (EMBC), San Diego, CA, USA, August 2012.

T. H. Hsieh, A. C. Tsai, and T. T. Lin, “Design and Analysis of a Mobile Gait Monitoring System,” in

Proceedings of 2011 Symposium of Biomechatronic Engineering, Chia Yi, Taiwan, October 2011. (in

Traditional Chinese) Best Poster Award

Page 5: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

Powered Exoskeleton

http://robinhsieh.com/?p=1068

After studying the biomechanics of gait and developing a wearable gait monitoring system, we began to design a powered exoskeleton for lower extremity. My job was to help to design the exoskeleton mechanism and develop a program to control it. Using the gait data from our developed gait monitoring system, I developed a program that can control the exoskeleton to simulate a human-like gait.

Harmonic drives Can simulate a human-like gait Bluetooth communication

We designed the mechanism based on anthropometry data. The lengths of the braces between the hip and knee joints and between the knee and ankle joint can be adjusted. We placed two DC motors with harmonic drives on the hip and knee joints. The reduction ratio of the harmonic drives and torque of the motors were chosen based on biomechanical data for level walking. Two limit switches were placed on each joint to prevent hyperextension and hyperflexion. Two physical stops were also placed on each joint in case errors occur to limit switches. The figure below shows one of our previous designs. First, we only drove the left-hand side of the operator with the actuators, while several potentiometers were placed on the right-hand side. The idea behind this design was to determine if we can use the gait data from the right-hand side for the motion control of left-hand side.

Click me!

Page 6: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

http://robinhsieh.com/?p=1068

However, we later found out that this design would be difficult to apply clinically because a person suffering from gait pathology on only one side would also have his or her performance affected on the other side; thus, we could not rely on the gait data from one side to control the other side. We then changed our strategy to something more suitable: the powered side of the exoskeleton guides the operator to step in a fixed gait pattern that was obtained from the previous study, while the speed of the gait pattern can be moderated by the operator’s pace on the passive side.

The control circuit contains two DC motor controllers, two relays, a remote-controlled relay, an AC-to-DC power supplier, and a fuse. The power of the exoskeleton can be switched on and off either manually or by remote-control. Once the power is on, the exoskeleton can be controlled by the program. A personal computer (PC) was used to control the exoskeleton via Bluetooth connection

I developed a program using LabVIEW to control the exoskeleton. Because each motor came with a high-level controller having an internal close-loop controller, I was able to figure out how to control these motors fairly quickly. First, the program loads the target gait pattern, as shown in the figure below.

The target gait patterns shown above differ from the gait patterns of level walking. As mentioned, the function of the exoskeleton is to guide the operator to make a step; therefore, the gait pattern is a step rather than a full gait cycle.

This project is still under development with ongoing revisions of both the hardware and software. I helped in the development of the system until the third design version. The project is now in its fourth revision.

Page 7: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

InMoov Bionic Arm

http://robinhsieh.com/?p=334

InMoov is an open source project of a bio-mimetic humanoid robot whose assembly sketches can be downloaded on Thingiverse. I built this arm in cooperation with 3DMaker Inc. This arm is intended to demonstrate how 3D printers can be utilized to build robots. All of the parts of this arm were made by a 3D printer. The arm has three modes of operation: FreeDemo, RomoteControl, and PaperScissorsStone. It had been exhibited on Maker Faire Taiwan 2013.

A 3D-printed robot Can be remote controlled by a glove Able to play Paper-Scissors-Stone with a person Exhibited on Maker Faire Taiwan 2013

Most of the components of this arm were made by a 3D printer (Afinia H-Series) except the bottom container which was made from acrylic sheets. The bottom container has an Arduino Mega 2560 board, an AC-to- DC power supply, and a custom-made prototyping shield. This made it easier for servo motors to connect with Arduino and allowed them be powered by the AC-to-DC power supply, since the current from Arduino was insufficient to drive all of the servo motors we used. This arm uses five servo motors to control each finger. We tied two fishing lines to the servo arm of every servo motors so that a single servo motor could make a finger to perform flexion and extension I developed the program for the arm using LabVIEW 2011. The program has three modes of operations: FreeDemo, RemoteControl, and PaperScissorStone. The program will show the main window when it starts; I can choose which operation to perform in the main window, and the corresponding window will pop out. In FreeDemo mode, the arm just repeatedly performs fixed patterns of motion, therefore, the program is very simple. I prepared this operation mode in case I am not close enough to the robot arm to give instructions. For the RemoteControl mode, I designed a “MagicGlove” to control the robot arm. In order to measure finger motions, I used five dFlex flex sensors, which were made by Dextor Industries. The dFlex flex sensors are variable resistors which are very easy to use.

Click me!

Page 8: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

http://robinhsieh.com/?p=334

I used LEGO MINDSTORM NXT to read data from the sensors, and then transmit the data to a PC via Bluetooth. However, since NXT only has four input ports, I used a sensor multiplexer from HiTechnic to provide additional sensor ports. I chose dFlex, NXT and the sensor multiplexer because I already had them for years, and therefore didn’t need to buy additional parts. Furthermore, because LabVIEW supports Bluetooth connections with NXT, I could easily establish the Bluetooth connection between NXT and the PC. Once the program is started, it will ask the user to enter the name of the NXT in order to make a Bluetooth connection. After the connection is made, the NXT will keep sending data back to the PC, and the program will match the data to the rotation angle of servo motors. In PaperScissorsStone mode, the robot arm can literally “play” paper, scissors, and stone. I used a webcam to capture the hand gestures of a person, while the robot arm makes paper, scissor, or stone simultaneously in random fashion. The robot uses a computer voice (recorded from Google Translate) to guide participants to play the game. I used edge detection to count the number of fingers in order to distinguish between different hand gestures. Before performing edge detection, the raw image captured from the webcam needed image pre-processing. The procedures are as follows: 1. Color Plane Extraction: Projects the raw image to HSL plane in order to reject disturbances from luminance variation. 2. Erosion and Dilation: Eliminating small objects in the image. 3. Auto-threshold: Performs auto-threshold (clustering method) in order to look for bright objects. 4. Histogram Equalization: Makes the image look better and ready for edge detection.

Page 9: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

http://robinhsieh.com/?p=334

When I was trying to use edge detection after image pre-processing, I found that knuckles also have edges, thus making it difficult to distinguish between hand gestures. Therefore, the detected fingers should be “long enough” to be considered as a finger rather than a knuckle. I believed this could be done by comparing the detected length of an object with the total length of the palm. I calculated the linear average of the processed image and used differentiation to separate the palm out.

The last problem was how to map the analyzed result (number of fingers) to the gestures of the hand. The truth table for this is presented as the table below. I then used combinational logic to realize this truth table in LabVIEW.

This robot arm achieved great success in Maker Faire Taiwan 2013. People were all fascinated by its design and functionality. It turned out that RemoteControl was the most popular mode, and lots of people asked me to let them try the MagicGlove. I spent plenty of time explaining the theories behind this robot arm to different people. I was really exhausted after the event, but it was totally worth it!

Click me!

Page 10: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

LEGO Atomic Force Microscope

http://robinhsieh.com/?p=3150

I led a group of 4 students to build this LEGO AFM under Dr. Yen-Wen Lu‘s supervision. While I designed both hardware and software, the other group members help to design lesson plan so that it can be used as a teaching tool in the future. As AFM is perhaps one of the most fundamental and widely-used instruments in nanoscience and nanotechnology, the introduction of this LEGO AFM will be beneficial to nanoscience education in both theoretical and experimental aspects. We had submitted this work to the International Journal of Automation and Smart Technology and is currently under revision. A more detailed description of this work will be available on February, 2014.

Constructed based on the design of a real AFM Scanning resolution is adjustable Intuitive graphic user interface programmed via LabVIEW Self-made laser module for LEGO NXT

T. H. Hsieh, Y. C. Tsai, C. J. Kao, Y. M. Chang, and Y. W. Lu, ”A Conceptual Atomic Force Microscope using

LEGO for Nanoscience Education,” International Journal of Automation and Smart Technology, In revision

Page 11: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

Muscle Inspired Actuator

http://robinhsieh.com/?p=415

When I was taking the course “Advanced Biomechanics”, I proposed a hypothetical design of a new type of actuator using solenoids as well as how to apply it to build a powered ankle-foot exoskeleton as final term project. The design was intended to be a device that can augment human performance.

Using solenoids to generate force, arranged in a bipennate form Utilized inverse dynamics to estimate performance This work won the first prize in class presentation

The system architecture is shown in the figure below. The power-manage circuit is consisted of voltage regulators that are able to stabilize the power source, providing a steady 24 volts and a 5 volts output. Two bring the two output voltage from power-manage circuit up to 160 volts and 15 volts separately. The driver circuit for solenoids is a metal-oxide-semiconductor field-effect transistor (MOSFET) array. To detect gait events, the force sensitive resistors were considered. Once the push-off phase is detected, the digital signals will be sent to the MOSFET array, letting the power stored in the capacitors to pass through the solenoids. The whole system will be controlled by an embedded computer, single-board RIO. In this project, the sbRIO-9642 was considered due to its abundant I/O ports. The sbRIO will analyze the signals from FSRs, control the driver circuit base on the analytical results, and control the charge of capacitors.

Page 12: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

http://robinhsieh.com/?p=415

Similar to sarcomeres, the proposed actuator system consists of many cellular units made of solenoids. This novel actuator system has several advantages as follows: 1) Each solenoid has its own “ON” or “OFF” state like muscle fibers which are able to produce tension during ON state and simply relaxed (no tension generated) during OFF state and is therefore easier to control. 2) Solenoids are able to produce high peak forces with fast speed and 3) they will not be interfering with operator’s movement during the OFF state. In order to reduce the complexity of analytic effort, a uni-articulate mechanism was designed with a pennation angle of 30°, which functioned as an additional soleus muscle and can therefore provide additional moment of plantar flexion for ankle joint during push-off, illustrated in the figure above. The packages were expected to attach on an ankle-foot orthosis (AFO) to provide additional moment on ankle joint. Image credit of AFO: Carbon Express LLC.

The free body diagram can be drawn as below. We assigned values for the following parameters: d1 = 0.1 m, d2 = 0.03m, α=12 rad/s2, I=0.05 kg‧m2, and mg = 24.5 N, then M = 35.04 N-m. For a person weighed 75 kg, the device can provide 0.47 Nm/kg during push-off. The calculation above was based on hypothetical data; however, it can be inferred that the additional torque the device provides should have the ability to overcome its own kinetic characteristics.

T. H. Hsieh, “Muscle Inspired Actuator Using Solenoids: Design of A Powered Ankle-foot Exoskeleton,”

2013.

Page 13: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

Machine for Verifying Store Receipts

http://robinhsieh.com/?p=377

In Taiwan, we have a lottery system for store receipts. All receipts have a unique 8-digit number on them and our government announces the winning numbers every two months. The prize money ranges from NT$200 to NT$10,000,000 (US$6.67 to US$333,333) based on the matching criteria. I built an automated machine with my classmates, which verifies whether the number on a receipt is a winner or not. There have been many attempts from people to build an automated machine to verify store receipts; however, the one we built is the first and the only one that is “fully” automated.

Able to scroll receipts automatically An algorithm was developed that can filter out the number from an image The accuracy of recognition was 91.41% Can download the latest winning numbers from Ministry of Finance’s website

This machine was originally our final term project for the course “Principles and Applications of Microprocessors.” We modified an old printer to scroll receipts, and designed the corresponding circuits and programs in order to verify them. Although we received first prize in the class competition, I was not satisfied with the machine because it had many drawbacks.

After the class was over, we started to build a second generation machine using our spare time. We used the parts taken from a bill counter to build the second generation machine. I used LabVIEW to develop the program, and used Arduino together with TA7291P to control a DC motor. Then, I established a database containing 291 images of store receipts.

The biggest challenge in developing the program was that the location of the number varied between different captured raw images while the machine is in operation. As a result, I need to filter out the regions which contained numbers before the optical character recognition (OCR) was used. After trying numerous methods, I found a suitable algorithm which could verify the winning number.

Click me!

Page 14: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

http://robinhsieh.com/?p=377

The methods I used are described and illustrated below: 1. Color Plane Extraction (HSV): Because the target number on store receipts is printed in black and is usually the darkest color on the receipts, I first extract value color plane.

2. Threshold Based on Histogram: Although the histogram of the value color plane did not strictly follow a Gaussian distribution, I found out that setting the upper value as three standard deviations less than the mean value while doing threshold could effectively remove brighter pixels and keep darker pixels.

3. Logarithmic Remapping Operation: After performing threshold, I applied logarithmic remapping operations twice to give an extended contrast for small pixel values and less contrast for large pixel values

4. Calculate Linear Averages along X and Y Axes: This is the most important step in the whole algorithm. First, the linear averages along the X axis, which represents the average pixel intensity along each columns were computed so that I was able to cut the black areas out from both the left and right sides of the image. Second, the linear averages of the Y axis, and also their mean value, were computed. The calculation of mean values only included positive values from linear averages, which is indicated as a red line in the figure below. Base on the mean value (red line), I filtered out the regions that contained words. If the peak linear average value of the region is lower than mean value, the whole region will be eliminated, and if the peak value is higher than mean value, the whole region will be kept.

Page 15: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

http://robinhsieh.com/?p=377

5. Performing OCR to Each Regions: Utilizing the results from step 4, I built up a character set file using LabVIEW Vision Assistant; therefore, as long as the image contained the desired number, the program could identify it immediately. Then, the OCR was used for each region from top to down, as shown in the figure below. If the image does not contain desired number, the program will identify it as “not a number” once the desired number is detected and identified, the program will not process the remaining images.

6. Verifying the Winning Number: The table below is an example which shows the winning numbers and

matching criteria for May and June in 2013. Based on the rules, I developed an algorithm to verify whether

the number is a winner or not.

Page 16: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

http://robinhsieh.com/?p=377

7. Mapping Results to Raw Image: After the verification process is completed, the program will map the result

back to the raw image.

The program can download the winning numbers from the website of Ministry of Finance, while the user can

also set them up manually. Using the computer in our lab (Intel i7-860 CPU with 16GB RAM), it usually takes

less than 80 ms to verify a receipt, including image capture from the webcam. Below is the UI of the program.

Page 17: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

Robot Education

http://robinhsieh.com/?page_id=203

Under the belief that robots are great tool for STEM education, I initiated CAVE Education with David Tseng in order to promote robot education in Taiwan. We are now one of the best known groups in Taiwan, and I am often invited by educational institutions ranging from elementary schools to colleges to give lectures, hold workshops, or train teachers.

The educational purpose of bringing robotics into the classroom in Taiwan is similar to that of the United States—the US is one of the countries Taiwan’s educational policies are modeled on—although there are some marked differences. Unlike the United States, where there is a shortage of students with STEM degrees (Undergraduate STEM Education Report, President’s Council of Advisors on Science and Technology, 2012), high school students in Taiwan usually opt for STEM-related departments when entering university. This trend is driven by the industrial structure of Taiwan and the traditional values of Taiwanese culture, rather than students’ own interests. Furthermore, the educational system in Taiwan has always emphasized theoretical learning over practice, and mathematics and memorization are preferred over creative thinking, as well as exams over group projects. The long-term result of this practice, however, is that students only learn in a passive mode and have no idea how to apply their knowledge to real-world situations. Therefore, in promoting robot education in Taiwan, our goal is not only to stimulate young students to choose STEM fields in university, but also to provide a stimulus for students who have already majored in a STEM field to devote themselves to learn relevant skills and cultivate their problem-solving abilities.

We published our first book, New Horizons of Robots: NXC

and NXT, in 2009.

Although there are many standard resources available for the LEGO MINDSTORMS robotic set, such as custom programming languages and lesson plans that enable advanced applications and a more versatile curriculum, all of these are developed or written in English and, consequently, are difficult to access for the Chinese community. Therefore, our first task was to write a book in Chinese called New Horizons of Robots: NXC and NXT (2009), which introduced one of the most popular custom programming languages, Not eXactly C (NXC), currently maintained by John C. Hansen. Since then, because we were the first group that focused on advanced applications of LEGO robots, we have been frequently invited by various educational institutions ranging from elementary schools to colleges to give lectures, hold workshops, or train teachers. In 2009, we were invited to attend the LEGO Engineering Conference in Singapore, and David gave a keynote speech about NXC use for high school students. Besides publishing books and holding workshops, we have also established a blog on which we share the latest news on the robotics industry and educational robots.

Page 18: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

http://robinhsieh.com/?page_id=203

Left: When we were invited to attend LEGO Engineering Conference in 2009, we met Dr. Chris Rogers, the director of the Center for

Engineering Education Outreach at Tufts University. Right: A photo from when I was conducting an NXC workshop at National

Taichung Girls’ Senior High School, the first school I conducted the workshop at.

Currently, our blog has more than 200 visitors per day. As more and more people have joined our group over the years, we have been able to develop more resources, and gradually have become one of the best-known groups in Taiwan. Over the past few years, I have been invited to numerous places to hold workshops for students and teachers, lead summer camps, instruct students to prepare robotic competitions, or give lectures at senior high school science clubs.

Over the past few years, I have been invited to numerous places to hold workshops for students or teachers, lead summer camp,

instruct students to prepare competitions, or give lectures at senior high school science clubs.

Page 19: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

http://robinhsieh.com/?page_id=203

Since NXC, we have published several books on different programming languages to control NXT, such as LabVIEW and LeJOS (JAVA). We have also published a second edition of the NXC book, which was used extensively by senior high schools and some colleges as teaching material or as a reference book for their robotics courses. For instance, it was used as a guidebook by National Taiwan Normal University’s Department of Applied Electronic Technology for the course “Robot Programming in C.”

Although we consider the LEGO robot a great tool for STEM education, we have also used other tools as teaching materials, some of which can be combined with the LEGO robots for advanced applications. For example, we dug into Android programming when the system became prevalent on the market. In 2011, we published Android/NXT: using smart phones to control your robots. Furthermore, when the MIT App Inventor, a graphical programming language that is both powerful enough to develop classy Android programs as well as easy to use, came out, we spend a lot of time developing related teaching material. We also published a two-volume book called Android Easy Programming – App Inventor for Beginners and Android Easy Programming – App Inventor for Programming Robots. The latter volume contains a foreword from Professor Hal Abelson, the director of the MIT App Inventor project and co-director of the MIT Center for Mobile Learning. He wrote the foreword after one of his students—who knows Chinese—reviewed the book’s contents for him. For even more accessibility, we established the App Inventor TW website for the Chinese community. In recent years, we also focused on some popular prototyping platforms such as Arduino and Raspberry Pi. For example, we wrote LabVIEW for Arduino: A Perfect Combination of Control and Applications, which came out in November, 2013.

In recent years, we have published several books that not only related to LEGO robots, but also other useful tools such as App

Inventor, Android and Arduino, some of which can be combined with LEGO robots to build more advanced applications.

Page 20: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

http://robinhsieh.com/?page_id=203

These efforts only truly become valuable when they motivate students to start exploring the world of science. I believe that students are much more stimulated by hands-on experience than by tough mathematics and theoretical learning. Especially nowadays so many great tools, in both hardware and software, are available that can provide the link between theories and real-world practice.

Once, I was teaching a junior high school student about the concept of proportional control, and I asked him to design a line-following robot based on it. Although the concept itself was not difficult to grasp, he struggled for a while because he had no idea how to implement it in a program. During the process, he told me “This is so hard, but I really love it.” It is rare to hear a student say this, especially in the learning environment in Taiwan. That was the first time I really understood the purpose of robot education, and felt fortunate to be a part of it.

T. H. Hsieh, and C. H. Tseng, “The Programming Software for Hands-on Robot Education,” in Proceedings

of 43rd International Symposium on Robotics (ISR), Taipei, Taiwan, August 2012.

C. H. Tseng, W. H. Wu, M. Y. Lu, T. H. Hsieh, H. Y. Hsueh, and T. L. Weng, LabVIEW for Arduino: A Perfect

Combination of Control and Applications, 1st ed., Taipei: Fullon, 2013. (in Traditional Chinese)

C. H. Tseng, W. M. Lai, T. H. Hsieh, H. Y. Xue, and Y. X. Lin, Android Easy Programming: App Inventor for

Programming Robot, 1st ed., Taipei: Fullon, 2012. (in Traditional Chinese)

W. H. Wu, C. H. Tseng, T. H. Hsieh, J. B. Lu, Z. L. Weng, and Z. M. Huang, Handbook of LabVIEW

Programming: Build Your Own Intelligent Robots, 1st ed., Taipei: GoTop, 2010.

C. H. Tseng, and T. H. Hsieh, New Horizons of Robots: NXC and NXT, 2nd ed., New Taipei: BlueOcean, 2010.

(in Traditional Chinese)

C. H. Tseng, T. H. Hsieh, and J. Y. Hou, New Horizons of Robots: NXC and NXT, 1st ed., New Taipei:

BlueOcean, 2009. (in Traditional Chinese)

Page 21: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

Device for Plantar Pressure Analysis

http://robinhsieh.com/?p=419

Although there already exists many tools that are able to measure plantar pressure distribution, they are all very expensive. For some applications such as deciding the right size of a shoe or designing custom shoe insoles, such advanced system may not be required. Therefore, I developed a device which is low-cost, but can still provide substantial data for analyzing plantar pressure distribution.

Relatively low-cost compares with standard system Self-made pressure sensor Developed a 2D linear interpolation to enhance measured results After researching commonly used force sensors on the market, I found out they were all too expensive regarding the number of sensors I needed. Therefore, rather than buying the existing sensors, I decided to design a force sensor by myself. The best solution I could think of was to use springs accompanied with linear potentiometers. Using SolidWorks, I designed a socket to hold the potentiometer and spring, and a beam to connect the spring with the potentiometer, as shown in the figure below. To test it, I used a 3D printer to construct a model.

After testing, I believed the design was feasible enough to provide sufficient plantar pressure information. I continued to design an array of sensors and arranged the 6 sensors in a row, with 9 columns in total. I also designed a container so that all these 54 sensors could be placed inside.

Page 22: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

http://robinhsieh.com/?p=419

I then sent the design to a machine workshop. The sensor sockets and the beams were manufactured using polyoxymethylene (POM) as material, and the material of the container was aluminum. For the circuit, because this device contains 54 analog channels, I used a Arduino Mega 2560 board plus a Arduino Mux shield and a CD74HC4067 breakout board to read all these channels. I also designed a wooden box so that the circuits and the sensor array could be placed into it. In addition, by having this box, the users can have enough area to stand on, and can measure the pressure distribution of the feet in turn. I am still looking for a manufacturer to make this box. I used Arduino IDE to develop a program that reads the 54 channels of data, labeled the start and end of the data, and sends them to a computer. The LabVIEW program on PC receives raw data as voltage from the device, and converts it to pressure (kPa). I also developed a program to perform 2D linear interpolation to the data, and the user can set the number of iterations. Prior to the measurement, the program goes through the calibration process which calculates an average of 100 arrays of data as the baseline. The figure below is an example of the measurement result. The left image is the raw data sent from Arduino, and the right image is the result after 4 times of 2D linear interpolation; also, the baseline was subtracted and the data are converted to pressure value.

Page 23: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

Wearable IMU for Motion Analysis

http://robinhsieh.com/?p=428

Ankle motor control is crucial for balance and walking. However, few portable devices are available for the assessment and training of ankle motor control. We developed a new portable wireless device for the assessment and training of ankle tracking performance in older and clinical populations.

Using MPU 9150 SDK from InvenSense Provided a wearable design The accuracy and reliability were validated by compared with potentiometer Bluetooth communication The ankle tracking device consisted of a wireless inertial measurement unit (IMU) (InvenSense®), interface software, and analysis software. Equipped with a 3-DoF accelerometer, a 3-DoF gyro sensor, a 3-DoF magnetometer, and a microcontroller, the IMU can precisely and reliably measure roll, yaw, and pitch angles with less than 0.8° of errors as validated by a standard potentiometer encoder. The interface and analysis software, developed by using C++ Builder (Borland®), produced target tracking trajectories derived from polynomial equations, calculated the recorded tracking performance, and provided visual feedback about performance errors. In a motor learning paradigm, with the IMU fastened to the dorsal-anterior aspect of the non-dominant foot, two healthy young (age=23.0±1.4 yrs) and two healthy older (age=66.5.0±2.1 yrs) adults performed ankle dorsiflexion/plantarflexion movements to track the upward/downward target trajectories in a baseline test, a five-day training period, and a retention test.

A motor sequence learning paradigm was designed, including a baseline test that had five-day training period, and a retention test. The subject was sitting in front of the monitor equipped with the interface and analysis software. Each subject performed ankle dorsiflexion and plantarflexion movements to track the upward and downward trajectories of the target. On each day of training, 12 blocks of ten 12-sec repeated/random sequence trials were practiced. The root-mean-square errors (RMSEs), RMSEbaseline and RMSEretention, were calculated for dependent measure.

Page 24: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

http://robinhsieh.com/?p=428

The subjects reduced tracking errors over time whether as individuals or as groups, illustrated in the figures below, which is the mean performance curves of 3 subject groups for repeated and random sequence learning. Preliminary results support that the device is applicable to the assessment and training of ankle plantarflexion and dorsiflexion tracking movements. Further studies will be needed using larger samples of different populations.

K. Chiao, A. C. Tsai, T. H. Hsieh, M. T. Wu, T. T. Lin, and P. F. Tang, “Development of a Portable Wireless

Ankle Tracking Assessment and Training Device: Preliminary Results,” 6th Asia-Western Pacific Regional

Congress of the World Confederation for Physical Therapy and 12th International Congress of Asian

Confederation for Physical Therapy (WCPT-AWP and ACPT), Taichung, Taiwan, September, 2013.

Page 25: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

Field Robot Competition

http://robinhsieh.com/?p=480

This competition has been held annually by the Taiwan Institute of Biological Mechatronics (TIBM) since 2007, and I have participated in it twice with my friends during my sophomore and junior years. I used the LEGO MINDSTORMS robot set as a platform, because it was the platform that I was most familiar with at the time. We won the most creative robot award and the excellence award in 2010 and 2011, respectively.

Used compass sensor and IR distance sensor to navigate in the race field Applied a PID controller to minimize error We won the most creative robot award and excellence award The rules of the competition are illustrated in the figure below. An artificial farm was built as the race field, and there were three aisles for the robot to go through automatically and without remote control. Each team had two rounds during the competition, and the rank was decided according to the total score of the two rounds. One team member could follow the robot in the field. Once the robot crossed the white line, the participant could stop the robot and start over from the nearest white line if an error occurred. Prior to the start, a participant would pick up a color, red or blue, randomly. While passing through the last aisle, the robot needs to pick up a can with the corresponding color, and place it on the color plate which has the same color at the end of the last aisle. Those were the rules for the 2011 competition; however, the 2010 one was slightly different. For that competition no can needed to be picked, and the length of the aisles were longer.

We designed the robot using a LEGO MINDSTORMS robot set as a platform. However, the race field was too big for a robot that was built from LEGO parts to pass; as a result, for the 2010 competition, we only used NXT as a controller along with a HiTechnic DC motor controller; most of the other components were made by ourselves. We used polyvinyl chloride (PVC) plates and aluminum plates as materials to build the robot. To navigate in the race field, we used a compass sensor and applied a proportional-integral-derivative (PID) controller to minimize the error and an IR distance sensor to detect when to turn. We kept the compass sensor far from the motors to reduce the interference of magnetic field generated by the motors. However, we later

Page 26: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

http://robinhsieh.com/?p=480

found out that may not have been necessary, because the interference was nevertheless constant. We first designed the robot in SolidWorks and made all of the components based on the blueprint, as shown in the figure below.

The biggest mistake we made during the 2010 competition was our poor choice of motors. At that time we did not know where to buy a proper set of motors and wheels, so we simply broke down two toy motorcycles and used those motors and wheels directly even though they were extremely difficult to control. The relationship between the output voltage from the motor controller and the rotation speed of motors was nonlinear. Therefore, although a PID controller was applied, the robot still needed a lot of space to turn, resulting in difficulties during the competition. Besides this, making almost all of the parts by ourselves was very time consuming, and there were only two people on the team.

Click me!

Page 27: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

http://robinhsieh.com/?p=480

Learning from the mistakes we made, for the 2011 competition we only made some of the parts by ourselves while the robot was mostly constructed from TETRIX and LEGO parts. We also bought motors and wheels that had a fair quality for making robots. Because the robot needed to grab a colored can, we initially placed an NXTCam, which was a camera that could be controlled directly by NXT, on the top of the robot to distinguish the colors. However, we found this strategy to be not suitable for an outdoor environment. At last, we found it would be much better to simply use the encoders of the motors to plan a motion to grab the can. For navigation, we still used a compass sensor and an infrared distance sensor to decide when to make the turn, and applied a PID controller to minimize the error. We got the highest score for the second round because all of the tasks were accomplished. However, we still made a mistake for the 2011 competition. During the first round, we used a pair of brand-new Lithium polymer (Li-Po) batteries. Although they were fully charged during the first round, because they hadn’t been used before, the output voltages were not stable, and thus, the turning failed. Despite that, we still finished most of the tasks for the first round, even though the robot didn’t grab the can. Despite the difficulties, these were great experiences, and I learned a lot about how to build larger scale robots. Also, through these hands-on practices, I realized that even for relatively simple tasks, there are still many details that needed to be dealt with.

Click me!

Page 28: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

Plant Factory

http://robinhsieh.com/?p=367

This was the final project of the course “Mechatronics and Laboratory.” For this project, I led a group of 12 students, and we constructed an automatic plant factory that can not only grow vegetables, but also provide visual inspection and packaging. Because of our outstanding performance in the class, we also submitted and presented this work at the 2011 National Student Project Contest, which was held by the Ministry of Education. Our project won the Remarkable Award.

A growing environment that can control light, temperature, and CO2 density automatically A website that provides access to the growing environment An Android application that enables users to control the growing environment A visual inspection system that utilized computer vision to decide the quality of vegetables This project won the Remarkable Award at 2011 National Student Project Contest The term “plant factory” refers to a closed growing system where the light, temperature, moisture, and carbon dioxide concentrations are controlled so that the farmer can achieve constant production of vegetables for the whole year. The plant factory we built during the class contains a seeding machine, a vegetable growing system, a visual inspection system, and a packaging system, illustrated in the figure below. A growing chamber was provided by our department, which was modified from a mini-refrigerator. Most of the hardware were also provided, such as T5 lamps, humidifier, sensors, compressor and CO2 tank. However, we were asked to develop an electrical circuit to control all of these components.

To control all this equipment, we used an AVR micro-controller (ATMega48) and a personal computer (PC). We designed a circuit that enabled RS-232 communication between PC and AVR using MAX232, which is a dual driver/receiver. For the equipment that needed to be turned on and off consistently, such as T5 lamps, the CO2 inlet and humidifier, they were connected to relay modules while these modules were connected to the AVR. Thus, by sending signals to the AVR from PC, we could control the equipment. The sensors included a CO2 density sensor, a temperature sensor, and a humidity sensor; the sensors also came with a RS-232 interface. Therefore, data from the sensors could be transferred to the PC directly.

Click me!

Page 29: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

http://robinhsieh.com/?p=367

Using LabVIEW, I designed a program to receive all of the data from the sensors, and also to control the growing environment; the program ATMega48 was developed using CodeVisionAVR. The growing environment can be controlled either manually or automatically. For manual control, the user can simply use the graphical user interface (GUI) to control the equipment. For automatic control, the user should set target parameters on the GUI for CO2 density, humidity, and temperature. The program then controls the system to meet the target values. During this process, the measured data from the sensor is plotted on the GUI, and the user can save all the data in Microsoft Excel format. The GUI is shown as the figure below.

Page 30: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

http://robinhsieh.com/?p=367

Furthermore, we also created a website, and all of the data from the growing environment were uploaded to it. On the web page, the data are shown as waveform charts, and the instant parameters in the growing environment can also be viewed. If the automatic control mode is chosen, the user can log-in to the web site to set the target values, and the remote PC controls the environment based on these settings.

For more convenience, I designed an Android application on which the user can either set the target parameters or acquire the current state. The application would communicate with our website based on the settings, as shown in the figures below, which illustrate the setting procedure.

Page 31: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

http://robinhsieh.com/?p=367

A system was designed and built for visual inspection and packaging. This system was able to sort vegetables according to quality. The I/O of the system contained a PC, a webcam, 2 DC motors and 6 RC servos. An AVR micro-controller and a programmable logic controller (PLC) were used to control these actuators. The system is shown in the figure below.

The fully grown vegetables were taken from the growth chamber, and were then put on the upper conveyer for the visual inspection system. The program would estimate the area of the vegetables, and determine its quality based on the results. After the inspection, a packaging box was dropped on the lower conveyer automatically. After the vegetables fell into the packaging box on the lower conveyer, the lower conveyer would start moving. A pair of plates would then guide the packaged vegetables to the target place.

We submitted this work to the National Student Project Contest, and I gave a presentation during the contest. We won the Remarkable Award at last.

Page 32: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

EEG Controlled Robots

http://robinhsieh.com/?p=154

This was one of my undergraduate projects, and was funded by a College Undergraduate Research Scholarship from the National Science Council. I developed a program that utilized a commercial brain-machine interface (BCI) device to control both a LEGO MINDSTORMS robot and a powered upper-limb exoskeleton. The former task, control of the LEGO robot, served as a pilot experiment to test the algorithm.

Using Emotiv neuroheadset with SDK Up to four mental states can be trained to control robots This work was funded by a College Undergraduate Research Scholarship The hardware setup includes a commercial BCI device, Emotiv EPOC, a LEGO Mindstorms NXT robot, a powered upper-limb exoskeleton, and a personal computer (PC). The Emotiv EPOC neuroheadset was used to acquire EEG signals, and the data was transferred wirelessly to a personal computer. The neuroheadset was accompanied by a self-development kit (SDK) for software development, while the LEGO Mindstorms NXT was used to construct a dual wheels robot in order to test the program before applying it to the exoskeleton. In this study, the program was executed on the computer, while the NXT brick was connected to the PC wirelessly through Bluetooth. The exoskeleton has two degrees of freedom which were actuated by two electrical drives. The axes were located on the elbow and shoulder joints, respectively. The exoskeleton is the property of the Biophotonics and Bioimaging Laboratory, Department of Bio-industrial Mechatronics Engineering, National Taiwan University.

The software development environment was Visual Studio 2008 and used C/C++ as the programming language. The dynamic link libraries (DLLs) provided by Emotiv were used to analyze EEG signals. The system architecture is illustrated below. The Emotiv EPOC sent EEG data wirelessly for real-time analysis, and the result was then mapped to the event that the robot would execute. For the LEGO robot, the command was sent from a personal computer to NXT via Bluetooth; for the exoskeleton, the command was sent via a physical wire, which had a RS-232 signal type with the baud rate of 9600.

Click me!

Page 33: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

http://robinhsieh.com/?p=154

For controlling the NXT robot, the mental states returned were used to choose the behaviors of the robot. In addition to the neutral state, there were two more mental states which subjects were trained for: push and turn. If the mental state was neutral, the two motors of the robot would brake; if the mental state was push, the robot would drive forward; and if the mental state was turn, the robot will turn clockwise. The speeds of the motors were decided by the consistency of the state. The control strategy for the exoskeleton is different from the NXT robot. Because the exoskeleton was intended to assist rehabilitation, the speeds of the motor drives were set to constant during the experiment and could be modified manually. The exoskeleton actuates based on the consistency level returned by the API function.

T. H. Hsieh, “An EEG-based Approach for Exoskeleton Control,” Final Report of College Undergraduate

Research Project, 2011

Page 34: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

BIME Protractor

http://robinhsieh.com/?p=431

I wrote this software to help Dr. Wei-Li Hsu’s graduate students measure the kyphotic angle in radiography. Their research focused on changes in lumbar lordosis after minimally invasive lumbar spine fusion surgery, and thus, they had a tremendous amount of X-ray images to analyze. The program I developed saved them a large amount of time. Dr. Hsu was one of my advisors when I was developing the wearable gait monitoring system. She is an assistant professor at the School and Graduate Institute of Physical Therapy, College of Medicine, National Taiwan University. She also serves as the director of the Movement Science Laboratory.

Provides three methods to analyze kyphotic angle Designed a user-friendly interface I was asked to consider 3 methods that are commonly used to measure the kyphotic angle in radiography: the Type C Cobb Method, Type A Centroid Method, and Type B Centroid Method, illustrated in the figure below.

I used LabVIEW 2011 to develop the software. Two challenges affected the project. To develop this program, I first had to become familiar with how to manipulate the region of interest (ROI) using LabVIEW. In addition, I needed to review some basic vector calculations. Secondly, a user-friendly interface design was required so that students from the Department of Physical Therapy (PT) could easily learn how to use it.

Because I already have related experience before, determining how to enable users to draw ROIs on the image did not require much time. The remaining task was to calculate vector mathematics, such as finding the intersection point of 2 vectors, constructing lines, and finding the angle between the 2 lines. However, using LabVIEW commands to solve these basic mathematical problems would make the program appear convoluted. Fortunately, LabVIEW provides MathScript node which made me able to use MATLAB commands directly in LabVIEW environment.

Page 35: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

http://robinhsieh.com/?p=431

With the help of the MathScript node, I effortlessly built the program. The program can help users analyze X-ray images using 3 methods. Users can draw lines and dots on the X-ray images and the program will automatically calculate the corresponding results, as shown in the figure below.

Finally, I designed a user interface that is simple to use, as shown in the figure below. I also created an installer for the software, named the “BIME Protractor.” As a result, students from the PT department could easily start using the software.

Click me!

Page 36: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

BIME Caliper

http://robinhsieh.com/?p=435

I developed this software for Dr. Huei-Ming Chai’s graduate student, Szu-Hua Chen. Szu-Hua used this software to measure the length of musculotendinous units in ultrasonographic images. She measured these units for her master thesis entitled “Extensibility of the Gastrocnemius Muscle-tendon Unit after Different Static Stretching Techniques Using Ultrasonography”. Because she had numerous ultrasonography images to analyze, this software saved her a lot of time. Dr. Chai is currently a lecturer at the School and Graduate Institute of Physical Therapy, College of Medicine, National Taiwan University. She also serves as the director of the Assistive Technology Laboratory. I met Dr. Chai when I was taking the Kinesiology course she taught.

Provides an intuitive way to measure the length of an arbitrary line in the image Able to perform calibration for different scales

Initially, Dr. Chai gave me an ultrasonographic image and asked me if I could write a program that could filter out the musculotendinous unit in the image and calculate its length automatically based on the scale on the right side of the image. However, I found this very difficult to achieve because the target musculotendinous unit was very difficult to recognize. Because I could not recognize the musculotendinous unit in the image, Szu-Hua indicated it using Paint (Microsoft), as shown in the figure below.

After considering the options, I determined that a program that enabled users to draw multiple line segments, after which the program would calculate the total length, would be an excellent solution. After Szu-Hua accepted this idea, I started developing the software using LabVIEW. To enable users to draw multiple lines on the image, I used the region of interest (ROI) tools provided by LabVIEW. The ROI tools I assigned, which were the Zoom Tool and Polyline Tool, appear on the user interface (UI) of the program. The user can use the Polyline Tool to draw multiple lines connected by dots, the coordinates of these dots are received by the program, and the summed length of the unit is calculated. Eventually, I successfully developed a program that helped Szu-Hua to measure the length of musculotendinous units in ultrasonographic images.

Page 37: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

http://robinhsieh.com/?p=435

To use this program, the below steps should be followed:

1. Calibrating the Scale: Because each image has different scales, the first step is to calibrate the pixels to the desired unit. The scale on the right side of the image is in centimeters. After the image is loaded, the user should draw 1 cm on the scale and press the “Calibrate” button. The process is illustrated in the figure below.

2. Drawing the ROI: After calibration, the user can start measuring by drawing polylines in the image, as shown in the figure below.

Click me!

Page 38: B. Sc. in Bio-industrial Mechatronics Engineering National ...robinhsieh.com/wp-content/uploads/2013/12/Portfolio_3.pdf · I developed a program using LabVIEW to control the exoskeleton

http://robinhsieh.com/?p=435

3. Making Adjustments: The polyline is adjustable; therefore, the user can easily make modifications after drawing the ROI, as shown in the figure below.

4. Confirming Accuracy: Before clicking “Next image”, the user can measure the length several times to validate accuracy. The mean value and standard deviation of the measurements will appear on the right side of the window, as shown in the figure below. I created an installer for this software and named it “BIME Caliper”.