sparc – supplementary assistance for rowing coaching simon fothergill ph.d. student, digital...

Download SpARC – Supplementary Assistance for Rowing Coaching Simon Fothergill Ph.D. student, Digital Technology Group, Computer Laboratory DTG Monday Meeting,

If you can't read please download the document

Post on 21-Dec-2015

212 views

Category:

Documents


0 download

TRANSCRIPT

  • Slide 1
  • SpARC Supplementary Assistance for Rowing Coaching Simon Fothergill Ph.D. student, Digital Technology Group, Computer Laboratory DTG Monday Meeting, 9 th November 2009
  • Slide 2
  • Overview Automated Physical Performance Assessment Basic real-time feedback Performance similarity Fault recognition Data annotation Questions
  • Slide 3
  • Automated Physical Performance Assessment Sense and Optimise Automatically provide feedback to athletes on their technique, including how good they are and how they could improve. Combining Coaches and surrogate coaches Analysis of the kinetics of a physical performance Focusing on (indoor) rowing Other relevant domains
  • Slide 4
  • Overview Automated Physical Performance Assessment Basic real-time feedback Data capture User interface Conclusions Performance similarity Fault recognition Data annotation Questions
  • Slide 5
  • Data Capture System Data capture system requirements: Portable Cheap Physically robust Extensible platform Equipment augmentation Aim: Record a data set. Allow annotation by expert, professional coaches Provide basic local, real-time and distributed, post-session feedback. Data set requirements: Real Convincing + useful evaluation of algorithms Investigate domain Large Segmented Labelled Synchronised High fidelity
  • Slide 6
  • Data Capture System : Previous work Systems: Bat system Inertial sensors Phasespace Coda Vicon Motion capture with Wii controllers StrideSense Concept 2 Performance monitor Datasets: Generic actions 2D videos of sports performances
  • Slide 7
  • Data Capture System : SpARC (0)
  • Slide 8
  • Data Capture System : WMCS 3D Motion Capture System with Wii controllers: Based on work below which showed 3D motion capture is possible with a reasonable local precision, using 2 Wii controllers. Reference: Optical tracking using commodity hardware; Simon Hay, Joseph Newman and Robert Harle; 7th IEEE and ACM International Symposium on Mixed and Augmented Reality; 2008 Standalone from Matlab, improved performance Synchronisation of Wii controllers Robust software Cmdline tool/library
  • Slide 9
  • Data Capture System : WMCS Nintendo Wii controller Bluetooth IR 1024x768 camera (100Hz) Nintendo Wii controller IR 1024x768 camera (100Hz) PC (Ubuntu) Wii library Bluetooth library C server Wii controller Thread Wii controller bridge Serial port power + control Bluetooth adapter Bluetooth Wii controller Thread Wii library interface LIFO Buffer
  • Slide 10
  • Data Capture System : C2PM3 PC (Ubuntu) C server libUSB C2PM3 library interface Concept 2 Performance Monitor v3 USB port Erg (with flywheel) Buffer
  • Slide 11
  • Data Capture System : StrideSense ( http://sourceforge.net/projects/stridesense/ )http://sourceforge.net/projects/stridesense/ PC (Ubuntu) Java client Crossbow iMote2 strideSenseADCServer.c USB port : power + TCP/IP ADCPower board FSR usb0 network interface
  • Slide 12
  • Data Capture System : EMCS (1) ECS (Erg Coordinate System) EMCS needs to track handle (1), seat (1) and erg position + orientation (4) WMCS currently limited to 4 LEDs Use 1 LED as a stationary point on the erg & 2 LEDs on the seat at different points in time Use PCA to extract ECS axes Erg clamped to camera rig to minimise error Two LEDS attached to seat
  • Slide 13
  • Data Capture System : EMCS (2) Data from one Wii controller IR camera, used in computing correspondance of LEDs between cameras End LED Handle LED Seat LEDs Server Triangulation Stereo calibration Client 4 x 2D coordinates 4 x 3D coordinates Erg calibration Label markers Transform to ECS Update ECS if necessary ECS Calibrate labeller Calibrate WMCS (openCV) Storage Calibration Live operation
  • Slide 14
  • Data Capture System : SpARC (1) PC (Ubuntu) in boathouse Camera (rower) camera interface libdc1394 (firewire) Monitor + Keyboard WMCS (handle + seat motion) C2PM3 (handle force) StrideSense (footplates force) C server CLOCK Java client PC in Computer Lab Athletes / coachs PC Batch processing Upload to database Create user requested videos ssh scp Web browser TCP / IP
  • Slide 15
  • Data Capture System : SpARC (2) ServerClient Detect strokes Log data :Motion + force data, images Split data into strokes Update database Turn on/off camera Live operation Post session File server (CL) Create directories Transmit data Handle + seat coordinates, handle force, stroke boundaries Create user videos Augment and select Encode videos Record user code Data, videos, video metadata Display on GUI Create metadata
  • Slide 16
  • Data Capture System : SpARC (3)
  • Slide 17
  • User interface : SpARC (1) Real-time feedback
  • Slide 18
  • User interface : SpARC (2) http://www-dyn.cl.cam.ac.uk/~jsf29/
  • Slide 19
  • Conclusions (1) General It works! It is being used and collects on average 1 session of 5 strokes every 2 or 3 days Users Cool!, Works, Has potential Some people are very frightened about using it (reassurance is required). Being able to see what your doing as well as feel it has been observed to facilitate communication and understanding of correct technique Although rowing technique is complex, the system has a steep but short learning curve Athletes require a very simple interface. They wont even see half the screen and definitely not read anything. Elite athletes will favour raw signal feedback; novices would be aided by hints and tips Force is equally important as motion in the feedback.
  • Slide 20
  • Conclusions (2) General Basic signals or sophisticated interpretation. Manually constructing rules is hard and unhelpful Every sports has specific requirements General system fidelity does show up interesting shapes, artefacts in rowing Sensor signals BECOME the ontology Technical At limit of WMCS range (accuracy and precision) WMCS wont work in bright sunlight Hand covering LED on handle Correspondence: Unnecessary vigorous rowing upsets algorithms which could be improved (domain specific e.g. scan; generic e.g. epipolar constraints) ECS updated infrequently More force sensors on heal of feet openCV is buggy
  • Slide 21
  • Conclusions (3) General Developed a novel and functional system and gained experience of deploying it and what is possible to achieve. It enables further useful and convincing work to be done Useful dataset Platform for other system to be built with Contributes to the age of useful data sets by setting a benchmark for dataset scale and detail. The experience is applicable in other domains of physical performances and a lot of the work and could be reused
  • Slide 22
  • Further work Use metrics to quantitatively measure the consistency of a performance with different levels (and modes) of real-time feedback
  • Slide 23
  • Overview Automated Physical Performance Assessment Basic real-time feedback Performance similarity Fault recognition Data annotation Questions
  • Slide 24
  • Performance similarity (1) Overall quality Quantitatively measure the difference between an ideal and given performance Motivation: Closed sports, muscle memory, practise consistently good strokes Cue and motivate athlete Problems include definition of ideal (currently coach), anthropomorphic invariance Similarity is VERY hard to define.
  • Slide 25
  • Performance Similarity (2) Approach Trajectories can be different in various ways Objective or subjective (from coaches) arguments for how similar are pairs of trajectories which are different in these ways. Form hypothesis about likely algorithms Test on dataset labelled with similarity define by coach See how necessary various aspects of algorithms are Artefacts exist in real dataset Correlations between shape & speed
  • Slide 26
  • Overview Automated Physical Performance Assessment Basic real-time feedback Performance similarity Fault recognition Data annotation Questions
  • Slide 27
  • Fault recognition (on-going work) Preliminary results have been obtained using a dataset of 6 rowers and the complete trajectory of the erg handle only. Binary classification over stroke quality was done using tempo-spatial features of the trajectory of the handle and a neural network. Two training methods were compared. Classification accuracy across given number of performers, for quality of individual aspects of technique. Progress: Waiting for enough data to make it worth running more algorithms
  • Slide 28
  • Overview Automated Physical Performance Assessment Basic real-time feedback Performance similarity Fault recognition Data annotation Questions
  • Slide 29
  • Annotating data http://www-dyn.cl.cam.ac.uk/~jsf29/
  • Slide 30
  • Concluding remarks Three generations of machine learning (Taxonomy by Prof. Chris Bishop). Rule dont work, Statistics is limited, especially recognition across different people Structure (time, segment selection)
  • Slide 31
  • Acknowledgements Rob Harle (advice) Brian Jones (EMCS, driving the van) Marcelo Pias & Salman Taherian (StrideSense) SeSAME (context) Simon Hay (Wii controllers, van driving) Andrew Lewis (C2PM3 code advice) Jesus College Boat Club, fellows and IT department (environment and support) i-Teams (who are investigating any commercial potential of SpARC)
  • Slide 32
  • Questions Please come and use the system! Thank you! Any questions?