![Page 1: ACE: A Framework for optimizing music classification](https://reader036.vdocuments.us/reader036/viewer/2022070404/56813b8f550346895da4bfa9/html5/thumbnails/1.jpg)
ACE: A Framework for optimizing music classification
Cory McKayRebecca FiebrinkDaniel McEnnis
Beinan LiIchiro Fujinaga
Music Technology AreaFaculty of MusicMcGill University
![Page 2: ACE: A Framework for optimizing music classification](https://reader036.vdocuments.us/reader036/viewer/2022070404/56813b8f550346895da4bfa9/html5/thumbnails/2.jpg)
22/25/25
Goals
Highlight limitations of existing pattern recognition software when applied to MIR Present solutions to these limitations
Stress importance of standardized classification and feature extraction software Ease of use, portability and extensibility
Present the ACE software framework Uses meta-learning Uses classification ensembles
![Page 3: ACE: A Framework for optimizing music classification](https://reader036.vdocuments.us/reader036/viewer/2022070404/56813b8f550346895da4bfa9/html5/thumbnails/3.jpg)
33/25/25
Existing music classification systems
Systems often implemented with specific tasks in mind Not extensible to general tasks Often difficult to use for those not involved in project
Need standardized systems for a variety of MIR problems No need to reimplement existing algorithms More reliable code More usable software Facilitates comparison of methodologies
Important foundations Marsyas (Tzanetakis & Cook 1999) M2K (Downie 2004)
![Page 4: ACE: A Framework for optimizing music classification](https://reader036.vdocuments.us/reader036/viewer/2022070404/56813b8f550346895da4bfa9/html5/thumbnails/4.jpg)
44/25/25
Existing general classification systems
Available general-purpose systems: PRTools (van der Heijden et al. 2004 ) Weka (Witten & Frank 2005)
Other meta-learning systems: AST (Lindner and Studer 1999) Metal (www.metal-kdd.org)
![Page 5: ACE: A Framework for optimizing music classification](https://reader036.vdocuments.us/reader036/viewer/2022070404/56813b8f550346895da4bfa9/html5/thumbnails/5.jpg)
55/25/25
Problems with existing systems
Distribution problems Proprietary software Not open source Limited licence
Music-specific systems are often limited None use meta-learning Classifier ensembles rarely used Interfaces not oriented towards end users
General-purpose systems not designed to meet the particular needs of music
![Page 6: ACE: A Framework for optimizing music classification](https://reader036.vdocuments.us/reader036/viewer/2022070404/56813b8f550346895da4bfa9/html5/thumbnails/6.jpg)
66/25/25
Special needs of music classification (1)
Assign multiple classes to individual recordings A recording may belong to multiple genres, for example
Allow classification of sub-sections and of overall recordings Audio features often windowed Useful for segmentation problems
Maintain logical grouping of multi-dimensional features Musical features often consist of vectors (e.g. MFCC’s) This relatedness can provide classification opportunities
![Page 7: ACE: A Framework for optimizing music classification](https://reader036.vdocuments.us/reader036/viewer/2022070404/56813b8f550346895da4bfa9/html5/thumbnails/7.jpg)
77/25/25
Special needs of music classification (2)
Maintain identifying meta-data about instances Title, performer, composer, date, etc.
Take advantage of hierarchically structured taxonomies Humans often organize music hierarchically Can provide classification opportunities
Interface for any user
![Page 8: ACE: A Framework for optimizing music classification](https://reader036.vdocuments.us/reader036/viewer/2022070404/56813b8f550346895da4bfa9/html5/thumbnails/8.jpg)
88/25/25
Standardized file formats
Existing formats such as Weka’s ARFF format cannot represent needed information
Important to enable classification systems to communicate with arbitrary feature extractors
Four XML file formats that meet the above needs are described in proceedings
![Page 9: ACE: A Framework for optimizing music classification](https://reader036.vdocuments.us/reader036/viewer/2022070404/56813b8f550346895da4bfa9/html5/thumbnails/9.jpg)
99/25/25
The ACE framework
ACE (Autonomous Classification Engine) is a classification framework that can be applied to arbitrary types of music classification
Meets all requirements presented above
Java implementation makes ACE portable and easy to install
![Page 10: ACE: A Framework for optimizing music classification](https://reader036.vdocuments.us/reader036/viewer/2022070404/56813b8f550346895da4bfa9/html5/thumbnails/10.jpg)
1010/25/25
ACE and meta-learning
Many classification methodologies available Each have different strengths and weaknesses
Uses meta-learning to experiment with a variety of approaches Finds approaches well suited to each problem Makes powerful pattern recognition tools available to non-
experts Useful for benchmarking new classifiers and features
![Page 11: ACE: A Framework for optimizing music classification](https://reader036.vdocuments.us/reader036/viewer/2022070404/56813b8f550346895da4bfa9/html5/thumbnails/11.jpg)
1111/25/25
ACE
Feature Extraction
System
Classification Methodology n
Dimensionality Reduction
Classification Methodology 1
Dimensionality Reduction
…
Model Classifications
MusicRecordings
Taxonomy Feature Settings
Extracted Features
Experiment Coordinator
Classifier Evaluator
Trained ClassifiersStatistical Comparison of Classification Methodologies
![Page 12: ACE: A Framework for optimizing music classification](https://reader036.vdocuments.us/reader036/viewer/2022070404/56813b8f550346895da4bfa9/html5/thumbnails/12.jpg)
1212/25/25
Algorithms used by ACE
Uses Weka class libraries Makes it easy to add or develop new algorithms
Candidate classifiers Induction trees, naive Bayes, k-nearest neighbour, neural
networks, support vector machines Classifier parameters are also varied automatically
Dimensionality reduction Feature selection using genetic algorithms, principal
component analysis, exhaustive searches Classifier ensembles
Bagging, boosting
![Page 13: ACE: A Framework for optimizing music classification](https://reader036.vdocuments.us/reader036/viewer/2022070404/56813b8f550346895da4bfa9/html5/thumbnails/13.jpg)
1313/25/25
Classifier ensembles
Multiple classifiers operating together to arrive at final classifications e.g. AdaBoost (Freund and Shapire 1996)
Success rates in many MIR areas are behaving asymptotically (Aucouturier and Pachet 2004) Classifier ensembles could provide some
improvement
![Page 14: ACE: A Framework for optimizing music classification](https://reader036.vdocuments.us/reader036/viewer/2022070404/56813b8f550346895da4bfa9/html5/thumbnails/14.jpg)
1414/25/25
Musical evaluation experiments
Achieved a 95.6% success with a five-class beatbox recognition experiment (Sinyor et al. 2005)
Repeated Tindale’s percussion recognition experiment (2004) ACE achieved 96.3% success, as compared to Tindale’s
best rate of 94.9% A reduction in error rate of 27.5%
![Page 15: ACE: A Framework for optimizing music classification](https://reader036.vdocuments.us/reader036/viewer/2022070404/56813b8f550346895da4bfa9/html5/thumbnails/15.jpg)
1515/25/25
General evaluation experiments
Applied ACE to six commonly used UCI datasets
Compared results to recently published algorithm (Kotsiantis and Pintelas 2004)
![Page 16: ACE: A Framework for optimizing music classification](https://reader036.vdocuments.us/reader036/viewer/2022070404/56813b8f550346895da4bfa9/html5/thumbnails/16.jpg)
1616/25/25
Results of UCI experiments (1)UCI Experiments
50
55
60
65
70
75
80
85
90
95
100
autos diabetes ionosphere iris labor vote
Dataset
Succ
ess
Kotsiantis ACE
DataSet
ACE's Selected Classifier
Kotsiantis' Success
Rate
ACE's Success Rate
autos AdaBoost 81.70% 86.30%
diabetes Naïve Bayes 76.60% 78.00%
ionosphere AdaBoost 90.70% 94.30%
iris FF Neural Net 95.60% 97.30%
labor k-NN 93.40% 93.00%
vote Decision Tree 96.20% 96.30%
![Page 17: ACE: A Framework for optimizing music classification](https://reader036.vdocuments.us/reader036/viewer/2022070404/56813b8f550346895da4bfa9/html5/thumbnails/17.jpg)
1717/25/25
Results of UCI experiments (2)
ACE performed very well Statistical uncertainty makes it difficult to say
that ACE’s results are inherently superior ACE can perform at least as well as a state of
the art algorithm with no tweaking ACE achieved these results using only one
minute per learning scheme for training and testing
![Page 18: ACE: A Framework for optimizing music classification](https://reader036.vdocuments.us/reader036/viewer/2022070404/56813b8f550346895da4bfa9/html5/thumbnails/18.jpg)
1818/25/25
Results of UCI experiments (3)
Different classifiers performed better on different datasets Supports ACE’s experimental meta-learning
approach
Effectiveness of AdaBoost (chosen 2 times out of 6) demonstrates strength of classifier ensembles
![Page 19: ACE: A Framework for optimizing music classification](https://reader036.vdocuments.us/reader036/viewer/2022070404/56813b8f550346895da4bfa9/html5/thumbnails/19.jpg)
1919/25/25
Feature extraction
ACE not tied to any particular feature extraction system Reads Weka ARFF as well as ACE XML files
Does include two powerful and extensible feature extractors are bundled with ACE Write Weka ARFF as well as ACE XML
![Page 20: ACE: A Framework for optimizing music classification](https://reader036.vdocuments.us/reader036/viewer/2022070404/56813b8f550346895da4bfa9/html5/thumbnails/20.jpg)
2020/25/25
jAudio
Reads: .mp3 .wav .aiff .au .snd
![Page 21: ACE: A Framework for optimizing music classification](https://reader036.vdocuments.us/reader036/viewer/2022070404/56813b8f550346895da4bfa9/html5/thumbnails/21.jpg)
2121/25/25
jSymbolic
Reads MIDI Uses 111
Bodhidharma features
![Page 22: ACE: A Framework for optimizing music classification](https://reader036.vdocuments.us/reader036/viewer/2022070404/56813b8f550346895da4bfa9/html5/thumbnails/22.jpg)
2222/25/25
ACE’s interface
Graphical interface Includes an on-line manual
Command-line interface Batch processing External calls
Java API Open source Well documented Easy to extend
![Page 23: ACE: A Framework for optimizing music classification](https://reader036.vdocuments.us/reader036/viewer/2022070404/56813b8f550346895da4bfa9/html5/thumbnails/23.jpg)
2323/25/25
Current status of ACE
In alpha release Full release scheduled for January 2006
Finalization of GUI User constraints on training, classification and meta-
learning times Feature weighting Expansion of candidate algorithms
Long-term Distributed processing, unsupervised learning, blackboard
systems, automatic cross-project optimization
![Page 24: ACE: A Framework for optimizing music classification](https://reader036.vdocuments.us/reader036/viewer/2022070404/56813b8f550346895da4bfa9/html5/thumbnails/24.jpg)
2424/25/25
Conclusions
Need standardized classification software able to deal with the special needs of music
Techniques such as meta-learning and classifier ensembles can lead to improved performance
ACE designed to address these issues
![Page 25: ACE: A Framework for optimizing music classification](https://reader036.vdocuments.us/reader036/viewer/2022070404/56813b8f550346895da4bfa9/html5/thumbnails/25.jpg)
Web site:Web site: coltrane.music.mcgill.ca/ACEcoltrane.music.mcgill.ca/ACE
E-mail:E-mail: [email protected]@mail.mcgill.ca