skip to main content
10.1145/1569901.1570077acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
research-article

On the scalability of XCS(F)

Published: 08 July 2009 Publication History

Abstract

Many successful applications have proven the potential of Learning Classifier Systems and the XCS classifier system in particular in datamining, reinforcement learning, and function approximation tasks. Recent research has shown that XCS is a highly flexible system, which can be adapted to the task at hand by adjusting its condition structures, learning operators, and prediction mechanisms. However, fundamental theory concerning the scalability of XCS dependent on these enhancements and problem difficulty is still rather sparse and mainly restricted to boolean function problems. In this article we developed a learning scalability theory for XCSF---the XCS system applied to real-valued function approximation problems. We determine crucial dependencies on functional properties and on the developed solution representation and derive a theoretical scalability model out of these constraints. The theoretical model is verified with empirical evidence. That is, we show that given a particular problem difficulty and particular representational constraints XCSF scales optimally. In consequence, we discuss the importance of appropriate prediction and condition structures regarding a given problem and show that scalability properties can be improved by polynomial orders, given an appropriate, problem-suitable representation.

References

[1]
E. Bernadó-Mansilla and J.M. Garrell-Guiu. Accuracy-based learning classifier systems: Models, analysis, and applications to classification tasks. Evolutionary Computation, 11:209--238, 2003.
[2]
L.B. Booker, D.E. Goldberg, and J.H. Holland. Classifier systems and genetic algorithms. Artif. Intell., 40(1-3):235--282, 1989.
[3]
M.V. Butz. Kernel-based, ellipsoidal conditions in the real-valued XCS classifier system. In GECCO '05: Proceedings of the 2005 conference on Genetic and evolutionary computation, pages 1835--1842, New York, NY, USA, 2005. ACM.
[4]
M.V. Butz. Rule-Based Evolutionary Online Learning Systems: A Principal Approach to LCS Analysis and Design. Springer-Verlag, 2006.
[5]
M.V. Butz. Combining gradient-based with evolutionary online learning: An introduction to learning classifier systems. 7th International Conference on Hybrid Intelligent Systems, HIS 2007:12--17, 2007.
[6]
M.V. Butz, D.E. Goldberg, and P.L. Lanzi. Gradient descent methods in learning classifier systems: Improving XCS performance in multistep problems. Technical report, Illinois Genetic Algorithms Laboratory, 2003.
[7]
M.V. Butz, D.E. Goldberg, and P.L. Lanzi. Computational complexity of the XCS classifier system. In L. Bull and T. Kovacs, editors, Foundations of Learning Classifier Systems, Studies in Fuzziness and Soft Computing, pages 91--126. Springer-Verlag, 2005.
[8]
M.V. Butz, T. Kovacs, P.L. Lanzi, and S.W. Wilson. Toward a theory of generalization and learning in XCS. IEEE Transactions on Evolutionary Computation, 8:28--46, 2004.
[9]
M.V. Butz, P.L. Lanzi, and S.W. Wilson. Function approximation with XCS: Hyperellipsoidal conditions, recursive least squares, and compaction. IEEE Transactions on Evolutionary Computation, 12:355--376, 2008.
[10]
J. Drugowitsch and A. Barry. A formal framework and extensions for function approximation in learning classifier systems. Machine Learning, 70:45--88, 2008.
[11]
N. Hansen and A. Ostermeier. Completely derandomized self-adaptation in evolution strategies. Evolutionary Computation, 9(2):159--195, 2001.
[12]
J.H. Holland. Adaptation in natural and artificial systems: An introductory analysis with applications to biology, control, and artificial intelligence. The MIT Press, Cambridge, Massachusetts, 1992.
[13]
J.H. Holland and J.S. Reitman. Cognitive systems based on adaptive algorithms. SIGART Bull., 63(63):49--49, 1977.
[14]
J. Hurst and L. Bull. A neural learning classifier system with self-adaptive constructivism for mobile robot learning. Artificial Life, 12:1--28, 2006.
[15]
P.L. Lanzi. An analysis of generalization in the XCS classifier system. Evolutionary Computation, 7(2):125--149, 1999.
[16]
P.L. Lanzi, M.V. Butz, and D.E. Goldberg. Empirical analysis of generalization and learning in XCS with gradient descent. In GECCO '07: Proceedings of the 9th annual conference on Genetic and evolutionary computation, pages 1814--1821, New York, NY, USA, 2007. ACM.
[17]
P.L. Lanzi, D. Loiacono, S.W. Wilson, and D.E. Goldberg. Prediction update algorithms for XCSF: RLS, kalman filter, and gain adaptation. In GECCO '06: Proceedings of the 8th annual conference on Genetic and evolutionary computation, pages 1505--1512, New York, NY, USA, 2006. ACM.
[18]
P.L. Lanzi, D. Loiacono, S.W. Wilson, and D.E. Goldberg. Generalization in the XCSF classifier system: Analysis, improvement, and extension. Evolutionary Computation, 15(2):133--168, 2007.
[19]
A. Orriols-Puig and E. Bernadó-Mansilla. Bounding xcs's parameters for unbalanced datasets. In GECCO '06: Proceedings of the 8th annual conference on Genetic and evolutionary computation, pages 1561--1568, New York, NY, USA, 2006. ACM.
[20]
C. Stone and L. Bull. An analysis of continuous-valued representations for learning classifier systems. In L. Bull and T. Kovacs, editors, Foundations of Learning Classifier Systems, Studies in Fuzziness and Soft Computing, pages 127--175. Springer-Verlag, 2005.
[21]
S.W. Wilson. Classifier fitness based on accuracy. Evolutionary Computation, 3(2):149--175, 1995.
[22]
S.W. Wilson. Generalization in the XCS classifier system. Genetic Programming 1998: Proceedings of the Third Annual Conference, pages 665--674, 1998.
[23]
S.W. Wilson. Classifiers that approximate functions. Natural Computing, 1:211--234, 2002.
[24]
S.W. Wilson. Classifier conditions using gene expression programming. In Learning Classifier Systems. 10th International Workshop, IWLCS 2006, Seattle, MA, USA, July 8, 2006, and 11th International Workshop, IWLCS 2007, London, UK, July 8, 2007, Revised Selected Paper, Lecture Notes in Artificial Intelligence, pages 206--217. Springer-Verlag, 2008.
[25]
D.H. Wolpert and W.G. Macready. No free lunch theorems for optimization. IEEE Transactions on Evolutionary Computation, 1(1):67--82, Apr. 1997.

Cited By

View all
  • (2019)A survey of formal theoretical advances regarding XCSProceedings of the Genetic and Evolutionary Computation Conference Companion10.1145/3319619.3326848(1295-1302)Online publication date: 13-Jul-2019
  • (2013)Performance analysis of rough set ensemble of learning classifier systems with differential evolution based rule discoveryEvolutionary Intelligence10.1007/s12065-013-0093-z6:2(109-126)Online publication date: 9-Oct-2013
  • (2012)An enhanced XCS rule discovery module using feature rankingInternational Journal of Machine Learning and Cybernetics10.1007/s13042-012-0085-94:3(173-187)Online publication date: 18-Mar-2012
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
GECCO '09: Proceedings of the 11th Annual conference on Genetic and evolutionary computation
July 2009
2036 pages
ISBN:9781605583259
DOI:10.1145/1569901
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 08 July 2009

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. function approximation
  2. learning classifier systems
  3. lwpr
  4. recursive least squares
  5. xcs

Qualifiers

  • Research-article

Conference

GECCO09
Sponsor:
GECCO09: Genetic and Evolutionary Computation Conference
July 8 - 12, 2009
Québec, Montreal, Canada

Acceptance Rates

Overall Acceptance Rate 1,669 of 4,410 submissions, 38%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)3
  • Downloads (Last 6 weeks)0
Reflects downloads up to 08 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2019)A survey of formal theoretical advances regarding XCSProceedings of the Genetic and Evolutionary Computation Conference Companion10.1145/3319619.3326848(1295-1302)Online publication date: 13-Jul-2019
  • (2013)Performance analysis of rough set ensemble of learning classifier systems with differential evolution based rule discoveryEvolutionary Intelligence10.1007/s12065-013-0093-z6:2(109-126)Online publication date: 9-Oct-2013
  • (2012)An enhanced XCS rule discovery module using feature rankingInternational Journal of Machine Learning and Cybernetics10.1007/s13042-012-0085-94:3(173-187)Online publication date: 18-Mar-2012
  • (2010)A multiple population XCS: Evolving condition-action rules based on feature space partitionsIEEE Congress on Evolutionary Computation10.1109/CEC.2010.5586521(1-8)Online publication date: Jul-2010

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media