APR Advances in Pattern Recognition Support Vector Machines for Pattern Classification Shigeo abe Second edition pringer
Topics and features: Support Vector Machines for Pattern Classification Shigeo Abe Second Edition
Advances in pattern recognition For further volumes http://www.springer.com/series/4205
Advances in Pattern Recognition For further volumes: http://www.springer.com/series/4205
Shigeo abe Support vector machines for pattern Classification Second edition Springer
Shigeo Abe Support Vector Machines for Pattern Classification Second Edition 123
Prof Dr Shigeo abe Kobe University Graduate School of engineering 1-1 Rokkodai-cho Nada-ku 657-8501 Japan abe@@kobe-uLac jp eries editor Prof. Sameer Singh. PhD Research School of Informatics loughborough University ISSN16177916 ISBN978-1-84996-097-7 e-lSBN978-1-84996-098-4 DOI10.1007/978-1-84996-098-4 Springer London Dordrecht Heidelberg New York A catalogue record for this book is available from the British Library Library of Congress Control Number: 2010920369 O Springer-Verlag London Limited 2005, 2010 Apart from any fair dealing for the purposes of research or private study. ermitted under the Copyright, Designs and Patents Act 1988, this publication ored or transmitted, in any form or by any means, with the prior permis m ublishers, or in the case of reprographic reproduction in accordance with the terms of licenses issued by the Copyright Licensing Agency. Enquiries concerning reproduction outside those terms should I nt to the publisher The use of registered names, trademarks, etc, in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant laws and regulations and therefore free plied. with regard to the contained in this book and cannot accept any legal responsibility or liability for any errors or omissions that may be made Printed on acid-free paper SpringerispartofSpringerScience+businessMedia(www.springer.com)
ISSN 1617-7916 ISBN 978-1-84996-097-7 e-ISBN 978-1-84996-098-4 DOI 10.1007/978-1-84996-098-4 Springer London Dordrecht Heidelberg New York British Library Cataloguing in Publication Data A catalogue record for this book is available from the British Library Library of Congress Control Number: 2010920369 c Springer-Verlag London Limited 2005, 2010 Apart from any fair dealing for the purposes of research or private study, or criticism or review, as permitted under the Copyright, Designs and Patents Act 1988, this publication may only be reproduced, stored or transmitted, in any form or by any means, with the prior permission in writing of the publishers, or in the case of reprographic reproduction in accordance with the terms of licenses issued by the Copyright Licensing Agency. Enquiries concerning reproduction outside those terms should be sent to the publishers. The use of registered names, trademarks, etc., in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant laws and regulations and therefore free for general use. The publisher makes no representation, express or implied, with regard to the accuracy of the information contained in this book and cannot accept any legal responsibility or liability for any errors or omissions that may be made. Printed on acid-free paper Springer is part of Springer Science+Business Media (www.springer.com) Prof. Dr. Shigeo Abe Kobe University Graduate School of Engineering 1-1 Rokkodai-cho Nada-ku Kobe 657-8501 Japan abe@kobe-u.ac.jp Series Editor Prof. Sameer Singh, PhD Research School of Informatics Loughborough University Loughborough, UK
Preface Preface to the Second edition Since the introduction of support vector machines, we have witnessed the huge development in theory, models, and applications of what is so-calle kernel-based methods: advancement in generalization theory, kernel classifiers and regressors and their variants, various feature selection and extraction methods, and wide variety of applications such as pattern classification and regressions in biology, medicine, chemistry, as well as computer science. In Support vector Machines for Pattern Classification, Second Edition, I try to reflect the development of kernel-based methods since 2005. In ad- dition, I have included more intensive performance comparison of classifiers and regressors, added new references, and corrected many errors in the first edition. The major modifications of, and additions to, the first edition are as follows I have changed the symbols of the mapping function to the feature from g(x)to more commonly used (x) and its associated kernel from H(x,x')to K(x,x') 1.3 Data Sets Used in the Book: I have added publicly available two-class data sets, microarray data sets, multiclass data sets, and regression data 1.4 Classifier Evaluation: Evaluation criteria for classifiers and regressors are discussed 2.3.2 Kernels: Mahalanobis kernels, graph kernels, etc are added 2.3.6 Empirical Feature Space: The high-dimensional feature space is treated implicitly via kernel tricks. This is an advantage and also a disadvantage because we treat the feature space without knowing its structure. The em- pirical feature space is equivalent to the feature space, in that it gives the same kernel value as that of the feature space. The introduction of the empirical feature space greatly enhances the interpretability and manipu- lability of the fea ature space
Preface Preface to the Second Edition Since the introduction of support vector machines, we have witnessed the huge development in theory, models, and applications of what is so-called kernel-based methods: advancement in generalization theory, kernel classifiers and regressors and their variants, various feature selection and extraction methods, and wide variety of applications such as pattern classification and regressions in biology, medicine, chemistry, as well as computer science. In Support Vector Machines for Pattern Classification, Second Edition, I try to reflect the development of kernel-based methods since 2005. In addition, I have included more intensive performance comparison of classifiers and regressors, added new references, and corrected many errors in the first edition. The major modifications of, and additions to, the first edition are as follows: Symbols: I have changed the symbols of the mapping function to the feature space from g(x) to more commonly used φ(x) and its associated kernel from H(x, x ) to K(x, x ). 1.3 Data Sets Used in the Book: I have added publicly available two-class data sets, microarray data sets, multiclass data sets, and regression data sets. 1.4 Classifier Evaluation: Evaluation criteria for classifiers and regressors are discussed. 2.3.2 Kernels: Mahalanobis kernels, graph kernels, etc., are added. 2.3.6 Empirical Feature Space: The high-dimensional feature space is treated implicitly via kernel tricks. This is an advantage and also a disadvantage because we treat the feature space without knowing its structure. The empirical feature space is equivalent to the feature space, in that it gives the same kernel value as that of the feature space. The introduction of the empirical feature space greatly enhances the interpretability and manipulability of the feature space. v