SVM Application List

Support vector machines-based generalized predictive control

Reference(s):

Support vector machines-based generalized predictive control, Serdar Iplikci, INTERNATIONAL JOURNAL OF ROBUST AND NONLINEAR CONTROL, Vol. 16, pp. 843-862, 2006



Support vector machines-based generalized predictive control, Serdar Iplikci, INTERNATIONAL JOURNAL OF ROBUST AND NONLINEAR CONTROL, Vol. 16, pp. 843-862, 2006 Reference link(s):

http://ietfec.oxfordjournals.org/cgi/content/abstract/E89-A/10/2787



http://ietfec.oxfordjournals.org/cgi/content/abstract/E89-A/10/2787 Data link(s):





Entered by: Serdar Iplikci <iplikci@pau.edu.tr> - Monday, October 23, 2006 at 18:05:17 (GMT)

Comments:

Dynamic Reconstruction of Chaotic Systems from Inter-spike Intervals Using Least Squares Support Vector Machines

Reference(s):

Physica D, Vol. 216, pp. 282-293, 2006



Physica D, Vol. 216, pp. 282-293, 2006 Reference link(s):





Data link(s):





Entered by: Serdar Iplikci <iplikci@pau.edu.tr> - Monday, May 29, 2006 at 12:53:56 (GMT)

Comments:

Application of The Kernel Method to the Inverse Geosounding Problem

Reference(s):

"Application of the kernel method to the inverse geosounding problem", Hugo Hidalgo, Sonia Sosa and E. Gómez-Treviño, Neural Networks, vol. 16, pp. 349-353, 2003



"Application of the kernel method to the inverse geosounding problem", Hugo Hidalgo, Sonia Sosa and E. Gómez-Treviño, Neural Networks, vol. 16, pp. 349-353, 2003 Reference link(s):

http://cienciascomp.cicese.mx/recopat/articulos/NeuralNetworks03.pdf



http://cienciascomp.cicese.mx/recopat/articulos/NeuralNetworks03.pdf Data link(s):





Entered by: Hugo Hidalgo <hugo@cicese.mx> - Wednesday, March 22, 2006 at 14:04:25 (MST)

Comments:

Support Vector Machines Based Modeling of Seismic Liquefaction Potential

Reference(s):

Goh ATC. Seismic Liquefaction Potential Assessed by Neural Networks. Journal of Geotechnical Engineering 1994; 120(9): 1467-1480.

Goh ATC. Neural-Network Modeling of CPT Seismic Liquefaction Data. Journal of Geotechnical Engineering 1996; 122(1): 70-73





Goh ATC. Seismic Liquefaction Potential Assessed by Neural Networks. Journal of Geotechnical Engineering 1994; 120(9): 1467-1480. Goh ATC. Neural-Network Modeling of CPT Seismic Liquefaction Data. Journal of Geotechnical Engineering 1996; 122(1): 70-73 Reference link(s):

Accepted for publication in International Journal for Numerical and Analytical Methods in Geomechanics.



Accepted for publication in International Journal for Numerical and Analytical Methods in Geomechanics. Data link(s):





Entered by: Mahesh Pal <mpce_pal@yahoo.co.uk> - Wednesday, February 22, 2006 at 06:50:07 (GMT)

Comments:

SVM for Geo- and Environmental Sciences

Reference(s):

1. N. Gilardi, M. Kanevski, M. Maignan and E. Mayoraz. Environmental and Pollution Spatial Data Classification with Support Vector Machines and Geostatistics. Workshop W07 Intelligent techniques for Spatio-Temporal Data Analysis in Environmental Applications. ACAI99, Greece, July, 1999. pp. 43-51. www.idiap.ch

2. M Kanevski, N Gilardi, E Mayoraz, M Maignan. Spatial Data Classification with Support Vector Machines. Geostat 2000 congress. South Africa, April 2000.

3. Kanevski M., Wong P., Canu S. Spatial Data Mapping with Support Vector Regression and Geostatistics. 7th International Conference on Neural Information Processing, Taepon, Korea. Nov. 14-18, 2000. Pp. 1307-1311.

4. N GILARDI, Alex GAMMERMAN, Mikhail KANEVSKI, Michel MAIGNAN, Tom MELLUISH, Craig SAUNDERS, Volodia VOVK. Application des méthodes dapprentissage pour létude des risques de pollution dans le Lac Léman. 5e Colloque transfrontalier CLUSE. Risques majeurs: perception, globalisation et management. Université de Genève, 2000.

5. M. Kanevski. Evaluation of SVM Binary Classification with Nonparametric Stochastic Simulations. IDIAP Research Report, IDIAP-RR-01-07, 17 p. 2001. www.idiap.ch

6. M. Kanevski, A. Pozdnukhov, S. Canu, M. Maignan. Advanced Spatial Data Analysis and Modelling with Support Vector Machines. International Journal on Fuzzy Systems 2002. p. 606-615.

7. M. Kanevski , A. Pozdnukhov , S. Canu ,M. Maignan , P.M. Wong , S.A.R. Shibli Support Vector Machines for Classification and Mapping of Reservoir Data. In: Soft Computing for Reservoir Characterization and Modelling. P. Wong, F. Aminzadeh, M. Nikravesh (Eds.). Physica-Verlag, Heidelberg, N.Y. pp. 531-558, 2002.

8. Kanevski M., Pozdnukhov A., McKenna S., Murray Ch., Maignan M. Statistical Learning Theory for Spatial Data. In proceedings of GeoENV2002 conference. Barcelona, 2002.

9. M. Kanevski et al. Environmental data mining and modelling based on machine learning algorithms and geostatistics. Journal of Environmental Modelling and Software, 2004. vol. 19, pp. 845-855.

10. M. Kanevski, M. Maignan et al. Advanced geostatistical and machine learning models for spatial data analysis of radioactively contaminated territories. Journal of Environmental Sciences and Pollution Research, pp.137-149, 2003.

11. Kanevski M., Maignan M. and Piller G. Advanced analysis and modelling tools for spatial environmental data. Case study: indoor radon data n Switzerland. International conference EnviroInfo, 2004. http://www.enviroinfo2004.org/cdrom/Datas/Kanevski.htm

12. Kanevski M., Maignan M. and Pozdnukhov A. Active Learning of Environmental Data Using Support Vector Machines. Conference of the International Association for Mathematical Geology, Toronto 2005. http://www.iamgconference.com/

13. M. Kanevski, A. Pozdnukhov, M. Tonini, M. Motelica, E. Savelieva, M. Maignan. Statistical Learning Theory for Geospatial Data. Case study: Aral Sea. 14th European colloquium on Theoretical and Quantitative Geography. Portugal, September 2005.

14. Pozdnukhov A., Kanevski M. Monitoring network optimisation using support vector machines. In: Geostatistics for Environmental applications. (Renard Ph., Demougeot-Renard H and Froidevaux, Eds.). Springer, 2005. pp. 39-50.

15. Pozdnukhov A. and Kanevski M. Monitoring Network Optimisation for Spatial Data Classification Using Support Vector Machines. (2006). International Journal of Environment and Pollution. Vol.28. 20 pp.





1. N. Gilardi, M. Kanevski, M. Maignan and E. Mayoraz. Environmental and Pollution Spatial Data Classification with Support Vector Machines and Geostatistics. Workshop W07 Intelligent techniques for Spatio-Temporal Data Analysis in Environmental Applications. ACAI99, Greece, July, 1999. pp. 43-51. www.idiap.ch 2. M Kanevski, N Gilardi, E Mayoraz, M Maignan. Spatial Data Classification with Support Vector Machines. Geostat 2000 congress. South Africa, April 2000. 3. Kanevski M., Wong P., Canu S. Spatial Data Mapping with Support Vector Regression and Geostatistics. 7th International Conference on Neural Information Processing, Taepon, Korea. Nov. 14-18, 2000. Pp. 1307-1311. 4. N GILARDI, Alex GAMMERMAN, Mikhail KANEVSKI, Michel MAIGNAN, Tom MELLUISH, Craig SAUNDERS, Volodia VOVK. Application des méthodes dapprentissage pour létude des risques de pollution dans le Lac Léman. 5e Colloque transfrontalier CLUSE. Risques majeurs: perception, globalisation et management. Université de Genève, 2000. 5. M. Kanevski. Evaluation of SVM Binary Classification with Nonparametric Stochastic Simulations. IDIAP Research Report, IDIAP-RR-01-07, 17 p. 2001. www.idiap.ch 6. M. Kanevski, A. Pozdnukhov, S. Canu, M. Maignan. Advanced Spatial Data Analysis and Modelling with Support Vector Machines. International Journal on Fuzzy Systems 2002. p. 606-615. 7. M. Kanevski , A. Pozdnukhov , S. Canu ,M. Maignan , P.M. Wong , S.A.R. Shibli Support Vector Machines for Classification and Mapping of Reservoir Data. In: Soft Computing for Reservoir Characterization and Modelling. P. Wong, F. Aminzadeh, M. Nikravesh (Eds.). Physica-Verlag, Heidelberg, N.Y. pp. 531-558, 2002. 8. Kanevski M., Pozdnukhov A., McKenna S., Murray Ch., Maignan M. Statistical Learning Theory for Spatial Data. In proceedings of GeoENV2002 conference. Barcelona, 2002. 9. M. Kanevski et al. Environmental data mining and modelling based on machine learning algorithms and geostatistics. Journal of Environmental Modelling and Software, 2004. vol. 19, pp. 845-855. 10. M. Kanevski, M. Maignan et al. Advanced geostatistical and machine learning models for spatial data analysis of radioactively contaminated territories. Journal of Environmental Sciences and Pollution Research, pp.137-149, 2003. 11. Kanevski M., Maignan M. and Piller G. Advanced analysis and modelling tools for spatial environmental data. Case study: indoor radon data n Switzerland. International conference EnviroInfo, 2004. http://www.enviroinfo2004.org/cdrom/Datas/Kanevski.htm 12. Kanevski M., Maignan M. and Pozdnukhov A. Active Learning of Environmental Data Using Support Vector Machines. Conference of the International Association for Mathematical Geology, Toronto 2005. http://www.iamgconference.com/ 13. M. Kanevski, A. Pozdnukhov, M. Tonini, M. Motelica, E. Savelieva, M. Maignan. Statistical Learning Theory for Geospatial Data. Case study: Aral Sea. 14th European colloquium on Theoretical and Quantitative Geography. Portugal, September 2005. 14. Pozdnukhov A., Kanevski M. Monitoring network optimisation using support vector machines. In: Geostatistics for Environmental applications. (Renard Ph., Demougeot-Renard H and Froidevaux, Eds.). Springer, 2005. pp. 39-50. 15. Pozdnukhov A. and Kanevski M. Monitoring Network Optimisation for Spatial Data Classification Using Support Vector Machines. (2006). International Journal of Environment and Pollution. Vol.28. 20 pp. Reference link(s):

www.unil.ch/igar

www.idiap.ch



www.unil.ch/igar www.idiap.ch Data link(s):





Entered by: Mikhail Kanevski <Mikhail.Kanevski@unil.ch> - Sunday, February 12, 2006 at 16:30:07 (GMT)

Comments:

SVM for Protein Fold and Remote Homology Detection

Reference(s):

Profile based direct kernels for remote homology detection and fold recognition by Huzefa Rangwala and George Karypis (Bioinformatics 2005)



Profile based direct kernels for remote homology detection and fold recognition by Huzefa Rangwala and George Karypis (Bioinformatics 2005) Reference link(s):

http://bioinformatics.oxfordjournals.org/cgi/content/abstract/bti687v1



http://bioinformatics.oxfordjournals.org/cgi/content/abstract/bti687v1 Data link(s):

http://bioinfo.cs.umn.edu/supplements/remote-homology/



Entered by: Huzefa Rangwala <rangwala@cs.umn.edu> - Sunday, November 06, 2005 at 06:02:08 (GMT)

Comments:

content based image retrieval

Reference(s):

Dacheng Tao, Xiaoou Tang, Xuelong Li, and Xindong Wu, Asymmetric Bagging and Random Subspacing for Support Vector Machines-based Relevance Feedback in Image Retrieval, IEEE Transactions on Pattern Analysis and Machine Intelligence, accepted, to appear.



Dacheng Tao, Xiaoou Tang, Xuelong Li, and Xindong Wu, Asymmetric Bagging and Random Subspacing for Support Vector Machines-based Relevance Feedback in Image Retrieval, IEEE Transactions on Pattern Analysis and Machine Intelligence, accepted, to appear. Reference link(s):





Data link(s):





Entered by: Dacheng Tao <Dacheng Tao> - Tuesday, October 11, 2005 at 19:03:18 (GMT)

Comments:

DATA Classification ursing SSVM

Reference(s):



[1] O. L. Mangasarian . A Finite Newton Method for Classification Problems

[2] O. L. Mangasarian . A Smooth Support Vector machine for classification .

[3] K.P. Soman . XSVMs and Applications





[1] O. L. Mangasarian . A Finite Newton Method for Classification Problems [2] O. L. Mangasarian . A Smooth Support Vector machine for classification . [3] K.P. Soman . XSVMs and Applications Reference link(s):





Data link(s):





Entered by: Aduru . Venkateswarlu <venkatsherma@yahoo.com> - Monday, September 19, 2005 at 04:35:39 (GMT)

Comments:

DTREG SVM and decision tree modeling

Reference(s):





Reference link(s):

http://www.dtreg.com/svm.htm



http://www.dtreg.com/svm.htm Data link(s):





Entered by: Phil Sherrod <phil.sherrod@sandh.com> - Saturday, September 10, 2005 at 20:32:24 (GMT)

Comments:

DTREG - SVM and Decision Tree Predictive Modeling

Reference(s):





Reference link(s):

http://www.dtreg.com/svm.htm



http://www.dtreg.com/svm.htm Data link(s):





Entered by: Phil Sherrod <phil.sherrod@sandh.com> - Friday, August 26, 2005 at 20:09:46 (GMT)

Comments: DTREG supports Linear, Polynomial, Sigmoid and Radial Basis kernel functions. It can handle problems with millions of data rows and hundreds of variables.

Facial expression classification

Reference(s):

J. Ghent and J. McDonald, "Facial Expression Classification using a One-Against-All Support Vector Machine", proceedings of the Irish Machine Vision and Image Processing Conference, Aug 2005.



J. Ghent and J. McDonald, "Holistic Facial Expression Classification", SPIE Opto-Ireland, pp 5823-18, April 2005.







J. Ghent and J. McDonald, "Facial Expression Classification using a One-Against-All Support Vector Machine", proceedings of the Irish Machine Vision and Image Processing Conference, Aug 2005. J. Ghent and J. McDonald, "Holistic Facial Expression Classification", SPIE Opto-Ireland, pp 5823-18, April 2005. Reference link(s):





Data link(s):





Entered by: John Ghent <jghent@cs.may.ie> - Tuesday, August 09, 2005 at 10:14:08 (GMT)

Comments:

End-depth and discharge prediction in semi-circular and circular shaped channels

Reference(s):

C. Cortes and V.N. Vapnik, Support vector networks, Machine Learning 20 (1995), pp. 273297.

S. Dey, Free over fall in circular channels with flat base: a method of open channel flow measurement, Flow Meas. Instrum. 13 (2002), pp. 209221.

S. Dey, Free over fall in open channels: state-of-the-art review, Flow Meas. Instrum. 13 (2002), pp. 247264.

Y.B. Dibike, S. Velickov, D.P. Solomatine and M.B. Abbott, Model induction with support vector machines: Introduction and applications, J. Comput. Civil Eng. 15 (2001), pp. 208216.

D. Leunberger, Linear and Nonlinear Programming, Addison-Wesley (1984).

H. Rouse, Discharge characteristics of the free overfall, Civil Engineering, ASCE 6 (1936) (4), pp. 257260.

R.V. Raikar, D. Nagesh Kumar and S. Dey, End depth computation in inverted semi circular channels using ANNs, Flow Meas. Instrum. 15 (2004), pp. 285293.

A.J. Smola, Regression estimation with support vector learning machines, Masters Thesis, Technische Universität München, Germany, 1996.

M. Sterling and D.W. Knight, The free overfall as a flow measuring device in a circular channel, Water and Maritime Engineering Proceedings of Institution of Civil Engineers London 148 (December) (2001), pp. 235243.

V.N. Vapnik, Statistical Learning Theory, John Wiley and Sons, New York (1998).







C. Cortes and V.N. Vapnik, Support vector networks, Machine Learning 20 (1995), pp. 273297. S. Dey, Free over fall in circular channels with flat base: a method of open channel flow measurement, Flow Meas. Instrum. 13 (2002), pp. 209221. S. Dey, Free over fall in open channels: state-of-the-art review, Flow Meas. Instrum. 13 (2002), pp. 247264. Y.B. Dibike, S. Velickov, D.P. Solomatine and M.B. Abbott, Model induction with support vector machines: Introduction and applications, J. Comput. Civil Eng. 15 (2001), pp. 208216. D. Leunberger, Linear and Nonlinear Programming, Addison-Wesley (1984). H. Rouse, Discharge characteristics of the free overfall, Civil Engineering, ASCE 6 (1936) (4), pp. 257260. R.V. Raikar, D. Nagesh Kumar and S. Dey, End depth computation in inverted semi circular channels using ANNs, Flow Meas. Instrum. 15 (2004), pp. 285293. A.J. Smola, Regression estimation with support vector learning machines, Masters Thesis, Technische Universität München, Germany, 1996. M. Sterling and D.W. Knight, The free overfall as a flow measuring device in a circular channel, Water and Maritime Engineering Proceedings of Institution of Civil Engineers London 148 (December) (2001), pp. 235243. V.N. Vapnik, Statistical Learning Theory, John Wiley and Sons, New York (1998). Reference link(s):

http://www.sciencedirect.com/science/journal/09555986



http://www.sciencedirect.com/science/journal/09555986 Data link(s):





Entered by: mahesh pal <mpce_pal@yahoo.co.uk> - Monday, August 01, 2005 at 10:20:34 (GMT)

Comments:

Identification of alternative exons using SVM

Reference(s):

Dror G., Sorek R. and Shamir S.

Accurate identification of alternatively spliced exons using Support Vector Machine

Bioinformatics. 2005 Apr 1;21(7):897-901.

Epub 2004 Nov 5.



Dror G., Sorek R. and Shamir S. Accurate identification of alternatively spliced exons using Support Vector Machine Bioinformatics. 2005 Apr 1;21(7):897-901. Epub 2004 Nov 5. Reference link(s):

http://www2.mta.ac.il/~gideon/nns_pub.html



http://www2.mta.ac.il/~gideon/nns_pub.html Data link(s):





Entered by: Gideon Dror <gideon@mta.ac.il> - Monday, June 20, 2005 at 11:55:09 (GMT)

Comments: 2 class, 243 positive , 1753 negative instances. total 228 features gaussian kernel. Baseline systems: neural networks and Naive Bayes. SVM outperformed them in terms of area under ROC curve, but most inportantly, in its ability to get very high true positives rate (50%) for very low false positives rate (0.5%). This performance would enable effective scan of exon databases in search for novel alternatively spliced exons, in the human or other genomes.

Support Vector Machines For Texture Classification

Reference(s):

1.Support Vector Machines for Texture Classification

Kwang In Kim, Keechul Jung, Se Hyun Park, and

Hang Joon Kim,IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, VOL. 24, NO. 11, NOVEMBER 2002,

2.An introduction to Support Vector Machines and other kernel-based learning

methods by Nello Cristianini & John Shawe-Taylor. (http://www.support http://www.supportvector.

net/)



1.Support Vector Machines for Texture Classification Kwang In Kim, Keechul Jung, Se Hyun Park, and Hang Joon Kim,IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, VOL. 24, NO. 11, NOVEMBER 2002, 2.An introduction to Support Vector Machines and other kernel-based learning methods by Nello Cristianini & John Shawe-Taylor. (http://www.support http://www.supportvector. net/) Reference link(s):





Data link(s):





Entered by: sathishkumar <sathishkumar.maddy@gmail.com> - Thursday, June 02, 2005 at 05:02:34 (GMT)

Comments:

SVM application in E-learning

Reference(s):





Reference link(s):





Data link(s):





Entered by: sandeep dixit <sandeepdixit2004@yahoo.com> - Thursday, March 31, 2005 at 15:14:17 (GMT)

Comments:

text classification with SVMs

Reference(s):





Reference link(s):





Data link(s):





Entered by: Duong DInh DUng <dungngtq8@yahoo.com> - Thursday, March 24, 2005 at 06:03:04 (GMT)

Comments:

Isolated Handwritten Jawi Characters Categorization Using Support Vector Machines (SVM).

Reference(s):





Reference link(s):





Data link(s):





Entered by: Suhaimi Abd Latif <suhaimie@iiu.edu.my> - Wednesday, January 19, 2005 at 06:02:27 (GMT)

Comments:

Image Clustering

Reference(s):





Reference link(s):





Data link(s):





Entered by: Ahmed Yousuf Saber <saber_uap@yahoo.com> - Wednesday, January 19, 2005 at 02:16:09 (GMT)

Comments:

ewsRec, a SVM-driven Personal Recommendation System for News Websites

Reference(s):

Bomhardt, C. (2004): NewsRec, a SVM-driven Personal Recommendation System for News Websites

In: Web Intelligence, IEEE/WIC/ACM International Conference on (WI'04)

Keywords: Personal Recommendation, Support-Vector-Machine, Personalization, Text Classification



Bomhardt, C. (2004): NewsRec, a SVM-driven Personal Recommendation System for News Websites In: Web Intelligence, IEEE/WIC/ACM International Conference on (WI'04) Keywords: Personal Recommendation, Support-Vector-Machine, Personalization, Text Classification Reference link(s):

http://csdl.computer.org/comp/proceedings/wi/2004/2100/00/2100toc.htm



http://csdl.computer.org/comp/proceedings/wi/2004/2100/00/2100toc.htm Data link(s):





Entered by: Christian Bomhardt <christian.bomhardt@etu.uni-karlsruhe.de> - Monday, October 11, 2004 at 15:26:58 (GMT)

Comments: about 1200 datasets, about 30000 features, linear kernel, SVMs are very fast compared to other methods and can handle the large number of features.

Equbits Foresight

Reference(s):





Reference link(s):

www.equbits.com



www.equbits.com Data link(s):





Entered by: Ravi Mallela <ravi@equbits.com> - Saturday, October 09, 2004 at 15:37:20 (GMT)

Comments:

SPEAKER /SPEECH RECOGNITION

Reference(s):

X.DONG......

ELECTRONICS LETTERS ,VOL.37,PP.527-529,(2001)

C.MA.RANDOLPH ....

IEEE INT.CONFERENCE ON ACOUSTICS,SPEECH,AND SIGNAL PROCESSING VOL.1,PP.381-384,(2001)

V.WAN...

IEEE WORKSHOP ON NEURAL NETWORK FOR SIGNAL PROCESSING X,VOL.2,(2000)



X.DONG...... ELECTRONICS LETTERS ,VOL.37,PP.527-529,(2001) C.MA.RANDOLPH .... IEEE INT.CONFERENCE ON ACOUSTICS,SPEECH,AND SIGNAL PROCESSING VOL.1,PP.381-384,(2001) V.WAN... IEEE WORKSHOP ON NEURAL NETWORK FOR SIGNAL PROCESSING X,VOL.2,(2000) Reference link(s):





Data link(s):





Entered by: MEHDI GHAYOUMI <M_GHAYOUMI@YAHOO.COM> - Tuesday, March 09, 2004 at 06:25:10 (GMT)

Comments:

STUDENT IN AI

Reference(s):

X.DONG......

ELECTRONICS LETTERS ,VOL.37,PP.527-529,(2001)

C.MA.RANDOLPH ....

IEEE INT.CONFERENCE ON ACOUSTICS,SPEECH,AND SIGNAL PROCESSING VOL.1,PP.381-384,(2001)

V.WAN...

IEEE WORKSHOP ON NEURAL NETWORK FOR SIGNAL PROCESSING X,VOL.2,(2000)



X.DONG...... ELECTRONICS LETTERS ,VOL.37,PP.527-529,(2001) C.MA.RANDOLPH .... IEEE INT.CONFERENCE ON ACOUSTICS,SPEECH,AND SIGNAL PROCESSING VOL.1,PP.381-384,(2001) V.WAN... IEEE WORKSHOP ON NEURAL NETWORK FOR SIGNAL PROCESSING X,VOL.2,(2000) Reference link(s):





Data link(s):





Entered by: MEHDI GHAYOUMI <M_GHAYOUMI@YAHOO.COM> - Tuesday, March 09, 2004 at 06:23:03 (GMT)

Comments:

Analysis and Applications of Support Vector Forecasting Model Based on Chaos Theory

Reference(s):

[1] ÂÀ½ð»¢µÈ.»ìãçÊ±¼äÐòÁÐ·ÖÎö¼°ÆäÓ¦ÓÃ[M]. Îäºº: Îäºº´óÑ§³ö°æÉç,2001

[2] Stefania Tronci, Massimiliano Giona, Roberto Baratti, ¡°Reconstruction of chaotic time series by neural models: a case study,¡± Neurocomputing, vol. 55, pp. 581-591, 2003.

[3] ºØÌ«¸Ù,Ö£³çÑ«. »ìãçÐòÁÐµÄ·ÇÏßÐÔÔ¤²â[J].×ÔÈ»ÔÓÖ¾, 19(1): 10-13, 2001.

[4] Àî¶¬Ã·, ÍõÕýÅ·. »ùÓÚRBFÍøÂçµÄ»ìãçÊ±¼äÐòÁÐµÄ½¨Ä£Óë¶à²½Ô¤²â[J].ÏµÍ³¹¤³ÌÓëµç×Ó¼¼Êõ, 24(6): 81-83, 2002.

[5] ½ªÌÎ. º½¿Õ·¢¶¯»ú´­Õñ/Ê§ËÙÔ¤¹ÀÄ£ÐÍºÍ¹ÊÕÏ¼ì²âÑÐ¾¿[D], ²©Ê¿Ñ§Î»ÂÛÎÄ, Î÷°²:¿Õ¾ü¹¤³Ì´óÑ§¹¤³ÌÑ§Ôº,2002.

[6] Íõº£Ñà, Ê¢ÕÑå«. »ìãçÊ±¼äÐòÁÐÏà¿Õ¼äÖØ¹¹²ÎÊýµÄÑ¡È¡·½·¨[J].¶«ÄÏ´óÑ§Ñ§±¨, 30(5):113-117, 2000.

[7] L.-Y. Cao, ¡°Practical method for determining minimum embedding dimension of a scalar time series,¡± Physica D, vol. 110, pp. 43-52, 1997.

[8] Eckmann J.P, Kamphorst S.O, ¡°Lyapunov exponent from time series,¡± Phys. Rev. A, vol. 34, no. 6, pp.4971~4979, Dec. 1986.

[9] Oiwa N.N, Fiedler-Ferrara N, ¡°A fast algorithm for estimating lyapunov exponents from time series,¡± Physics Letter A, vol.246, pp.117-121, Sep. 1998.

[10] Fabio Sattin, ¡°Lyap: A FORTRAN 90 program to compute the Lyapunov exponents of a dynamical system from a time series,¡± Computer Physics Communications, vol.107, pp.253-257. 1997.

[11] K.R.M¨¹ler, A.J. Smola, G.Rätsch, ¡°Predicting time series with support vector machines,¡± in Proceeding of ICANN 97¡¯, Berlin: Springer LNCS, vol. 1327, pp. 999-1004. 1997.

[12] B.-J. Chen, ¡°Load forecasting using Support vector machines: A study on EUNITE Competition 2001,¡±unpublished.

[13] L.-J.Cao, Q.-M. Gu, ¡°Dynamic support vector machines for non-stationary time series forecasting,¡± Intelligent Data Analysis. vol. 6, no. 1, pp. 67-83, 2002.

[14] F.E.H.Tay, L.-J. Cao, ¡°Applications of support vector machines in financial forecasting,¡± Omega, vol. 9, no. 4, pp.309-317, Aug. 2001.

[15] K.W.Lau, Q.-H. Wu, ¡°Local prediction of chaotic time series based on Gaussian processes,¡± in Proceeding of the 2002 IEEE International Conference on Control Applications, Glasgow, Scotland, U.K, pp. 1309-1313, Sep. 18-20 2002.

[16] Sayan Mukherjee, Edgar Osuna, Frederico Girosi, ¡° Nonlinear prediction of chaotic time series using support vector machines,¡± in Proc.of IEEE NNSP 97, Amelia Island, FL, Sep. 1997.





[1] ÂÀ½ð»¢µÈ.»ìãçÊ±¼äÐòÁÐ·ÖÎö¼°ÆäÓ¦ÓÃ[M]. Îäºº: Îäºº´óÑ§³ö°æÉç,2001 [2] Stefania Tronci, Massimiliano Giona, Roberto Baratti, ¡°Reconstruction of chaotic time series by neural models: a case study,¡± Neurocomputing, vol. 55, pp. 581-591, 2003. [3] ºØÌ«¸Ù,Ö£³çÑ«. »ìãçÐòÁÐµÄ·ÇÏßÐÔÔ¤²â[J].×ÔÈ»ÔÓÖ¾, 19(1): 10-13, 2001. [4] Àî¶¬Ã·, ÍõÕýÅ·. »ùÓÚRBFÍøÂçµÄ»ìãçÊ±¼äÐòÁÐµÄ½¨Ä£Óë¶à²½Ô¤²â[J].ÏµÍ³¹¤³ÌÓëµç×Ó¼¼Êõ, 24(6): 81-83, 2002. [5] ½ªÌÎ. º½¿Õ·¢¶¯»ú´­Õñ/Ê§ËÙÔ¤¹ÀÄ£ÐÍºÍ¹ÊÕÏ¼ì²âÑÐ¾¿[D], ²©Ê¿Ñ§Î»ÂÛÎÄ, Î÷°²:¿Õ¾ü¹¤³Ì´óÑ§¹¤³ÌÑ§Ôº,2002. [6] Íõº£Ñà, Ê¢ÕÑå«. »ìãçÊ±¼äÐòÁÐÏà¿Õ¼äÖØ¹¹²ÎÊýµÄÑ¡È¡·½·¨[J].¶«ÄÏ´óÑ§Ñ§±¨, 30(5):113-117, 2000. [7] L.-Y. Cao, ¡°Practical method for determining minimum embedding dimension of a scalar time series,¡± Physica D, vol. 110, pp. 43-52, 1997. [8] Eckmann J.P, Kamphorst S.O, ¡°Lyapunov exponent from time series,¡± Phys. Rev. A, vol. 34, no. 6, pp.4971~4979, Dec. 1986. [9] Oiwa N.N, Fiedler-Ferrara N, ¡°A fast algorithm for estimating lyapunov exponents from time series,¡± Physics Letter A, vol.246, pp.117-121, Sep. 1998. [10] Fabio Sattin, ¡°Lyap: A FORTRAN 90 program to compute the Lyapunov exponents of a dynamical system from a time series,¡± Computer Physics Communications, vol.107, pp.253-257. 1997. [11] K.R.M¨¹ler, A.J. Smola, G.Rätsch, ¡°Predicting time series with support vector machines,¡± in Proceeding of ICANN 97¡¯, Berlin: Springer LNCS, vol. 1327, pp. 999-1004. 1997. [12] B.-J. Chen, ¡°Load forecasting using Support vector machines: A study on EUNITE Competition 2001,¡±unpublished. [13] L.-J.Cao, Q.-M. Gu, ¡°Dynamic support vector machines for non-stationary time series forecasting,¡± Intelligent Data Analysis. vol. 6, no. 1, pp. 67-83, 2002. [14] F.E.H.Tay, L.-J. Cao, ¡°Applications of support vector machines in financial forecasting,¡± Omega, vol. 9, no. 4, pp.309-317, Aug. 2001. [15] K.W.Lau, Q.-H. Wu, ¡°Local prediction of chaotic time series based on Gaussian processes,¡± in Proceeding of the 2002 IEEE International Conference on Control Applications, Glasgow, Scotland, U.K, pp. 1309-1313, Sep. 18-20 2002. [16] Sayan Mukherjee, Edgar Osuna, Frederico Girosi, ¡° Nonlinear prediction of chaotic time series using support vector machines,¡± in Proc.of IEEE NNSP 97, Amelia Island, FL, Sep. 1997. Reference link(s):

In press

Proceeding of WCICA 2004



In press Proceeding of WCICA 2004 Data link(s):





Entered by: xunkai <skyhawkf119@163.com> - Monday, February 23, 2004 at 04:51:26 (GMT)

Comments: It seems impossible but SVM do perfect well!

A Comparison Of The Performance Of Artificial Neural Networks And Support Vector Machines For The Prediction Of Traffic Speed and Travel Time

Reference(s):

V. Kecman, Learning And Soft Computing: Support Vector Machines, Neural Networks, And Fuzzy Logic Models, The MIT press, Cambridge, Massachusetts, London, England.



S. Haykin, Neural Networks: A Comprehensive Foundation, Prentice Hall, 1999



N. Cristianini and J. S. Taylor, An Introduction To Support Vector Machines And Other Kernel Based Learning Methods, Cambridge university press, 2000



S. R. Gunn, Support Vector Machines for Classification and Regression, http://www.ecs.soton.ac.uk/~srg/ publications/pdf/SVM.pdf







V. Kecman, Learning And Soft Computing: Support Vector Machines, Neural Networks, And Fuzzy Logic Models, The MIT press, Cambridge, Massachusetts, London, England. S. Haykin, Neural Networks: A Comprehensive Foundation, Prentice Hall, 1999 N. Cristianini and J. S. Taylor, An Introduction To Support Vector Machines And Other Kernel Based Learning Methods, Cambridge university press, 2000 S. R. Gunn, Support Vector Machines for Classification and Regression, http://www.ecs.soton.ac.uk/~srg/ publications/pdf/SVM.pdf Reference link(s):





Data link(s):





Entered by: Lelitha Vanajakshi <lelitha@yahoo.com> - Friday, January 30, 2004 at 17:39:08 (GMT)

Comments: When the training data was less SVM outperformed ANN, when enough data was available both performed more or less same.

none

Reference(s):





Reference link(s):





Data link(s):





Entered by: leechs <leechs@sohu.com> - Sunday, January 25, 2004 at 13:44:16 (GMT)

Comments:

svm learning

Reference(s):





Reference link(s):





Data link(s):





Entered by: burak <burakkaragoz2002@yahoo.com> - Monday, December 08, 2003 at 16:06:03 (GMT)

Comments:

Protein Structure Prediction

Reference(s):

1. Kim, H. and H. Park, "Prediction of protein relative solvent accessibility with support vector machines and long-range interaction 3D local descriptor",

Proteins:structure, function, and genetics, to appear. (pdf download)

2. Kim, H. and H. Park, "Protein secondary structure prediction by support vector machines and position-specific scoring matrices",

Protein Engineering, to appear. (pdf download)



1. Kim, H. and H. Park, "Prediction of protein relative solvent accessibility with support vector machines and long-range interaction 3D local descriptor", Proteins:structure, function, and genetics, to appear. (pdf download) 2. Kim, H. and H. Park, "Protein secondary structure prediction by support vector machines and position-specific scoring matrices", Protein Engineering, to appear. (pdf download) Reference link(s):

http://www.cs.umn.edu/~hpark/papers/surface.pdf

http://www.cs.umn.edu/~hpark/papers/protein2.pdf



http://www.cs.umn.edu/~hpark/papers/surface.pdf http://www.cs.umn.edu/~hpark/papers/protein2.pdf Data link(s):





Entered by: Dr. Haesun Park <hpark@cs.umn.edu> - Friday, July 11, 2003 at 18:59:18 (GMT)

Comments:

Support vector classifiers for land cover classification

Reference(s):

Mahesh Pal recently finished his PhD form the university of Nottingham, UK and presently working as a lecturer in department of civil enginnering NIT kurukshetra, haryana, India.



Mahesh Pal recently finished his PhD form the university of Nottingham, UK and presently working as a lecturer in department of civil enginnering NIT kurukshetra, haryana, India. Reference link(s):

http://www.gisdevelopment.net/technology/rs/pdf/23.pdf



http://www.gisdevelopment.net/technology/rs/pdf/23.pdf Data link(s):





Entered by: Mahesh Pal <mpce_pal@yahoo.co.uk> - Wednesday, May 21, 2003 at 07:17:46 (GMT)

Comments:

Intrusion Detection

Reference(s):

Srinivas Mukkamala joined the Computer Science graduate program of New Mexico Tech in 2000 and is currently a Ph.D. student. He received his B.E. degree in computer science and engineering in 1999 from University of Madras. His interests are information assurance, information hiding, artificial intelligence, soft computing techniques for computer security.



Andrew H. Sung is professor and chairman of the Computer Science Department, and the Coordinator of the Information Technology Program, at New Mexico Tech. He received his Ph.D. in computer science from the State University of New York at Stony Brook in1984. His interests are intelligent systems, soft computing, and information assurance.





Srinivas Mukkamala joined the Computer Science graduate program of New Mexico Tech in 2000 and is currently a Ph.D. student. He received his B.E. degree in computer science and engineering in 1999 from University of Madras. His interests are information assurance, information hiding, artificial intelligence, soft computing techniques for computer security. Andrew H. Sung is professor and chairman of the Computer Science Department, and the Coordinator of the Information Technology Program, at New Mexico Tech. He received his Ph.D. in computer science from the State University of New York at Stony Brook in1984. His interests are intelligent systems, soft computing, and information assurance. Reference link(s):

www.cs.nmt.edu/~IT







www.cs.nmt.edu/~IT Data link(s):

http://kdd.ics.uci.edu/



Entered by: Srinivas Mukkamala <srinivas@cs.nmt.edu> - Thursday, January 09, 2003 at 05:02:19 (GMT)

Comments: SVMs are superior to ANNs for intrusion detection in three critical respects: SVMs train, and run, an order of magnitude faster; SVMs scale much better; and SVMs give higher classification accuracy. For details on number of classes, kernels used, input features, number of support vectors, input feature selection and ranking methods. Please take a read of our latest versions. If you need our latest versions or need any assistance, please send the author an email: srinivas@cs.nmt.edu Sincerely Srinivas Mukkamala

The Gaussian Dynamic Time Warping (GDTW) kernel for On-line Handwriting Recognition

During the last years the task of on-line handwriting recognition has gained an immense importance in all-day applications, mainly due to the increasing popularity of the personal digital assistant (pda). Currently a next generation of ``smart phones'' and tablet-style PCs, which also rely on handwriting input, is further targeting the consumer market. However, in the majority of these devices the handwriting input method is still not satisfying. In current pdas people still use input methods, which abstract from the natural writing style, e.g. in the widespread Graffiti.

Thus there is demand for a handwriting recognition system which is accurate, efficient and which can deal with the natural handwriting of a wide range of different writers.

Entered by: Claus Bahlmann <bahlmann@informatik.uni-freiburg.de> - Monday, September 09, 2002 at 11:52:27 (GMT)

Comments:

Usual SVM kernels are designed to deal with data of fixed dimension. However, on-line handwriting data is not of a fixed dimension, but of a variable-length sequential form. In this respect SVMs cannot be applied to HWR straightforwardly.

We have addressed this issue by developing an appropriate SVM kernel for sequential data, the Gaussian dynamic time warping (GDTW) kernel. The basic idea of the GDTW kernel is, that instead of the squared Euclidean distance in the usual Gaussian kernel it uses the dynamic time warping distance. In addition to on-line handwriting recognition the GDTW kernel can be straightforwardly applied to all classification problems, where DTW gives a reasonable distance measure, e.g. speech recognition or genome processing.

Experiments have shown superior recognition rate in comparison to an HMM-based classifier for relative small training sets (~ 6000) and comparable rates for larger training sets.

The Gaussian Dynamic Time Warping (GDTW) kernel for On-line Handwriting Recognition

During the last years the task of on-line handwriting recognition has gained an immense importance in all-day applications, mainly due to the increasing popularity of the personal digital assistant (pda). Currently a next generation of ``smart phones'' and tablet-style PCs, which also rely on handwriting input, is further targeting the consumer market. However, in the majority of these devices the handwriting input method is still not satisfying. In current pdas people still use input methods, which abstract from the natural writing style, e.g. in the widespread Graffiti.

Thus there is demand for a handwriting recognition system which is accurate, efficient and which can deal with the natural handwriting of a wide range of different writers.

Entered by: Claus Bahlmann <bahlmann@informatik.uni-freiburg.de> - Friday, September 06, 2002 at 11:39:08 (GMT)

Comments:

Usual SVM kernels are designed to deal with data of fixed dimension. However, on-line handwriting data is not of a fixed dimension, but of a variable-length sequential form. In this respect SVMs cannot be applied to HWR straightforwardly.

We have addressed this issue by developing an appropriate SVM kernel for sequential data, the Gaussian dynamic time warping (GDTW) kernel. The basic idea of the GDTW kernel is, that instead of the squared Euclidean distance in the usual Gaussian kernel it uses the dynamic time warping distance. In addition to on-line handwriting recognition the GDTW kernel can be straightforwardly applied to all classification problems, where DTW gives a reasonable distance measure, e.g. speech recognition or genome processing.

Experiments have shown superior recognition rate in comparison to an HMM-based classifier for relative small training sets (~ 6000) and comparable rates for larger training sets.

forecast

Reference(s):





Reference link(s):





Data link(s):





Entered by: shen <shen0204@yahoo.com.tw> - Thursday, September 05, 2002 at 07:24:00 (GMT)

Comments:

Detecting Steganography in digital images

Reference(s):

Detecting Hidden Messages Using Higher-Order Statistics and Support Vector Machines



S. Lyu and H. Farid

5th International Workshop on Information Hiding, Noordwijkerhout, The Netherlands, 2002







Detecting Hidden Messages Using Higher-Order Statistics and Support Vector Machines S. Lyu and H. Farid 5th International Workshop on Information Hiding, Noordwijkerhout, The Netherlands, 2002 Reference link(s):

http://www.cs.dartmouth.edu/~farid/publications/ih02.html



http://www.cs.dartmouth.edu/~farid/publications/ih02.html Data link(s):

http://www.cs.dartmouth.edu/~farid/publications/ih02.html



Entered by: Siwei Lyu <lsw@cs.dartmouth.edu> - Thursday, August 22, 2002 at 15:58:54 (GMT)

Comments: 2 classes 3600 training examples, over 18,000 testing samples 1100 SVs RBF kernel LibSVM

Detecting Steganography in digital images

Reference(s):





Reference link(s):

http://www.cs.dartmouth.edu/~farid/publications/ih02.html



http://www.cs.dartmouth.edu/~farid/publications/ih02.html Data link(s):

http://www.cs.dartmouth.edu/~farid/publications/ih02.html



Entered by: Siwei Lyu <lsw@cs.dartmouth.edu> - Thursday, August 22, 2002 at 15:57:24 (GMT)

Comments: 2 classes 3600 training examples, over 18,000 testing samples 1100 SVs RBF kernel LibSVM

Fast Fuzzy Cluster

Reference(s):





Reference link(s):

members.aol.com/awareai



members.aol.com/awareai Data link(s):





Entered by: Michael Bickel <awareai@aol.com> - Tuesday, July 23, 2002 at 01:11:11 (GMT)

Comments:

Breast Cancer Prognosis: Chemotherapy Effect on Survival Rate

Reference(s):

Yuh-Jye Lee, O. L. Mangasarian and W. H. Wolberg: ¡§Survival-Time Classification of Breast Cancer Patients, Data Mining Institute Technical Report 01-03, March 2001.



Yuh-Jye Lee, O. L. Mangasarian and W. H. Wolberg: ¡§Breast Cancer Survival and Chemotherapy: A Support Vector Machine Analysis¡¨, DIMACS Series in Discrete Mathematics and Theoretical Computer Science, Vol. 55 (2000), pp. 1-10.



Yuh-Jye Lee and O. L. Mangasarian: ¡§SSVM: Smooth Support Vector Machine for Classification¡¨, Computational Optimization and Applications (20)1: pp. 5-22.





Yuh-Jye Lee, O. L. Mangasarian and W. H. Wolberg: ¡§Survival-Time Classification of Breast Cancer Patients, Data Mining Institute Technical Report 01-03, March 2001. Yuh-Jye Lee, O. L. Mangasarian and W. H. Wolberg: ¡§Breast Cancer Survival and Chemotherapy: A Support Vector Machine Analysis¡¨, DIMACS Series in Discrete Mathematics and Theoretical Computer Science, Vol. 55 (2000), pp. 1-10. Yuh-Jye Lee and O. L. Mangasarian: ¡§SSVM: Smooth Support Vector Machine for Classification¡¨, Computational Optimization and Applications (20)1: pp. 5-22. Reference link(s):





Data link(s):

WPBCC: Wisconsin Prognostic Breast Cancer Chemotherapy Database.

ftp://ftp.cs.wisc.edu/math-prog/cpo-dataset/machine-learn/WPBCC/





Entered by: Yuh-Jye Lee <yjlee@cs.ccu.edu.tw> - Wednesday, October 24, 2001 at 19:38:50 (MDT)

Comments:

Underground Cable Temperature Prediction

Reference(s):





Reference link(s):





Data link(s):





Entered by: Robin Willis <rew198@soton.ac.uk> - Friday, May 04, 2001 at 08:31:41 (PDT)

Comments:

Image classification

Reference(s):





Reference link(s):

www.ens-lyon.fr/~ochapell/tnn99.ps.gz



www.ens-lyon.fr/~ochapell/tnn99.ps.gz Data link(s):





Entered by: Olivier Chapelle <chapelle@research.att.com> - Tuesday, April 04, 2000 at 13:50:39 (PDT)

Comments: Number of classes = 6 or 14 Dimension of the input features = 4096 Kernel = RBF with various distances SVM outperforms KNN. The choice of the distance in the RBF kernel is critical.

Particle and Quark-Flavour Identification in High Energy Physics

Entered by: Philippe Vannerem <philippe.vannerem@cern.ch> - Tuesday, October 19, 1999 at 16:17:56 (PDT)

Comments: We saw only small differences in performance between NNs and SVMs.

Object Detection

Combustion Engine Knock Detection

Reference(s):



M.Rychetsky, S.Ortmann, M.Glesner: Construction of a Support Vector Machine with Local Experts. Workshop on Support Vector Machines at the International Joint Conference on Artificial Intelligence (IJCAI 99), August, 1999, Stockholm, Sweden

M.Rychetsky, S.Ortmann, M.Glesner: Support Vector Approaches for Engine Knock Detection. International Joint Conference on Neural Networks (IJCNN 99), July, 1999, Washington, USA

Reference link(s):



Data link(s):



Engineering Support Vector Machine Kernels That Recognize Translation Initiation Sites

Reference(s):



A. Zien and G. Rätsch and S. Mika and B. Schölkopf and C. Lemmen and A. Smola and T. Lengauer and K.-R. Müller

Engineering Support Vector Machine Kernels That Recognize Translation Initiation Sites

German Conference on Bioinformatics 1999

Reference link(s):



http://www.bioinfo.de/isb/gcb99/talks/zien/ Data link(s):

The data sets we used were kindly supplied by Pedersen and Nielsen

(Center for Biological Sequence Analysis, Denmark; http://www.cbs.dtu.dk/).

Detection of Remote Protein Homologies

Reference(s):



A discriminative framework for detecting remote protein homologies.

Tommi Jaakkola, Mark Diekhans, and David Haussler Reference link(s):



http://www.cse.ucsc.edu/research/compbio/discriminative/Jaakola2-1998.ps Data link(s):

Dataset (12Mb compressed)

Function Approximation and Regression

Reference(s):

Support Vector Regression Machines.





Drucker, H.; Burges, C.; Kaufman, L.; Smola, A.; Vapnik, V. 1997.

In: M. Mozer, M. Jordan, and T. Petsche (eds.):

Neural Information Processing Systems, Vol. 9. MIT Press, Cambridge, MA,

1997. Support Vector Regression with ANOVA Decomposition Kernels.



Mark O. Stitson, Alex Gammerman, Vladimir Vapnik, Volodya Vovk, Chris Watkins, and Jason Weston

in Advances in Kernel Methods, B. Schölkopf, C.J.C. Burges, and A.J. Smola Eds.

Pages 285-291, MIT Press, 1999. ISBN 0-262-19416-3. Reference link(s):



Drucker-97,

Data link(s):

ftp://ftp.ics.uci.edu/pub/machine-learning-databases/housing

3-D Object Recognition Problems

Text Categorization

Time Series Prediction and Dynamic Resconstruction of Chaotic Systems

Support Vector Machine Classification of Microarray Gene Expression Data

Reference(s):



Support Vector Machine Classification of Microarray Gene Expression Data

M. Brown, W. Grundy, D. Lin, N. Cristianini C. Sugnet, M. Ares Jr., D. Haussler

University of California, Santa Cruz,

technical report UCSC-CRL-99-09. Reference link(s):



http://www.cse.ucsc.edu/research/compbio/genex/genex.tech.html Data link(s):

http://www.cse.ucsc.edu/research/compbio/genex/

Handwritten digit recognition problems

Breast cancer diagnosis and prognosis

Reference(s):



O. L. Mangasarian, W. Nick Street and W. H. Wolberg: ``Breast cancer diagnosis and prognosis via linear programming", Operations Research, 43(4), July-August 1995, 570-577. P. S. Bradley, O. L. Mangasarian and W. Nick Street: ``Feature selection via mathematical programming", INFORMS Journal on Computing 10, 1998, 209-217. P. S. Bradley, O. L. Mangasarian and W. Nick Street: ``Clustering via concave minimization", in ``Advances in Neural Information Processing Systems -9-", (NIPS*96), M. C. Mozer and M. I. Jordan and T. Petsche, editors, MIT Press, Cambridge, MA, 1997, 368-374. T.-T. Friess; N. Cristianini; C. Campbell.

The kernel adatron algorithm: a fast and simple learning procedure for support vector machines.

15th Intl. Conf. Machine Learning, Morgan Kaufman Publishers.

1998. Reference link(s):



ftp://ftp.cs.wisc.edu/math-prog/tech-reports/94-10.ps

ftp://ftp.cs.wisc.edu/math-prog/tech-reports/95-21.ps

ftp://ftp.cs.wisc.edu/math-prog/tech-reports/96-03.ps

http://svm.first.gmd.de/papers/FriCriCam98.ps.gz Data link(s):

WDBC: Wisconsin Diagnostic Breast Cancer Database

BC: Wisconsin Prognostic Breast Cancer Database

Support Vector Decision Tree Methods for Database Marketing

Reference(s):



On Support Vector Decision Trees for Database Marketing K.P. Bennett, D. Wu, and L. Auslender Report No. 98-100, Rensselaer Polytechnic Institute, Troy, NY, 1998. Reference link(s):



http://www.rpi.edu/~bennek/mr98100.ps Data link(s):

Data is proprietary.