词条 | XGBoost |
释义 |
| name = XGBoost | logo = | screenshot = | caption = | developer = The XGBoost Contributors | status = Active | released = {{Start date and age|2014|03|27}} | latest release version = 0.70 | latest release date = {{release date|2017|12|30}} | operating system = Linux, macOS, Windows | programming language = C++ | genre = Machine learning | license = Apache License 2.0 | website = {{URL|https://xgboost.ai/}} }} XGBoost[1] is an open-source software library which provides a gradient boosting framework for C++, Java, Python,[2]R,[3] and Julia.[4]It works on Linux, Windows,[5] and macOS.[6] From the project description, it aims to provide a "Scalable, Portable and Distributed Gradient Boosting (GBM, GBRT, GBDT) Library". Other than running on a single machine, it also supports the distributed processing frameworks Apache Hadoop, Apache Spark, and Apache Flink. It has gained much popularity and attention recently as the algorithm of choice for many winning teams of machine learning competitions.[7] HistoryXGBoost initially started as a research project by Tianqi Chen[8] as part of the Distributed (Deep) Machine Learning Community (DMLC) group. Initially, it began as a terminal application which could be configured using a libsvm configuration file. It became well known in the ML competition circles after its use in the winning solution of the Higgs Machine Learning Challenge. Soon after, the Python and R packages were built, and XGBoost now has package implementations for Julia, Scala, Java, and other languages. This brought the library to more developers and contributed to its popularity among the Kaggle community, where it has been used for a large number of competitions.[7] It soon became used with multiple other packages making it easier to use in the respective communities. It now has integrations with scikit-learn for Python users, and also with the caret package for R users. It can also be integrated into Data Flow frameworks like Apache Spark, Apache Hadoop, and Apache Flink using the abstracted Rabit[9] and XGBoost4J.[10] The working of XGBoost has also been published by Tianqi Chen and Carlos Guestrin.[11] Awards
References1. ^{{cite web|url=https://github.com/dmlc/xgboost |title=GitHub project webpage}} {{artificial-intelligence-stub}}2. ^{{Cite web|url=https://pypi.python.org/pypi/xgboost/|title=Python Package Index PYPI: xgboost|access-date=2016-08-01}} 3. ^{{Cite web|url=https://cran.r-project.org/web/packages/xgboost/index.html|title=CRAN package xgboost|access-date=2016-08-01}} 4. ^{{Cite web|url=http://pkg.julialang.org/?pkg=XGBoost#XGBoost|title=Julia package listing xgboost|access-date=2016-08-01}} 5. ^{{Cite web|title=Installing XGBoost for Anaconda in Windows|url=https://www.ibm.com/developerworks/community/blogs/jfp/entry/Installing_XGBoost_For_Anaconda_on_Windows?lang=en|accessdate=2016-08-01}} 6. ^{{Cite web|title=Installing XGBoost on Mac OSX|url=https://www.ibm.com/developerworks/community/blogs/jfp/entry/Installing_XGBoost_on_Mac_OSX?lang=en|accessdate = 2016-08-01}} 7. ^1 {{Cite web|title=XGBoost - ML winning solutions (incomplete list)|url=https://github.com/dmlc/xgboost/tree/master/demo#machine-learning-challenge-winning-solutions|access-date=2016-08-01}} 8. ^{{Cite web|url=http://homes.cs.washington.edu/~tqchen/2016/03/10/story-and-lessons-behind-the-evolution-of-xgboost.html|title=Story and Lessons behind the evolution of XGBoost|access-date=2016-08-01}} 9. ^{{Cite web|url=https://github.com/dmlc/rabit|title=Rabit - Reliable Allreduce and Broadcast Interface|access-date=2016-08-01}} 10. ^{{Cite web|url=https://xgboost.readthedocs.io/en/latest/jvm/index.html|title=XGBoost4J|access-date=2016-08-01}} 11. ^{{cite conference | last1 = Chen | first1 = Tianqi | last2 = Guestrin | first2 = Carlos | editor1-last = Krishnapuram | editor1-first = Balaji | editor2-last = Shah | editor2-first = Mohak | editor3-last = Smola | editor3-first = Alexander J. | editor4-last = Aggarwal | editor4-first = Charu C. | editor5-last = Shen | editor5-first = Dou | editor6-last = Rastogi | editor6-first = Rajeev | arxiv = 1603.02754 | contribution = XGBoost: A Scalable Tree Boosting System | doi = 10.1145/2939672.2939785 | pages = 785–794 | publisher = ACM | title = Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, August 13-17, 2016 | year = 2016}} 12. ^{{Cite web|url=http://stat-computing.org/awards/jmc/winners.html|title=John Chambers Award Previous Winners|access-date=2016-08-01}} 13. ^{{Cite web|url=https://higgsml.lal.in2p3.fr/prizes-and-award/award/|title=HEP meets ML Award|access-date=2016-08-01}} 6 : Data mining and machine learning software|Free data analysis software|Software using the Apache license|Big data products|Free software programmed in C++|2014 software |
随便看 |
|
开放百科全书收录14589846条英语、德语、日语等多语种百科知识,基本涵盖了大多数领域的百科知识,是一部内容自由、开放的电子版国际百科全书。