We've added a citation format for referencing the KDD Cup 2010 datasets. Please remember to cite any KDD Cup datasets you use in your research using this format.
The log-in and registration pages were not working from 5/24/2012 until today. They are now working again.
During the KDD Cup Workshop, some participants suggested that we change the way the leaderboard works so that we display the same type of scores that were used to determine the competition winners (by validating most of the predictions instead of a small portion). We've made this change by introducing a toggle at the top of the leaderboard and submission pages, which preserves how the leaderboard worked during the competition. Try it out, or read the FAQ for more info.
The KDD Cup Workshop page is now up. The workshop, which will be held on July 25, 2010 as part of the KDD conference in Washington, DC, will include a discussion of the KDD Cup 2010 competition, and the winning teams will present their work.
Fact sheets submitted by this year's competitors are now
available, and are linked from the full results table. Learn more about the
competitors and their methods by reading their fact sheets.
The KDD Cup 2010 site is now open for you to make post-competition submissions. If you would like to
continue working on the challenge task and gain feedback from the online submission process and leaderboard,
you can now do so.
How generally or narrowly do students learn? How quickly or slowly? Will the rate of improvement vary between students? What does it mean for one problem to be similar to another? It might depend on whether the knowledge required for one problem is the same as the knowledge required for another. But is it possible to infer the knowledge requirements of problems directly from student performance data, without human analysis of the tasks?
This year's challenge asks you to predict student performance on mathematical problems from logs of student interaction with Intelligent Tutoring Systems. This task presents interesting technical challenges, has practical importance, and is scientifically interesting.
At the start of the competition, we will provide 5 data sets: 3 development data sets and 2 challenge data sets. Each of the data sets will be divided into a training portion and a test portion. Student performance labels will be withheld for the test portion of the challenge data sets but available for the development data sets. The competition task will be to develop a learning model based on the challenge and/or development data sets, use this algorithm to learn from the training portion of the challenge data sets, and then accurately predict student performance in the test sections. At the end of the competition, the actual winner will be determined based on their model's performance on an unseen portion of the challenge test sets. We will only evaluate each team's last submission of the challenge sets.
Already have an account? Log in.
March 15 | Call for participants |
---|---|
April 1 | Registration opens at 2pm EDT, development data sets available |
April 19 | Competition starts at 2pm EDT, challenge data sets available |
June 8 | Competition ends at 11:59pm EDT |
June 14 | Fact sheet and team composition info due by 11:59pm EDT |
June 21 | Winners announced |
July 25 | KDD Cup Workshop |
*Cup Score shown (validation against the withheld contest portion of the test set, which is a majority of the data).
KDD Cup is the annual Data Mining and Knowledge Discovery competition organized by ACM Special Interest Group on Knowledge Discovery and Data Mining (KDD), the leading professional organization of data miners.
This year's competition is hosted by PSLC DataShop. Learn more about the organizers and sponsors.
Contact us via email.