[Logo] Форум DL
  [DL]  На главную страницу 
Индекс форума ->Олимпиадное программирование ->Методика подготовки к IOI 2007 - ... 1, 2, 3, 4, 5, 6, 7
Автор Сообщение
Михаил Долинский

Темы: 1984
Сообщений: 47256

Мой профиль
Olympiads in Informatics (IOI Journal 2007 - ...)
Старая ссылка

Мои конспекты

Volume 9 (2015)
Volume 8 (2014)
Volume 7 (2013)
Volume 6 (2012)
Volume 5 (2011)
Volume 4 (2010)
Volume 3 (2009) - нет
Volume 2 (2008)
Volume 1 (2007) - нет
Михаил Долинский

Темы: 1984
Сообщений: 47256

Мой профиль
Во время проведения IOI 2008 состоится международная конференция, посвященная подготовке и проведению олимпиад по информатике:

IOI Conference - August 18 and 20, 2008, Cairo
Olympiads in Informatics: Tasks and Training

http://www.ioi2008.org/index.php?option=com_content&task=view&id=82&Itemid=60

Вот ее содержание:

18 August, 200810.00-10.30 Opening
10.30-11.00 M. Revilla (Spain)
Competitive Learning in Informatics: The UVa Online Judge Experience

11.00-11.30 B. A. Burton (Australia), M. Hiron (France)
Creating Informatics Olympiad Tasks: Exploring the Black Art

11.30-11.45 P. Ribeiro, P. Guerreiro (Portugal)
Early Introduction of Competitive Programming

11.45-12.00 Coffee break
12.00-12.30 K. Diks, M. Kubica, J. Radoszewski, K. Stencel (Poland)
A Proposal for a Task Preparation Process

12.30-12.45 A. Chargueraud, M. Hiron (France)
Teaching Algorithmics for Informatics Olympiads: The French Method

12.45-13.00 E. Cerchez, M. I. Andreica (Romania)
Romanian National Olympiads in Informatics

13.00-13.15 E. Kelevedjiev, Z. Dzhenkova (Bulgaria)
Tasks and Training the Younger Beginners for Informatics competitions

13.15-13.30 P. Pankov (Kyrgyzstan)
Naturalness in Tasks for Olympiads in Informatics

20 August, 200810.00-10.30 K. Manev (Bulgaria)
Tasks on Graphs

10.30-11.00 B. Merry, M. Gallotta, C. Hultquist (South Africa)
Challenges in Running a Computer Olympiad in South Africa

11.00-11.15 W. Pohl (Germany)
Manual Grading in an Informatics Contest

11.15-11.30 B. A. Burton (Australia)
Breaking the Routine: Events to Complement Informatics Olympiad Training

11.30-11.45 S. Tani, E. Moriya (Japan)
Japanese Olympiad in Informatics

11.45-12.00 Coffee break
12.00-12.15 A. Truu, H. Ivanov (Estonia)
On Using Testing-related Tasks on the IOI

12.15-12.45 T. Vasiga, G. Cormack, G. Klemkes, (Canada)
What Do Olympiad Tasks Measure?

12.45-13.00 T. Verhoeff (The Netherlands)
Programming Task Packages: Peach Exchanging Format

13.00-13.30 Discussion

Все статьи уже опубликованы в специально издаваемом журнале
OLYMPIADS IN INFORMATICS
International Journal
http://www.mii.lt/olympiads%5Fin%5Finformatics/

Я намерен прочитать эти статьи и процитировать в ответах на это сообщение наиболее интересные материалы.
Foreword

The journal is substantially connected with the conference organized during International Olympiads in Informatics (IOI). Papers are requested to be presented during the conference. The main goals of the conference (and journal) are:
– to improve the quality of olympiads,
– to improve the skills of contestants and educators, and
– to discuss developing tasks and topics for olympiads.

This volume concentrates on training and task types.
Михаил Долинский

Темы: 1984
Сообщений: 47256

Мой профиль
Breaking the Routine: Events to Complement
Informatics Olympiad Training

http://www.mii.lt/olympiads_in_informatics/pdf/INFOL025.pdf

Benjamin A. BURTON
Department of Mathematics, SMGS, RMIT University
GPO Box 2476V, Melbourne, VIC 3001, Australia

Here we discuss some of the more interesting events that complement the usual routine of lectures, problems and exams. In particular we focus on three events:
(i) codebreakers, aimed at designing tests and finding counterexamples;
(ii) “B-sessions”, aimed at implementation and rigour; and
(iii) team events, which use challenges in cryptography and information security to encourage teamwork in a competitive setting.

The Australian training schools run for approximately ten days. A typical day consists of lectures and laboratories in the morning (or possibly a programming contest instead), more lectures and laboratories in the afternoon, and then an evening session at a whiteboard where students present and analyse their solutions to problems.

Codebreakers

... it is critical that students be able to identify incorrect algorithms. In order to identify incorrect algorithms, students must be willing to spend time searching for counterexamples, i.e., specific test cases for which an algorithm gives an incorrect answer.

The codebreaker runs as a live (and sometimes noisy!) competition, with a running scoreboard at the front of the room and a contest website that evaluates submissions on the fly (illustrated in Fig. 1). The competition is relatively informal, and is structured as follows:
• Students are given three or four problems. These are all problems that the students have seen and thought about before (typically exact copies of problems from the most recent national contest).
• Students are given several “solutions” to each of these problems. Each solution is given as source code (e.g., C++ or Pascal), and each contains an error, possibly in the implementation of the algorithm or possibly in the algorithm itself.
• While the event is running, students attempt to break these solutions. To break a solution, a student must submit an input file to the contest website. This input file
is scored as follows:
- if the input file is valid (conforms to the problem specification) and the solution does not solve it correctly within the time and memory limits, the solution is considered broken and the student gains ten points;
– if the input file is invalid (does not conform to the problem specification), the student loses two points;
– if the input file is valid but the solution solves it correctly within the time and memory limits, the student loses one point.
• Students may make as many attempts as they wish to break each solution. At the end of the competition, the student with the most points is declared the winner!

A typical codebreaker contains around 15 solutions to break, and runs for about two hours. Ideally a few (but not too many) students will have successfully broken every solution by the end of the contest, and so the winner is determined not only by the ten
point rewards but also by the 1–2 point penalties. Indeed, the duration of the contest is often changed on the fly, and at least once it was declared to be “until three people have broken every solution”.

Selecting Problems and Solutions
- Problems from the latest national contest work well
- we do not use real erroneous submissions from the national contest

There are several types of errors that can be introduced into these “solutions”, including:
• implementation errors, where the algorithm is correct but the code is buggy, such as using < instead of <= or missing a boundary case;
• algorithmic errors, where the implementation is correct but the underlying algorithm is wrong, such as a greedy solution to a shortest path problem;
• comprehension errors, where the program solves the wrong problem, such as a program that omits one of the conditions in the problem statement.

As a related event, it is worth noting the TopCoder contests (www.topcoder.com), which incorporate codebreaking into their regular programming competitions. After writing their code, TopCoder contestants enter a brief 15-minute “challenge phase” in which they attempt to break each others’ solutions. This allows for a codebreaking environment that is more realistic and competitive, but where the difficulty, variety and readability of
solutions is not controlled.

B-Sessions

A typical B-session runs as follows:

• A single difficult problem is chosen by the organisers. This problem is handed out to students the day before the event. Students are encouraged to think about the problem, either individually or in groups, but they are instructed not to write any
code. The chosen problem should be too hard to give in a regular contest, but the stronger students should be able to solve it given enough time.
• The B-session itself runs for an entire morning or an entire afternoon (usually 3.5 hours). The session begins in the lecture room, where the students discuss their ideas as a group. A staff member chairs the discussion, but ideally the students
should be doing most of the talking.

Once the discussion has finished, students move into the laboratory and enter exam conditions. The remainder of the B-session runs as a programming contest with just one problem (which is the problem they have been working on). For the remaining
2.5 hours, the students must implement the solution that has been discussed and test it thoroughly, with an aim to score 100%.

In general it is nice to use IOI problems for B-sessions.

Team Events

The team event is a deliberate break from IOI-style problem solving. The goals of the team event are:
• to encourage teamwork, which traditional informatics olympiads for secondary school students do not do;
• to encourage students to look outside the narrow focus of informatics olympiads and explore the larger world of computer science;
• to spend an afternoon doing something noisy and fun!
Михаил Долинский

Темы: 1984
Сообщений: 47256

Мой профиль
Австралийцы во время 10-дневных сборов занимаются по 3 раза в день (до обеда, до ужина и после ужина). В основном, это лекции и иногда контесты. А также они проводят занятия, отличающиеся по форме:

- CodeBracker - ломание исходников. Предъявляются неверные решения известных задач или неверные реализации известных алгоритмов. Участники должны послать на проверку тесты (входные данные) на которых предложенные решения дадут неверные ответы.
За каждый такой тест прибавляется 10 баллов.
За тест, несоответствующий формату ввода-вывода, отнимается 2 балла.
За тест, на котором предъявленная программа выдает правильный ответ, отнимается 1 балл.

- B-занятия: Условие задачи дается за день. Поощряется обдумывание поодиночке и обсуждение в группах. На следующий день в течение часа обсуждается и вырабатывается верный алгоритм. А затем в течение 2.5 часов участники должны его закодировать. Задача берется с предыдущей IOI олимпиады.
- Командные олимпиады - не-АСМ - просто решение прикладных задач - по расшифровке текстов (по интересам автора статьи). Кто первые расшифруют, смогут догадаться, где находится приз.
Михаил Долинский

Темы: 1984
Сообщений: 47256

Мой профиль
Creating Informatics Olympiad Tasks:
Exploring the Black Art

http://www.mii.lt/olympiads_in_informatics/pdf/INFOL031.pdf

Benjamin A. BURTON
Department of Mathematics, SMGS, RMIT University
GPO Box 2476V, Melbourne, VIC 3001, Australia

Mathias HIRON
France-IOI 5, Villa Deloder, 75013 Paris, France

we explore some of the different techniques that problem setters can use to find new ideas for tasks and refine these ideas into problems suitable for an informatics olympiad. These techniques are illustrated through concrete examples from a variety of contests.

It should be possible to express the task using a problem statement that is (relatively) short and easy to understand. The focus of the task should be on problem solving, not comprehending the problem statement. It is also nice to include some tasks that relate to real-world problems, though this is by no means necessary.
• Ideally the algorithm that solves the task should not directly resemble a classic algorithm or a known problem. It might be a completely original algorithm, it might be a modified version of a classic algorithm, or it might resemble a classic algorithm
after some transformation of the input data.
• The task should support several solutions of varying difficulty and efficiency. This allows weaker students to gain partial marks by submitting simple but inefficient solutions, while stronger students can aim for a more difficult algorithm that scores 100%.

Many of the examples in this section are taken from the French-Australian Regional Informatics Olympiad (FARIO), the France-IOI training site, and the Australian Informatics Olympiad (AIO). Full texts for all of these problems are available on the Internet:
the FARIO problems from http://www.fario.org, the France-IOI problems from http://www.france-ioi.org (in French), and the AIO problems from the Australian training site at http://orac.amt.edu.au. The last site requires a login, which can be obtained through a simple (and free) registration.
Михаил Долинский

Темы: 1984
Сообщений: 47256

Мой профиль
Ну, в общем, как создаются задачи, они "Америку не открыли".
Зато один из авторов статьи указал, что он делал задачи для APIO.
Я "посоветовался с Гуглом" и нашел сайт
http://www.apio.olympiad.org
Asia-Pacific Informatics Olympiad
Они начали с 2007 года, уже провели 2008.
Выкладывают условия, тесты, результаты, участвуют КИТАЙЦЫ !!!
Я думаю в высшей степени разумно будет использовать их задачи во вторых тренировочных сборах. Ну а сейчас задачи 2008 года попробуем порешать 10 августа. Если Вова Тимошков успеет поставить.
Михаил Долинский

Темы: 1984
Сообщений: 47256

Мой профиль
Romanian National Olympiads in Informatics and Training
http://www.mii.lt/olympiads_in_informatics/pdf/INFOL027.pdf

Emanuela CERCHEZ
Informatics College “Grigore Moisil”
Petre Andrei 9, Iasi, 700495 Romania

Mugurel Ionu?t ANDREICA
Politehnica University
Splaiul Independentei 313, Bucharest, 060032, Romania

http://campion.edu.ro
In 2002 we started .campion, an online training program for performance in informatics, supported by teachers with remarkable results and brilliant students, former winners of international informatics olympiads.

The main goal of .campion training program is to offer all students an equal chance to participate in a high level training in computer science. The specific objectives of our training are:
• to develop algorithmic thinking,
• to develop programming skills,
• to develop competitive spirit.

Students are divided into 3 training groups, according to their level/age:
• group S (Small) – beginners in computer science, the first year in programming; age should not exceed 16 years (9th grade or primary school students);
• group M (Medium) – intermediate level, one or two years experience in programming; age should not exceed 17 years (10th grade);
• group L (Large) – advanced in programming (11th and 12th grade).

The training program consists of rounds: alternating training rounds and contest rounds. For each round, students receive 2 problems, to be solved in 10 days for a training round, and in 3 hours for a contest round.

Solutions are graded using an automatic online grading system. After grading, on the website are published:
for each student: personal grading sheets;
• for each group: rankings for the current round and also total rankings, including all the rounds;
• for each problem: solutions, solution descriptions, grading tests.
All past problems are organized in an archive (task description, test, solutions).

Each year, the best 50–60 students, participate in an on-site final round, consisting of a single contest day. The main goal of the final round is to offer students the opportunity to know each other, to compete in a real, not virtual environment, to be awarded and acknowledged.

http://infoarena.ro
Infoarena is a training website made by students for students. On this website, students organize programming contests, publish educational materials, and exchange ideas.
A problem archive is available on the website, including Infoarena contests problems, but also problems from different stages of the National Olympiad in Informatics. Online grading is available for Infoarena contests, and also for the archive problems. The Infoarena team has implemented a rating system, reflecting the performances of Infoarena users.

Romanian National Olympiad in Informatics has three stages and two divisions. The first division is for gymnasium students (5th to 8th grade), while the second division is for high-school students (9th to 12th grade). For each division 3 stages are organized: a local, regional and national stage.

The National Olympiad in Informatics gathers about 300 high-school students and about 160 gymnasium students. The contest is held over two consecutive days and consists of 3 problems for each contest day. After a day break, half of the students may
participate in another 2 contests, having the purpose to select the national informatics teams (10 students in the Junior Team, 24 students for the Senior Team).

Two training camps are organized for the national teams. Training consists of theoretical courses and also problem solving sessions. During the training camps students have selections contests (3 for each camp), in order to select the teams for IOI, BOI, CEOI
and JBOI.
Михаил Долинский

Темы: 1984
Сообщений: 47256

Мой профиль
В Румынии налаженная система подготовки.
Каждые 10 дней - домашнее задание и контест на 3 часа.
Лучшие 50-60 по такой работе собираются на национальную олимпиаду.
Примерно половина из них потом еще приглашается два раза в "тренировочный лагерь", в каждом по 3 отборочных контеста.
Команды Румынии участвуют в IOI, Balkan OI, CEOI и JBOI.
И довольно успешно: за период 1997-2007 по результатам в IOI они на 4-ом месте после Китая, России и Польши. У них за это время 43 медали на 44 участника, из которых 14 золотых и 19 серебряных.
Михаил Долинский

Темы: 1984
Сообщений: 47256

Мой профиль
Teaching Algorithmics for Informatics Olympiads: The French Method
http://www.mii.lt/olympiads_in_informatics/pdf/INFOL018.pdf

Arthur CHARGUERAUD, Mathias HIRON
France-IOI
5, Villa Deloder, 75013 Paris, France

All the teaching materials that have been produced are made available to the public on the training website (http://www.france-ioi.org), where the students can study all year long. Intensive live training sessions are held several times a year for the top
students.

The training curriculum we present differs from those that can be found in many algorithmic courses. Most curriculums follow a common pattern:
1. Provide basic knowledge. Teach algorithms and data-structure through lectures, providing pseudo-code and complexity analysis.
2. Show examples. Illustrate how to apply this knowledge to practical situations.
3. Give exercises. Solidify what was taught, and try to develop solving skills.
This approach counts mostly on a student’s innate abilities when faced with new,more difficult problems.

Our thesis is that improving skills to solve new, difficult problems is what really matters in algorithmics. Therefore the true aim of a teaching curriculum should be to improve those skills. Providing the student with knowledge and experience
is necessary, but should not be the first priority.

The interactive training is structured around sets of exercises with each set focusing on one particular area of the field. Within a set, each exercise is a step towards the discovery of a new algorithm or data-structure. To solve an exercise the students submit their source-code and the server automatically evaluates the code, in a fashion similar to grading systems that are used in many contests. If the students fail to solve an exercise, they
may ask for automated hints or new intermediate problems, which in turn ease the progression. Once the students succeed, they get access to a detailed solution that describes the expected algorithm and presents a reasonable path to its discovery. At the end of each sequence of exercises, a synthesis of what has been learned is provided.

Overall the curriculum can be seen as a guided step-by-step discovery completed with instructional material as opposed to more standard curriculums which are typically based on instructional material and very little discovery learning.

The training website aims at being an entirely self-contained course in algorithmics. The curriculum is divided into three levels of difficulty: beginner, intermediate, and advanced.
The only requirement for the “beginner” level is to have some experience of programming in one of the languages supported by the server. At the other end, the “advanced” section contains techniques and problems corresponding to the level of the IOI. Note that the training website also includes instructional programming courses and exercises for the C and OCaml languages.

Each of the three levels contains several series of exercises of the following kinds:

1. Step-by-step discovery.

A sequence of problems designed to learn about a set of classical algorithms on one particular topic. Each problem comes with a detailed solution, as well as optional automated hints and intermediate exercises.

2. Application problems.

A non-ordered set of problems of various topics involving variations on algorithms and techniques previously learned through step-by-step discovery.

3. Rehearsal exercises.

Students are asked to write their best implementation of each standard algorithm studied in a given domain. Reference implementations are then provided.

4. Practice contests.

Students are trained to deal with limited time and get them used to zero-tolerance on bugs.
Михаил Долинский

Темы: 1984
Сообщений: 47256

Мой профиль
В статье описывается "французский" метод обучения информатике.
Вообще с 1997 года французы завоевали 1 золотую, 4 серебряных и 13 бронзовых медали (всего 18 - не так уж и много. Например, Беларусь за то же время взяла 35 медалей, из них 3 золотых и 16 серебряных).
Тем не менее, подход интересен, есть нечто общее с нашим обучением.

Основные идеи таковы:

1. Цель - разработать АВТОНОМНО(от учителей) обучающий сайт
http://www.france-ioi.org
к сожалению, не на английском (на французком) языке.

2. Обучение в каждой теме ведется с помощью множества советов и подводящих задач. Таким образом, каждому учащемуся дается возможность "совершить открытие" - придумать алгоритм (или ту его часть, на которую способен).

3. В процесс обучения вплетены контольные задания и соревнования.

4. Требуется предварительное умение программировать на одном из языков, поддерживаемых(тестированием) на сервере.
Михаил Долинский

Темы: 1984
Сообщений: 47256

Мой профиль
Tasks and Training the Youngest Beginners for Informatics Competitions
http://www.mii.lt/olympiads%5Fin%5Finformatics/htm/INFOL028.htm

Emil KELEVEDJIEVa, Zornitsa DZHENKOVAb
Institute of Mathematics and Informatics, Bulgarian Academy of Sciences Akad. G. Bonchev str., block 8, 1113 Sofia, Bulgaria

Mathematical High School 2 Elin Pelin str., 5300 Gabrovo, Bulgaria

В Болгарии учат детей, начиная с 4-го класса (и до 12-го), поделив их на 5 возрастных категорий: 4-5, 6, 7-8, 9-10, 11-12.

Для классификации задач, с которых нужно начинать обучение, авторы проанализировали задачи с реальных контестов для начинающих с трех точек зрения:
а) основные концепции языка программмирования (С/С++) вместе с простыми типами данных: числа, символы, строки, тексты (как множество строк и ограничителей), одномерные и двумерные массивы, массивы строк, последовательности элементов входных данных
б) основные управляющие конструкции, которые формируют программу: простые вычисления по заданной формуле, условный оператор (if), цикл с условием (while), комбинации циклов и условий, вложенные циклы, рекурсия, использование процедур (функций в С/С++)
в)алгоритмы (по отношению к их объектам): целые числа и делимость, цифры числа, длинные числа, комбинаторика, сортировка, рекурсия, геометрия (прямоугольники со сторонами, параллельными осям координат), реализация (включая интервалы дат и времени, обработку текстов и т.д.)
В Болгарии олимпиады для учеников 4-7 классов проводятся с 2001 года и вот таблица распределения задач по темам:

Sequential processing            17       Combinatorial analysis  2
Digits from a number             16       Dynamic programming     2
Print out a figure of characters 12       Games and strategies    2
Counting                         11       Geometry                2
Divisibility                     10       Number systems          2
Text processing                  10       Palindrome              2
Optimal elements                  9       Rectangular figures     2
Logical                           7       Recursion               2
Dates                             6       Decomposing numbers     1
Long numbers                      6       Exhaustive search       1
Sorting                           4       Fractional numbers      1
Modeling                          3       Parity                  1
String of digits                  3       Raising to a power      1


Соответственно, предлагаемая схема обучения:

Group E (4-5 классы)

Programming: Environment for C/C++, Branch and loop operators, Integers and Characters. One-dimensional array. Standard input and output. Algorithms: Whole numbers arithmetic. Dates. Geometry: Straight line coordinates.

Group D (6 кл)

Programming: Extended study of the programming language. Inroduction to pointers. Data structures: Arrays and Strings. Multi-dimensional arrays. Stacks and Queues. Methods for algorithms desing: Simple exhaustive search. Recursion. Introduction to dynamic programming. Binary search in a sorted array. Aritmetic: Divisibility. Euclid‘s algorithm. Long integers. Number systems. Sequences: Searching, sorting, Merging, Polynomials. Combinatorics: Counting, Generating combinatorial configurations. Graphs: Representations, Grid of squares. Geometry: Coordinates in the plane. Rectangles with sides parallel to the axes. Games: Strategies, Parity, Symmetry.
6th grade
       Themes                  Study hours
 1 Functions in C language                      2
 2 One-dimensional array                        2
 3 Sorting                                      4
 4 Strings                                      4
 5 Divisibility. Prime numbers                  4
 6 Euclid‘s algorithm. Common Fractions         4
 7 Strings in C++ style.                        4
 8 Two-dimensional arrays                       6
 9 Rectangles with sides parallel to the axes   4
10 Structures in C language                     4
11 Recursion.                                   2
12 Number systems                               6
13 Long integers                                6
14 Backtracking                                 7
15 Grid of squares                              6
Total                                          65

7th grade

    Themes                                Study hours
1 Parameters of the functions in C              3
2 Introduction to the standard library          2
3 Sorting – fast algorithms                     2
4 Searching – binary search                     2
5 Introduction to complexity of algorithms      2
6 Introduction to object-oriented programming   2
7 Combinatorial configurations                  2
8 Extended Euclid‘s algorithms                  3
9 Roman numerals                                2
10 Polynomials                                  4
11 Pointers in C                                2
12 Stack and Queue                              3
13 Linked Lists                                 2
14 Searching substrings in strings              3
15 Games with numbers – using symmetry and parity 4
16 Rectangles                                   3
17 Bitwise operations                           2
18 Long integers                                3
19 Backtracking                                 4
20 Introduction to Dynamic Programming          5
21 Introduction to Graphs                       5
Total 60

Михаил Долинский

Темы: 1984
Сообщений: 47256

Мой профиль
Откровенно говоря, я не понимаю приведенных часов. Это время лекционных часов? Ну прочитали лекцию. А что дети усвоили? Сколько часов практики им на это требуется? Или в Болгарии такие супер-дети?
Михаил Долинский

Темы: 1984
Сообщений: 47256

Мой профиль
Tasks on Graphs
http://www.mii.lt/olympiads%5Fin%5Finformatics/pdf/INFOL029.pdf

Krassimir MANEV
Department of Mathematics and Informatics, Sofia University and
Institute of Mathematics and Informatics, Bulgarian Academy of Sciences
5, J. Bourchier blvd., 1164 Sofia, Bulgaria

Классификация задач на графах:

1. Breadth first search and related (including connectivity checking, identifying connected components, shortest path in graphs without cost function, etc.)
2. Depth first search and related (including topological sorting + some optimization, strongly connected components, etc.).
3. Euler traversals
4. Minimum spanning tree
5. Shortest path (both single-source and all-sources) and related
6. Matching and flows in networks
7. Games in graph (of type Nim
8. Exhaustive search
Михаил Долинский

Темы: 1984
Сообщений: 47256

Мой профиль
Классификация задач полезна для того, чтоб знать какие темы нужно знать. Мне кажется, у нас в "Методах алгоритмизации" приведена существенно более детальная, и соответственно, более полезная классификация.
Михаил Долинский

Темы: 1984
Сообщений: 47256

Мой профиль
Competitive Learning in Informatics: The UVa Online Judge Experience
http://www.mii.lt/olympiads%5Fin%5Finformatics/pdf/INFOL035.pdf

Miguel A. REVILLA
Applied Mathematics Department, University of Valladolid
Prado de la Magdalena s/n, 47011-Valladolid, Spain

Shahriar MANZOOR
Computer Science and Engineering Department, Southeast University
24, Kemal Ataturk Avenue, Dhaka, Bangladesh

Rujia LIU
Department of Computer Science and Technology, Tsinghua University
Qinghua Yuan, Haidian District, 100084 Beijing, China


The EduJudge European Project

The project has been funded with support from the European Commission into the frame of the Lifelong Learning Programme of the European Union. Other than the University of Valladolid, there are three more partners from different European countries: the University of Porto (Portugal), the KTH Royal Institute of Technology (Stockholm, Sweden) and the Institute of Mathematics and Informatics (Vilnius, Lithuania).

The main goal of the project is to give a greater pedagogic character to the UVA Online Judge, and adapt it to an effective educational environment for higher and secondary education. We want to give the Online Judge a pedagogical character by means of a redesign, improvement of contents and its integration into an e-learning platform. All these will contribute to the development of quality lifelong learning, and also to promote innovation providing new methods of teaching through contests, instead it being focused exclusively on competition.

The different tasks are:

• Solution quality evaluation: the user will receive a more complete feedback from the system, not only indicating that the problem is solved (or not) but grading the quality of the solutions. This can range from a no significant solution (less than
50% of correctness) to a completely correct solution (100%).

• Generic Judge Engine: the system will support several problem formats, allowing different kinds of learning approaches. By allowing different formats of problems the system is not limited to a right/wrong evaluation method. There can be problems in which the challenge is not only solving a problem, but solving it in the
most efficient way. Also there can be cases where a student’s solution must ‘compete’ against another student solution in an interactive way. Having a generic judge engine that can easily be extended to support more problem formats will make this possible.

• Automatic Test case Generation: the automatic generation of test cases will allow different levels of difficulty in the solving of the problems. Having good quality and heavily checked test cases is essential for a good evaluation of a solution. The creation of such cases by hand is a difficult task. The automatic system should
be able to generate good test cases based on a given set of rules describing the format of the test case. There can be another interesting situation with automatic generation of test cases. We just give a trivial example here. Suppose a user is trying to solve “The closest pair Problem” – given n points find the distance of the closest two points. Now the user finds that he is struggling to solve the problem with n <= 10000. So using the generator he can generate test cases with smaller values of n and check whether his program works for that. Or we can even propose a
WIKI system where users can submit their own generator, and our input tester will check whether the submitted generator generates according to the specified rule before passing the input to other users’ solution. In other words if the problem’s input statement specification is editable by the user, he can even generate his own
test cases with different values of the parameters and actually solve a very different problem that the original problem setter did not intend to solve. Of course all these changes will be within certain limits so that the original author’s solution can solve it. All these will help the teachers to create easier problems for their weaker and/or younger students. Also the teacher can specify in which format he wants the test cases to be IOI, ICPC or any other new format. But to implement all these a good number of dedicated people are needed to be involved with it.

7. About the Categorization of Tasks

About classification, serious solvers are not interested in doing some classification of the UVa archive. They think that making it public would take away the fun part. Many times the more important task for solving the problem is to decide the type of the designing
technique to use. So, some of our previous attempts have failed.

Of course, grouping problems into specific categories is very useful, especially for beginners, teachers or for those who want practice on a particular problem type. The users can then try to solve all variety of problems of the same type to master the technique. And, in the frequent case, when a problem can be assigned to several classes it’s also very useful to learn new
algorithmic concepts behind the technique itself.
However, maintaining such a kind of list is a really hard task, especially when the number of problems is as big as 2500+, and to do it well is very troublesome. First of all we need a prototype controlled list where to classify the problems. Even though there is an almost standard universally accepted list, the experience shows us that the contribution of the users must be managed if we want to prevent a real chaos. The first version of our judge allowed to the users fill a field to write the algorithm they had used in the code, and many of the people didn’t use or made an undesirable use of it and that made the field useless. There are too many details to decide before to go on with these helping tools, because it could be negative and confusing if we are not careful enough.
Talking about difficulty level, the problem is even worse as for most of the problems it is very subjective opinion depending of the expertise of the person making the decision. Probably most the people agree about trivial and very hard classes, but the opinions for intermediate levels can be almost impossible to fix. And this is a very important detail for the learning efficiency of the site. A user trying and trying an “easy” problem without getting a positive answer probably lives a traumatic experience in his mind. Then it must be very clear that the difficulty level is a relative concept and a good idea is that the user completes this kind of information with additional data, mainly heuristics consequences
of the statistics. In fact, the only published classifications we have done are included in the book that the first author wrote with the Professor Skiena. After arguing for long and working a lot we decided that talking about popularity was better than difficulty and success rate was better than difficulty.
 
Индекс форума ->Олимпиадное программирование ->Методика подготовки к IOI 2007 - ... 1, 2, 3, 4, 5, 6, 7
Time:0,062