Posing Interactive Contest Problems in the ICPC Scoring Model
Ed Skochinski, Jefery Roberts, and Marc Furon
ACM ICPC Competitive Learning Symposium, April 2008
Banff, Alberta, Canada


Updated

2008 Competitive Learning Institute Symposium

Maze Problem from the 2004-05 Southern California Regional ICPC Contest

The Maze problem was the Southern California Region's first attempt at an interactive problem. It was an experimental problem chosen for its familiar nature and ease of implementation of the server process. The only variation on the classical maze was the hexagonal rather than orthogonal grid. The problem introduced some key concepts: the use of Linux named pipes as the interprocess communications (IPC) method, the ability to isolate the client and server for debugging purposes, mapping client (mis)behavior to standard judging responses, detailed scrutiny of the configuration file, and a transcript file.

The server process was not very sophisticated. It did not reason or play against the client process; it was a neutral responder, exposing the layout of the maze move-by-move as the client probed neighboring gridspaces. Although secure and robust, the maze server lacked the console (interactive terminal) prompting of UniClue, the 2006-07 interactive problem. Many protocol and handshaking errors were judged as RUNTIME ERROR, allowing the judges to add explanatory text to their responses. Additionally, the transcript was not a true transcript, rather it reported the client and server conversation with maze coordinate information. This early transcript was useful, but could not be filtered easily for feedback into the client or server.

Contestants had full access to the server executable during the contest, but the judges' test configuration files were hidden, just as the judges' test cases for traditional "batch" problems were exposed only to submitted runs. Contestants used the sample configuration file just as they would use sample input files for traditional batch problems. To further test their client programs, contestants were instructed how to form their own configuration files. The server had to read these new configuration files, and it had to diagnose and reject configuration errors to prevent the server from aborting or behaving in a misleading fashion.

The server gives quite detailed error messages; this benefits the judges during server and solution (client) development. Contestants see these exact same messages during their client development, but submitted runs return only the standard judging responses. This level of detail guided the contestants past handshaking and protocol errors, letting them concentrate on the interactive nature of the problems.

Transcripts from submitted runs were never returned to the contestants.

UniClue Problem from the 2006-07 Southern California Regional ICPC Contest

UniClue implemented a subset of the Hasbro "Clue" (Cluedo) game. Using the experience gained from the earlier maze problem, console (terminal) I/O prompting was added to allow a user to know when it was the human-as-client's turn to talk. UniClue's protocol was still simple, but more complicated than maze. Maze's cautious response of RUNTIME ERROR to protocol violations was replaced with PRESENTATION ERROR (read the proceedings excerpt for an explanation). In addition to prompting, the reporting of protocol errors was enhanced and enforcement of protocol violations was relaxed for console I/O.

Transcripts became true records of the dialog between client and server. A few simple text editing commands could be used to separate client messages for subsequent feedback to the standalone server.

The server itself became more sophisticated: it played against the client with a limited defensive strategy. The configuration file was correspondingly more complex, and the code to validate contestants' configurations grew accordingly.

Contestants had full access to the server executable during the contest, but the judges' test configuration files were hidden, just as the judges' test cases for traditional "batch" problems were exposed only to submitted runs. Contestants used the sample configuration file just as they would use sample input files for traditional batch problems. To further test their client programs, contestants were instructed how to form their own configuration files. The server had to read these new configuration files, and it had to diagnose and reject configuration errors to prevent the server from aborting or behaving in a misleading fashion.

The server gives quite detailed error messages; this benefits the judges during server and solution (client) development. Contestants see these exact same messages during their client development, but submitted runs return only the standard judging responses. This level of detail guided the contestants past handshaking and protocol errors, letting them concentrate on the interactive nature of the problems.

Transcripts from submitted runs were never returned to the contestants.


Posted on Thu, 23-Apr-2015 18:32:21 MST
00000
Copyright © 2008 ACM Southern California Regional Programming Contest <webmaster@socalcontest.org>