Monday, May 22

Zombology AI Competition

There's been some interest in the AI competition, so I'm going ahead with it - thanks very much for your offers to help me test the balance of Zombology.

Purpose

I'm trying to test the balance of my card game Zombology after some rules changes in the second edition which I will be releasing later this year. To properly test the game I need to play it hundreds of times, which I'm struggling to do in the flesh around a full-time job, young family and imminent second child! So I've written a game harness that will play the game on a computer and I want to test it using reasonably realistic human-mimicking AIs. The aim is to write a AI to play Zombology, and have it be the best at winning the game across numbers of players. Your AI will not be used for any purpose other than the competition and the testing of the game's balance.

Rules

The test will be based on the Zombology second edition rules.

You will need to download the game solution and then implement a subclass of the Player class in the solution in C#. You can use the ConservativeSciencePlayer and RiskySciencePlayer examples for inspiration. Your class must obey the following rules:

  1. You must implement all of the abstract methods on the Player class which describe your choices and return your chosen option.
  2. You cannot change any game state (the base Player class will do this on your behalf).
  3. You cannot share state between multiple instances of your player in a game (i.e. no static members).
  4. You can only use information that would be available to a human player in your position (this should be enforced by the GameState and PlayerState classes - feel free to remember things from previous rounds, but no looking at cards you haven't seen, etc.).
  5. All entrants must provide their Player-derived class to me by the end of 30th June 2017 BST (UTC+1), entrants received after this time will not be included.
  6. In the event that a bug in your class results in the game harness crashing, I will give you as much information as I can about the crash and give you one week to fix it before re-running the test.
If you need to make any changes/fix bugs to my existing classes please detail them in the comments to this post and I'll incorporate them and re-share the solution so everyone has the same information. Please comment below and subscribe to the comments on this post if you are entering, so I know who you are, how many entrants there are and you get notified if I have to change anything. I will copy your player class (and any ancillary classes you've created) into my solution for the test, any changes you've made to the existing classes will not be preserved.

The Test

I will have the harness play 60,000 games, 10,000 each with 3, 4, 5, 6, 7 and 8 players. For each game the harness will pick a random player from my ConservativeSciencePlayer, my RiskySciencePlayer and all the entrant's players for each slot in the game. I will record each player class' total games played and games won stats and will sum these across all sizes of games. The class with the best win/play ratio across all games will be declared the winner. In the event that the top two or more entrants are within 0.01% of each other, I will repeat the test just using the 'tied' entrants - the winner of the second test will be declared the winner, even it if is still within 0.01%.

Remember it's a semi- co-op game: the best result is you are sole winner, the second best result is you are a shared winner, third best is everyone loses and the worst outcome is that you lose but others win.

Prize

The winning AI gets huge kudos, plus I'll give you a free, signed and numbered copy of the second edition of Zombology, including free postage worldwide.

Monday, May 15

Bot-Loads of Data!

As I've mentioned over the last couple of weeks, I've been working on a framework to play Zombology automatically, with AI players making all the decisions. On Friday night, I finally got a basic version of the application working.

The whole point of this is that I need more data about the relative difficulty of the new rules of the game and how well they scale by number of players. I drew a graph a couple of months ago of how I wanted it to look, with the chance of winning in each round being twice as high as the previous round, and the chance of all players losing being 50% (when the game is won there are winners and losers - it's semi-co-operative):

To build a graph that looks like that I need to play at least 128 games with each number of players (3-8), so 768 in total. So far I've managed 33:

While this is looking alright, there's still nowhere near enough data - I've not lost a 7- or 8-player game yet with the new rules (but we've only played it twice each with those numbers of players). Clearly waiting around for me to play it enough with real people is a non-starter.

The AI idea was a way to get more data, but with very little free time (and soon to be even less with Daughter The Second at most 10 days away) I'm not going to be able to come up with a hyper-realistic player AI. So I've opened it up as a competition, which I'll elaborate more on next week.

For the moment, I've started collecting data with an entirely random player - it figuratively rolls a die for every decision it's faced with. I've run the simulation 60,000 times, 10,000 times for each of the numbers of players from three up to eight.

The good news is that Random McRandomface is pretty crap at Zombology. The chances of a single instance of him winning a game varies between 0.2% (in a 3-player game) and 1.5% (in an eight player game). The chances of the game being won varied between 0.5% (3-player again) and 11% (8-player again). Both of these are way south of the 50% I'm aiming for, but I'm very happy that the game is usually lost when played at random. Out of 60,000 games, only one of them was won in round 3 (of 8). Also, if you exclude the loss bar (it's so large you can't see the others!), then the wins by round graph looks pretty good too:

The bad news is the variation by player count. 3-player is hard. 4- and 5-player pretty hard, 6- and 7-player much easier and 8-player twice as easy as 6 & 7. This might disappear once I have a more realistic AI, but it's a potential concern.

My (and your?) next task is to write a more realistic AI...





Monday, May 8

AI Competition Early Info

So I had a few takers for the Zombology AI competition, so here's an early cut of what I'm thinking. I will write a game harness that plays the game and calls into the Player class periodically for state changes and decisions. I will write an abstract Player class that does all of the state change stuff - the entrants will need to subclass the Player class and implement the decisions however they see fit. I will try to make it so that they only have the information a human player would have at that point, so it's fine to remember cards you've seen earlier in the draft, but you shouldn't be able to see cards you've not yet received in the draft. If you wish to remember any state between decisions and turns, that's fine, but you are NOT allowed to make any state changes at all (I've implemented all of that in the base class).

Here's a link to the latest rules of the game, and a link to a zip file containing all the classes you need to implement your subclass of Player, including an example subclass RandomPlayer that will just act entirely at random! In a couple of weeks I hope to share the full code of the solution (once I've written it!) including unit tests for the methods you have to implement in the derived class.

That should be enough to get started with, and is enough for you to get thinking about it, and ask any questions you have (and potentially spot all my bugs!).

Thanks to everyone who has shown an interest in this!

Monday, May 1

Interest in an AI Competition?

As I mentioned a couple of months ago, I need to play Zombology a lot to work out whether I have the balance right for each different number of players. At least 768 times, probably many times that number. At the moment I'm at 29. Clearly, there's some work to be done!

This week I've started another programming project - I'm going to make a very simple app which can play thousands of games of Zombology and record the win/loss ratio and the round in which the game was won.

It will let me quickly find out whether the game is balanced by number of players and potentially tweak the rules to make it more balanced. To play thousands of games it's going to need some AI - I don't want to be playing thousands of games by hand (especially not soloing trying to pretend I don't know what's in the other hands and what the 'other' player is going to do. So I'm going to need some AI that can play Zombology. 

Thankfully it's a pretty easy game - there's only four decision points per round:
  1. If I have a Guru, and there are matching Science cards in the central display, do I want to swap a Science card from my hand for one of them, and if so, which for which;
  2. Choose a card from my hand to play this turn;
  3. If I chose an Overrun, which player do I want to overrun;
  4. If I chose a Guru, which Guru do I want to claim;

So, the first thing I'm doing is writing all the plumbing that runs the game - then I need to write some AIs that can play the game. My current thought is that I'll write an entirely random one first that just lets me test that the plumbing is working. Then I need a few different AIs that play the game with different styles to see how the game works out.

Seeing as I know there's a bunch of programmers who read this blog I'm wondering if there's any interest in a competition to see who can write the most successful Zombology AI? Let me know in the comments if you'd be interested in that!