Topic: BiM Contest Judging
I'm a bit late to this because I was a robotics demo until 1 AM. I'm a bit disappointed by the results of BRAWL 2014. I'm shocked that 40% of the score is based on interpretation of the theme. That seems outrageous to me with how broad the theme was this year. You might as well just roll the dice. I'm not sure if the judging criteria was released before the contest. If it was, I must have missed it and would love to be corrected. I think for future contest it should be standard to explain the judging criteria. That way the results seem less pulled out of thin air. This will also tell the participants what the contest is looking for in a film.
This is not to pick on BRAWL but rather contest judging on BiM as a whole. For the most part I am using BRAWL because it is the most recent. I know these contests are mainly for fun and making a film is its own reward. However, people still care about the rankings mainly for bragging rights. This is a contest that I didn't enter so I can finally talk about contest judging in general in an unbiased manner.
I feel like every BiM contest happens and the judging is released and it always seems random to me. It's never oh yeah that was clearly the best film. I put the judging sheet into a spreadsheet and started playing around with it. What I discovered is that the four categories don't make much of a real difference. If film had good presentation it scored equally well on theme. On average each film had a 8.29 percent standard deviation between all the categories. This mean the categories were rather meaningless.
The spreadsheet is viewable here: https://docs.google.com/spreadsheets/d/ vrX9EVRRJY
I understand people have different values on what makes a good film and that is why we have multiple judges. I would really like to see the breakdown by judges. I know in THAC X the BuilderBrothers' film was at the top of the list for two of the three judges but the third judge single handedly brought it down to a ranking of 6. From a math standpoint most BiM contests give the winner to the most mediocre film. If films have a weak point or unliked by one judge it kills their chances at winning. I think this is why every time result are released there are always people unhappy.
So what can be done to make them better? As I pointed out before having the judging criteria outlined before the start of the contest. If you look at competitions that judge things that are subjective most of the time there are at least 5 judges. The high and low scores for each event are removed. Let’s say for example the five judges gives scores of: 90, 47, 85, 99, 79. The 47 and 99 are removed from the calculation. I think this would improve the results.
I really don't want this to be viewed as a diss to the people who created film, the judges or FunSucker. Instead I would like to see some discussion about contest judging and ranking.
Edit: It appears the scoring system was laid out to begin. I missed that and I'm glad that was the case for this contest.
Last edited by AquaMorph (August 1, 2014 (12:26pm))