It is that time of year again, when the various big agile conferences send out the acceptance and rejections notifications to those who submitted proposals. In Australia, this is generally followed by some twitter chatter from those who didn’t make the cut. This time last year I was one of the many who received a “thanks but no thanks” email from Agile Australia. To be frank, I was surprised. I had submitted two talks that had been accepted by the Agile 2015 conference but were rejected by Agile Australia. What was up with that? After much soul searching I decided that the only course of action was to volunteer some of my time and get involved in the process next time around. So it was with the best of intentions and high hopes that I became an Advisor to Agile Australia 2016.
As anticipated my first challenge was carving out time to contribute. Having been involved with Agile 20xx as a track chair for the past three years, I knew these things were time consuming, but I never realised what a blessing it was that most of the the demands on my time were outside business hours. As a partner in a small consulting firm, participating in meetings during core business hours was a significant challenge. This resulted in me contributing less than I would have like to. Despite my limited involvement, it has been a very informative and insight filled experience and it is with this lens that I thought it might be of value to the community if I was to share some of my insights about the process I observed this year.
I’m not sure if it has always been the case, but certainly in recent years, Agile Australia has used an anonymous submission and review process to build the short list for the program. This approach has both pros and cons. It protects the submitters from the cognitive biases of the review team but it also means the review team is selecting sessions with a subset of the information. I find this to be a stark contrast to the way the Agile 20xx submission process is run, whereby the review team has access to information about the speaker's experience and often video footage of past presentations.
The second part of this equation is the review teams themselves. This is a group of volunteers who to the best of my knowledge receive nothing in exchange for giving up their time to review and provide feedback to submitters. I can’t help but wonder how many “rejected speakers” volunteer their time to contribute to the review process. My challenge to those who feel the process isn’t working is to get involved.
The third influence on the final program is the Advisors. This group is provided with visibility of who submitted the proposal. I had always wondered how this information was used. I was pleasantly surprised to find that speaker information was only used to validate that the speaker was credible and to ensure that the program was balanced, in terms of the number of speakers from the same company, the number of repeat speakers from previous years and the number of speakers that are also track chairs or advisors.
With my new found insider knowledge, I suspect the weakness in the selection process for the Agile Australia conference is not so much in the process but in the lack of understanding about how the process works. This year I learnt that you are more likely to be selected to speak if your proposal:
Of course sometimes it is just the luck of the draw. Two great talks on the same topic get shortlisted and the program only has room for one.