This post was written by Reba Hull Campbell
I submitted my first Mercury Award entry almost 20 years ago. It was a single entry in the Silver Wing category for a project I was really proud of when I worked at SCETV. The project had been very successful in meeting its goals. But it didn’t win.
I fretted awhile over what was wrong with our project and eventually realized the project was great – it was the award entry that wasn’t so great. Over the next year, I talked with other SCPRSA members who frequently won awards. I read articles in PRSA publications and revisited what I knew about research/planning/implementation/evaluation. I studied the award criteria carefully when the call for entries arrived. The next year, we walked out with not just one, but two awards. That was 2000.
In the subsequent years, my teams’ projects at ETV, the Governor’s School Foundation and the Municipal Association won awards almost every year. Over the years, we fine tuned our approach to award submissions making sure that, throughout the year, we were keeping track of the items we would need for the entries. We designated one person on our staff to file and track these items.
Not only did tracking this throughout the year give us what we needed to craft winning entry packets, but it also made our projects stronger by ensuring on the front end we were actively employing all the steps in the research/planning/implementation/evaluation process.
After several SCPRSA members judged the San Diego awards this summer, I solicited some insight from several who volunteered as judges. Here’s a compilation of some of their advice and comments:
- Keep your entry packet simple. An ineffective project or program won’t win no matter how well you dress up the entry. Fancy notebooks and reams of clips are no substitute for quality research/planning/implementation/evaluation.
- Describe research using specific methods to figure out why you need the campaign in the first place. I found that some of the award entries lacked well-organized descriptions about research (primary and secondary). Instead many opted to say something like … “we were hired to shoot and edit a commercial because the organization felt not enough people knew about their goods or services offered.”
- Discuss in detail what the research suggested and why – goals, strategies, timeline, etc. Most of the ones I judged discussed just a brief plan but did not communicate how (strategies, tools, tactics), when (timeline) and why (based on research) they implemented the project.
- Remember to set measurable goals. Evaluation was a huge loss for most of the entries. A recurring theme: no clear, quantifiable goals stated, therefore leaving little to
- Follow the instructions precisely. It made the judging so much easier when I had didn’t have to specifically seek out items that were either missing or buried in the
- Make sure to include budget details. That was the biggest weakness in most of the entries I judged. Either a general number was mentioned or a term such as “sparse
budget” was used.
- Tie evaluation to quantifiable goals. In many I judged, the evaluation included numbers such as clicks, hits, visitors, clips etc. but those numbers weren’t tied to anything in the goals.
- Keep in mind the judges will know nothing about your project, company or challenge. Use a paragraph of two to help them understand the context of your entry.