In Direct Marketing, marketers send different marketing pieces in the mail…or post ads in print media…or post online on websites…or send emails. And sales will immediately show the success or failure of the ad. In direct mail campaigns, number of sales per number of items sent is an important number to track.
This is a very competitive space and if the ad does not catch the reader’s eye immediately, it is instead immediately filed in the recycling bin. People sort their new mail over recycle bins these days, so you have to stop them or it is over.
Some of the ads were so effective that they sold millions of dollars in products, but rarely are they designed so effective immediately upon creation. Direct Marketing is about work in progress and testing. To save on costs, they will do smaller tests of multiple variations of copy to see which ones draw the most results (i.e. sales).
This is known as split testing, multivariate testing, or A/B testing. I will generically say split test to mean any combination of two variable or multi-variable testing.
You never know if a certain call to action or another will resonate better with the target audience. Some companies split test pricing, to see which pricing gets the most profits for the costs. Others will test the photos used or the specific words used within the ad. Small elements sometimes have huge impacts.
For example, in “The Psychology of Persuasion” by Kevin Hogan, he talks about an experiment done by Ellen Langer, a Harvard social psychologist, in 1977. She asked for a favor of people waiting in line to use a library’s copy machine. She would say “excuse me, I have five pages. May I use the machine because I’m in a rush” and the other half of the time she left out the word because. When she used the word “because”, 93% let her cut ahead. When she did not include the word “because”, only 60% let her cut ahead. 33% more with just a change of a single word.
I think we could use such testing of our job posts in recruiting. Not all of them…obviously. But it would be advantageous to test the job post copy that you use all the time and repeatedly. Say, if your company is recruiting a lot of call center people (for example).
It is not so difficult to do. I used to use Google Website Optimizer to create automated A/B tests and split tests of my website content on a few of my old websites. Now you can do the same in Google Analytics (under Reporting, Behavior, and Experiments) with Content Experiments. I have not used it yet, but if it is like Website Optimizer, it will give you a script to put in your website code and it will equally test (for example) two different paragraphs within a job post. It will then give you the number of times the variations were viewed and the number clicks (to the application button, for example) you received and thereby the click-through-rate (clicks divided by impressions (# views) times 100).
Even if you can not get access to the website code to insert the script, if you can get the website analytics you can do a slower, manual version with repeated or constantly needed jobs you are recruiting for. You can see the number of visitors a job page has and in your ATS you can see the number of applications (this would be completed applications…not number of clicks to the application button like in the automated version). The tracked metric would be number of applications divided by number of webpage views times 100. Instead of just repeatedly posting the exact same job post all the time, you could make minor changes to the job each time you post the job again and again…and steadily improve and increase this number through testing different content.
Over time, you would learn which content gets you more and/or better candidates. This doesn’t take a lot of time, once you learn how to conduct the tests and log all your tests for comparison. And over time, it will not just improve the effectiveness of your job posts for the tested job…but also influence how you post all your other jobs as well.