The mantra of ‘believe the data’ has served email marketing well over the years and has proved to be one of the most beneficial tools we have in our box. We test a single point and the data is returned much quicker than any other form of marketing and we are able to make an evidenced decision to an accuracy of a percent or two. The returns can be believed.
One of the temptations of testing via split email marketing lists is that the idea we put forward might have been a favourite. Logically it should be an improvement. It was obvious. So when the data comes back suggesting we were wrong, the temptation is to try it again. We’ve all have done that. I think it’s safe to say we all have also learned the lesson as well. Believe the data.
The results have an eat-by date. The length of time it is effective will vary, but we must remember that we tested just one factor, leaving everything else alone, perhaps to be tested later. We’d proved something. Let’s stick with it for the time being.
We have no idea how long a specific test will remain effective. After all, and equally obvious, circumstances change and as we regard a percent or two as something of a success, even a minor alteration to a specific factor can alter things substantially.
The saying, ‘flogging a dead horse’ is rather graphic but is applicable to circumstances where people test and test again trying to get a result that they want. However, there is nothing stopping us, apart from lethargy, from checking that a specific test result is current.
One thing we should remove from our minds is that a split test demonstrates one of the two ways did not work. What it actually proved was one was more effective than the other. That’s not a failure for the one we rejected. It just wasn’t quite as good as the one that replaced it.
We’ve had an example of this quite recently. It’s a bit of a stretch comparing it to email marketing, but as a general point it’s valid. Take the election results in the recent Chesham and Amersham by-election. Ignoring the reasons for the change in voting behaviour, what everyone seems to agree on is that certain circumstances have changed. The effect has been a shock to many.
Similar changing habits might well affect email marketing. There has been a distinct social change for most of the UK. Whether or not it is a new norm, it might well alter the results of our testing. The way we can discover if it has or not is to retest.
We mustn’t see this as a way of vindicating our past rejected ideas. What you need to do is to get an idea of how our customer base has changed, and test to see if our assumptions are correct. Testing is all about a percent or two. Covid and lockdowns are hardly subtle changes, and nothing in email marketing is forever.