A/B split campaigns allow you to create alternative campaigns. For example two campaigns Version A and Version B. Version A is sent to one group and Version B is sent to another. These groups only consist of a small percentage (usually 10% per group) of the total receivers of the campaign. The campaign is sent out to these groups for a designated amount of time with the best of the two being the campaign that is sent to the remaining subscribers.
There is also the comfort in knowing that no one subscriber will receive the campaign twice as the test pool only receive either the Version A or B email with the the remaining subscribers receiving the winning campaign email. It's good to keep in mind these are tests and not every test you run will improve results or have a clear winner, though it will give you a better insight as to what works best with your customers. It is still up to you to write great subject lines that your target audience will respond to. This is something you can implement and try and then improve on with every campaign ongoing. After all life is all about learning new stuff everyday.
There are a number of tests you can run through, here are three of the main ones to consider or try out (we are here to help continually improve your results, please fell free to contact us to give you a hand with this.):
1. Subject Line
Testing two different subject lines is often the most common practice with A/B testing, allowing you to control the approach of the campaign and see what is of most interest to subscribers. You could also add personalisation to identical subject lines such as a name greeting and see if that has a better response. We ran a test campaign using this method with the winner being determined by the open rate. This was run at 5% per group for 1 hour, Version A being a direct subject line and Version B being a more personal subject line.
2. From Name
If people don't recognise who the email is from chances are they won't open it, that's why sender information is important. Testing two versions of sender details can see which responds best with subscribers and can help build upon the relationship. Considering they are more likely recognise and trust your company or product name in your campaigns.
3. Email content
This is to test different elements of the campaign itself, for example: section titles, article length, calls-to-action, header images and more. You might even test two completely different designs to see which one gets the most clicks.
This is just the beginning to A/B testing with more possible potential to try achieve greater email campaign results. If you would like to learn more about A/B testing and how we can help with trying to improve your email campaign results get in touch with us.