Last month’s eMetrics event in San Francisco, CA, was one of the best conferences I’ve ever been to with several lessons that were truly important to myself, Dyn and anyone that is charged with better online conversions and sales.
Motivation, Ability and Trigger
Motivation, ability and the call to action (trigger) combine to determine whether a conversion happens or not. For example, if you are looking for email subscribers to eventually convert later, the easier you make the process, the more likely you are to succeed.
You could potentially increase motivation by adding things like discounts and coupons in, but the best path to increase conversions is always to reduce the ability required in order to complete the conversion. The graph is an illustration of when triggers work and when they don’t.
We are all looking to swing a golf ball. Each mm makes a difference.
I am a huge believer in this point that was reiterated at eMetrics. Small improvements driven by data can make the difference between a successful marketing campaign and a break-even campaign. Let’s say you invested $1000 in an online marketing campaign and recovered $900 through online sales. This campaign would possibly fall under the ‘break-even’ category.
On the other hand, say you had tested multiple landing pages which led to 20% improvement in conversion rates, tested variations of the A copy which had a 10% improvement and optimized your bidding strategy across geographies/ad groups that reduced costs by 20% by maybe not bidding in areas that did not have conversion.
Suddenly you have a marketing campaign that you have only spent $800 and managed to generate $1200 — a 50% ROI. With that knowledge in hand, we know we can scale this up and drive more revenue and continuously optimize to further improve ROI.
Tying this back to the golf ball analogy, the small changes might just make the difference between a golf ball (read marketing campaign here) that ends up in the sand or on the fairway.
Means are meaningless. Always segment.
The average time of six minutes spent on a website means absolutely nothing if 50% of the traffic spends less that 10 seconds there and another 20% spends more than 30 minutes there. It’s even worse if the 20% is your existing user base having issues using your product and spending time on the support section.
Move from no-harm A/B tests to controversial A/B tests
This advice came all the way from the Director of Analytics at walmart.com and I think there is a lot of truth here. All too often, A/B testing is used to test really small things on websites like fonts/colors in call to action wording. Though some of these tests do tend to win and add some value, you will always find that the big wins come out of controversial tests. This is definitely where A/B testing can really be unleashed big time.
Mobile: biggest gap in ad money spent vs customers time spent
Illustrated fairly well in the graph below, 23% of customer time is spent on mobile while only 1% of ad money is actually spent on that channel –- a pretty clear indicator of where marketing dollars are going in the next couple of years.
Digital marketing for B2B lead gen can be successful if tracked end to end
Tracking B2B marketing can be difficult, but at the same time, it is the most important facet of such a campaign. A lot of times, either B2B digital marketing is not tracked with the ‘hope’ that it is doing well or measured by metrics such as ‘leads generated’.
While leads generated is possibly a decent metric and is better than not measuring it all, it is extremely important to measure what the conversion rate of these leads and value of deals closed is for each of the digital marketing campaigns (and for each individual segment/geography too). Such end-to-end tracking will give much larger insight and tell us where the next dollar to be spent on digital marketing should end up.
The conference left me with the feel that the analytics industry is where the field of computer science was in the late 90s with lots of potential to uncover!