Conversion Rate Optimization: What to Expect in 2022

Al Rowe
Experience Stack
Published in
9 min readDec 20, 2021

--

It has been another crazy year. We have seen, and continue to see more Covid mayhem, as well as worldwide supply chain issues, which are made more difficult in the UK, by Brexit.

Nevertheless, businesses find a way to survive and in some cases thrive.

Due to lockdowns, which means consumer lack of confidence in the high street and changing work patterns, there has been a huge shift online. This means that online competition has never been higher, and this can be seen by the continuing rise in worldwide digital advertising spend.

In 2017, worldwide spend on platforms such as Google, Bing as well as Facebook, Amazon and other platforms was around $250 billion. Emarketer predicts that for 2021 this has risen to $491.70 billion and will certainly zoom past the half a trillion mark next year. Indeed Forbes predict 2022 will see over $700 billion spent.

In the US, this has been partly fueled by a huge growth in Amazon advertising but Google still captures 38% of worldwide spend.

This means that like never before, has there been such pressure on website owners to try to compete for custom and in the case of ecommerce, probably to sell on Amazon. That said “pure play” ecommerce growth is expected to be 35% by the end of 2021 according to Forbes. It is worth noting though, that there are still a number of famous brands not selling on Amazon with many more only running a restricted selection of products. Notable brands not on Amazon are Nike, Birkenstock and Ikea.

There are also a growing number of smaller boutique brands who prefer to try to grow their own audience. Environmentally friendly companies in particular seem to fit this category, with brands such as Lush and UpCircle beauty.

So what else will happen?

Site Optimization Predictions and Tips for 2022

  1. Dealing with Cookies

Although Google pushed back their deadline for deprecating third-party cookies to 2023, the fact in the UK that Chrome only has a market share of 49% and that Safari has 33% means that we should all be taking this very seriously. If you haven’t so far, then it is likely you will have been already seeing quite a lot of inaccuracy in your data, particularly around new versus repeat customers.

With this in mind, it makes complete sense for pure play ecommerce brands to focus on trying to grow their communities through site stickiness, social and email campaigns as well as other, perhaps more traditional methods. It is really important for businesses to own their own first party cookie data. If you can create a logged in area, you can leverage this. Many off the shelf ecommerce packages already have this capability. Perhaps 2022 is the year to try to discourage those guest baskets and encourage account creation.

I see several sites doing this by encouraging users to sign up to email lists with the reward of a first time purchase discount. Likewise, you can offer similar incentives on account creation.

The sites I have worked with, still find email to be a strongly converting channel. The key thing is getting the balance right with frequency.

If you are a lead generation or B2B site, you can still grow your repeat audience this way through collecting email addresses in return for premium downloaded content. If you have a lot of premium content for download, you can extend this through account creation (that logged in area). This gives you the ability to know and market to your customers. This deals with both GDPR as well as going someway to address the demise of 3rd party cookies.

Beware of Safari with your A/B tests

Because of the 33% browser share Safari has, if you are running A/B testing software, you need to understand that visitors who return to your site after 7 days are treated as a new visitor and any campaign cookie dropped by software such as VWO will be removed. This means that if I visit your site and your AB testing software serves me the variation, when I return 8 days later, I will be treated as a new visitor. I will then have a 50% chance of seeing the control (original version) and a 50% chance of seeing the new variation. This both creates user consistency issues which, can sow seeds of doubt and negatively impact conversion rates as well skew the data through mis-counting repeat visits.

To get round this issue, you either need to exclude Safari traffic from your tests or create a server-side cookie sync endpoint. To get direct help on this for VWO, read this.

Beware of 100% serving of test variations.

Several of the clients we work with have pretty hefty systems and sometimes that makes their development cycles either expensive or slow. In some cases we have completed a split test with a clear winner and found for whatever reason, that it is too difficult for the client to get the winning variation implemented. When this happens there is a reluctant fall back position — send 100% of the test traffic to the winning variation. This is something of a last resort, but in my experience happens way too often.

Why is this a problem? For three main reasons:

  1. It means that VWO or whatever you use, always has to be running an experiment at run-time. This slows the page load and rendered experience down.
  2. It means that you are having to use testing credits. Most (certainly enterprise testing platforms) charge for testing credits. If you are sending 100% of your traffic to a winning variation, this means it can get quite costly.
  3. Potentially the most problematic reason of all is that anyone running a cookie blocker will not see the winning variation at all.

I have not been able to find any reliable statistics on cookie blocker usage with Wired.com suggesting that leading cookie blocker Ghostery has 7 million users whilst levelingup.com suggests there have been 50 million downloads. Of course, Ghostery is just one of many cookie blockers on the market, so taking them all into account suggests a pretty decent number that is growing all the time. Because of this, I would advise against sending 100% of your traffic to a winning variation, if at all possible. In one case, for one of our clients, this means that some traffic may be experiencing a website that is 18 months old if they cannot see the runtime render of the test! Crazy!!!

So what else might we see more of in 2022?

There are several blogs out there already stating the obvious, so I am just going to mention a few here.

2) Artificial Intelligence or Machine Learning tools.

Over the last few years, we have seen a large rise in the use of the growing number of big data APIs as well as increasingly available machine learning tools like Google’s NLP (natural language processing) algorithm BERT (Bidirectional Encoder Representations from Transformers). The beauty of these APIs is that they potentially make very complex processing available to us digital marketers. This is sometimes referred to as BPaaS (Business Processes as a Service). I have used ScaleSERP myself for example, to make https://www.searchsnapshots.com/, a site to take Google screenshots for a handful of keywords so I can see how the SERPs change over time. It has a way to go before it is that useful, but it didn’t take that long to make.

Many CRO tools are embedding these into their tools to add further insights or automate processing. Many of us may be familiar with screen video session capture tools such as Hotjar. One variant, Sessioncam uses machine learning to look for red flags in recorded sessions to help users spot which videos are actually worth watching. This can save a lot of time trawling through all your recorded sessions.

Next year, I would love to see much more automation in heatmapping-type tools. It would be great to see them start to make insightful suggestions, based on known best practice design.

For example, there is no reason, a heatmapping service could not detect where call to action buttons are and then also detect how much engagement there has been on that button. Potentially, they could also analyse the CSS and point out that it should be no surprise that a transparent call to action button gets little or no clicks, particularly if over a background image. I expect to start seeing CRO tools heading in this direction if not next year, then soon.

Potentially, it may even be possible for A/B testing software to run automated design variations, potentially testing many more css variations than a human would want to set up. This could work particularly well for enterprise sites with millions of site visitors. They could then use the Multi-arm bandit method to adjust traffic variant traffic according to the results in real time.

This brings us nicely onto personalisation.

3) AI driven personalisation.

Many people think that personalisation is just about flattery, knowing who your customer is and making them feel special. Of course, this is important and can have a positive effect on conversion rates. However, it can be just as effective to use site data to customise user experience to tailor the user journey to increase conversions. This is a USP of a platform like Sitecore.

We are currently working on our own piece of AI-driven tech which will give us much more insight into user’s journeys.

Through using it we have learned that it would be advantageous for a particular site to block access to users coming into the site via the paid channel from seeing the blog and that this could potentially provide an uplift of 20% in conversions for that channel.

The real power of personalisation is through recognising user intent and tailoring their experience to keep them focussed and give them more of what they want.

4) Intelligent chat bots.

I don’t know if you have returned an item on Amazon? I have done so a few times this year and have always been amazed at how effective the AI chat bot is, pulling in my user data and asking which order I want to return, giving me common reasons why I might want to do so. It saved me a lot of time and really made me think they have their stuff together.

I spoke at the IRX at the NEC this year and saw a few interesting pieces of tech. One was an intelligent site chat system. It allowed much more functionality to be driven through the chat interface such as booking meetings with sales team calendars. I thought that was impressive.

Buying a Christmas present from ethical skincare company UpCircle this year, I was offered the ability to exchange my email address for a 10% discount. Because I couldn’t seem to get the cookie-driven popup to work, I found this a much better way to get my discount and it worked. The system driving this is called Tidio and looks very powerful as well as being reasonably priced.

I expect to see more and more AI driven capability through online chat. After all, it is the closest thing to a shop assistant. If you are just browsing, this member of staff can be off-putting, but online is different and much more standardised. We can test what works and do more of that. However, if you have questions or are serious, the ability to get those questions answered before purchasing can really drive conversions. Although more expensive, the AI-driven ones can work well for smaller businesses since they can lessen the need for them to be humanly moderated.

5) Site speed

I often cite poor performance as one of the main conversion killers. With Google’s page experience algorithm introducing us to Core Web Vitals in 2021, I have to mention this one. It is not new and will probably always be a concern and a battle. As we try to pile on more engaging user experience, a site often slows down. I don’t think this is going to go away in 2022. In fact if anything, it will only continue to get more and more important.

The main difference, for me now, is that there are more and more tools out there to help diagnose issues. Really, anyone managing a website these days needs to have a reasonable technical understanding of the components that help drive their site performance.

Some of the biggest gains I have seen this year have come from converting all images to newer formats like WebP. Others have been about understanding that not all pages need all javascripts and CSS for the whole site. I would recommend really understanding how your CSS and JS are organised and trying to move to a more page specific model. This cuts down on serving so much code on pages that do not need it, significantly helping performance. This needs to be balanced against the benefits of caching though, so understanding which code is needed by the whole site and serving that centrally is the key here.

Whatever happens next year, we can expect more chaos, more competition and more tech. There has never been a more important time than now to take this seriously and get in the race.

--

--

Al has over 20 years of experience in web development, design, optimisation and conversion and has helped businesses of all sizes to achieve digital growth.