Tracking Web Visitors to Increase Conversions
Tracking web visitors is still not being used enough when people are looking to make web iterations to increase conversions. Far too often, individuals use gut instincts over quantitative data for website and landing page changes. Ignoring the data increases the chances of you making insignificant modifications. There are tools like Google Analytics and others that give you quantitative data on your website’s performance, but applications that give you visual data for tracking web visitors should also be added to your marketing technology stack. I’ve had success with tracking web visitors and it’s an integral part of my A/B testing checklist. Detailed below is an experiment on tracking web visitors that I conducted with the help of a co-worker and how it increased conversions on our website.
Creating a hypothesis
Before you begin making iterations to your landing page, you should first create a hypothesis. You can list the pros and cons, but ultimately the two questions that must be answered are why are you looking to make changes and what do you believe the outcome will be? This will give you and others involved the much-needed clarity of your goals because making changes to your website has the potential to be a daunting task.
Tracking web visitors
My tool of choice for tracking web visitors is hotjar. There are some cool features you can implement to enhance user experience, but web tracking and the heatmap are the two main features that are widely used. I’ve also used crazyegg in the past and consider it a great alternative to hotjar.
By tracking web visitors on hotjar, you’ll have visual data on the user flow. This insight will aid in your decision making.
By using the heatmap feature, you’ll see what web visitors are clicking and how far they’re scrolling. This is very useful for placing call-to-actions on the landing page.
The visual data you receive from hotjar is great for improving user experience. I once added a subscribe top bar to a landing page and set up tracking to see how people were responding to it. While analyzing the user flow, it was clear that the subscribe top bar was effecting user experience on mobile. After noticing the mobile experience with that feature, I made changes to the subscribe top bar by making it visible only on desktop. This is just an example of how UX can be overlooked when you have tunnel vision for increasing conversions. Having visual data at your disposal is key to avoiding these potential mistakes.
My little experiment
I once did a minor landing page experiment with a former coworker with the goal of improving user engagement. My hypothesis was that by simply making the buttons on the homepage clickable, the web visitors would be more active. This would decrease our bounce rate, increase our engagement, and increase our conversions.
While analyzing our web visitors on hotjar, I noticed a lot of interaction on our homepage. People were clicking on what they deemed to be buttons on the homepage. Rather then venturing to another page, some of these visitors would simply leave the site without the information they were seeking. This was also during a period where we were sending a ton of targeted traffic to our site via email and Facebook. It was also a lesson in UX. Design has to “make sense” and be instinctive for people.
The graphic below is what website visitors were seeing on our homepage. You can see why people thought they were clickable.
We then added content to these buttons and this is what people saw after they clicked on it.
Now we were equipped with nine buttons that people clicked for further information about our product. These simple changes made a world of difference to our web engagement. Aside from our conversion rate improving, our visitors learned about our product through reading content within these buttons.
Small experiments with the goal of making minor improvements benefit you long-term. The 1-2 percent increase in conversions makes a difference when you break down your yearly numbers.
How we tracked events
We set up Mixpanel to track the clicks on the newly added buttons. This added credence to our hypothesis and gave us data to continue making improvements. Mixpanel is a great tool for A/B testing, building a funnel, and segmentation. It’s currently my tool of choice for tracking events. The analytics are easy to decipher and you get real-time data. When you’re tracking web visitors, the data becomes quite useful.
This is what the interaction data revealed after we began tracking the homepage buttons.
This provided invaluable insight on what aspects of our software our target market was most interested in. We already had data on which features our customers used the most, but now we could compare that with our results from this experiment. This data was also available for anyone to use if they wanted to tailor their pitch to potential customers.
What we did from there
After running these experiments, we gathered useful data to put in use. By tracking website visitors, we knew which features were most to least important. My co-worker and I decided to keep making changes to the buttons with the least interactions. We never ran out of substitute buttons, as our software had many features.
We used Optimizely to continually swap the least interactive button and the traffic was evenly distributed to have statistically significant data. Optimizely is a great tool because it allows non-technical individuals to run experiments without the help of a developer. With basic HTML skills, making changes in Optimizely should come easy.
If you aren’t familiar with Optimizely, below is a random example from one of our experiments showing what the results could look like.
I first came across Optimizely when my former co-worker used it for a simple experiment to determine which phrase brought the most traction to our CTA button. He also used Mixpanel to track the events and distributed the traffic evenly. These were the results from his test.
I would suggest using both Mixpanel and Optimizely as your tools of choice for A/B testing unless you have in-house tools with more features. They’re both easy to use and powerful enough for most experiments.
My goal for this post was to provide insight on A/B testing for those who haven’t had the opportunity to do so. I thoroughly enjoy running experiments and I also recognize the value that it brings. This was a small experiment that I conducted with my co-worker and the numbers prove it was successful. A data or A/B testing mindset keeps you from being complacent. It challenges you to keep innovating and looking for improvements. With more and more companies putting further emphasis on analytics nowadays, teaching yourself analytical skills will only benefit you in the future.