Keeping Sales Momentum From a Tradeshow Last Throughout the Year
A powerful trade association for more than 1,000 companies in the construction and agriculture industries.
Every three years, this client supports one of the largest construction trade shows in North America. They wanted to translate the trade show’s real world success into a robust online platform for buyers and sellers. Their first attempt was not yielding the level of engagement they desired, so we needed to identify where there was friction with adopting the platform and use behavioral science to get buyers and sellers to increase engagement with the platform.
Using Behavioral Science to Identify, Test, and Correct Issues
Behavioral Assessment: What’s not working?
Our assessment surfaced issues in two core areas:
1. Design. Their website wasn’t optimized for behavior. The platform was hard to navigate and was missing some important functionality, like clear instructions on what to do next as well as the ability to compare products.
2. Messaging.The core value props were not accurate. Previous research lacked measurements of baselines and real-life trade-offs. Because of this, when the client asked what people wanted, the answer was “everything”. They then made a website that tried to have everything, but it was overwhelming and hard to use effectively. A typical case of “trying to be everything to everyone and becoming nothing to nobody”.
We presented these findings to the client, and they decided we should investigate the second core area because if the value props were wrong, no amount of optimizing the site would help.
Qualitative Research: Understanding the Audience Through Phone Interviews
We spoke with 10 contractors (buyers) and 3 manufacturers (sellers) to understand which of the client’s original ~15 unique value props (UVPs) they valued most. We also wanted to understand how people researched equipment and their level of tech-savviness.
Quantitative Test: What UVP Would Drive Behavior for Both Buyers and Sellers?
In the qualitative interviews, we had evidence that people found the 3 UVPs below to be unique, valuable, and interesting. But people often tell you one thing when there are no consequences or real-world choices, but act quite differently when you put those ideas out in the real world.
Now we needed to know if people would actually take action on these value props:
1. Social Proof: The ability to leave reviews about equipment.
2. Relativity: The ability to compare multiple products side by side in a standardized way.
3. Talk to Techs and Engineers: Contractors want to be able to talk to the people who know the machines and can honestly tell them what ownership is like. Talk to techs, not sales and marketing people.
We did not yet have a product to test this hypothesis on. To solve this, we ran a test with 5 different ads that looked like ads for a real platform:
- One for each value prop (See above)
- One that combined all 3
- One that was a control
We then directed people to a landing page that recalibrated people’s expectations: we asked them to sign up and give us feedback on the product in the future.
These ads performed remarkably well. Ranked:
1. All of our ads outperformed the client’s average click rate.
2. Relativity: outperformed our Control by 146%
3. All 3 and Talk to Techs outperformed the Control by 66%
4. Social Proof: outperformed the Control by 35%
5. Control: Our control outperformed the client’s average click rate by 75%
The client is using our research results to help contractors have a much better experience, rethinking their program to reflect these 3 core value propositions. People not only reported finding them valuable, but more importantly, they were willing to go out of their way to take action (i.e. click on an ad) because they found these features attractive. With these unambiguous results, the client now has a clear way forward to radically improving the program for their users.