
I promise you, those ten people will provide better data than your JS snippet can drag out of a million users. Invite 10 random people to use your product in exchange for a gift coupon, and film them interacting & commenting on usability. The opinions and feelings of users are not objective and easily classifiable, they're fuzzy and detailed with lots of asterisks. If a feature increased signups, did you also look at churn? Did you just create a bait marketing campaign with a sudden peak which scares away loyal customers?

Their analytics tools showed them that people loved solid dependable feature phones and hated the slow as fuck smartphones with bad touchscreens - the reality was that people hated details about smartphones, but loved the concept.ĭid you really have zero Android/Firefox users, or do those users use blocking extensions?ĭid people really like page B, or was A's design better except for the incessant crashing? Small detail: The tiles were default (bias!), and the table didn't render well on mobile, but when speaking to users they told us they actually liked the table better - we just had to fix it. We removed a table view in favor of a tile overview because the majority seemed to use it. I don't subscribe to this whole "data driven" way of doing things, because when you dig down, the data is almost always wrong. So called "growth hackers" will hate me for this, but I find the results from analytics tools absolutely useless. Whether it's implementing some tracking script, or building a custom backend for it.
