The overwhelming majority of respondents (73% in the UK, 81% in the US and 82% in Australia) criticised companies for collecting too much of their personal or financial data. And 9 out of 10 consumers are concerned that AI will impact how companies keep customer data secure. Meeting customers' growing expectations for better protection of their sensitive information is now a business imperative, with consumers prepared to punish companies by switching providers for any loss of trust.
The survey of 6,000+ consumers worldwide polled assessments of the digital industry's data practices. Alongside the criticism around companies’ hunger for data, the findings also spell out an unmet expectation for greater diligence from organisations to protect consumers’ personal information once obtained - as identified by 73% of respondents in the UK, 86% in the US, and 87% in Australia.
Respondents aren’t just asking for a change - they are willing to change their preferred seller if they fall victim to a cyberattack and their data is compromised. A near consensus among users in all three countries (more than 90%) said they might stop doing business with a company if it were the victim of a cyberattack.
"Consumers clearly understand that companies have a lot of catching up to do in the area of data governance and security," explains James Blake, Global Cyber Security Strategist, Cohesity. "The hunger for AI is causing some businesses to skip threat modelling and due diligence on how their data will be exposed. Companies looking to use AI in-house must invest in the security and hygiene of their data to maintain cyber resilience in order to satisfy these consumers that are willing to vote with their purchases. Those looking to leverage the AI capabilities of suppliers must adopt a strong and proactive approach to third-party risk. Consumer trust is quickly lost, and competitors are always just a click away, so ensuring AI strategies don’t introduce additional risk to customer data is crucial.”
Most common consumer fears surrounding unregulated AI data collection
Companies around the world expect miracles from AI, but large amounts of data must be collected for these AI models to learn from. Often, this need for data is prioritised over responsible data collection and handling. Private users are, in turn, concerned with the lack of transparency from companies regarding their AI practices:
• Nearly all consumers (87% in the UK, 92% in the US, and 93% in Australia) are concerned that AI will make securing and managing their data much more challenging.
• Most even go a step further to classify AI as a risk to data protection and security (64% in the UK, 72% in the US, and 83% in Australia).
• Worsening these fears about AI's implications for their data, consumers (70% in the UK, 81% in the US, and 83% in Australia) are severely concerned with the unrestricted or unpoliced use of AI with their data, with the vast majority demanding greater transparency and regulation.
• At a minimum, private users (74% in the UK, 85% in the US, and 88% in Australia) want to be asked for permission before their personal or financial data is fed into AI models.
The expectation of greater transparency also applies to the common practice of sharing data with third-party providers:
• The vast majority of respondents (79% in the UK, 87% in the US, and 90% in Australia) want to know who their data is being shared with.
• Most respondents (77% in the UK, 85% in the US, and 90% in Australia) also call for companies to vet third-party providers' data security and management practices with access to customer data.
Sanctions in the event consumer data is compromised
This clear call for more control, transparency, and protection around their data is largely motivated by respondents' negative experiences. Unsurprisingly, most respondents are highly critical of buying access back to their compromised data - essentially fueling cyber criminals' business model with fresh capital.
• Over half of those surveyed (46% in the UK, 75% in the US, and 62% in Australia) had been personally impacted by a cyberattack.
• More than half of those surveyed do not agree with the idea that companies should pay ransoms (56% in the UK, 52% in the US, and 58% in Australia), condemning the common practice of companies buying their way out of ransomware attacks.
“Paying a ransom rarely results in the recovery of all data. It brings its own logistical challenges and potential criminal liability for paying sanctioned entities - not to mention rewarding criminals”, explains James Blake, Global Cyber Security Strategist, Cohesity. “It’s time for companies to really focus on aligning themselves with the best cyber resiliency vendors and end the cycle.”