Digital Technologies in CX: Privacy, Ethics and the User Experience
- Kristine Aitchison

- Oct 9
- 4 min read

With the ongoing drive for better, faster, more capable technologies, organisations are increasingly investing in digital technologies.
In 2025, worldwide IT spending is expected to reach $5.74 trillion, a 9.3% increase from 2024, with cybersecurity alone forecasted to rise by 15% to $212 billion. On average, companies are investing about 7.5% of their revenue in digital transformation initiatives. With this in mind, what risks are our tech leaders mitigating for, to ensure this substantial investment is paying off?
This month, we’ll hear from Auror - a Retail Crime Intelligence & Loss Prevention Platform at our upcoming Design Safari. They’ll discuss how commercial organisations can embed privacy & ethics into their design decisions, and how a people-led culture can directly influence the user experience and product solutions.
Designing with Integrity
With the increased use of technology and automated systems, customers are becoming increasingly wary of safeguarding their data and security. Customers expect transparent data handling and clear guidelines for how it will be used.
As an example, LinkedIn recently updated their privacy terms and conditions. Effective November 2025, they have expanded their data sharing and cross-platform tracking. This means, LinkedIn will be sharing more information with Microsoft, including profile details, feed activity and ad interactions. LinkedIn have advised that data will be used to influence what you see in your newsfeed and ad personalisation.
While we’ve all known for a while that social media platforms collect information on our scrolling habits, it raises the question: What are the ethical implications of these actions?
How organisations handle a customer's data can directly influence their experience. Transparency and ethical data practices are not just tick boxes; they’re essential to creating meaningful, respectful digital experiences and maintaining customer trust.
As CX designers, this means complying with all privacy standards and laws when collecting customer data, prioritising privacy and avoiding misuse.
Designing for the End-User
When designing new technology, it’s easy for companies to get swept up in innovation for innovation’s sake. Many organisations rush to develop the next big app, platform, or AI integration to prove they’re ahead of the curve. But by doing so, they can lose sight of the very people they’re building it for.
A great example is Google’s integration of AI into its search function. While it was a major technological leap, according to Google Trends, the search term “how to turn AI Overview off on Google” has had a 350% search increase in the past 12 months. This suggests a disconnect between innovation and user needs, and begs the question, ‘Did anyone think to ask if we actually wanted this?’
It’s a reminder that technology should enhance, not overwhelm, the customer journey. It is our role as CX designers to educate organisations about their responsibility when it comes to these initiatives. To ensure every new tool or feature genuinely improves the experience for the end user, instead of simply following a trend.
How to Design CX With the Customer in Mind
We recently ran an online session with our community to discuss how we are using technology like AI and its implications on ethics and privacy. Many of our participants raised concerns about the lack of trust in AI and the desire to retain our human agency and control.
As AI and technology become more interwoven into how products learn, adapt, and automate experiences, the risks become more tangible. For CX professionals, trust erosion, data harm, reputational damage, and societal bias become more challenging to manage.
As organisations integrate more tech into their operations, it is essential to put in place clear guidelines that ensure data privacy, fairness, and transparency. Adopting an ethical approach to AI not only helps protect against legal risks but also strengthens trust and safeguards a company’s reputation.
In essence, a CX designer's role is to ensure every interaction a customer has with an organisation is frictionless, trustworthy, and positive, while also driving loyalty and satisfaction. CX designing for the user means keeping people at the centre of every decision of the customer journey. This means ensuring companies design with empathy and intention, and create technology that feels empowering and not invasive, ultimately strengthening the relationship between people and the products they use.
Auror: What They Do
Auror is a tech platform focused on retail crime intelligence and loss prevention. They help retailers, security teams, and law enforcement partners share intelligence, identify patterns, prevent theft, and reduce harm to staff, stores, and communities.
Because Auror deals with sensitive incident data — surveillance, patterns of behaviour, associations between actors — the stakes for privacy, ethics, and the end-user are high:
A slip in security or a data mishandling could expose vulnerable individuals.
Decisions made by analytics models could affect real lives (e.g. suspects, businesses)
The trust of retail partners, law enforcement, and the public hinges on strong ethical guardrails.
At their upcoming Design Safari, the team will reveal how they weave privacy, ethics, and human-centred design into every layer of their work, from internal culture to product strategy.
Auror’s approach shows that for tech companies with a real-world footprint, privacy and ethics aren’t optional extras. They are core to design thinking, decision-making, and organisational culture.
Grab a ticket to the upcoming Design Safari with Auror and join the conversation!




Comments