
Annabelle Frohn
Lead UX Designer
5 Mar 2025, 6 min
Introduction
Reflecting on my time in a Master's program for Interaction Design, I recall a pivotal debate: "Should designers know how to code?" I was tasked to argue against it—ironically, given how foundational a basic grasp of coding later proved to be in my own UX practice. That debate underscored a fundamental truth: design isn't just about aesthetics; it's about understanding the systems that shape user experiences. Today, the same principle applies to AI. As designers, we can't afford to treat AI as a black box—we must engage with its mechanics to create experiences that are not only functional but also ethical, inclusive, and truly user-centric.
Every day, artificial intelligence shapes the digital experiences of billions of users—from personalized recommendations to automated customer service. According to Gartner [1], by 2025, over 70% of enterprises will have operationalized AI architectures. This shift towards AI-driven design isn't just trendy—it's driven by compelling business outcomes. Companies are increasingly turning to AI for design decisions because it can process vast amounts of user behavior data, identify patterns humans might miss, and adapt interfaces in real-time based on user interactions. McKinsey's research [2] shows that companies that fully absorb AI tools across their workflows and business functions can potentially double their cash flow by 2030.
However, despite these efficiencies, AI lacks the human intuition, ethical reasoning, and emotional intelligence needed to craft truly meaningful and inclusive experiences. Designers provide the critical human perspective—ensuring that AI-driven decisions align with user needs, cultural nuances, and ethical considerations that algorithms alone might overlook.
Yet, as AI’s influence grows, many UX designers are building experiences around technologies they don’t fully understand. This gap is no longer sustainable. To design AI-powered products that are responsible, inclusive, and user-centric, we must go beyond the interface and understand the systems driving them. Otherwise, we risk designing experiences around a “black box,” unable to anticipate its consequences—or control its impact.

UX designers often treat AI systems as obscure black boxes. Original illustration by Simon Prades
When designers lack understanding of AI fundamentals, seemingly minor design decisions can cascade into significant user problems. Research from Microsoft [3] shows that designing appropriate user interfaces for AI systems remains one of the biggest challenges in creating effective human-AI interactions. Consider Netflix's recommendation system evolution, documented in their technical publications [4, 5]. Their initial recommendation approach relied heavily on explicit user ratings, which proved limiting as users often rated what they thought they should like rather than what they actually enjoyed watching.
After enhancing their UX with an AI-informed approach, Netflix implemented a more nuanced preference system that incorporated viewing history, browsing patterns, and time-of-day context. This redesign—created by designers who understood the algorithmic capabilities and limitations—significantly improved content discovery and reduced browsing time. As Gomez-Uribe and Hunt from Netflix explain, their recommendation system "enables us to get the right content in front of our members at the right time" [5], showcasing how UX designers with strong AI literacy can create interfaces that effectively translate user inputs into algorithm-ready data, resulting in measurably improved experiences.
The implications extend far beyond entertainment. AI now powers critical systems in healthcare, finance, and public services. According to research published in the Journal of the American Medical Informatics Association [6], effective design of AI interfaces in healthcare settings is critical for preventing misinterpretation of AI recommendations. Poor design choices in these contexts can reinforce biases, exclude vulnerable users, or even cause direct harm. UX designers serve as the bridge between user needs and technical implementation—we have both the opportunity and responsibility to shape how AI systems interact with humans.
Rather than viewing AI as a mystifying black box, designers should understand these key principles that directly impact user experience:
The AI systems we design are only as good as the data they learn from. Research from Google [7] has demonstrated that thoughtfully designed data collection interfaces are crucial for producing effective AI systems. This means:
Creating intuitive interfaces for data collection that encourage accurate input: e.g., Airbnb's calendar selection tool that automatically highlights available dates and grays out unavailable ones, reducing errors compared to manual date entry. As documented in a case study by Nielsen Norman Group [8], such thoughtful design patterns can significantly improve data accuracy.
Designing transparent consent mechanisms: e.g., Clear, contextual explanations of how personal data will be used at the moment of collection. Research by Cranor [9] shows that transparent, user-friendly consent mechanisms increase both user trust and willingness to share accurate information.
Implementing progressive disclosure: e.g., Prompt new mobile app users to fill in personal details at relevant steps, such as inputting their location when selecting shipping preferences, rather than front-loading profile creation as an onboarding task. This approach has been shown to increase completion rates while improving data quality [10].
Building inclusive data collection methods: e.g., Voice recognition systems that account for different accents and speech patterns. Microsoft's inclusive design guidelines [11] emphasize the importance of collecting diverse data sets to ensure AI systems work for all users.
To create ethical AI experiences, designers should follow these principles:

Here's how these principles translate into concrete design practices:
Research from the University of Minnesota [12] found that recommender systems that provide users with more control over their data and preferences lead to higher user satisfaction and engagement levels.
When designing chatbots or AI assistants:
Microsoft's guidelines for conversational AI [3] emphasize the importance of setting clear expectations about AI capabilities and providing smooth handoffs to human agents when needed.
Personalized experiences demand a delicate balance between algorithmic precision and user autonomy. Effective AI-driven recommendation systems should display confidence levels alongside their suggestions, helping users understand the reliability of each recommendation. By explaining the rationale behind AI decisions, we empower users to make informed choices while building trust in the system. Furthermore, allowing users to adjust algorithm parameters gives them agency over their experience. Perhaps most importantly, recommendation systems should break free from the echo chamber effect by presenting diverse options beyond primary recommendations, encouraging discovery and serendipity.
As AI continues to evolve, UX designers must position themselves as advocates for responsible innovation. This evolution requires a fundamental shift in how designers approach their craft. Rather than viewing AI as a separate technical component, designers need to actively immerse themselves in understanding AI capabilities and limitations. This knowledge enables more meaningful collaboration with data scientists and engineers, creating a shared language that bridges the gap between technical possibilities and user needs.
Thorough user research becomes even more critical in AI-driven experiences, as it reveals not just usability issues but also potential biases and ethical concerns. By incorporating ethical considerations into the design process from the outset, designers can help shape AI systems that respect user privacy, promote fairness, and maintain transparency. This proactive approach to ethical design isn't just about avoiding harm—it's about creating AI experiences that actively benefit users and society.
The future of digital experiences will be increasingly AI-driven. By understanding and thoughtfully designing these systems, we can ensure they enhance rather than diminish human experience. The challenge for UX designers isn't just to make AI interfaces usable—it's to make them trustworthy, inclusive, and genuinely beneficial for all users. As we navigate this complex landscape, our success will be measured not by the sophistication of our AI systems, but by how well they serve and empower the humans who use them.
At Hypersolid, we view the intersection of technology and creativity as a powerful space for innovation. Our team of data and AI experts works alongside designers to explore how AI can drive impactful solutions across various aspects of business, as we've done with clients like Lotus. Beyond the technical side, we design meaningful brand experiences —such as for Polestar in a D2C commerce environment, a website unlike any other automotive brand, which seamlessly blends rich brand experience with conversion. We don't believe in the traditional advertising model, but we deliver differentiated and targeted campaigns for brands like IMC, Under Armour, and Heineken, while leveraging AI to optimize assets as well as quick and deep consumer insights to drive creativity.
As AI continues to evolve, we look forward to collaborating with businesses to address specific challenges and explore new possibilities. Through that initial engagement, we establish our integrated way of working with internal IT and Marketing teams and start to demonstrate impact. Our experience with our other clients teaches us that this is usually the start of a longer partnership where we empower internal organizations to be more ready for disruption, speed up experimentation, and build out a solid technical and brand foundation.
Perspectives
Industry insights, company updates, and groundbreaking achievements. Stories that drive Hypersolid forward.
Contact us and let's get started.