Thoughts on AI Assisted UX
This year has been the moment of AI. It has been steadily adopted to our personal tasks and office workflows taking on minor tasks like writing an email, helping us choose the right words for a CTA, suggesting recopies, travel destinations, robust weather forecasts and more.
As general public, the AI tools we have access to is Artificial Narrow Intelligence. Build to complete a targeted task imitating human behaviour. Chat GPT, Alexa, google AI, Adobe Generative AI all goes into this segment. Now these AI are more like assistants to us, humans.
AI for Designers
As Designers, these AI models start becoming a third arm to help us with our tasks. It can fetch us information from the ends of internet, summarize lengthy information, generate visualizations based on our prompts.
If I were to categorize them, these tools would fit into
1. Summarization
2. Image Generation
3. Desk research
1. Summarization
Its algorithms distil vast amounts of information into concise, actionable insights. In research notes, it extracts key findings, making complex data more accessible and easier to review. For call transcripts, it highlights essential discussion points, streamlining follow-up actions. In project documentation, it can synthesize Key pointers and highlight areas on our workflow
2.Image Generation
Based on the given prompt, the tool would generate images based on its understanding or make changes to reference image based on prompt given. As we cannot generate brand aligning image, the usage would be limited in storyboarding and internal presentations.
3.Desk Research
For market analysis, AI can quickly scan websites and reports to identify trends and opportunities. AI can analyse customer reviews and surveys to reveal patterns and preferences that might be missed because of information overload.
The Catch
The processed information based on our prompts is what AI thinks the answer should look like. For instance, having AI to look for good UX practices in care management digital products gave me a list of websites where one of them worked on civil engineering and real-estate. Also the result would not be the same for each conversation.
It also falls into common pitfalls like
1. Potential Bias : On specific topics, AI can produce information from outdated sources and biased findings. It cannot distinguish between credible sources.
2. Data Quality : AI is at the mercy of the data that’s available and it will sometimes struggle to convert it into the designer’s needs . Asking a follow-up question in such situations would often create AI hallucinations.
These are problems that we as designers also face. Being the user of the tool, it falls into our hands to use these tools responsibly. We have to be critical of AI’s output and be mindful that our prompts dictate AI suggestions.
Leveraging the tool
Axure, released in 2002, would be the first ones to introduce the digital prototyping. Sketch, released in 2010, became a de-facto tool that took over the entire UI workflow. Then came a plethora of tools like Figma, framer, Miro each building its core strength to help designers in their workflow. Now these generative Ai tools have joined the fray.
As Designers, we have to leverage any tool that can help us work efficiently and save time. In time, generative AI models would be more mature and penetrate different parts of our workflow. Adopting them early would give us a head start into understanding how these tools behave and reshape the future of working in a digital product environment.
Note: This article is written based on my experiences with Generative AI tools and the opinion is subjected to change as the generative algorithms mature. Except ChatGPT and Meta AI, the tools mentioned in this article is available for use in every Elevance and it’s sister companies with default software licenses shared with designers.
This article is also Published in elevance health UX Digest, Oct 2024