
SQUERL AI
• Generative AI Companion •
Overview:
The Task
AI integration is a core part of this product, enhancing and personalizing the content delivery experience. My task was to research and explore how to best design AI-driven functionality into the app in a way that felt intuitive, helpful, and aligned with the overall experience.
Impact
Designed a scalable content delivery experience that supports both real-time and asynchronous interactions, laying the groundwork for future AI integration.
Anticipated future AI-driven features by designing interaction models and interface components that could seamlessly evolve without needing to rework the core UX.
Explored and defined early interaction patterns for AI-driven features, setting a clear vision for how AI could enhance content delivery and user engagement in future versions of the app.
Reduced cognitive load and typing expectations by designing AI-supportive interactions that fit naturally within the user’s viewing context and mobile behavior.
Informed future development by researching mobile AI heuristics and best practices, producing a reference-able set of design principles tailored for AI on mobile.
What didn't work
Too keep the UI from looking cluttered, I started with the pills at the top of the screen. This was problematic due to the distance from prime thumb touch locations. Eventually after multiple iterations, a solution was devised that solved this issue.
The Accomplishments
The research conducted regarding AI for use on mobile provided great insights to integrate into the design explorations. Solutions were found that helped reduce the amount of typing for users, allowed for flexibility without clutter, so now we have a solid design direction and foundation that is ready to be tested and implemented.
The Challenges
Reducing Typing & Cognitive Load
One of the key challenges was designing for users who would be interacting with the app both during and outside of viewing experiences. During viewing, attention is divided, so interactions needed to be quick, low-effort, and reduce cognitive load. Outside of those moments, users often engaged out of curiosity or in response to new content. This made it essential to minimize typing expectations, especially during active viewing.
AI at the PoC Stage
At the Proof of Concept stage, AI functionality was still in early development and existed only on paper. It wasn’t being implemented in the app yet, which meant we couldn’t observe it in action or validate assumptions through user testing. This required us to think ahead and design speculative features that could align with future goals.
Balancing Flexibility with Simplicity
We also faced the challenge of offering flexible interactions without introducing clutter. Users needed access to options and features when relevant, but the interface had to remain clean and focused. We needed a design that surfaced functionality only when asked, to maintain simplicity while still offering depth.
Design Process



Research
For this feature, my research focused on understanding the core requirements, identifying best practices for AI integration on mobile platforms, and exploring whether any established heuristics existed specifically for AI-driven experiences. This helped inform how AI could be introduced in a way that felt natural, supportive, and aligned with user expectations.



Comparison:
As part of the research process, I reviewed other products that incorporated AI to identify patterns that could be adapted for this app, and to avoid those that might hinder usability. Since one of the main goals was to reduce typing input, I focused on alternative interaction models that minimized friction. I also used ChatGPT to explore emerging design strategies and gather insights on how AI can support mobile experiences in context-sensitive ways.










Documentation:
The research process led to a working list of best practices, heuristics, and design patterns that could be applied to this feature. These guidelines helped inform decisions throughout the design phase and ensured the AI-driven functionality aligned with user expectations and mobile usability standards.



Ideate & Design: Early Explorations
This stage involved multiple rounds of iteration, each helping to surface new insights and refine the direction of the feature.
Some of the key tools and patterns explored included:
-
A floating AI assistant button for quick access
-
Suggestion pills to reduce input effort
-
Auto-fill options to streamline user interactions
These elements were designed to support a more fluid, low-friction experience, particularly in contexts where the user’s attention might be divided.
Design & Iterate
As designs evolved, they needed to support both AI-driven interactions and mobile design best practices. Balancing these two priorities was an interesting challenge.
For example, to avoid visual clutter and prevent the UI from feeling bottom-heavy, I initially placed suggestion pills at the top of the screen. However, this conflicted with mobile thumb zone best practices. After further iteration, I moved the pills to the bottom of the screen, striking a better balance between visual hierarchy and ergonomic usability.

Ideate & Design: Color and Content
Hi-Fi Wires:
Once the core direction was defined and the design foundation felt solid, we moved into high-fidelity wires, applying color and refining the visual style to align with the broader product identity.
Although this feature wasn’t being implemented at this stage, I documented functionality and UX expectations directly within the wireframes to capture intent and provide clarity for future development.






Final Designs
As the UI Designer on the project, I took the UX designs through to full-color deliverables. This included creating the iconography, selecting and applying typography, designing the logo, defining the color scheme, and producing all UI art to ensure a cohesive visual language across the product.
Next Steps
Since this was a Proof of Concept for the MVP and not slated for implementation at this stage, several areas still need further development. Upcoming work includes conducting usability testing, fleshing out sub-screens, and finalizing the full list of requirements and stakeholder expectations.



