Screens from the MVP product which launched in September, 2024.
Problem
This project started as a proof-of-concept: how might the Lowe's Visual Commerce team use generative AI capabilities to increase customer conversions? Engineers created a demo of the now-core Style Your Space functionality, using Stable Diffusion to restyle photos and our internal visual search capabilities to find similar items in the Lowe's catalog.
I partnered with research to design a test of this proof-of-concept: it was part evaluative (could customers use and understand the POC?) and part generative (did this feature appeal to customers; how might they use it?). The directional findings we heard were:
- Customers were very interested in the functionality of the POC. They thought it would help them with home improvement inspiration and finding items to bring that inspiration to life.
- They were confused by prompting, especially outside of a conversational affordance. They wanted a simpler way to style photos.
- They were most interested in pre-defined styles; adjusting color was less valuable to them.
Proof-of-concept. This is one of the screens used in the proof-of-concept test conducted with users.
Approach
After discovery, the team and I moved into iterative design. Because this was a zero-to-one design project, there was a lot to define and iterate through. Over the course of our iterative design, we conducted an additional four evaluative tests with users.
Early Style Your Space customer flow, used to define the customer experience and to communicate this experience to stakeholders.
Technology changes drove iteration alongside research: our engineering partners were improving the models over the course of the project. Features such as the time to load generated images, what styles could be generated, how many styles could be generated, and the fidelity of the generated images themselves all changed throughout the product development lifecycle. I worked directly with engineering to understand how the capabilities were evolving and with my team to help them work within these capabilities and constraints.
Solution
Key insights from our four evaluative research studies shaped the final design:
- Despite being a net-new feature, testers found the user interface very intuitive.
- Users did not really understand the difference between AI, AR, or VR; they mostly knew AI as "the cool new thing."
- Testers also did not care much about the differences between AI, AR, and VR; the technology we were using was not the value prop, the outcome was what mattered to them.
- Testers liked Style Your Space and understood its value, but they nearly unanimously asked for two features that were not technically feasible: object replacement (e.g., switching one couch from generated photos to a different couch) and the ability to shop the actual items in the image (as opposed to products "inspired by" the image).
Examples of some of the screens used in testing. (Style Your Space's working title was "Inspiration Engine.")
Outcomes
After multiple user tests, dozens of concept iterations, and close partnership with PMs and engineering, Style Your Space launched to the public in September 2024. At launch, we saw initial customer engagement with the feature and an early 16% conversion lift among users who tried it.
Style Your Space is currently live in the Lowe's iOS and Android apps as the company's first generative AI consumer feature.
A prototype of Style Your Space from its 2025 launch.
Reflection
Style Your Space was Lowe's first generative AI consumer feature and established foundational patterns for AI visual commerce at scale. The project taught me how to design for emerging ML capabilities, translate messy constraints into consumer-ready experiences, and validate new technology through iterative user testing.