Conversations in a fashion metaverse
Imagine: social distancing restrictions are over. It’s safe(r) to go out again! For once, after a long, grueling era of pandemic stress, you make plans to go out to a special public event. It hasn’t happened for the longest time. Clearly, this is a cause for celebration and what else to mark the occasion than to dress yourself up a little?
You reach for your phone and virtually try on some shoes to match the outfit you’ve been eagerly waiting to wear. It’s so much easier slip on that little number and swipe through AR filters that cover your feet with different shoe styles from the comfort of your own home rather than make the long trek to the store. Is that the future of fashion? No, in fact, that’s the present.
Luxury retailers are no strangers to high tech. From virtual reality fashion shows where it seems like the clothes are walking themselves down the runway to digital clothes you can buy but never actually touch, fashion has the potential to bring far off tech closer into an everyday reality.
At present, voice commerce has yet to really take off. The technology is product of a revolution of the past, but its implementation is still pending a revolution of the future. Out of all voice-based orders, shopping for clothes via a voice assistant only accounts for 8% of orders. Which means, if you’d like to talk to someone about the kind of clothes you’re buying without doing the in-person retail experience, your only option is to message a chatbot.
The first notable chatbot of its kind was the 2016 Facebook Messenger bot by the American brand Tommy Hilfiger (which appears to no longer be in operation, see image). The TMY.GRL chatbot experience provided a way for customers to directly interact with the brand and receive personalized recommendations based on their style preferences, handling slot values like “sporty” and “classic cool”. Many other brands followed suit, shedding the skin of bland customer-service oriented bots to instead allow multi-service automation, emojis included.
In all this, what seems to be missing is a truly tailored shopping experience, which is where true opportunity lies. What if you didn’t have to imagine how a piece of clothing hung on your body? What if you could pair a new item with things you already have in your closet? What if we raise the bar a little and think bigger?
But before we do that, let’s explore a bit of fashion tech “history”.
1. How Conversational AI can Automate Customer Service
2. Automated vs Live Chats: What will the Future of Customer Service Look Like?
3. Chatbots As Medical Assistants In COVID-19 Pandemic
4. Chatbot Vs. Intelligent Virtual Assistant — What’s the difference & Why Care?
Nothing screams innovation in fashion more than Cher’s iconic closet in the 90’s teenage film, Clueless (see the scene here).
The imaginary interface represented a godsend for all fashionistas out there who don’t want to go through the motions of trying something on, slipping it off, mixing & matching while making a mess out of their hair and who instead, want to experiment with new pairings in their closet, conveniently. Cher’s wardrobe represents what all fashion lovers want: a tailored experience that keeps the closet owner in the driver’s seat of the decision making— of an informed decision making.
In the interaction, we see a “Dress Me” feature, which populates a photo of Cher so that an approved outfit choice may be previewed on the screen, eliminating the need for the final sign-off step in front of the mirror. When two items are not suitable to be worn together, a big “Mis-match” error toast message appears (presumably programmed based on some higher opinion’s logic on what may be deemed fashionable or in season), forcing Cher to choose something else. Taking the “expert opinion” input one step further, once Cher selects a yellow 2-piece set, matching accessories and shoes are also put on screen Cher, simplifying the “what to wear” process even more.
It’s a 30 second-long interaction of a technology that doesn’t exist, but it’s a UX that could benefit many customers and meet them where they are already: stuck at home for the past 2 years.
As a die-hard fan of fashion-related competitive reality shows (think Project Runway and ANTM), I was drawn to Stylish with Jenna Lyons. What particularly stood out to me from the show was the final episode, where all promising contestants a.k.a “job candidates” prepared the ultimate final round presentation to prove their skills to join Jenna’s company. One of the contestants, Alon Cameron, dazzled the judges with something that combined her background in the digital arts and fashion styling: a VR-immersive runway fashion show.
The VR experience transports you right in the middle of a luxury runway show on the streets of New York City (see video below). The models walk toward you, and detailed video close-ups of each outfit component surround you on both sides. Behind you, there’s the behind-the-scenes action of Alon styling her models before they leave for the runway.
The experience brings fashion directly to you, the consumer. Based on the camerawork itself with the models walking toward you and only you, it feels exclusive. As if the designer only wants your approval and admiration.
But luxury isn’t where the average person lives. While viewing an exclusive fashion show at home could be an enjoyable experience for anyone, at the end of the day, the models are models and the clothes might be out of reach for the average customer. Okay, money aside, how would you know those styles would sit right on your body?
What if, instead of a luxury fashion house pushing its latest season of designs, this were an average retailer wanting to show off its most popular or highest rated items? Regardless of whether you’re at home or in person at a boutique: there’s a “live” VR fashion show ready for you to watch. The show begins with models and ends with an optional “See how it looks” experience where you get to design your own 3D avatar (think Skyrim style), based either on how your body shape is or that of the person you’re shopping for, and see the clothing you just watched walk the runway hang on this virtual model. Instead of just guessing if something could work, you now have a virtual 3D body for reference, not the typical mannequin at the store.
Why stop there? Fashion shows aren’t the only thing in XR. Adding a seasoned virtual fashion “consultant” to a browsing experience could give a user even greater freedom to explore and allow for more personalization.
Going back to Cher’s closet, there are 2 main things that could be improved in the UX. The first: using a static photo as a model. What if, for example, you’ve gained or lost a few pounds or dyed your hair a new color, the photo you use for every “Dress Me” interaction would then be outdated and might lead you toward (gasp) a fashion faux-pas. The second? The lack of complex information exchange with the outfit picker. Other than the big, screaming letters of “Mis-match”, there’s no real explanation of why an outfit could be ill-fitting or indication if a set has been worn recently (to minimize reusing outfits too often).
I propose a multimodal (AI & AR) solution of “Cher’s Wardrobe”. Moving the interface from a touch-screen computer to instead a smart blended-reality mirror would give the fashionista the ability to try out their clothes as they are, in that moment. The clothes would be virtual representations of what they already own, and feedback could be communicated through an AR fashion consultant— a consistently available assistant that can understand your requests and control what’s displayed on the mirror.
The visual part of the experience would look something like this:
In the example, a man tries on different color variations of the same Tommy Jeans puffer jacket. He’s able to switch from color to color by gesturing to his phone on a specific part of the screen displaying a floating jacket icon. Likewise, in our smart mirror interaction, users would have the ability to “tell” the assistant either through gesture or through voice to skip to the next outfit or change the color of a certain clothing item. The same assistant could also handle utterances like “Show me what I wore last Wednesday” or “What can I wear to a formal dinner party?” and change the outfit on screen to selected clothing items that match the description.
With this kind of technology at home, it would be easy to bring e-Commerce into the equation. It would be an extremely advantageous situation for retailers and fashion businesses. Not only would users already have an established relationship with their AR fashion consultant and therefore trust in the product, they would also have the ability to try things on themselves directly at home, reducing some of the friction of online shopping. The assistant would come even more into play, handling questions about pricing, material, and availability (e.g. “Are these boots still in stock in a size 7 1/2?”).
There’s so much to explore here. Users could save their styling preferences and be shown clothing relevant to their taste first. Or each retailer could have different visual markers on the AR assistant’s persona to indicate you’ve entered a new “store”. If you think a bot it, this is only the beginning.
There was a lot of uproar in the past month on the internet about an old demo video presented to Walmart in 2017 about what a branded virtual reality shopping experience could look like. Described simply, the experience is like if you copy and pasted the IRL Walmart shopping experience into the metaverse.
The reality is the same: you still have a “physical” shopping cart, and you’re still making the same rounds through the store, even having to pull items off the shelves. The only real difference is that you’re followed by a persistent Walmart customer service digital attendant that speaks to you, unprompted, in the hopes of being helpful (e.g. “Looks like you already have a full gallon of milk at home. Should we put this one back?”). I don’t know about y’all, but I only want to know if I already have a full jug of milk at home if I ask first if I have a full jug of milk at home. I could be making a special one-time recipe that involves buying more milk, does the assistant know that?
Unfortunately, this is where a lot of brands get it wrong about mixed or virtual realities. The aim should not be to fully mimic an existing experience, the goal should be to make an existing experience better by translating it into the digital world— or if not better, different. That way, people could decide on their own how they’d like to have their coffee that morning (so to speak).
While it’s nice to see the work put into building these beautifully detailed virtual stores (I see you H&M, adidas 👀), they may not be entirely necessary as each reality has it’s own pros and cons. For example, people with high social anxiety or who are just not “feeling it” that day may want to shop at their own personal store in the metaverse. But if people want an excuse to get out of the house and be pampered at an in-person boutique, they should be encouraged to do that as well. People are drawn to fashion in the first place for the concept of choice and for their own unique personality being expressed through their choices. Don’t take that away from them.