Microsoft has been a technology provider for the Fashion Innovation Agency at London College of Fashion since 2015. Together they have consistently leveraged emerging technologies to disrupt the fashion and retail industry. Launched on 6 October 2020, the Digital Human Stylist is their latest project in collaboration with Reactive Reality, an Austria-based leading provider of virtual try-on and virtual fashion image generation technology. LCF Newsroom content creator, Simran Sanghavi, recently spoke to the technology development manager of the Fashion Innovation Agency, Moin Roberts-Islam, on the inspiration, process, and scope of the project.
Digital Human Stylist | Microsoft, Fashion Innovation Agency, Reactive Reality
The Digital Human Stylist, as the name suggests, is a personalised digital avatar created by the user which then becomes their own in-house personal stylist. This stylist can learn and predict the user's behaviour and style preferences based on multiple factors such as the breadth of their wardrobe and behaviours. The AI would infer deductions from the interactions with it and become better and better as users interact with it. As a result, the digital human stylist can provide meaningful and personalised outfit suggestions for the user to instantly view on the 3D avatar. It is also possible to integrate the user’s calendar and the weather to get nuanced styling suggestions.
The project was started as an attempt to showcase a scanning technology by Reactive Reality. We have worked on some mobile scanning with them once before, taking a two-minute video of a person as a 3D model using a technique called photogrammetry - creating a photo-realistic representation of that person with a fine degree of accuracy. The avatar could then be used for virtual try-ons, animations, or whatever you want to use it for.
Microsoft liked the technology, so the team brainstormed ideas of how we could use it in a meaningful way in the fashion industry. We also wanted to integrate some AI into the project in the form of recommendations. We were just starting to execute it and then lockdown hit in 2020 so we built this whole thing at home. We took scans of ourselves, and garments from our wardrobe and the production company sent equipment to our homes to shoot a product video. This was a covid project, there was a lot of work undertaken and a real testament to virtual working.
You could use it in many ways - a mobile phone, a tablet, or a VR headset. Mobile phones are obviously the most used where you take your scans on the phone, go around the user three times - one at head height, one at chest and one at the waist, upload imagery, get your avatar sent back to you and interact with it through your mobile phone. You would hold it up to see the avatar in the room with you, likewise with a tablet.
You then 3D scan the garments from your wardrobe just like you scan your body measurements by putting it on a mannequin or wearing it yourself. It only takes a minute or two for each garment, so it is quick, and the Reactive Reality software is user-friendly. And then you just enjoy a personal in-house stylist.
Research conducted by Fashion Business Research looked at the psychology behind using an avatar for a stylist and they surveyed a bunch of people around their experiences and asked if they wanted an avatar that looked just like them. Interestingly, they found that many respondents did not want to see themselves as they did not want to see any imperfections of themselves in front of them. In terms of styling recommendations, you want to feel slightly glamorous and want to feel removed from it all. It can be unusual to see yourself recommending things to yourself because it’s you talking to you. I think just removing that slight dissonance and having someone with the same body type but a different face for example helps.
Since this was a proof of concept, we rolled it out with the avatars we had, the exact replicas. But you could very well easily change those avatars because all you have got to do is get the body measurements and put a different head on it.
We worked with the Microsoft AI team to first put all the outfit recommendations together. We also wanted to integrate calendars and weather so it would make nuanced style suggestions. For example, you know you are going to work on Friday, you have two meetings and then you are meeting friends for dinner, the tech would look at your calendar and give you contextual recommendations. The project also has a sustainability kind of theme whereby it was more about using different combinations within your wardrobe rather than buying new garments.
We also integrated natural language processing which works on the user's speech as input and the avatar talks to you like seeing little video snippets which are conversational. So, if it suggests something that you do not like, you can ask the avatar to show you a different option which it will then understand and do.
The sustainability conversation in fashion has been around for a while and is certainly gathering pace. It is also an especially important part of everything we do at LCF. We wanted to embrace it in this project as well. Most people have more clothes than they need and they forget about them. There is always something sitting at the back of your wardrobe that you just never think about, maybe because you are not sure how, when, or where to wear it. A tool like this can make suggestions and help you utilise the full breadth of your wardrobe. In the process if you know there is something you’re just not wearing then you can ethically get rid of it. This increases the longevity of each garment within your wardrobe. If you’re wearing them more times or in different ways in different seasons, then that is a good thing.
It can be extremely helpful to retailers because this tool is in your home and learns about you. It already understands your preferences, so retailers do not have to. The biggest advantage to retailers is the data and knowledge of each person that comes with the tool. There is also the factor of trust because this is in your home and the rapport is already built. When you walk into a store to make a new purchase, something slightly outside your comfort zone, something you’re spending more on, the interactions and how the person makes you feel are especially important. With the rapport with your avatar, this in-store experience of creating a rapport with the brand can be brought to shoppers’ homes.
In terms of shopping, you could have a different stylist for each brand. For example, if you are shopping for sportswear, you can have an avatar for that, or if you are shopping for an evening gown you can choose an avatar accordingly. Equally, could be the same avatar that changes its appearance to suit each of those. We wanted to extend this project to getting personalised recommendations when buying new garments based on purchase history and more, but this is more of a proof of concept.
Written by LCF Newsroom Content Creator, Simran Sanghavi, MA Strategic Fashion Marketing.