Fashion at the intersection of AI, Robotics and Spatial Computing
Thought this week
The following article was originally published in the January issue of SPUR magazine in my Tomorrow column. These are written by myself 2 months prior as my Tomorrow Column is exclusively in print and needs time, something that always encourages me to work on topics that are not hype.
Is this statement prescient? When it comes to large language models, generative AI, robotics, haptics, and mixed realities, we are just scratching the surface. Do Not Fear. This is not about a robot uprising or taking control of our freedoms. This is about a future where we live more intimately with machines and mixed realities. So, in that sense, I would say perhaps more Star Wars than Terminator. Why this matters? A few reasons:
The Betas, the generation coming after Alphas (born from 2008 to 2014) are coming. Eventually, they will control the workplace, market, and culture. They will be a smaller cohort due to the decrease in births and the first generation to grow up in an artificially intelligent world: ‘generative intelligence natives’. Just like GenZ were ‘digital natives’. The Betas will embrace a future of AI civil rights, cohabitating with robots, and co-creating with AI. They will type and design using spatial computing and mixed-reality interfaces. In some cases, they will even have brain implants or wear headgear which directly control their tasks without type or even speech, purely through brain activity.
The fashion industry’s new interfaces may go beyond our imagination of what is possible today when we see what is emerging in spatial computing, AI, mixed reality, brain to machine haptics and robotics. The most innovative brands will lead this, pushing for a top-down propagation of new design interfaces. Think of what we know today as the Myspace to Facebook, the Napster to iTunes.
What I am about to share are the precursors, early iterations of what’s to come, indicative of the technological shift about to impact the way we work, design, and communicate in the fashion industry.
Here are the types of emerging projects we should all be paying attention to if we work in the fashion industry because, as the saying goes: ‘times are changing’.
Mills Fabrica showcased the touring exhibit by FASHION X AI in London, a collaboration with the Laboratory for Artificial Intelligence In Design (AiDLab), the Royal College of Art, the Hong Kong Polytechnic University and Create Hong Kong. Here were three worthy projects on display:
AiDA by AiDa Lab, a first-to-market technology for fashion designers to work with AI to create original designs with just a few clicks. They can choose or refine options to develop fashion collections, bringing agility, efficiency, and flexibility to conventional and intensive design team processes. How this could be used in the future? This points to a future where fashion designers may not have to toil over a tech pack. They will closely collaborate with a machine to co-design, based on a set of prompts and a database used by a large language model. This type of tool will eventually be part of the suite of tools a fashion design team will use.
The Sensory Materials Library AILoupe by the RCA is an experimental AI design tool for materials selection in the product design process. Conventional digital materials libraries contain information about technical properties yet lack sensory and the human experience of materials. How this could be used in the future? AI loupe allows designers and material developers to discover and assess textile materials by identifying materials with a material data card, showing sensory subjective data, and translating the tactile, and physical elements of touching the materials digitally.
The Neo Couture Atelier, by the RCA, was created to document and protect fashion artisanship and couture embroidery through AI. How could AI help preserve and protect embroidery, as well as teach future generations? The research team reflected upon AI within the future of the highest of luxury, the rarest of positions within Fashion - Haute Couture. How this could be used in the future? The project is still in development but shows the ability, using a digital textile surface, to record embroidery hand movements and thereby record and train the next artisans, with more possible uses envisaged.
At the London Design Festival were 3 more projects I want to shift your attention to:
EveryHuman is an Algorithmic Perfumery and scent creation platform. A robot-like machine, once a user has completed a highly detailed and personal questionnaire, then creates a personalized scent. Their collaboration with Moo helps individuals create their unique home fragrance scent. How this could be used in the future? Rather than go to a luxury or department store to get a perfume, you could co-crate a scent with a brand to suit your unique preferences.
As part of the Digital Design Weekend at London Design Festival and Computational Arts University Showcase, Data Eco-Domain is an installation concept and methodology created from Haiwen Zhu’s uninhibited thoughts on the relationship between humans and ecology, using robotics and human brain activity as a tool. The installation connects a user’s human brain activity directly to robotic movements, without speech or human movement. How this could be used in the future? You could control a computer or machine through mental visualisation, no speech or movement needed.
Sensing Patterns, by Ninon Ardisson invited visitors to hear and feel knitwear patterns through sound vibrations via a wearable audio system. This piece investigates new ways of recording artefacts digitally and seeks to underline the relationship between the construction of knitted fabric and code. How this could be used in the future? Your sweater could vibrate and communicate with you.
Finally when it comes to Marketing and fashion campaigns, the future of video making is changing very fast.
YouTube mega influencers Marques Brownlee and Cleo Abram recently spent time at zerospace to unpack the future of movies, packed with motion sensing, AI and mixed. They test out 3 movie-making tools that are changing how movies, shows, and perhaps how online creators make content as well
MoveAI helps creators bring human motion into digital worlds on a large scale. Future Deluxe, in a developmental collaboration with Disguise and Move.ai explored AI-driven, real-time marker less motion capture in Unreal-Engine.
To conclude, an interdependent blend of technologies will lead to more complex forms of creation, design and manufacturing impacting the different phases of fashion. The robotics revolution, AI, haptics, mixed realities create an even more intimate relationship between the biosphere and the technosphere. In a future society, fashion will maintain its tremendous contribution to economic and creative growth with jobs and interfaces we can’t yet fully imagine but are being prototyped today.
By Geraldine Wharry