It was up to the computer to imagine the threedimentionality of the garments on the images it generated. For this it made use of the algorithm PifuHD

a project by cris mollee in collaboration with her laptop


how do i extract a clear instruction from an implicit dataset without overruling the free interpretation from the computer?

I wanted to co-create something with my computer. Something I could touch and wear, something my computer could dream about. As it turned out, this something is a T-shirt.

At the start, I already had a notion of what a T-shirt was - my computer didn’t. I took it upon me to show it to him. For this, I created two datasets: one set of images of me wearing all the T-shirts that I own, one set of images of the textiles these shirts were made of. My computer went to think and using StyleGAN2, it returned with two new algorithms that show me how he understands the things that I wear.

images from latent space - the computers interpretation of me wearing my clothes

images from latent space - the computers interpretation of the textiles of my clothes

The images the computer dreamed up were great, but not yet something I could actually wear. I wanted real, in three dimensions! Surely, I could make a guess on how these clothes could be made into sewing patterns, but I didn’t want to fill in too much. So why not let the computer dream a little further?

I asked for three dimensions. Using PifuHD, my computer looked at its fit designs once again and sure enough, it generated the 3D counterparts. I was now facing 100 digital little puppet interpretations of myself, wearing all sorts of different clothes on all sorts of different bodies.

The 3D models were nicely dressed, yet I was still wearing the clothes we started with. How could we make these garments real? I needed two things; the fabric and the sewing patterns.

The computer had created these beautiful designs, now it was my turn to make an interpretation by making them real. I went to the tex-space, asked for some advise on how to recreate something like this and soon enough I had a screenprinted fabric with different parts of the computers’ design and some 3D effects created by the puff ink to visually match the texture of the images.

I didn’t want all of me, I just wanted the upper body garments. In order to continue, I had to brutally cut of all the parts that didn’t look like they were connected to this garment. Next, I gave the edited models to the program Slicer, which would simplify the mesh and generate a pattern that I could work with. On to the sewing machine!