The StyleCLIP neural network sets picture style based on a text description

9 April 2021

The StyleCLIP neural network sets picture style based on a text description

StyleCLIP is a combination of CLIP and StyleGAN models designed to manipulate image style with text prompts. The open-source code is available, including Google Colab notebooks. Why is it needed StyleGAN…

NVIDIA Research Proposed New Style-Based Generator Architecture for GANs

18 December 2018
stylegan

NVIDIA Research Proposed New Style-Based Generator Architecture for GANs

Recently, we wrote about a research paper that studies and tries to understand the internal representations of GANs. Despite efforts like this, the understanding of various aspects of the image…

A Style-Aware Content Loss for Real-time HD Style Transfer

14 August 2018

A Style-Aware Content Loss for Real-time HD Style Transfer

A picture may be worth a thousand words, but at least it contains a lot of very diverse information. This not only comprises what is portrayed, e.g., a composition of…

Real-time Video Style Transfer: Fast, Accurate and Temporally Consistent

25 July 2018
video style transfer

Real-time Video Style Transfer: Fast, Accurate and Temporally Consistent

Developers all over the world deploy convolutional neural networks for recomposing images with the style of other pictures or simply image style transfer. After existing methods achieved high enough processing…

Twin-GAN: Cross-Domain Translation of Human Portraits

25 June 2018
Neural style transfer

Twin-GAN: Cross-Domain Translation of Human Portraits

That comes as no surprise that many discoveries and inventions in any domain come out of the researchers’ personal interests. This new approach to translation of human portraits is also…