Interactive Style Transfer: Towards Styling User-Specified Object

John Jethro Virtusio, Arces Talavera, Daniel Stanley Tan, Kai-Lung Hua, Arnulfo Azcarraga

Research output: Chapter in Book/Report/Conference proceedingConference Article in proceedingAcademicpeer-review


Researches dealing with the task of style transfer have focused and produced results that only transfers the style of a reference image to the entirety of another image. As this domain would be beneficial in digital art, it would be preferable for such algorithms to support a more flexible style transfer, such that the style would be applied only on a specific portion of an image. We propose a framework that can selectively apply a given style onto an object using only 4 user-defined points. Our approach combines a style transfer module and an object segmentation module to synthesize the stylized image. As the ultimate goal is to develop an artistic application tool, we also introduce a method that makes use of specific filters to preserve certain characteristics of an image, such as its high frequency components. Experiments show that our proposed method is able to produce pleasing results with minimal effort while also being able to handle more complicated tasks such as the application of multiple reference style images onto different objects in an image.
Original languageEnglish
Title of host publication2018 IEEE Visual Communications and Image Processing (VCIP)
ISBN (Electronic)978-1-5386-4458-4
ISBN (Print)978-1-5386-4459-1
Publication statusPublished - 2 Jul 2018
Externally publishedYes
EventIEEE Visual Communications and Image Processing - Taichung, Taiwan, Province of China
Duration: 9 Dec 201812 Dec 2018


ConferenceIEEE Visual Communications and Image Processing
Abbreviated titleVCIP 2018
Country/TerritoryTaiwan, Province of China
Internet address


  • Object Segmentation
  • Style Transfer


Dive into the research topics of 'Interactive Style Transfer: Towards Styling User-Specified Object'. Together they form a unique fingerprint.

Cite this