A framework for interactive image color editing

Handle URI:
http://hdl.handle.net/10754/562407
Title:
A framework for interactive image color editing
Authors:
Musialski, Przemyslaw; Cui, Ming; Ye, Jieping; Razdan, Anshuman; Wonka, Peter ( 0000-0003-0627-9746 )
Abstract:
We propose a new method for interactive image color replacement that creates smooth and naturally looking results with minimal user interaction. Our system expects as input a source image and rawly scribbled target color values and generates high quality results in interactive rates. To achieve this goal we introduce an algorithm that preserves pairwise distances of the signatures in the original image and simultaneously maps the color to the user defined target values. We propose efficient sub-sampling in order to reduce the computational load and adapt semi-supervised locally linear embedding to optimize the constraints in one objective function. We show the application of the algorithm on typical photographs and compare the results to other color replacement methods. © 2012 Springer-Verlag Berlin Heidelberg.
KAUST Department:
Computer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division; Computer Science Program; Visual Computing Center (VCC)
Publisher:
Springer Nature
Journal:
The Visual Computer
Issue Date:
9-Nov-2012
DOI:
10.1007/s00371-012-0761-5
Type:
Article
ISSN:
01782789
Sponsors:
This research was financially supported by Science Foundation Arizona, US Navy, and NSF. We would like tom thank Tom Ang (Fig. 14) and Norman Koren (Figs. 1, 5) for the permission to use their outstanding photographs.
Appears in Collections:
Articles; Computer Science Program; Visual Computing Center (VCC); Computer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division

Full metadata record

DC FieldValue Language
dc.contributor.authorMusialski, Przemyslawen
dc.contributor.authorCui, Mingen
dc.contributor.authorYe, Jiepingen
dc.contributor.authorRazdan, Anshumanen
dc.contributor.authorWonka, Peteren
dc.date.accessioned2015-08-03T10:37:06Zen
dc.date.available2015-08-03T10:37:06Zen
dc.date.issued2012-11-09en
dc.identifier.issn01782789en
dc.identifier.doi10.1007/s00371-012-0761-5en
dc.identifier.urihttp://hdl.handle.net/10754/562407en
dc.description.abstractWe propose a new method for interactive image color replacement that creates smooth and naturally looking results with minimal user interaction. Our system expects as input a source image and rawly scribbled target color values and generates high quality results in interactive rates. To achieve this goal we introduce an algorithm that preserves pairwise distances of the signatures in the original image and simultaneously maps the color to the user defined target values. We propose efficient sub-sampling in order to reduce the computational load and adapt semi-supervised locally linear embedding to optimize the constraints in one objective function. We show the application of the algorithm on typical photographs and compare the results to other color replacement methods. © 2012 Springer-Verlag Berlin Heidelberg.en
dc.description.sponsorshipThis research was financially supported by Science Foundation Arizona, US Navy, and NSF. We would like tom thank Tom Ang (Fig. 14) and Norman Koren (Figs. 1, 5) for the permission to use their outstanding photographs.en
dc.publisherSpringer Natureen
dc.subjectColor manipulationen
dc.subjectComputational photographyen
dc.subjectImage processingen
dc.subjectInteractive image editingen
dc.subjectRecoloringen
dc.titleA framework for interactive image color editingen
dc.typeArticleen
dc.contributor.departmentComputer, Electrical and Mathematical Sciences and Engineering (CEMSE) Divisionen
dc.contributor.departmentComputer Science Programen
dc.contributor.departmentVisual Computing Center (VCC)en
dc.identifier.journalThe Visual Computeren
dc.contributor.institutionVienna University of Technology, Vienna, Austriaen
dc.contributor.institutionArizona State University, Tempe, AZ, United Statesen
kaust.authorWonka, Peteren
All Items in KAUST are protected by copyright, with all rights reserved, unless otherwise indicated.