| Home  | About ScienceAsia  | Publication charge  | Advertise with us  | Subscription for printed version  | Contact us  
Editorial Board
Journal Policy
Instructions for Authors
Online submission
Author Login
Reviewer Login
Volume 50 Number 5
Volume 50 Number 4
Volume 50 Number 3
Volume 50 Number 2
Volume 50 Number 1
Volume 49 Number 6
Earlier issues
Volume  Number 

previous article next article

Research articles

ScienceAsia (): 423-435 |doi: 10.2306/scienceasia1513-1874...423


Grid-line watermarking: A novel method for creating a high-performance text-image watermark


Wiyada Yawai*, Nualsawat Hiransakolwong

 
ABSTRACT:     This study aims to discover an effective method to watermark any text image in any language. The ultimate goal of this work is to generate invisible and more robust watermarks, increase the hiding capacity, and identify any change in the original text. Generally these are the limitations of most text-image watermarking methods. Using a grid of horizontal and vertical lines is one of the most effective methods to overcome these limitations. These grid lines run across text-image character-skeleton lines to finely mark and detect watermarks on these line-character intersection points; there is one intersection for one specific zero watermark pixel. Each intersection point is defined by one hiding bit of the watermark such that the more reference horizontal and vertical lines of grid pattern are plotted, the more points of intersection are obtained. This approach increases the hiding-bit capacity to watermark embedding, which is a disadvantage shared by many text-image watermarking methods. In addition, if all positions of all intersection points are collected, these points will act as the identity reference points of all original characters to verify their integrities after they are modified.

Download PDF

52 Downloads 1794 Views


Department of Computer Science, Faculty of Science, King Mongkut's Institute of Technology Ladkrabang, Chalongkrung Road, Ladkrabang, Bangkok 10520 Thailand

* Corresponding author, E-mail: wyd.yawai@gmail.com

Received 4 Jun 2012, Accepted 18 Jan 2013