Next Article in Journal
PSO Based Optimized Ensemble Learning and Feature Selection Approach for Efficient Energy Forecast
Next Article in Special Issue
A New Blind Video Quality Metric for Assessing Different Turbulence Mitigation Algorithms
Previous Article in Journal
A Regularized Procedure to Generate a Deep Learning Model for Topology Optimization of Electromagnetic Devices
Previous Article in Special Issue
Digital Compensation of a Resistive Voltage Divider for Power Measurement
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Additional Information Delivery to Image Content via Improved Unseen–Visible Watermarking

1
Facultad de Ingenieria, Universidad Nacional Autónoma de Mexico (UNAM), Av. Universidad No. 3000, Ciudad Universitaria, Coyoacán, Mexico City 04510, Mexico
2
Instituto Politecnico Nacional (IPN), Escuela Superior de Ingenieria Mecanica y Electrica, Unidad Culhuacan, Av. Santa Ana No. 1000, San Francisco Culhuacan, Coyoacán, Mexico City 04430, Mexico
3
College of Engineering, San Jose State University, San Jose, CA 95192, USA
*
Authors to whom correspondence should be addressed.
Electronics 2021, 10(18), 2186; https://doi.org/10.3390/electronics10182186
Submission received: 13 August 2021 / Revised: 30 August 2021 / Accepted: 1 September 2021 / Published: 7 September 2021
(This article belongs to the Special Issue Theory and Applications in Digital Signal Processing)

Abstract

:
In a practical watermark scenario, watermarks are used to provide auxiliary information; in this way, an analogous digital approach called unseen–visible watermark has been introduced to deliver auxiliary information. In this algorithm, the embedding stage takes advantage of the visible and invisible watermarking to embed an owner logotype or barcodes as watermarks; in the exhibition stage, the equipped functions of the display devices are used to reveal the watermark to the naked eyes, eliminating any watermark exhibition algorithm. In this paper, a watermark complement strategy for unseen–visible watermarking is proposed to improve the embedding stage, reducing the histogram distortion and the visual degradation of the watermarked image. The presented algorithm exhibits the following contributions: first, the algorithm can be applied to any class of images with large smooth regions of low or high intensity; second, a watermark complement strategy is introduced to reduce the visual degradation and histogram distortion of the watermarked image; and third, an embedding error measurement is proposed. Evaluation results show that the proposed strategy has high performance in comparison with other algorithms, providing a high visual quality of the exhibited watermark and preserving its robustness in terms of readability and imperceptibility against geometric and processing attacks.

Graphical Abstract

1. Introduction

In the future, the current growth of technological development will be compared only with the period called the industrial revolution. This development allows people easy access to electronic devices that are capable of capturing and displaying different multimedia files, especially images and videos, which could be easily copied, edited, and distributed without any protection, turning these practices into a problem of copyright protection and intellectual property. To solve this problem, in the last decade, several digital watermarking algorithms have been proposed [1,2,3,4,5]. However, in a real-life scenario, the watermarks are used to deliver auxiliary information, such as banknotes, official documents, etc., where the watermarked document delivers information about its authenticity when its visualization is enhanced, seeing it against light. In this context, several watermarking algorithms have been proposed to deliver auxiliary information about the visual content of images [6,7,8,9,10,11]; opening with this target a new research field that is denoted as invisible–visible watermarking, which can be divided according to the exhibition stage in unseen–visible watermarking (UVW) [6,7,8], and imperceptible–visible watermarking (IVW) [9,11]. The main idea of invisible–visible watermarking algorithms applied to digital color images is based on the digital reproduction of the real-life scenario of the watermarks, that in addition to providing copyright and intellectual property protection, are capable of delivering information to the end-users via owner logotype, 2D barcode or quick response (QR) codes [12]. To make this possible, these algorithms are focused on the observation of how some digital image information that is invisible to the naked eyes becomes visible after applying some image enhancement operations such as Gamma Correction Function (GCF) [6,7,8,13], Histogram Modulation (HM) [9] or Binarization Function (BF) [10,11].
In the IVW algorithms [9,10,11], the image processing capabilities of display and capture images devices such as cameras, laptops or mobile phones are used to embed and exhibit a binary watermark pattern such as owner logotype, 2D barcode or QR code. In the embedding stage, the algorithm takes advantage of the imperceptibility and visibility of the invisible and visible watermarking algorithms, respectively, to imperceptibly embed a binary watermark pattern into the largest homogeneous region of the corresponding host image. Until now, in the literature, two ways have been proposed to exhibit the watermark pattern and deliver information to the end-users. In the first [9], the watermark pattern is exhibited in color after applying a histogram modulation to the watermarked image; in the second [10,11], a binarization function is used to exhibit, to the end-users, the watermark in a black–white color. In contrast with the IVW algorithms, the embedding and the exhibition stages of UVW algorithms work independently of each other. In a general scenario, the entity who wants to provide additional information about the visual content embeds in an imperceptible manner a logotype, 2D barcode or QR code, into the image or video; after the transmission, when the end-user requires extra information about the visual content, he/she executes an exhibition stage, taking advantage of the equipped functions into the display devices, such as GCF [6,7,8,13], and contrast or brightness [14].
In this proposal, additional information delivery to the image content refers to the application of the general scenario of the UVW algorithms in daily life; when the end-users exhibit the QR watermark, the decoded information could be a company website, information of singers, dress designer, location or another kind of information that has or does not have relation to the image content. Additionally, it is not limited to extend the applicability to information security tasks such as medical imaging [15,16], deepfake prevention [17], 3D-video protection [18], face recognition authentication [19] and other related fields [20]. However, to obtain better results in all those applications, several drawbacks of the UVW algorithms must be improved: first, the algorithms only require large image regions with low intensity [6,7,8,14] second, embedding strategies increase histogram distortion and visual degradation of the watermarked image, impacting the legibility of the visual content and the watermark imperceptibility. To improve UVW algorithms in the context of imperceptibility and visibility of the watermark pattern, this paper proposes a strategy that is distinguished by three main goals:
  • First, the strategy to select the embedding region considers large smooth regions of low or high intensity, extending the applicability of the proposed algorithms to any class of images;
  • Second, the watermark complement strategy is capable of reducing the histogram distortion and the visual degradation of the watermarked image;
  • Third, a methodology to quantify the embedding error induced by the watermark embedding strategy is introduced.
To evaluate the performance of the proposed algorithm, the experimental results are compared with previously reported algorithms [6,7,8], showing that the proposal can be implemented efficiently in images with different characteristics and that the histogram distortion and visual degradation of the watermarked image are reduced.
The rest of this paper is organized as follows. A brief description of the previously reported works and the motivation is given in Section 2. Then, in Section 3, the materials and methods of the proposed watermarking algorithm are detailed. In Section 4, the experimental results, including a performance comparison with the previously reported algorithms [6,7,8], are shown. Finally, Section 5 concludes this work.

2. Literature Survey

In this section, related works to UVW algorithms [6,7,8] and the motivation of this work are described briefly. The binary watermark pattern and the host image are defined in all this work as W = {W(i, j) = 0, 1 | i = 1, , ww, j = 1, , hw}, where ww and hw are the width and height, respectively, and the host image is defined as Ih = {Ihk(x, y) | 1 ≤ x ≤ M, 1 ≤ y ≤ N}, where M and N are the width and height of the host image, respectively, and 0 ≤ Ihk(x, y) ≤ 255 is the pixel in the corresponding (x, y) position of the color channel k ∈ {R,G,B}.

2.1. Unseen–Visible Watermarking

In [6], Huang et al., inspired by a real-life scenario of the watermarks, propose a pioneer UVW algorithm based on [21,22] and the observation of how several pieces of image information that are invisible to the naked eyes become visible after applying an image enhancement function equipped into display devices such as Gamma Correction Function [13], as is shown in Figure 1.
In this algorithm, the best intensity value i* is obtained as the maximum gradient of the GCF, which is considered as the watermark exhibition strategy. Once the best intensity value i* is obtained, an adequate embedding region is selected from the host image according to the following equation.
( x * , y * ) = arg max ( x 0 , y 0 ) x = x 0 x 0 + w w + 1 y = y 0 y 0 + h w + 1 | I h ( x , y ) i * | ,
where (x*, y*) is the upper-corner left position of the adequate embedding region R that is selected from the host image as the darkness plain region, and (x0, y0) is the region position in Ih. To prevent a quality degradation of the exhibited watermark, a denoising operation is applied interactively to the adequate embedding region before being watermarked. The denoising operation is controlled in each n iteration by a threshold δn ∈ [0,1], which represents the pixels’ percentage of the embedding region with a similar value to the intensity i*.
Finally, once the denoised host images are obtained, Ih′, the watermark W is embedded by a slight adjusting of the pixels intensity value, where according to [6], the degree of adjustment has an empirical value Δ = 3. Then, the watermarked image is obtained as follows:
I w ( x , y ) = { I h ( x , y ) + Δ   ,   i f   W ( x x * + 1 , y y * + 1 ) = 1 , I h ( x , y ) ,     o t h e r w i s e .

2.2. Improved Unseen–Visible Watermarking

In [7], Juarez et al. propose an improved version of [6], where two methodologies are adapted to improve the reported UVW algorithm. To increase the applicability of images without any darkness plain region, the Total-Variation L1-Norm [23] is introduced, which decomposes the input image luminance into texture and cartoon images since the adequate embedding region can be selected from the cartoon image that contains several dark plain regions. To embed the binary watermark, the same proposed methodology is used with the difference that the embedding strength, Δ, is calculated according to the human visual system (HVS) through a methodology called Just Noticeable Distortion (JND) [24,25,26,27,28]. In the exhibition stage, the shifted GCF is used to reveal the watermark to the naked eyes.

2.3. Camouflaged Unseen–Visible Watermarking

In [8], a camouflage unseen–visible watermarking algorithm was proposed to deliver information, claiming the copyright protection and ownership authentication of color images under two exhibition strategies: a composed strategy of logarithmic and negative transformations, and the image enhancement functions equipped into the mobile display devices, as is shown in Figure 2. In this algorithm, an adequate embedding region R with the minimum mean value is selected from the color image luminance according to (3) and (4).
( x ^ , y ^ ) = min ( x , y ) ( 1 w w × h w i , j Ω W ( i , j ) ) ,
Ω = { i [ x w w 2 , x + w w 2 ] j [ y h w 2 , y + h w 2 ] ,   x = 1 , , M ; y = 1 , , N ,
where (x, y) denotes the appropriate position in the image luminance to find the central pixel ( x ^ , y ^ ) of R. Then, an image texture classification process in the DCT domain [29] is used to classify each non-overlapping block of size (8 × 8) of R, where the dark and weak texture blocks correspond to class 1, semi-dark and strong texture blocks correspond to class 3 and the remaining blocks correspond to class 2.
Once R is obtained and its blocks are classified, a binary watermark with the same size of R is embedded adaptively according to its block classification. If the bζ pixel of the ζ-th candidate region corresponds to class 1, the embedding process follows the process given by (5); otherwise, if the bζ pixel corresponds to class 2 or 3, the embedding process is given by (6).
b ς * ( x , y ) = { b ς ( x , y ) + β   ,   i f   W ( i , j ) = 0 , b ς ( x , y ) β   ,   o t h e r w i s e ,
b ς * ( x , y ) = { b ς ( x , y ) + 0.5 b ς ( x , y ) · α ,   i f   W ( i , j ) = 0 , b ς ( x , y ) 0.5 b ς ( x , y ) · α ,   o t h e r w i s e ,
where b ς * ( x , y ) is the watermarked pixel and β and α are the corresponding strength factor to class 1, and class 2 or 3, respectively. After the embedding process, the watermarked image Iw is reconstructed. Finally, to exhibit the watermark image to the HVS, two strategies are adopted: in the first, a logarithmic transformation given by c log(1 + Iw) is applied to the watermarked image, where c and Iw are a constant and the watermarked image, respectively; subsequently, a negative transformation is applied to the logarithmic representation of the watermarked image. In the second strategy, a mobile device is superimposing to the watermarked image, and only varying the brightness and contrast equipped into the mobile devices, the watermark can be exhibited to the HVS; more details about exhibition strategies can be accessed in [8].

2.4. Motivation

The UVW and the IVW are developed to deliver information to the end-users via owner logotype, 2D barcodes or QR codes. In the IVW algorithms [9,10], the embedding stage is based on a histogram accumulation strategy that modifies the histogram of the embedding region in the upper and lower boundaries with respect to its mean value [11]; its imperceptibility is reported in the range of 40 dB–60 dB. To exhibit the watermark, the processing capabilities equipped into capture and display devices are used. The UVW algorithms [6,7,8] embed a binary watermark by a slight adjustment of the embedding region pixels and guarantee a watermark imperceptibility above 40 dB; however, they do not offer better performance in images without a darkness plain region [15,16]. To exhibit the watermark, these algorithms take advantage of the image enhancement function equipped into the display devices such as GCF [13], brightness [30], contrast [14] and the angle of vision [21,22], such that a complex algorithm is not required. According to the analysis, in both cases, the embedding stage increases the histogram distortion and visual degradation of the watermarked image, disserving watermark imperceptibility. These alterations are a consequence of its corresponding embedding strategy that does not consider the watermark characteristics. To solve these drawbacks, the watermark features are considered in the proposed algorithm, which is applied to any images with large smooth regions, adopting a watermark complement strategy to reduce the visual degradation and the histogram distortion, increasing the imperceptibility of the watermark and the watermarked image quality. Additionally, to quantify the embedding error, a measurement strategy is introduced.

3. Materials and Methods

In this section, the main requirement and the key contribution of the proposed algorithm are described in detail.

3.1. Just Noticeable Distortion (JND)

The luminance-based JND [24] used in the proposed algorithm is a numerical representation of the human eyes’ ability to perceive intensity variations into an image, which has been employed on several applications of video [24], image processing [25], data hiding [26], image compression [27] and image watermarking [28]. According to the proposed methodology, if the embedding region ensures a minimum variance, its mean value can be used to obtain an embedding strength based on the HVS perception; this methodology is calculated in the spatial domain by (7). As is shown in Figure 3, the HVS has a weak perception to intensity variations in the smooth region within mean value in the range (0, 32), whereas a strong perception is given in the range (33, 208) and a middle perception is given in the range of (209, 255).
J N D ( μ R ) = { 1 8 μ R + 6 ,   0 μ R 32 1 32 μ R + 3 ,   33 μ R 64 1 96 μ R + 1 3 ,   65 μ R 255 .

3.2. Watermark Complement Strategy (WCS)

Inspired by [31,32] and how several applications can scan inverted QR codes, a watermark complement strategy based on the watermark features is introduced to reduce the histogram distortion and the visual degradation of the UVW algorithms.
In addition to the owner logotype and 2D barcode as watermark, the UVW algorithms also propose the use of QR codes, which can be obtained in a binary representation [12], with features ‘0s’ and ‘1s’, such that its absolute watermark length is given by:
| W | = | W 0 | + | W 1 | ,
{ | W 0 | = i = 1 w w j = 1 h w W ( i , j ) + 1 , i f   W ( i , j ) = 0 , | W 1 | = i = 1 w w j = 1 h w W ( i , j ) , i f   W ( i , j ) = 1 ,
where according to Figure 4a, |W0| = 7029 and |W1| = 15,471 are the absolute number of ‘0s’ and ‘1s’ of the binary watermark W, respectively; and | W 0 ¯ | = 15,471 , | W ¯ 1 | = 7029 correspond to the watermark complement, Figure 4b. To embed the watermark or its corresponding complement, the appropriate embedding region R has an absolute length given by:
| R | = | R 1 | + | R 0 | ,
{ | R 0 | = x = 1 w r y = 1 h r R ( x , y ) + 1 |   R ( x , y ) :   W ( i , j ) = 0 , | R 1 | = x = 1 w r y = 1 h r R ( x , y ) |   R ( x , y ) :   W ( i , j ) = 1 ,
where |R0| = 7029 and |R1| = 15,471 are the absolute number of pixels of the embedding region that match with the ‘0s’ and ‘1s’ of watermark, respectively. Considering the embedding strategies mentioned in the previous reported UVW algorithms, a general embedding strategy is obtained, formulated by (12).
R w ( x , y ) = { R ( x , y ) + Δ   ,   i f   W ( i , j ) = 1 , R ( x , y ) ,   o t h e r w i s e ,
where Rw is the watermarked region and Δ is the embedding strength obtained from the JND in the range of (1, 6), where Δ = 1 and Δ = 6 represent the minimum and the maximum histogram distortion. According to the previously mentioned, the binary watermark of Figure 4a is embedded into the image lena.bmp by the general embedding strategy and a calculated embedding strength Δ = 1; as a result, Figure 5a,b show a visual degradation with a PSNR = 34.1075 dB and a histogram distortion with 15,471 modified pixels, respectively.
With the aim to reduce the drawbacks of Figure 5a,b, the proposed watermark complement strategy is based on the evaluation of the pixels within the embedding region with respect to its mean value. This evaluation determines if the original watermark W or its corresponding complement W ¯ = 1 W is the suitable watermark W′ to be embedded; this evaluation is satisfied by:
W = { W ¯ ,   i f   | x = 1 w w y = 1 h w Γ ( R 1 ( x , y ) < μ R ) | >   | x = 1 w w y = 1 h w Γ ( R 0 ( x , y ) μ R ) | , W ,   i f   | x = 1 w w y = 1 h w Γ ( R 1 ( x , y ) μ R ) |     | x = 1 w w y = 1 h w Γ ( R 0 ( x , y ) > μ R ) | ,
Γ ( s ) = { 1 ,   i f   s 0 , 0 ,   o t h e r w i s e ,
where R0 and R1 are the pixels of the embedding region that match with the ‘0s’ and ‘1s’ of the watermark, respectively, and Γ(s) is a conditional function. Finally, based on the previous evaluation, the binary watermark must be embedded into R according to its mean value µR; in this context, the watermark’s ‘0s’ are embedded into the pixels of the embedding region in the range of (0, µR-1), f:W = 0→(0, µR-1), and the ‘1s’ in the range of R, 255), f:W = 1→(µR, 255), where f corresponds to the general embedding strategy.
According to the proposed strategy, an embedding error (EE) measurement is obtained; it considers all the pixels in the ranges (0, µR-1) and R, 255) where the ‘0s’ and ‘1s’ of the watermark were embedded, respectively, as given by (15).
E E =   ( x = 1 w w y = 1 h w ( Γ ( R 1 ( x , y ) < μ R ) + Γ ( R 0 ( x , y ) μ R ) ) × 100 w w × h w )
To observe the feasibility of the proposed strategy, the watermark complement, Figure 4b, is embedded into the image lena.bmp. The resulting Figure 5c,d shows a visual degradation with a PSNR = 42.23 dB and a histogram distortion with 7029 modified pixels, respectively. Finally, the watermark complement strategy generates an EE = 35.64% in comparison with EE = 44.68% obtained by the original watermark. In addition to the previous analysis, the relation (|W0|>>|W1|) must be satisfied during the embedding stage to ensure a reduction of visual degradation and histogram distortion.

3.3. Proposed Watermarking Algorithm

In this section, the unseen–visible watermarking algorithm in combination with the proposed strategy is described by two stages— embedding and exhibition [33].

3.3.1. Embedding Stage

1.
The host image
Ih with RGB color model is the corresponding input to the embedding stage, as shown and listed in Algorithms 1 and Figure 6, respectively. The major problem in the UVW algorithms is the selection of the color model and the embedding region where the watermark will be embedded. The RGB color model has strongly correlated components; however, the alteration into one or more components is not perceived to the naked eyes because image colors are dependent on each other. Conversely, the YCbCr color model has different advantages that should be considered; the most important are the low correlation between luminance and its chrominance components, the image information accumulated into the luminance Y and any alteration in Y observed in each RGB color channel [34,35]. In this way, the host image becomes the YCbCr color model [30], and the luminance Y is isolated.
2.
Adequate embedding region
To make this algorithm suitable to any class of images with large smooth regions of low or high intensity, the adequate embedding region R—with an upper-left corner pixel position (x, y), where x = 1, 2, 3, , wR and y = 1, 2, 3,, hR are the width and height, respectively—is obtained from the luminance Y of the host image Ih, satisfying the minimum variance given by (16). An empirical analysis indicates that to obtain better results, the range of intensity where the adequate embedding region must be selected corresponding to (0, 50) and (200, 255), low and high intensity, respectively. In this way, the mean value of the region is satisfied by 0 ≤ µR ≤ 50 for images with low intensity and 200 ≤ µR ≤ 255 for images with high intensity.
min   R Y ( 1 ( w R × h R ) 1 x = 1 w R y = 1 h R | R ( x , y ) μ R | ) 2 .
3.
Embedding Strength
As mentioned in Section 3.1, to obtain an embedding strength in terms of the JND, refs. [7,8,10,11,33] as well as mean value µR that guarantee high watermark imperceptibility to HVS, the embedding region must ensure a minimum variance. However, when µR ≥ 253 and JND(µR) = 3, an overflow can be generated by the embedding strategy into the pixels of the watermarked region Rw. In this way, to avoid this overflow, the embedding strength is finally obtained by a soft adjustment, as is formulated in (17).
Δ = { J N D ( μ R ) 1 ,   i f   μ R 253 , J N D ( μ R ) ,   o t h e r w i s e .
4.
Embedding Strategy
As is shown in Figure 6, once the embedding region R and the embedding strength Δ are obtained, the watermark is evaluated by (13) to determine the suitable watermark W′, which is embedded by (18).
R w ( x , y ) = { R ( x , y ) + Δ ,   i f   W ( i , j ) = 1 , R ( x , y ) ,   o t h e r w i s e ,
where Rw corresponds to the watermarked region. Along with the procedure of the embedding stage, Algorithms 1 summarizes and details each step to embed a watermark W′ into the adequate embedding region R of the host image Ih.
Algorithms 1. Embedding survey.
Input: Host Image Ih, watermark W.
Step 1: 
The host imagen Ih becomes the RGB color model to the YCbCr, and the luminance Y is isolated.
Step 2: 
According to (16), the adequate embedding region R is obtained from the luminance Y.
Step 3: 
The watermark is evaluated by (13) to select the suitable watermark W′, ensuring the minimum embedding error given by (15).
Step 4: 
The JND of the mean value µR and the embedding strength are obtained according to (7) and (17), respectively.
Step 5: 
The watermark is embedded into the adequate embedding region R, satisfying (18).
Step 6: 
Once the watermarked region Rw is obtained, the luminance is rebuilt YM.
Step 7: 
Finally, the watermarked image is obtained by the addition of the chrominance components to the watermarked luminance YM, becoming the color model YCbCr to the RGB.
Output: Watermarked image Iw.

3.3.2. Exhibition Stage

To exhibit the watermark to the HVS, an extra complex computational algorithm is not required [33]; this algorithm takes advantage of the image enhancement function equipped into display devices, mainly, the image enhancement function GCF. In this section, the GCF and other feasible exhibition strategies based on the GCF, contrast, brightness and the angle of vision are described in detail.
1.
Gamma Correction Function (GCF)
Recently, the GCF is incorporated as a function to improve the visual content of images into display devices and is formulated by:
P o u t = C P int γ
where Pint and Pout are the intensity value of the input pixel and its corresponding mapping output pixel, respectively, γ and C are defined by the system where it is implemented.
In this way, if γ = 1, then intensity values of the output pixels are mapped with the same value of the corresponding input pixel, as is shown in Figure 7a; when γ < 1, the input pixels with low intensity are mapped to high-intensity values, and the input pixels with high intensity are saturated as shown in Figure 7b. Finally, when γ > 1, the input pixels are mapped conversely to γ < 1, as shown in Figure 7c.
Additionally, to the GCF as an exhibition strategy, another option capable of improving the exhibited watermark readability on large smooth regions is the shift operation of the GCF that can be applied to the watermarked image with respect to the mean value of the adequate embedding region, as shown in Figure 8.
2.
Other feasible exhibition strategies
As mentioned above, the most common image enhancement function used through the display devices to exhibit the watermark to the HVS is the GCF; however, the GCF is not the only image enhancement function integrated into the display devices capable of exhibiting the watermark to the HVS. In this context, other feasible exhibition strategies based on the pixels intensity variation are contrast, brightness and angle of vision.
The most popular contrast system adopted by display devices manufacturers [14] defines the contrast and the brightness as the absolute difference between the white peak and the black luminance levels and the way to perceive the light intensity by the HVS, respectively. However, in an image processing context, both functions are pixel operations where the output pixel depends on the input pixel. The combination of contrast and the brightness functions, which can be applied to the watermarked image through display devices, is given by:
I R ( x , y ) = a   I w ( x , y ) + b ,
where a and b are the corresponding parameters to adjust the contrast and the brightness, respectively.
The reported algorithms [21,22] are based on LCD technology [36], where the angle of vision is used to show multiple images through display devices. The idea to take advantage of the angle of vision as another feasible exhibition strategy is focused on how the display devices modify the contrast and the brightness perception [14] to the HVS when it has changed in azimuth and elevation, as shown in Figure 9.

4. Experimental Results and Analysis

In this section, the watermarked image quality is evaluated by the Peak Signal to Noise Ratio (PSNR) given by (21), and the performance of the proposed algorithm is evaluated in terms of imperceptibility by the Structural Similarity Index Measure (SSIM), [37], and the Normalized Color Difference (NCD), [38], that are formulated by (22) and (23), respectively. A PSNR > 37 dB shows a high quality between the original host image and its corresponding watermarked image. Besides the evaluation of the imperceptibility, SSIM and NCD metrics also are used to evaluate the similarity between host and watermarked image. An SSIM value nearest to ‘1’ corresponds to identical images, whereas the CIELAB-based color space NCD metric is used to measure the difference in terms of color between the host and the watermarked image. An NCD value nearest to ‘0’ corresponds to identical images. The robustness of the proposed algorithm is evaluated against common geometric and image processing attacks listed in Table 1 and Table 2, respectively. Finally, the performance of the proposed algorithm is mainly contrasted with three previously reported algorithms [6,7,8]. To present a fair comparison, the host images are obtained from the uncompressed color image dataset (UCID) [39], which contains 1338 uncompressed TIFF color images of 512 × 384 size. Watermark patterns are based on two binary images given by an owner logotype and the QR code with sizes 117 × 117 and 150 × 150, respectively.
P S N R ( d B ) = 10 log 10 M P V 2 1 N × M ( x = 1 M y = 1 N ( I h ( x , y ) I W ( x , y ) ) 2 ) ,
S S I M ( I h , I W ) = ( 2 μ I h μ I W + C 1 ) ( 2 σ I h   I W + C 2 ) ( μ I h 2 + μ I W 2 + C 1 ) ( σ I h 2 σ I W 2 + C 2 ) ,
N C D = x = 1 h y = 1 w ( ( Δ L ( x , y ) ) 2 + ( Δ a ( x , y ) ) 2 + ( Δ b ( x , y ) ) 2 ) x = 1 h y = 1 w ( ( L ( x , y ) ) 2 + ( a ( x , y ) ) 2 + ( b ( x , y ) ) 2 ) .
In the PSNR expression, the MPV represents the maximum pixel value of the host image. In the SSIM, µX, σ X 2 and σ X Y are the mean value, the variance and the covariance of the corresponding Ih and IW, respectively, C1 and C2 are two constants with the same value defined in [37]. Finally, In the NCD, L represents the luminance, a and b are the color variation between the luminance of green–red and blue–yellow, respectively.

4.1. General Performance Validation

In addition to the use of the 2D barcodes and the owner logotype as watermark patterns, the UVW algorithms also propose the incorporation of the QR code in a binary representation as a novel watermark pattern to deliver auxiliary information. This is based on the observation that the QR code is the most popular mechanism to store and share a high amount of information among end-users and a wide variety of industry sectors. The QR code used in the performance evaluation of the proposed algorithm corresponds to the classification L with 7% of data restoration. Table 3 shows the average quality and imperceptibility of the proposed algorithm applied to images with large smooth regions with low and high intensity. To appreciate the performance of the proposed algorithm, the results are obtained with and without the incorporation of the proposed watermark complement strategy that also reduces the histogram distortion and visual degradation, highlighted by its suitability to any class of images, where more than the 74% of images used in this test were watermarked using the proposed watermark complement strategy.
Figure 10, Figure 11, Figure 12 and Figure 13 show a sample set of images that are marked by a QR code. The watermarked images show, under normal view conditions, the same visual quality as its corresponding original image. At the same time, these set of images show results of the QR code exhibited by the GCF, Shift GCF operation, contrast and brightness combination and the angle of vision, where the watermark in terms of readability and imperceptibility is recognized in high quality by the HVS. Figure 10b shows a watermarked image where a QR code is embedded into an embedding region located on a smooth area with low intensity. As shown in Figure 10c, after enhancing the watermarked image by the GCF, the QR code is exhibited with enough quality to be decoded by any QR code reader application. Figure 11b and Figure 12b show a watermarked image where the QR code is embedded into a region located in a middle intensity. Although the proposed algorithm does not consider middle intensities to embed and exhibit a watermark pattern, the proposed algorithm is able to exhibit it to the HVS with high visual quality after enhancing the watermarked image by the proposed exhibition strategies. In Figure 11c and Figure 12c, the watermark is exhibited by the Shift GCF and the combination of contrast and brightness, respectively. In Figure 13a–c, the watermark is exhibited to the HVS by the angle of vision, GCF and the combination of the contrast and brightness, respectively, that are functions integrated into a Samsung display device model UN32J4300AF; moreover, the obtained results can be improved using another display device with better resources.

4.2. Watermark Robustness

To evaluate the robustness of the proposed algorithm against common geometric and image processing attacks listed in Table 1 and Table 2, respectively, two sample images, ucid00327 and ucid00539, obtained from the UCID dataset, were marked by a QR code and the owner logotype.

4.2.1. Robustness to Image Geometric Attacks

Figure 14 shows a sample image set, where, to demonstrate that the proposed algorithm is not limited to embed only one watermark pattern, Figure 14a is used to embed a complement representation of the QR code and the owner logotype. Figure 14b shows the result of the double-watermarked image that, in terms of imperceptibility, maintains a high visual quality to the HVS; additionally, Figure 14c shows both watermarks exhibited to the HVS after being enhanced by the Shift GCF operation.
To evaluate the performance of the proposed algorithm against common geometric attacks, the double-watermarked image Figure 14b is processed by the attacks listed in Table 1; after that, the watermark is exhibited by the proposed image enhancement operation. Figure 15 shows several images with good robustness against the attacks listed in Table 1, including rotation 45°, affine transformation, an aggressive aspect ratio, shearing in x and y directions and barrel transformation. In all cases, the watermarks are exhibited with high visual quality to the HVS; moreover, the quality of the exhibited QR code is sufficient to decode the information by any QR code reader application.

4.2.2. Robustness to Image Processing Attacks

Figure 16 shows a sample image set used to evaluate the performance of the proposed algorithm against image processing attacks. Figure 16a shows the original image, whereas Figure 16b corresponds to the watermarked image, where a QR code is embedded into the smooth region with minimum variance and low intensity. Finally, Figure 16c corresponds to the enhanced representation of Figure 16b after being treated by the shift GCF exhibition strategy, and where the exhibited QR code has enough quantity to be read by any application focused on reading QR codes. Figure 17 shows the robustness against several image processing attacks, listed in Table 2. To obtain the experimental results, first, the watermarked image of Figure 16b is attacked with the maximum attack factors which allow reading and decode the QR code; second, the attacked representation of the watermarked image is treated by the shift GCF to exhibit the watermark pattern. The attacks include two of the most popular compression formats, JPEG and JPEG2000. The robustness of the proposed algorithm to different compression factors of the JPEG and JPEG2000 formats are shown in Figure 18a,b, respectively; in both images, the compression factor and the ability to be read and decode the information of the QR code are compared in terms of the data restoration, that correspond to 7%, 15%, 25% and 30% to the QR code classification L, M, Q and H, respectively. From the experimental results, two main observations are obtained. First, robustness to JPEG and JPEG2000 of the proposed algorithm can exhibit the QR code with high visual quality to be read by any application. Second, in the case of using an owner logotype as a watermark, the proposed algorithm can overcome a compression factor QF = 40, as shown in Figure 19a to JPEG and Figure 19b to JPEG2000.

4.2.3. Performance Comparison and Discussion

In Table 4, the performance in terms of the visual quality and imperceptibility of the proposed algorithm is compared with the previously reported algorithms [6,7,8], under the same features such as image watermark, hardware capabilities and image dataset. To obtain trustworthy results, a set of 800 images of the UCID dataset that had been selected randomly was split into 465 and 335 images with embedding regions located in low and high intensity, respectively. Due to the incorporation of the watermark complement strategy to reduce the histogram distortion and the visual degradation, the applicability of the proposed algorithm is not limited to images with low-intensity regions as reported in [6,7,8], an extra denoising operation [6,7], and complex computational algorithms to embed or exhibit the watermark [8,9,10].
The experimental results in terms of embedding error, visual quality and imperceptibility with EE ≤ 35.42%, PSNR ≥ 50.64 dB and SSIM ≥ 0.9890, NCD ≤ 0.0158, respectively, show that the proposed algorithm improves the performance of the previously reported algorithms [6,7,8]; in addition, the results in [6,7] are nearest to the proposed algorithm since their embedding strategy is similar to the proposed one. In contrast, the reported results in [8] have weaker performance than the proposal, as shown in Table 4, because the embedding strategy increases the histogram distortion and visual degradation. Table 5 shows a global comparison of the proposed algorithm against previously reported UVW algorithms [6,7,8]; additionally, for a broader overview of the invisible–visible watermarking algorithms, the most relevant IVW algorithms [9,10] are considered in this table.
Considering Table 4 and several aspects of each algorithm shown in Table 5, the proposal is more suitable to be used in practical applications of additional information delivery, information security tasks [15,16,17,18,19] and other related fields [20], such as is mentioned in Section 1.
According to the mechanism to exhibit the watermark pattern, nowadays, traditional electronic devices with the purpose of giving the end-users a comfortable visual experience via a high quality and resolution [40,41] have been developed with several image and video processing capabilities [41,42,43,44], but exhibiting watermarks focused to additional information delivery can be conditioned by manufactures with dedicated functions or applications.
The angle of vision as another strategy to exhibit the watermark pattern can be considered as a new research subject based on imperfect systems [45,46] because the strategy could be considered as a consequence of an imperfect physical process.
In comparison with the IVW algorithms, the proposed algorithm shows similar results to [9], but with minimum computational complexity and a high amount of exhibition strategies integrated into common display devices; in comparison with [10], the proposed algorithm does not require an additional DCT-based algorithm in the embedding and exhibition stages.

5. Conclusions

In this paper, a novel complement unseen–visible watermark algorithm is proposed to deliver information via color images and display devices. The conclusions of the proposed algorithm are highlighted according to its three main contributions: first, the minimum variance formulated in Equation (16), and the mean value in the range μR ∈ (0,50) and μR ∈ (200, 255) is not limited to select the appropriate embedding region with middle intensity, as shown in Figure 11 and Figure 12. Second, the watermark complement strategy given by (13) reduces the histogram distortion and the visual degradation of the watermarked image using an embedding strategy proposed in (18) that satisfies the relation (|W′0|>>|W′1|). The third contribution is the strategy to quantify the embedding error by using (15), being a parameter that ensures minimal distortion during the embedding stage.
The experimental results show that the proposed algorithm improves several aspects of the previously reported algorithm in [6,7,8]; with an SSIM value close to ‘1’ and NDC value close to ‘0’, the watermarked images maintain a high imperceptibility of the watermark pattern that can be an owner logotype, 2D barcodes or QR code. The high visual quality of the watermarked image is obtained with a PSNR ≥ 50.64 dB and a mean value of EE = 35.42% for any class of images. Additionally, the proposed algorithm can exhibit the watermark pattern via several exhibition strategies incorporated within common display devices; in this way, empirical non-reported results show that histogram modulation can be used as another feasible exhibition strategy; however, this function is not incorporated within display devices.
The addition of the histogram modulation as another exhibition strategy, in combination with several aspects listed in Table 5, makes possible a general comparison between the proposed one and two most common IVW algorithms [9,10]. The proposed algorithm provides higher robustness against common geometrics and image processing attacks listed in Table 1 and Table 2, respectively. The most relevant results are obtained for JPEG and JPEG2000 compression, where, after applying the JPEG and JPEG2000 compression to the watermarked image, the QR code as watermark can be decoded by common reader application as shown to different QR-classification in Figure 18. In this way, the features of an owner logotype as a watermark increase the robustness to the compression attacks, as shown in Figure 19.
In general, the robustness of the proposed algorithm and the several watermark exhibition strategies incorporated into common display devices make it suitable in practical multimedia applications. In the future, the presented work could be extended to video processing and information security and other applications related to electronic display devices.

Author Contributions

Conceptualization, O.U.J.-S., L.J.R.-R. and F.G.-U.; methodology, F.G.-U., M.C.-H. and R.M.-Z.; software, O.U.J.-S., L.J.R.-R. and J.R.-H.; validation, O.U.J.-S., L.J.R.-R. and F.G.-U.; formal analysis, O.U.J.-S., F.G.-U., M.C.-H. and R.M.-Z.; investigation, O.U.J.-S., F.G.-U., M.C.-H. and R.M.-Z.; resources, L.J.R.-R., F.G.-U. and R.M.-Z.; data curation, O.U.J.-S. and J.R.-H.; writing, original draft preparation, O.U.J.-S., J.R.-H. and L.J.R.-R.; writing, review and editing, O.U.J.-S., J.R.-H. and L.J.R.-R.; visualization, O.U.J.-S., F.G.-U., M.C.-H. and J.R.-H.; supervision, O.U.J.-S., F.G.-U., M.C.-H. and R.M.-Z.; project administration, O.U.J.-S., F.G.-U. and R.M.-Z.; funding acquisition, O.U.J.-S., F.G.-U., M.C.-H. and J.R.-H. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Universidad Nacional Autonoma de Mexico (UNAM) under the DGAPA Postdoctoral Scholarship Program, the PAPIIT-IT101119 Research Project and by the Instituto Politecnico Nacional (IPN).

Acknowledgments

The authors thank the Universidad Nacional Autonoma de Mexico (UNAM) under the DGAPA Postdoctoral Scholarship Program, the PAPIIT-IT101119 Research Project and the Instituto Politecnico Nacional (IPN) for the support provided during the realization of this research. The authors of the article appreciate the valuable suggestions of the referees, which contributed to improving the paper.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Cox, I.J.; Kilian, J.; Leighton, F.T.; Shamoon, T. Secure spread spectrum watermarking for multimedia. IEEE Trans. Image Process. 1997, 6, 1673–1687. [Google Scholar] [CrossRef]
  2. Barni, M.; Bartolini, F. Watermarking Systems Engineering, 1st ed.; Marcel Dekker: New York, NY, USA, 2004; pp. 23–43. [Google Scholar]
  3. Hu, Y.; Kwong, S.; Huang, J. An algorithm for removable visible watermarking. IEEE Trans. Circuits Syst. Video Technol. 2006, 16, 129–133. [Google Scholar] [CrossRef]
  4. Rosales-Roldan, L.; Cedillo-Hernandez, M.; Chao, J.; Nakano-Miyatake, M.; Perez-Meana, H. Watermarking-based Color Image Authentication with Detection and Recovery Capability. IEEE Lat. Am. Trans. 2016, 14, 1050–1057. [Google Scholar] [CrossRef]
  5. Muñoz-Ramirez, D.O.; Ponomaryov, V.; Reyes-Reyes, R.; Cruz-Ramos, C.; Sadovnychiy, S. Embedding a Color Watermark into DC coefficients of DCT from Digital Images. IEEE Lat. Am. Trans. 2019, 17, 1326–1334. [Google Scholar] [CrossRef]
  6. Huang, C.; Chuang, S.; Huang, Y.; Wu, J. Unseen Visible Watermarking: A Novel Methodology for Auxiliary Information Delivery via Visual Contents. IEEE Trans. Inf. Forensics Secur. 2009, 4, 193–206. [Google Scholar] [CrossRef]
  7. Juarez-Sandoval, O.; Fragoso-Navarro, E.; Cedillo-Hernandez, M.; Nakano, M.; Perez-Meana, H.; Cedillo-Hernandez, A. Improved unseen-visible watermarking for copyright protection of digital image. In Proceedings of the 5th International Workshop on Biometrics and Forensics (IWBF), Coventry, UK, 4–5 April 2017; pp. 1–5. [Google Scholar] [CrossRef]
  8. Juarez-Sandoval, O.; Cedillo-Hernandez, M.; Nakano, M.; Cedillo-Hernandez, A.; Perez-Meana, H. Digital image ownership authentication via camouflaged unseen-visible watermarking. Springer Multimed. Tools Appl. 2018, 77, 26601–26634. [Google Scholar] [CrossRef]
  9. Lin, P.Y. Imperceptible Visible Watermarking Based on Postcamera Histogram Operation. Elsevier J. Syst. Softw. 2014, 95, 194–208. [Google Scholar] [CrossRef]
  10. Juarez-Sandoval, O.; Fragoso-Navarro, E.; Cedillo-Hernandez, M.; Cedillo-Hernandez, A.; Nakano, M.; Perez-Meana, H. Improved imperceptible visible watermarking algorithm for auxiliary information delivery. IET Biom. 2018, 7, 305–313. [Google Scholar] [CrossRef]
  11. Juarez-Sandoval, O.U.; Garcia-Ugalde, F.; Cedillo-Hernandez, M.; Ramirez-Hernandez, J. Imperceptible visible watermarking with watermark readability improved. In Proceedings of the IEEE International Autumn Meeting on Power, Electronics and Computing (ROPEC), Guerrero, Mexico, 4–6 November 2020; pp. 1–6. [Google Scholar] [CrossRef]
  12. Denso Wave Incorporated. Quick Response Code. Available online: https://www.qrcode.com/ (accessed on 22 January 2021).
  13. Gamma Correction Function. Available online: https://www.mathworks.com/help/images/ref/imadjust.html?s_tid=srchtitle (accessed on 1 July 2021).
  14. ANSI/INFOCOMM 3M-2011, Projected Image System Contrast Ratio. Available online: https://www.avixa.org/standards/projected-image-system-contrast-ratio (accessed on 15 December 2020).
  15. Mendoza-Mata, D.; Cedillo-Hernandez, M.; Garcia-Ugalde, F.; Cedillo-Hernandez, A.; Nakano-Miyatake, M.; Perez-Meana, H. Secured telemedicine of medical imaging based on dual robust watermarking. Int. J. Comput. Graph. Vis. Comput. 2021, 1432–2315. [Google Scholar] [CrossRef]
  16. Nuñez-Ramirez, D.; Cedillo-Hernandez, M.; Nakano-Miyatake, M.; Perez-Meana, H. Efficient Management of Ultrasound Images using Digital Watermarking. IEEE Lat. Am. Trans. 2020, 18, 1398–1406. [Google Scholar] [CrossRef]
  17. Park, S.-W.; Ko, J.-S.; Huh, J.-H.; Kim, J.-C. Review on Generative Adversarial Networks: Focusing on Computer Vision and Its Applications. Electronics 2021, 10, 1216. [Google Scholar] [CrossRef]
  18. Abdelhedi, K.; Chaabane, F.; Amar, C.-B. A SVM-based zero-watermarking technique for 3D videos traitor tracing. In Proceedings of the Springer 20th International Conference, Advance Concepts for Intelligence Vision Systems 2020, Auckland, New Zealand, 10–14 February 2020; pp. 373–383. [Google Scholar] [CrossRef]
  19. Lee, H.; Park, S.-H.; Yoo, J.-H.; Jung, S.-H.; Huh, J.-H. Face Recognition at a Distance for a Stand-Alone Access Control System. Sensors 2020, 20, 785. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  20. Ray, A.; Roy, S. Recent trends in image watermarking techniques for copyright protection: A survey. Int. J. Multimed. Inf. Retr. 2020, 9, 249–270. [Google Scholar] [CrossRef]
  21. Wright, S.L.; Greier, P.F. Low-cost method to improve viewing angle characteristics of twisted-nematic mode liquid-crystal displays. SID Symposium Digest Tech. Papers 2002, 33, 717–719. [Google Scholar] [CrossRef]
  22. Wu, C.W.; Thompson, G.; Wright, S.L. Multiple images viewable on twisted-nematic mode liquid-crystal displays. IEEE Signal Process. Lett. 2003, 10, 225–227. [Google Scholar] [CrossRef]
  23. Yin, W.; Goldfarb, D.; Osher, S. Total Variation Based Image Cartoon-Texture Decomposition; CORC Report TR-2005-01, UCLA CAM Report 05-27; Columbia University: New York, NY, USA, 2005. [Google Scholar]
  24. Yang, X.K.; Lin, W.S.; Lu, Z.K.; Ong, E.P.; Yao, S.S. Just noticeable distortion model and its applications in video coding. Elsevier Signal Process. Image Commun. 2005, 20, 662–680. [Google Scholar] [CrossRef]
  25. Yu, P.; Shang, Y.; Li, C. A new visible watermarking technique applied to CMOS image sensor. In Proceedings of the SPEI SPIE 8917, MIPPR 2013: Multispectral Image Acquisition, Processing, and Analysis, Wuhan, China, 26–27 October 2013; pp. 891719-1–891719-7. [Google Scholar] [CrossRef]
  26. Jung, S.; Ha, L.T.; Ko, S. A New Histogram Modification Based Reversible Data Hiding Algorithm Considering the Human Visual System. IEEE Signal Process. Lett. 2011, 18, 95–98. [Google Scholar] [CrossRef]
  27. Liu, H.; Zhang, Y.; Zhang, H.; Fan, C.; Kwong, S.; Jay-Kuo, C.-C.; Fan, X. Deep Learning-Based Picture Wise Just Noticeable Distortion Prediction Model for Image Compression. IEEE Trans. Image Process. 2020, 29, 641–656. [Google Scholar] [CrossRef]
  28. Fragoso-Navarro, E.; Cedillo-Hernández, M.; Nakano-Miyatake, M.; Cedillo-Hernández, A.; Pérez-Meana, H. Visible Watermarking Assessment Metrics Based on Just Noticeable Distortion. IEEE Access 2018, 6, 75767–75788. [Google Scholar] [CrossRef]
  29. Huang, J.; Shi, Y.Q. Adaptive image watermarking scheme based on visual masking. IET Electron. Lett. 1998, 34, 748–750. [Google Scholar] [CrossRef] [Green Version]
  30. Perez-Daniel, K.R.; Garcia-Ugalde, F.; Sanchez, V. Watermarking of HDR Images in the Spatial Domain With HVS-Imperceptibility. IEEE Access 2020, 8, 156801–156817. [Google Scholar] [CrossRef]
  31. Ni, Z.; Shi, Y.Q.; Ansari, N.; Su, W. Reversible data hiding. IEEE Trans. Circuits Syst. Video Technol. 2006, 16, 354–362. [Google Scholar] [CrossRef]
  32. Chung, K.-L.; Huang, Y.-H.; Yan, W.-M.; Teng, W.-C. Distortion reduction for histogram modification-based reversible data hiding. Elsevier Appl. Math. Comput. 2012, 218, 5819–5826. [Google Scholar] [CrossRef]
  33. Singh, L.; Singh, A.K.; Singh, P.K. Secure data hiding techniques: A survey. Springer Multimed. Tools Appl. 2020, 79, 15901–15921. [Google Scholar] [CrossRef]
  34. Wan, W.; Zhou, K.; Zhang, K.; Zhan, Y.; Li, J. JND-Guided Perceptually Color Image Watermarking in Spatial Domain. IEEE Access 2020, 8, 164504–164520. [Google Scholar] [CrossRef]
  35. Zhang, F.; Luo, T.; Jiang, G.; Yu, M.; Xu, H.; Zhou, W. A novel robust color image watermarking method using RGB correlations. Springer Multimed. Tools Appl. 2019, 78, 20133–20155. [Google Scholar] [CrossRef]
  36. Li, B.; Zhang, Y. Design of digital watermark detection system based on handheld devices. In Proceedings of the IEEE International Conference on Computer Science and Electronics Engineering (ICCSEE), Hangzhou, China, 23–25 March 2012; pp. 52–55. [Google Scholar] [CrossRef]
  37. Wang, Z.; Bovik, A.C.; Sheikh, H.R.; Simoncelli, E.P. Image quality assessment: From error visibility to structural similarity. IEEE Trans. Image Process. 2004, 13, 600–612. [Google Scholar] [CrossRef] [Green Version]
  38. Chang, H.; Chen, H.H. Stochastic Color Interpolation for Digital Cameras. IEEE Trans. Circuits Syst. Video Technol. 2007, 17, 964–973. [Google Scholar] [CrossRef]
  39. Schaefer, G.; Stich, M. UCID: An uncompressed color image database. In Proceedings of the SPIE Storage and Retrieval Methods and Applications for Multimedia, San Jose, CA, USA, 18–22 January 2004; Volume 5307, pp. 472–480. [Google Scholar] [CrossRef]
  40. Park, Y.; Kim, J.; Kim, M.; Lee, W.; Lee, S. Programmable multimedia platform based on reconfigurable processor for 8K UHD TV. IEEE Trans. Consum. Electron. 2015, 61, 516–523. [Google Scholar] [CrossRef]
  41. Wang, H.; Zhang, X.; Wang, T.; Li, W.; Chen, Q.; Ren, P.; Wu, X.; Sun, H. A 4K × 2K@60fps Multifunctional Video Display Processor for High Perceptual Image Quality. IEEE Trans. Circuits Syst. I Regul. Pap. 2020, 67, 451–463. [Google Scholar] [CrossRef]
  42. Caviedes, J.E. The Evolution of Video Processing Technology and Its Main Drivers. Proc. IEEE 2012, 100, 872–877. [Google Scholar] [CrossRef]
  43. Xu, C.; Peng, Z.; Hu, X.; Zhang, W.; Chen, L.; An, F. FPGA-Based Low-Visibility Enhancement Accelerator for Video Sequence by Adaptive Histogram Equalization With Dynamic Clip-Threshold. IEEE Trans. Circuits Syst. I Regul. Pap. 2020, 67, 3954–3964. [Google Scholar] [CrossRef]
  44. Li, Z.; Wang, J.; Sylvester, D.; Blaauw, D.; Kim, H.S. A 1920 × 1080 25-Frames/s 2.4-TOPS/W Low-Power 6-D Vision Processor for Unified Optical Flow and Stereo Depth with Semi-Global Matching. IEEE J. Solid-State Circuits 2019, 54, 1048–1058. [Google Scholar] [CrossRef]
  45. Bucolo, M.; Buscarino, A.; Fortuna, L.; Famoso, C. Stochastic resonance in imperfect electromechanical Systems. In Proceedings of the IEEE 29th International Symposium on Industrial Electronics (ISIE), Delft, The Netherlands, 17–19 June 2020; pp. 210–214. [Google Scholar] [CrossRef]
  46. Bucolo, M.; Buscarino, A.; Famoso, C.; Fortuna, L.; Gagliano, S. Imperfections in Integrated Devices Allow the Emergence of Unexpected Strange Attractors in Electronic Circuits. IEEE Access 2021, 9, 29573–29583. [Google Scholar] [CrossRef]
Figure 1. General diagram of the UVW.
Figure 1. General diagram of the UVW.
Electronics 10 02186 g001
Figure 2. General diagram of the CUVW.
Figure 2. General diagram of the CUVW.
Electronics 10 02186 g002
Figure 3. JND response.
Figure 3. JND response.
Electronics 10 02186 g003
Figure 4. Binary watermark pattern of 150 × 150 size, |W| = 22,500. (a) Original watermark W; with |W0| = 7029, |W1| = 15,471, (b) corresponding watermark complement W ¯ of (a); with | W 0 ¯ | = 15,471 , | W ¯ 1 | = 7029 .
Figure 4. Binary watermark pattern of 150 × 150 size, |W| = 22,500. (a) Original watermark W; with |W0| = 7029, |W1| = 15,471, (b) corresponding watermark complement W ¯ of (a); with | W 0 ¯ | = 15,471 , | W ¯ 1 | = 7029 .
Electronics 10 02186 g004
Figure 5. Histogram distortion and visual degradation, (a) watermarked image by original watermark, (b) histogram of (a), (c) watermarked image by complement watermark, (d) histogram of (c).
Figure 5. Histogram distortion and visual degradation, (a) watermarked image by original watermark, (b) histogram of (a), (c) watermarked image by complement watermark, (d) histogram of (c).
Electronics 10 02186 g005
Figure 6. General diagram of the embedding stage.
Figure 6. General diagram of the embedding stage.
Electronics 10 02186 g006
Figure 7. Gamma Correction Function Response, (a) GCF response with γ = 1, (b) GCF response with γ < 1, (c) GCF with γ > 1.
Figure 7. Gamma Correction Function Response, (a) GCF response with γ = 1, (b) GCF response with γ < 1, (c) GCF with γ > 1.
Electronics 10 02186 g007
Figure 8. Shift GCF operation, (a) Shift GCF operation response to γ < 1, (b) Shift GCF operation response to γ > 1.
Figure 8. Shift GCF operation, (a) Shift GCF operation response to γ < 1, (b) Shift GCF operation response to γ > 1.
Electronics 10 02186 g008
Figure 9. Angle of vision.
Figure 9. Angle of vision.
Electronics 10 02186 g009
Figure 10. Sample set of images with QR code exhibited by the GCF = 0.2, (a) original image, (b) watermarked image with PSNR = 50.16 dB, SSIM = 0.9919, NCD = 0.279, (c) exhibited watermark.
Figure 10. Sample set of images with QR code exhibited by the GCF = 0.2, (a) original image, (b) watermarked image with PSNR = 50.16 dB, SSIM = 0.9919, NCD = 0.279, (c) exhibited watermark.
Electronics 10 02186 g010
Figure 11. Sample set of images with QR code exhibited by the Shift GCF operation, γ = 0.3, (a) original image, (b) watermarked Image with PSNR = 50.41 dB, SSIM = 0.9964, NCD = 0.0113, (c) exhibited watermark.
Figure 11. Sample set of images with QR code exhibited by the Shift GCF operation, γ = 0.3, (a) original image, (b) watermarked Image with PSNR = 50.41 dB, SSIM = 0.9964, NCD = 0.0113, (c) exhibited watermark.
Electronics 10 02186 g011
Figure 12. Sample set of images with QR code exhibited by the contrast and brightness combination, a = 2, b = −50, (a) original image, (b) watermarked Image with PSNR = 51.63 dB, SSIM = 0.9992, NCD = 0.0096, (c) exhibited watermark.
Figure 12. Sample set of images with QR code exhibited by the contrast and brightness combination, a = 2, b = −50, (a) original image, (b) watermarked Image with PSNR = 51.63 dB, SSIM = 0.9992, NCD = 0.0096, (c) exhibited watermark.
Electronics 10 02186 g012
Figure 13. Sample set of images with QR code exhibited by the angle of vision. (a) Azimuth negative adjustment, (b) Elevation positive adjustment, (c) Azimuth positive adjustment.
Figure 13. Sample set of images with QR code exhibited by the angle of vision. (a) Azimuth negative adjustment, (b) Elevation positive adjustment, (c) Azimuth positive adjustment.
Electronics 10 02186 g013
Figure 14. Double-watermarked image used to evaluate the robustness of the proposed algorithm against common geometric attacks: (a) original image, (b) watermarked image by QR code and owner logotype, PSNR = 47.33 dB, SSIM = 0.9926, NCD = 0.0072, (c) exhibited watermarks.
Figure 14. Double-watermarked image used to evaluate the robustness of the proposed algorithm against common geometric attacks: (a) original image, (b) watermarked image by QR code and owner logotype, PSNR = 47.33 dB, SSIM = 0.9926, NCD = 0.0072, (c) exhibited watermarks.
Electronics 10 02186 g014
Figure 15. Robustness against common geometric attacks, (a) rotation, (b) translation with cropping, (c) affine transformation, (d) rescaling, (e) flip—horizontal transformation, (f) flip—vertical transformation, (g) shearing—x-direction, (h) shearing—y-direction, (i) barrel transformation, (j) aspect ratio.
Figure 15. Robustness against common geometric attacks, (a) rotation, (b) translation with cropping, (c) affine transformation, (d) rescaling, (e) flip—horizontal transformation, (f) flip—vertical transformation, (g) shearing—x-direction, (h) shearing—y-direction, (i) barrel transformation, (j) aspect ratio.
Electronics 10 02186 g015
Figure 16. Test image used to evaluate the robustness of the proposed algorithm against image processing attacks. (a) Original image, (b) watermarked image, PSNR = 46.98 dB, SSIM = 0.9947, NCD = 0.0101, (c) exhibited watermark.
Figure 16. Test image used to evaluate the robustness of the proposed algorithm against image processing attacks. (a) Original image, (b) watermarked image, PSNR = 46.98 dB, SSIM = 0.9947, NCD = 0.0101, (c) exhibited watermark.
Electronics 10 02186 g016
Figure 17. Robustness against image processing attacks, (a) impulsive noise, (b) Gaussian noise, (c) sharpening, (d) blurring, (e) average filter, (f) JPEG compression QF = 75, (g) JPEG compression QF = 50, (h) JPEG2000 CR = 20, (i) JPEG2000 CR = 10.
Figure 17. Robustness against image processing attacks, (a) impulsive noise, (b) Gaussian noise, (c) sharpening, (d) blurring, (e) average filter, (f) JPEG compression QF = 75, (g) JPEG compression QF = 50, (h) JPEG2000 CR = 20, (i) JPEG2000 CR = 10.
Electronics 10 02186 g017aElectronics 10 02186 g017b
Figure 18. Ability to read and decode the information of the exhibited QR code against: (a) JPEG and (b) JPEG2000.
Figure 18. Ability to read and decode the information of the exhibited QR code against: (a) JPEG and (b) JPEG2000.
Electronics 10 02186 g018
Figure 19. Robustness of the owner logotype as watermark against: (a) JPEG QF = 40, and (b) JPEG2000 CR = 40.
Figure 19. Robustness of the owner logotype as watermark against: (a) JPEG QF = 40, and (b) JPEG2000 CR = 40.
Electronics 10 02186 g019
Table 1. Common geometric attacks used to the experimental.
Table 1. Common geometric attacks used to the experimental.
AttackSpecification
RotationAngle 45°
Translation with croppingx = 15, y = 50
Affine transformation(1,0.1,0;0.1,1,0;0,0,1)
Aspect ratio(2,0,0;0,1.0,0;0,0,1)
Flip transformationHorizontal direction
Vertical direction
Shearingx-direction (1 0 0; 0.5 1 0; 0 0 1)
y-direction (1 0.5 0; 0 1 0; 0 0 1)
RescalingRescaling 50%
Barrel-------
Table 2. Image processing attacks.
Table 2. Image processing attacks.
AttackSpecification
Noise contaminationImpulsive, density = 0.4
Gaussian, µ = 0, σ2 = 0.001
SharpeningRadius = 2; Amount = 1
BlurringRadio = 3
Average filter3 × 3
CompressionJPEG QF = 75
JPEG QF = 50
JPEG2000 CR = 20
JPEG2000 CR = 10
Table 3. Average quality and imperceptibility of the proposed algorithm.
Table 3. Average quality and imperceptibility of the proposed algorithm.
Watermark Complement StrategyWatermark SizeLow-Intensity Embedding RegionHigh-Intensity Embedding Region
PSNR (dB)SSIMNCDPSNR (dB)SSIMNCD
IncorporatedQR code
Logotype
51.52
52.45
0.9975
0.9985
0.0158
0.0105
50.64
51.58
0.9890
0.9984
0.0109
0.0097
Non-IncorporatedQR code
Logotype
46.59
47.18
0.9932
0.9950
0.2010
0.1508
47.67
47.12
0.9857
0.9938
0.0981
0.1548
Table 4. Comparison of embedding error, quality and imperceptibility.
Table 4. Comparison of embedding error, quality and imperceptibility.
AlgorithmWatermarkEE (%)PSNR (dB)SSIMNCD
High-Intensity Embedding RegionLow-Intensity Embedding RegionHigh-Intensity Embedding RegionLow-Intensity Embedding RegionHigh-Intensity Embedding RegionLow-Intensity Embedding RegionHigh-Intensity Embedding RegionLow-Intensity Embedding Region
Unseen–Visible Watermarking [6]QR Code
Logotype
33.54
32.98
37.86
36.45
46.95
49.08
45.05
49.14
0.9850
0.9945
0.9810
0.9975
0.0154
0.0160
0.0102
0.0198
Improved Unseen–Visible Watermarking [7]QR Code
Logotype
34.01
38.46
39.47
37.15
48.91
51.83
45.78
50.30
0.9881
0.9986
0.9881
0.9990
0.0237
0.0285
0.0192
0.0159
Camouflaged Unseen–Visible Watermarking [8]QR Code
Logotype
36.12
39.04
38.80
37.69
45.19
41.40
42.94
43.58
0.9613
0. 9489
0.9564
0.9452
0.0178
0.0204
0.0189
0.0177
ProposedQR Code
Logotype
32.12
32.25
35.42
35.02
50.64
51.58
51.52
52.45
0.9890
0.9984
0.9975
0.9985
0.0102
0.0097
0.0158
0.0105
Table 5. Global comparison among previously reported algorithms and the proposed.
Table 5. Global comparison among previously reported algorithms and the proposed.
ParameterProposedUnseen–Visible WatermarkingImperceptible Visible Watermarking
Unseen–Visible Watermarking [6]Improved Unseen–Visible Watermarking [7]Camouflaged Unseen–Visible Watermarking [8]Imperceptible Visible Watermarking [9]Improved Imperceptible Visible Watermarking [10]
WatermarkOwner logotype, 2D barcode and QR codeOwner logotype, 2D barcode and QR codeOwner logotype, 2D barcode and QR codeOwner LogotypeOwner logotype and QR codeOwner logotype, 2D barcode and QR code
UniversalityImages with large smooth regions with low or high intensityImages with large smooth region with low intensityImages with large smooth region with low intensityImages with large smooth region with low intensityImages with large smooth regionImages with large smooth region
Invisible watermarkYesYesYesYesYesYes
Visible watermarkNoNoNoNoNoNo
Embedding strengthJND-basedEmpiricJND-basedJND-basedEmpiricJND Based
Visual degradationLowHighHighHighHighHigh
Histogram distortionLowHighHighHighHighHigh
Watermarked image qualityHighHighHighHighHighHigh
Extra exhibition informationNot requiredGamma/Shift valueShift valueNot requiredMean valueK-th Mean value and color channel
Exhibition procedureGamma/shift gamma/contrast and brightness combination/angle of vision/histogram modulationGammaShift gammaLogarithmic transformation and negative function/image enhancement by a mobile deviceHistogram modulationBinarization function
Quality of the exhibited watermarkHighMediaMediaLowHighHigh
Exhibited watermark natureGrayscale/ColorGrayscaleGrayscaleGrayscaleGrayscale/ColorBinary
Multiple watermarksYesNoNoYesYesYes
Provide auxiliary informationYesYesYesYesYesYes
Computational complexityLowLowHighHighHighHigh
Intellectual property protectionYesYesYesYesYesYes
Copy right protectionYesYesYesYesYesYes
Prevention of non-authorized duplicityYesYesYesYesYesYes
Additional DCT algorithmNo requiredNo requiredNo requiredRequiredNo requiredRequired
Robustness to JPEGHighLowLowMediaLowHigh
Robustness to JPEG2000HighLowLowMediaLowHigh
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Juarez-Sandoval, O.U.; Reyes-Ruiz, L.J.; Garcia-Ugalde, F.; Cedillo-Hernandez, M.; Ramirez-Hernandez, J.; Morelos-Zaragoza, R. Additional Information Delivery to Image Content via Improved Unseen–Visible Watermarking. Electronics 2021, 10, 2186. https://doi.org/10.3390/electronics10182186

AMA Style

Juarez-Sandoval OU, Reyes-Ruiz LJ, Garcia-Ugalde F, Cedillo-Hernandez M, Ramirez-Hernandez J, Morelos-Zaragoza R. Additional Information Delivery to Image Content via Improved Unseen–Visible Watermarking. Electronics. 2021; 10(18):2186. https://doi.org/10.3390/electronics10182186

Chicago/Turabian Style

Juarez-Sandoval, Oswaldo Ulises, Laura Josefina Reyes-Ruiz, Francisco Garcia-Ugalde, Manuel Cedillo-Hernandez, Jazmin Ramirez-Hernandez, and Robert Morelos-Zaragoza. 2021. "Additional Information Delivery to Image Content via Improved Unseen–Visible Watermarking" Electronics 10, no. 18: 2186. https://doi.org/10.3390/electronics10182186

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop