Next Article in Journal
Quantum Epistemology and Falsification
Next Article in Special Issue
A Quantitative Comparison between Shannon and Tsallis–Havrda–Charvat Entropies Applied to Cancer Outcome Prediction
Previous Article in Journal
Task’s Choice: Pruning-Based Feature Sharing (PBFS) for Multi-Task Learning
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Reconstructing Binary Signals from Local Histograms

1
Department of Computer Science, University of Copenhagen, 2100 Copenhagen, Denmark
2
Center for Quantifying Images, MAX IV (QIM), 2800 Kgs. Lyngby, Denmark
*
Author to whom correspondence should be addressed.
Entropy 2022, 24(3), 433; https://doi.org/10.3390/e24030433
Submission received: 25 December 2021 / Revised: 12 March 2022 / Accepted: 15 March 2022 / Published: 21 March 2022
(This article belongs to the Special Issue Application of Entropy to Computer Vision and Medical Imaging)

Abstract

:
In this paper, we considered the representation power of local overlapping histograms for discrete binary signals. We give an algorithm that is linear in signal size and factorial in window size for producing the set of signals, which share a sequence of densely overlapping histograms, and we state the values for the sizes of the number of unique signals for a given set of histograms, as well as give bounds on the number of metameric classes, where a metameric class is a set of signals larger than one, which has the same set of densely overlapping histograms.

1. Introduction

A histogram is a central tool for analyzing the content of signals while disregarding positional relations. It is useful for tasks such as setting thresholds for detecting extremal events and for designing codes in communication tasks. In [1], the three fundamental scales for histograms for discrete signals (and images) were presented: the intensity resolution or the bin-width, the spatial resolution, and the local extent, for which a histogram is evaluated. Even when fixing these scale parameters, it is still essential to consider the sampling phase, since in general, we do not know the location of the interesting signal parts, and thus, we must consider all phases, or equivalently, all overlapping histograms and all histograms for different positions of the first left bin-edge.
A natural question, when using local histograms for signals and image analysis, is: How many signals share a given set of overlapping, local histograms (illustrated in Figure 1)? Out of pure theoretical interest, in this paper, we took a first step in answering this question by considering densely overlapping histograms of binary signals.
As an example, consider the signal [ 0 ; 0 ; 1 ] . Its global histogram is ( 2 , 1 ) , i.e., there are two “0” values and one “1” value.
[ 0 ; 0 ; 1 ] histogram ( 2 , 1 ) .
Given the histogram, the only possible signals are [ 0 ; 0 ; 1 ] , [ 0 ; 1 ; 0 ] , [ 1 ; 0 ; 0 ] , i.e.,
( 2 , 1 ) histogram [ 0 ; 0 ; 1 ] , [ 0 ; 1 ; 0 ] , , [ 1 ; 0 ; 0 ] , .
This is a much smaller number than 2 3 possible signals, but it is not a bijective relation. The representation power of the histogram may be quantified as the conditional entropy of signals given their histogram. For binary signals of length three, the histogram of a signal may be summarized by its count of “1”-values, since the number of “0”-values will be three minus this count. For length three binary signals, there are eight different signals [ 0 ; 0 ; 0 ] , [ 0 ; 0 ; 1 ] , [ 1 ; 1 ; 1 ] , which have four different histograms where the counts of “1” values are 0, 1, 2, and 3, respectively, and the corresponding number of signals counted by their “1”-values is 1, 3, 3, and 1. Given a histogram, the conditional probability of each of these corresponding signals is thus 0 , log 2 3 , log 2 3 , 0 , and the conditional entropy may thus be found to be approximately 1.2 bit.
In this paper, we did not focus on coding schemes for signals, but on the expression power of local overlapping histograms. Thus, consider again the signal in (1), but now with a set of local histograms of extent two, in which case, the histograms are calculated for the overlapping sub-signals,
[ 0 ; 0 ; 1 ] sliding window [ 0 ; 0 ] histogram ( 2 , 0 ) [ 0 ; 1 ] histogram ( 1 , 1 ) .
In this case, there is only one signal that has this sequence of overlapping histograms, since, by the first histogram, we know that the first two values are “0”, and in combination with the second histogram, we conclude that the last value must be “1”; thus, the signal must be [ 0 ; 0 ; 1 ] . In contrast, the signals, [ 0 ; 1 ; 0 ] and [ 1 ; 0 ; 1 ] have the same local histograms,
[ 0 ; 1 ; 0 ] sliding window [ 0 ; 1 ] histogram ( 1 , 1 ) [ 1 ; 0 ] histogram ( 1 , 1 ) ,
and:
[ 1 ; 0 ; 1 ] sliding window [ 1 ; 0 ] histogram ( 1 , 1 ) [ 0 ; 1 ] histogram ( 1 , 1 ) ,
in which case the signal↔histograms is not a bijective relation. However, the local histogram and the global histogram together uniquely identify the signals. As these examples show, the relation between local histograms and signals is non-trivial, and in this paper, we considered the space of possible overlapping local histograms and the number of signals sharing a given set of local histograms.
In the early 20th Century, much attention was given to the lossless reconstruction of signals, in particular using error-correction codes, where the original signal is sent together with added information [2]. Such additional information could be related to the histogram of the original signal. Later, the reconstruction of signals and images became more pressing problems, primarily as a way to compress images without losing essential content, and this resulted in still widely used image representation standards such as mpeg, tiff, and jpeg.
While signal representation has still been of some concern in the 21st Century, advances in hardware means that more attention has been given to image representation and, in particular, to qualifying the information content of image features. In [3], the authors introduced the concept of metameric classes for local features. The authors considered scale-space features and investigated the space of images that share these features. They further presented several algorithms for picking a single reconstruction. An extension of this approach was presented in [4], where patches at interest points of an original image were matched with patches from a database by a feature descriptor such as SIFT [5]. The database patches were then warped and stitched to form an approximation of the original image. In [6], the authors presented a reconstruction algorithm based on binarized local Gaussian weighted averages and using convex optimization. The theoretical properties of the reconstruction algorithm is still an open research question. In [7], images were reconstructed from a histogram of a densely sampled dictionary of local image descriptors (bag-of-visual-words) as a jigsaw puzzle with overlaps. They showed that their method resulted in a quadratic assignment problem and used heuristics to find a good reconstruction. In [8], the authors investigated the reconstruction of images from a simplified SIFT transform. The reconstruction was performed based on the SIFT-key points and their discretized local histograms of the gradient orientations, and several models were presented for choosing a single reconstruction from the possible candidates. In [9], a convolutional neural network was presented that reconstructs images from a regularly, but sparsely sampled set of image descriptors. The network was able to learn image priors and was able to reconstruct images from both classical features such as SIFT and representations found in AlexNET [10]. This was later extended in [11], where an adversarial network was investigated for reconstruction from local SIFT features.
Our work is closely related to [12], which discussed the relation between FRAME [13] and Julesz’s model for human perception of textures [14]. In [13], the authors defined a Julesz ensemble as a set of images that share identical values of basic features statistics. Although not considered in their works, histogram bin-values can be considered a feature statistics, and hence, the metameric classes presented in this paper are Julesz ensembles in the sense of [13]. In [12], they considered normalized histograms of images filtered with Gabor kernels [15], and they considered the limit of the spatial sampling domain converging to Z 2 . Their perspective may be generalized to local histograms; however, their results only hold in the limit.
This paper is organized as follows: First, we define the problem in Section 2. Section 3 describes an algorithm for finding the signal(s) that has (have) a specific set of local histograms. In Section 2, constraints on possible local histograms and the size of metameric classes are discussed, and finally, in Section 5, we present our conclusions.

2. Metameric Signal Classes

We were interested in the number of signals that have the same set of local histograms. In case there is more than one, then we call this a metameric signal class (or just a metameric class) defined by their shared set of local histograms. We define signals and their local histograms as follows: Consider an alphabet A = { 0 , 1 } , l = | A | = 2 and a one-dimensional signal S A n , n > 1 , which we denote S = [ s 0 ; s 1 ; ; s n 1 ] and where s i A is the value of S at position i. For a given window size 1 < m n , we considered all local windows S j = [ s j ; s j + 1 ; ; s j + m 1 ] , 0 j n m and their histograms h j : A Z + ,
h j = ( h j ( 0 ) , h j ( 1 ) )
h j ( a ) = i = j j + m 1 δ ( s i a ) ,
where δ is the Kronecker delta function, defined as:
δ ( x ) = 1 , when x = 0 , 0 , otherwise .
All local histograms of for the signal S are H S = [ h 0 ; h 1 ; ; h n m ] .
As an example, consider the signal,
S = [ 0 ; 1 ; 1 ; 1 ; 0 ] ,
in which case n = 5 . For m = 3 , the windows are:
S 0 = [ 0 ; 1 ; 1 ] ,
S 1 = [ 1 ; 1 ; 1 ] ,
S 2 = [ 1 ; 1 ; 0 ] ,
and the corresponding histograms are:
h 0 = ( 1 , 2 ) ,
h 1 = ( 0 , 3 ) ,
h 2 = ( 1 , 2 ) ,
or equivalently, in short form,
H S = [ ( 1 , 2 ) ; ( 0 , 3 ) ; ( 1 , 2 ) ] .
In some cases, two different signals will have the same set of histograms, and we call these signals metameric, i.e., they appear identical w.r.t. their histograms. We say that they belong to the same metameric class given by their common histogram sequence. For example, when n = 5 and m = 2 , the signals,
S = [ 0 ; 1 ; 0 ; 1 ; 0 ] ,
S = [ 1 ; 0 ; 1 ; 0 ; 1 ] ,
have the same sequence of n m + 1 = 4 histograms,
H S = H S = [ ( 1 , 1 ) ; ( 1 , 1 ) ; ( 1 , 1 ) ; ( 1 , 1 ) ] ,
and thus, S and S belong to the same metameric class denoted by [ ( 1 , 1 ) ; ( 1 , 1 ) ; ( 1 , 1 ) ; ( 1 , 1 ) ] .
We were interested in the ability of local histograms to represent signals. Hence, for a given signal and window sizes, we sought to calculate μ , the number of signals S, which are uniquely identified by H S and κ , the number of metameric classes. The values of μ and κ for small values of n are shown in Table 1. These values were counted by considering all 2 n different possible signals, which is an approach only possible for small values of n. From the table, we observe that we did not find any combination of signal lengths and window sizes without a metameric class; hence, none of the tested combinations yielded a unique relation between the local histograms and the signal. Further, the number of unique signals μ appeared to grow with n / m , and the number of metameric classes κ appeared to be convex in m for a large value of n.

3. An Algorithm for Reconstructing the Complete Set of Signals from a Sequence of Histograms

We constructed an algorithm for reconstructing the one or more signals, which has or have a given sequence of histograms. It was constructed using the following facts:
Fact 1
There is a non-empty and finite set of signals of size m, which share the same histogram h. These can be produced as all the distinct permutations of the following signal:
S = [ 0 ; ; 0 h ( 0 ) ; 1 ; ; 1 h ( 1 ) ] ,
and the number of distinct signals is given by the binomial coefficient:
h ( 0 ) + h ( 1 ) h ( 0 ) = h ( 0 ) + h ( 1 ) ! h ( 0 ) ! h ( 1 ) ! ;
Fact 2
Consider the windows S i 1 and S i and their corresponding histograms h i 1 and h i for i = 0 . . n m . If s i 1 = s i + m 1 , the histograms will be identical; otherwise, the histograms will differ by the count of one at s i 1 and at s i + m 1 , and s i + m 1 = s ¯ i 1 , where · ¯ is the Boolean “not” operator;
Fact 3
From Fact 2, it follows that the histogram of [ s i ; s i + 1 ; ; s i + m 2 ] is equal to h i , but where h i ( s i + m 1 ) has been reduced by one. We call this h i ;
Fact 4
The difference,
d i ( a ) = h i 1 ( a ) h i ( a ) = 1 , when a = s i 1 , 0 , otherwise .
As a consequence, for candidate signals S i that have histogram h i , but that have the wrong value placed at s i + m 1 , the difference will have both negative and positive values.
Thus, we constructed the following algorithm:
Step 1
Produce a candidate set of all the distinct signals of size m that have the histogram h n m ;
Step 2
For i = n m 1 …0 and for each element in the candidate set S i :
Step 2.1
Calculate d i .
Step 2.2
If d i does not have the form of (22), then discard it;
Step 2.3
Else, derive s i 1 from d i , and extend the candidate with this value:
The computational complexity of our algorithm is O K m ( n m ) , where K m is the maximum value of (21), since initially, all signals of h n m 1 must be considered, and this set can only shrink when considering earlier values.
A working code in F# is given at:
The code does not give any output, when run, but running in F#-interactive mode allows the user to inspect the key values after running the program, which are:
signal: int list = [0; 1; 0; 1; 1],
histogramList: Map<int,int> list = [map [(0, 2); (1, 1)]; map [(0, 1); (1, 2)]; map [(0, 1); (1, 2)]],
solutions: int list list = [[0; 0; 1; 1; 0]; [0; 1; 0; 1; 1]].
The signal is a sequence of binary digits, and the histogram sequence is represented as a sequence of maps, where each map-entry is an (intensity, count) pair, i.e., map [(0, 2); (1, 1)] above is equal to the histogram ( 2 , 1 ) . Finally, the solutions is represented as a sequence of sequences of binary digits. In this case, there is, as we can see, a metameric class of two signals, which shares a sequence of histograms. We verified that the algorithm is able to correctly reconstruct all the signal considered in Table 1 including all the members of the metameric classes.

4. Theoretical Considerations on μ and κ

In the following, we consider classes of histogram sequences and relate them to the number of metameric classes for a given family of signals and their sizes.
As a preliminary fact, note that for a window size m, all the histograms must have:
h j ( 0 ) + h j ( 1 ) = m ,
since all entries in s i are counted exactly once.

4.1. Constant Sequence of Histograms ( h 0 = h j )

There are two different constant signals of length n: [ 0 ; 0 ; ; 0 ] and [ 1 ; 1 ; ; 1 ] . All neighborhoods and histograms of the constant signals are identical, and a histogram will have one non-zero element with value m. These signals cannot belong to a metameric class, since permuting the position of the values in S 0 does not give a new signal, and they are trivially unique.
In general signals with constant histogram sequences, h 0 = h j , the signal must be periodic, since the only difference in the histogram count of h j and h j + 1 is that h j includes s j + m and h j + 1 does not include s j . Hence, for h j = h j + 1 , then s j = s j + m . For example, [ 1 ; 0 ; 1 ; 0 ; 1 ] is a periodic signal for m = 2 with histogram h j = ( 1 , 1 ) , j = 0 3 . Any constant sequence of histograms describes a periodic signal, and for non-constant signals ( a : h ( a ) < m ) and m > 1 , these histograms describe a metameric class, since some permutations of S = [ S 0 | S 0 | ] will produce new signals without changing the histograms due to periodicity. For any n > m , there are 2 m 2 such periodic binary signals.

4.2. Global Histogram ( m = n )

For a particular h = h 0 , all permutations of the signal belong to the same metameric class. Thus, the number of metameric class is equal to the number of different histograms with sum m, except those for constant signals. This corresponds to picking m numbers from A where repetition is allowed and order does not matter.
Following the standard derivation of unordered sampling with replacement, we visually rewrite the terms in (23) with a list of “·’s”, where each “·” represents the count of one for a given bin. For example, for m = 3 , we may have the histogram ( 2 , 1 ) , which implies that 3 = 2 + 1 = · · + · . A different histogram could be ( 0 , 3 ) , implying that 3 = 0 + 3 = + · · · . Hence, any permutation of three “·’s” and one “+” will in this representation give the sum of three, and the number of permutations is equal to the number of ways we can choose m out of m + 1 positions. Thus, the number of different ways we can pick histograms is given as the binomial coefficient:
m + 1 m = m + 1 1 = ( m + 1 ) ! 1 ! m ! = m + 1 .
Out of these, two histograms stem from the constant signals. The remaining histograms have a : h ( a ) < m , and each of these histograms defines a metameric class, since there will always be more than one signal with such a histogram by (21). Hence, the number of different histograms is,
κ ( m , m ) = m 1 .
This equation confirms the values in Table 1 where n = m .

4.3. Smallest Histogram ( m = 2 )

For the case of m = 2 , we now show that:
κ ( n , 2 ) = 1 .
Consider the sequence of histograms [ h 0 ; h 1 ; ] . For h 0 , there are three different histograms corresponding to 2 2 different signals. These signals fall into two classes: s 0 = s 1 is trivially solvable; however, for s 0 s 1 , the values of s 0 and s 1 are easily identifiable from the histogram h 0 , but their positions are not. Write h 0 = [ ( s 0 , 1 ) ; ( s 1 , 1 ) ] . Now, consider h 1 . Again, if s 1 = s 2 , then their values are trivially solvable, and since s 1 is known, then s 0 can be deduced from h 0 . Therefore, assume that s 1 s 2 , and write h 1 = [ ( s 1 , 1 ) ; ( s 2 , 1 ) ] . Now, consider h 2 ; as before, if s 2 = s 3 , then their values are trivially solvable by h 2 ; hence, s 1 can be deduced from h 1 and s 2 , and in turn, s 0 can be deduced from s 1 and h 0 . The general structure of the problem is illustrated in Figure 2, and by induction, we see that only the constant sequence of histograms h j = ( 1 , 1 ) , j = 0 n m 1 is a metameric class; hence, κ ( n , 2 ) = 1 .

4.4. The General Case ( n > m > 2 )

Since the sum of a histogram is m (see (23)) and since the histograms for binary signals only have two bins, we can identify each histogram by:
σ j = h j ( 1 ) = m h j ( 0 ) = i = j j + m 1 s i ,
i.e., as the number of one-values in S j . Thus, in the following, we identify h j by σ j . In the following, we consider consecutive pairs of histograms for signals of varying lengths n > m .
Firstly, consider the case n = 3 and m = 2 and all possible combinations of histograms h 0 and h 1 , i.e., σ 0 , σ 1 { 0 , 1 , 2 } . The organization of all the 2 3 signals in terms of σ 0 and σ 1 is shown in Table 2. We call such tables transition tables, and we say that each table cell contains a set of signal pieces. For n = 3 and m = 2 , the table illustrates that there is one metameric class shown in cell σ 0 = σ 1 = 1 , since this table cell contains two elements. This case is also discussed in relation to (26).
Now, consider the case n = 4 and m = 2 . The transition table for ( σ 1 , σ 2 ) is identical to Table 2. Further, an element in ( σ 0 , σ 1 ) is related to an element in ( σ 1 , σ 2 ) by the arrows in the table. For example, if [ s 0 ; s 1 ; s 2 ] = [ 0 ; 0 ; 1 ] , then σ 0 = 0 and σ 1 = 1 . If s 3 = 0 , then [ s 1 ; s 2 ; s 3 ] = [ 0 ; 1 ; 0 ] , σ 0 = 1 , and σ 1 = 1 , while if s 3 = 1 , then [ s 1 ; s 2 ; s 3 ] = [ 0 ; 1 ; 1 ] , σ 0 = 1 , and σ 1 = 2 .
Transition tables contain zero or more [ s j ; s j + 1 ; ; s m ] elements that have histograms σ j and σ j + 1 . The tables have a particular structure:
Fact 5
( σ j , σ j + 1 ) is a tridiagonal table: Since σ j and σ j + 1 only differ by the values s j and s j + m , then the differences between σ j and σ j + 1 can maximally be one. Hence, the table will have a tridiagonal structure;
Fact 6
Elements on the main diagonal have s j = s j + m : On the diagonal σ j + 1 = σ j , hence:
σ j = i = 0 m 1 s j + i = s j + i = 1 m 1 s j + i
σ j + 1 = i = 0 m 1 s j + 1 + i = s j + m + i = 0 m 2 s j + 1 + i = s j + m + i = 1 m 1 s j + i = σ j
Thus, s j = s j + m ;
Fact 7
Elements on the first diagonal above have s j = 0 s j + m = 1 : On the first diagonal above, σ j + 1 = σ j + 1 , and thus,
σ j = s j + i = 1 m 1 s j + i
σ j + 1 = s j + m + i = 1 m 1 s j + i = σ j + 1
Thus, s j = s j + m 1 s j = 0 s j + m = 1 ;
Fact 8
Elements the first diagonal below have s j = 1 s j + m = 0 : On the first diagonal below, σ j + 1 = σ j 1 , and thus,
σ j = s j + i = 1 m 1 s j + i
σ j + 1 = s j + m + i = 1 m 1 s j + i = σ j 1
Thus, s j = s j + m + 1 s j = 1 s j + m = 0 .
For counting the number of elements in the table, let γ ( σ i , σ j ) be the number of elements in cell ( σ i , σ j ) :
Fact 9
For ( σ j , σ j + 1 ) { ( 0 , 0 ) , ( 1 , 0 ) , ( 0 , 1 ) , ( m , m ) , ( m 1 , m ) , ( m , m 1 ) } ,
γ ( σ j , σ j + 1 ) = 1 :
In all six cases, the histograms are from signals where either or both S j and S j + 1 are constant, and hence, we can trivially reconstruct the corresponding m + 1 values from the histograms. We call such a histogram pair a two-trivial pair;
Fact 10
On the main diagonal, except σ j = σ j + 1 = 0 and σ j = σ j + 1 = m ,
γ ( σ j , σ j ) = m 1 σ j + m 1 σ j 1 :
By Fact 6, s j = s j + m . For s j = 0 , the possible signals for s j + k , 1 k m 1 are signals summing to σ j , i.e., m 1 σ j , and for s j = 1 , we have m 1 σ j 1 . Since 0 < σ j < m , therefore γ ( σ j , σ j ) 2 ;
Fact 11
On the first diagonal above,
γ ( σ j , σ j + 1 ) = m 1 σ j :
By Fact 7, s j = 0 s j + m = 1 . Hence, the possible signals for s j + i , 1 i m 1 are signals summing to σ j . Further, since 1 < σ j + 1 < m and σ j + 1 = σ j + 1 , therefore 0 < σ j < m 1 , and therefore, γ ( 0 , σ j + 1 ) = γ ( m 1 , σ j + 1 ) = 1 and γ ( σ j , σ j + 1 ) 2 for all other cases;
Fact 12
On the first diagonal below,
γ ( σ j , σ j 1 ) = m 1 σ j + 1 :
By Fact 8, s j = 1 s j + m = 0 . Hence, the possible signals for s j + i , 1 i m 1 are signals summing to σ j 1 = σ j + 1 . Further, since 1 < σ j < m and σ j + 1 = σ j 1 , then 0 < σ j + 1 < m 1 , and therefore, γ ( σ j , 0 ) = γ ( σ j , m 1 ) = 1 and γ ( σ j , σ j 1 ) 2 in all other cases.
For transitions, the following facts hold:
Fact 13
Any signal of any length n > 2 , m = 2 can be described as a route following the arrows in the table;
Fact 14
An element in column j transitions to an element in row j, and as a consequence,
( σ j , σ j ) ( σ j , σ k ) ( horizontal motion ) ,
( σ j , σ j + 1 ) ( σ j + 1 , σ k ) ( down motion ) ,
( σ j + 1 , σ j ) ( σ j , σ k ) ( up motion ) ,
for 0 j 1 k j + 1 m + 1 . Hence, only cells on the diagonal can contain intracellular paths;
Fact 15
Any entry is maximally m 1 steps away from a two-trivial element, since starting at element [ s 0 , , s m 2 , s m ] , there is an m 1 path leading to [ s m 2 , s m , , s m ] .
W.r.t. the number of metameric classes:
Fact 16
Intracellular paths for 0 < i , j < m + 1 , | j i | 1 , are ambiguous, since these cells contain several indistinguishable elements, and we cannot determine the path’s starting point from its histogram sequence;
Fact 17
Cell pairs, connected by more than one arrow in the same direction, give rise to ambiguous pairs, and paths that only contain such crossings or intracellular paths are ambiguous, since the paths cannot be distinguished by their histograms;
Fact 18
For m = n 1 , the number of metameric classes is equal to the number of non-empty cells in the tridiagonal table minus the six two-trivial cells.
κ ( 2 , n , n 1 ) = n + 2 ( n 1 ) 6 = 3 n 8 .
For 2 m < n 1 , an upper bound on the number of metameric classes is equal to the number of ambiguous paths. We have yet to come up with a closed form for κ ( n , m ) .
We verified the above facts by considering the transition table for m = 3 , as shown in Table 3. For n = m + 1 = 4 , we see that there are six uniquely identifiable signals and four metameric classes, as also confirmed by the algorithm in Table 1. For n = m + 2 = 5 , we identified two intracellular cycles in ( σ i , σ j ) = ( 1 , 1 ) and ( 2 , 2 ) . Further, there are four ambiguous cell pairs ( 1 , 1 ) ( 1 , 2 ) , ( 1 , 2 ) ( 2 , 2 ) , ( 2 , 2 ) ( 2 , 1 ) , and ( 2 , 1 ) ( 1 , 1 ) . Hence, in total, there are six metameric classes. All other arrows corresponds to non-metameric paths of which there are 18. These numbers also correspond to the results by the algorithm shown in Table 1. For n = m + 3 = 6 and leaving out the arrows for brevity, we identified the following ambiguous paths ( 1 , 1 ) 3 , ( 1 , 1 ) 2 ( 1 , 2 ) , ( 1 , 1 ) ( 1 , 2 ) ( 2 , 2 ) , ( 1 , 2 ) ( 2 , 2 ) 2 , and ( 1 , 2 ) ( 2 , 2 ) ( 2 , 1 ) and a similar set of paths starting in ( 2 , 2 ) . Hence, we have the upper bound on the 10 metameric signals, and after a careful study, we realized that of these, there are two unique paths ( 1 , 2 ) ( 2 , 2 ) ( 2 , 1 ) and likewise for ( 2 , 1 ) ( 1 , 1 ) ( 1 , 2 ) ; hence, the number of metameric classes is eight, as confirmed by our algorithm; see Table 1. We have yet to identify an efficient algorithm to count all the ambiguous paths in such tables.

5. Conclusions

From the concept of locally orderless images [1] in image processing, we were intrigued by characterizing the metameric classes for a given set of local histograms. In this article, we took the first step by studying binary signals. We gave a sifting algorithm with a computational complexity that was factorial in the size of the window and linear in the signal size. We further identified all unique signals and an upper bound on the number of metameric classes for all signal and window sizes. While the transition tables illuminated important aspects in identifying metameric classes, we have yet to discover an efficient algorithm for this purpose. Future work includes extending our work to signals of more complex types, sets of histograms with varying window sizes, and signals of higher dimensions such as images.

Author Contributions

Conceptualization, J.S. and S.D.; Formal analysis, J.S.; Methodology, J.S.; Software, J.S.; Writing—original draft, J.S.; Writing—review & editing, J.S. and S.D. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Code discussed in this paper is available at https://github.com/sporring/reconstructionFromHistograms (accessed on 24 December 2021). For the specific version discussed in this paper see https://github.com/sporring/reconstructionFromHistograms/commit/333506e72fd8bea2132857869aca15b54392aa75 (accessed on 24 December 2021).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Koenderink, J.J.; Doorn, A.J.V. The Structure of Locally Orderless Images. Int. J. Comput. Vis. 1999, 31, 159–168. [Google Scholar] [CrossRef]
  2. MacWilliams, F.; Sloane, N. The Theory of Error-Correcting Codes; Bell Laboratories: Holmdel, NJ, USA, 1977. [Google Scholar]
  3. Lillholm, M.; Nielsen, M.; Griffin, L.D. Feature-Based Image Analysis. Int. J. Comput. Vis. 2003, 52, 73–95. [Google Scholar] [CrossRef]
  4. Weinzaepfel, P.; Jégou, H.; Pérez, P. Reconstructing an image from its local descriptors. In Proceedings of the CVPR 2011, Colorado Springs, CO, USA, 20–25 June 2011; pp. 337–344. [Google Scholar]
  5. Lowe, D.G. Object Recognition from Local Scale-Invariant Features. In Proceedings of the Seventh IEEE International Conference on Computer Vision, Kerkyra, Greece, 20–27 September 1999. [Google Scholar]
  6. D’Angelo, E.; Jacques, L.; Alahi, A.; Vandergheynst, P. From bits to images: Inversion of local binary descriptors. IEEE Trans. Pattern Anal. Mach. Intell. 2014, 36, 874–887. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  7. Kato, H.; Harada, T. Image Reconstruction from Bag-of-Visual-Words. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Columbus, OH, USA, 23–28 June 2014. [Google Scholar]
  8. Desolneux, A.; Leclaire, A. Stochastic Image Reconstruction from Local Histograms of Gradient Orientation. In Proceedings of the International Conference on Scale Space and Variational Methods in Computer Vision, Kolding, Denmark, 4–8 June 2017; Lauze, F., Dong, Y., Dahl, A.B., Eds.; Springer International Publishing: Cham, Switzerland, 2017; Volume 10302, pp. 133–145. [Google Scholar]
  9. Dosovitskiy, A.; Brox, T. Inverting Visual Representations with Convolutional Networks. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 4829–4837. [Google Scholar]
  10. Krizhevsky, A.; Sutskever, I.; Hinton, G.E. ImageNet classification with deep convolutional neural networks. Commun. ACM 2012, 60, 84–90. [Google Scholar] [CrossRef]
  11. Wu, H.; Zhou, J.; Li, Y. Image Reconstruction from Local Descriptors Using Conditional Adversarial Networks. In Proceedings of the 2019 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC), Lanzhou, China, 18–21 November 2019. [Google Scholar]
  12. Wu, Y.N.; Zhu, S.C.; Liu, X. Equivalence of Julesz Ensembles and FRAME Models. Int. J. Comput. Vis. 2000, 38, 247–265. [Google Scholar] [CrossRef]
  13. Zhu, S.C.; Wu, Y.; Mumford, D. Filters, Random Fields and Maximum Entropy (FRAME): Towards a Unified Theory for Texture Modeling. Int. J. Comput. Vis. 1998, 27, 107–126. [Google Scholar] [CrossRef] [Green Version]
  14. Julesz, B. Visual Pattern Discrimination. IRE Trans. Inf. Theory 1962, 8, 84–92. [Google Scholar] [CrossRef]
  15. Gabor, D. Theory of communication. Part 1: The analysis of information. J. Inst. Electr. Eng. Part III Radio Commun. Eng. 1946, 93, 429–441. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Local, overlapping histograms are calculated from a signal. Does the set of histograms uniquely identify a signal?
Figure 1. Local, overlapping histograms are calculated from a signal. Does the set of histograms uniquely identify a signal?
Entropy 24 00433 g001
Figure 2. The reconstruction of the signal from a stream of local histograms. If the current point in the sequence can be resolved, the complete signal up until this point can be resolved.
Figure 2. The reconstruction of the signal from a stream of local histograms. If the current point in the sequence can be resolved, the complete signal up until this point can be resolved.
Entropy 24 00433 g002
Table 1. For different signal (n) and window (m) sizes, we calculated the total number of different signals ( 2 n ), the number of signals not belonging to a metameric class, i.e., invertible ( μ ), and the number of different metameric classes ( κ ).
Table 1. For different signal (n) and window (m) sizes, we calculated the total number of different signals ( 2 n ), the number of signals not belonging to a metameric class, i.e., invertible ( μ ), and the number of different metameric classes ( κ ).
n233444555566666
m223234234523456
2 n 488161616323232326464646464
μ 262146230186262461862
κ 11214316741815105
Table 2. All signals grouped by the sum of their local histograms when n = 3 and m = 2 . Arrows show the relations between the ( σ j , σ j + 1 ) and ( σ j + 1 , σ j + 2 ) tables.
Table 2. All signals grouped by the sum of their local histograms when n = 3 and m = 2 . Arrows show the relations between the ( σ j , σ j + 1 ) and ( σ j + 1 , σ j + 2 ) tables.
Entropy 24 00433 i001
Table 3. All signals grouped by the sum of their local histograms when n = 4 and m = 3 .
Table 3. All signals grouped by the sum of their local histograms when n = 4 and m = 3 .
Entropy 24 00433 i002
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Sporring, J.; Darkner, S. Reconstructing Binary Signals from Local Histograms. Entropy 2022, 24, 433. https://doi.org/10.3390/e24030433

AMA Style

Sporring J, Darkner S. Reconstructing Binary Signals from Local Histograms. Entropy. 2022; 24(3):433. https://doi.org/10.3390/e24030433

Chicago/Turabian Style

Sporring, Jon, and Sune Darkner. 2022. "Reconstructing Binary Signals from Local Histograms" Entropy 24, no. 3: 433. https://doi.org/10.3390/e24030433

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop