3.1. One Dimensional Network
First, we describe the behavior of a Boltzmann machine with five neurons arranged in a row. We set various connections between neurons.
Figure 4 shows the global entropy and global mutual information of a specific Boltzmann machine that consists of five neurons plotted against the noise strength, where the global entropy shows a monotonic increasing function and the other curve of shows the global mutual information. The weight of the connection is determined subject to a normal distribution, which is called the normal condition. The color of the curves show the weight strength. The probability of a binary sequence
is obtained by the frequency of the binary sequence in the time development of 10,000 steps, where
.
It is clear that the global mutual information increases in the low range of the noise strength. The range in which the global mutual information increases suggests some explicit behaviour based on attractors in the neural network under the normal condition, which indicates recurrence resonance. As each neuron mutually interacts with each other under the normal condition in which each neuron is basically connected to all neurons in the network, the effect of the external noise is weakened by the interaction and is prevailed slowly. Therefore, the global entropy slowly increases as the weight strength increases—. The global mutual information also slowly increases in the smaller range of the noise strength for the high weight strength. It is easy to see that the global mutual information gradually increases and reaches the maximal point till the global entropy reaches about the value of 3.6 independent of the connection strength. The global entropy reaches 3.6 for the noise strength of 1.3 if the weight strength is 1, and it reaches 3.6 for the noise strength of 10.0 if the weight strength is 10. This implies that the recurrence resonance is explicit if the weight strength is high.
Figure 5 also shows global entropy and global mutual information pairs of a specific Boltzmann machine which consists of five neurons, plotted against the noise strength, where the weight of the connection is determined under the diagonal distribution (left) and under the quantum distribution (right)—these are called the diagonal condition and Quantum condition, respectively. The diagonal distribution of the connection implies that the weight of a connection such as
is stochastically determined in the range between 0.7 and 1.0, and any other weight such as
is stochastically determined in the range of 0.0 and 0.05. The quantum distribution of the connection implies that weight of a connection such as
is stochastically determined in the range between 0.7 and 1.0, each of
,
,
,
,
,
,
,
,
,
,
, and
is stochastically determined in the range between 0.2 and 0.3, and any other weight is stochastically determined in the range of 0.0 and 0.05. Global entropy and global mutual information pairs were obtained from a time series of 1000 steps (above) and 10,000 steps (below).
The global entropy under the diagonal condition rapidly increases even for the small noise strength, as the connection between neurons is so weak that each neuron behaves independently of other neurons; the effect of external noise is not weakened due to the interactions between neurons. While the global entropy increases until it reaches about 4.0, the global mutual information also increases until it reaches the maximal point. This tendency is found for both cases in which the probability of binary sequence was obtained in time series of 1000 steps and 10,000 steps. Although the range showing the increase in global mutual information is very narrow, the maximal value of global mutual information under diagonal conditions is very high. This shows that the spatial and temporal patterns under the diagonal condition are very complex between order and chaos.
Compared to other conditions, in the diagonal condition, the maximum value of global mutual information increases rapidly as the weight strength increases. In the normal condition in
Figure 4 and the quantum condition in
Figure 5, the maximum value of global mutual information hardly changes when the weight strength is changed from 1 to 2, to 5 and then 10. However, in the diagonal condition, when the weight strength is changed from 1 to 10, the maximum value of global mutual information changes about three times. In the diagonal condition, there is no interaction with other neurons, but as the weight strength increases, the autocatalytic effect becomes stronger, which is thought to enhance the tendency of recurrence resonance.
Compared to the behaviour of the Boltzmann machine under the diagonal condition, the global entropy and global mutual information under the quantum condition slowly increase, since the interaction among neurons can contribute to weakening the effect of external noise. The global mutual information increases and reaches the maximal point until the global entropy increases and reaches the value of 3.5, whether the probability was obtained in a time series of 1000 steps or 10,000 steps. The maximal value of the global mutual information is rather small compared to the diagonal condition.
If the probabilities of binary sequences are obtained in a time series of 1000 steps, the global mutual information converges not to 0.0 but to 1.0, regardless of whether the connection of neurons is subject to the diagonal or quantum condition. However, if the probability is obtained in a time series of 10,000 steps, it converges to about 0.0. This implies that if the time development proceeds for long enough time, then explicit complex patterns disappear because of the effect of external noise, and that recurrence resonance appears in the transient stage before the behavior of neural nets converges to a noisy steady state.
Figure 6 also shows global entropy and global mutual information against the noise strength of a Boltzmann machine consisting of seven neurons, where
for
. The probability of the binary sequence was obtained from a time series calculated over 10,000 time steps. It is easy to see that the patterns of the weight of the connection reveal Boolean algebra and quantum logic. As mentioned in the above section, the diagonal relation reveals Boolean logic. As the pattern of weight in the diagonal condition leads to the diagonal relation by the relation to which only the weight exceeding 0.7 belongs, the weight pattern in diagonal condition reveals Boolean algebra. The pattern of the weight in the quantum condition leads to a relation in which 3 × 3 and 4 × 4 diagonal sub-relations are surrounded by related cells, if the weight exceeding 0.2 is determined as an element of the relation; otherwise, the weight is determined as no relation. This pattern leads to quantum logic, which is expressed as a disjoint union of
- and
-Boolean algebras whose greatest and least elements are common. We also calculated global entropy and mutual information for the weight pattern, which revealed a disjoint union of two
- and one
-Boolean algebras under the quantum condition. However, they are omitted because the tendency of the entropy and mutual information is almost same as that in the graph with the quantum condition in
Figure 6.
The trends of global entropy and global mutual information do not change fundamentally even when the number of neurons is increased. In the normal condition and quantum condition, the increase in global entropy and global mutual information in response to noise strength is gentle, and the maximum value of global mutual information hardly changes with changes in connection strength. In contrast, in the diagonal condition, both global entropy and global mutual information rise rapidly in response to an increase in noise strength, and as the connection strength increases, the maximum value of global mutual information also increases accordingly. This is thought to be because, as mentioned above, the interaction between neurons is weak, and the autocatalytic effect due to the connection strength is accentuated by the increase in connection strength.
Although the number of calculation steps is 10,000, because the number of neurons is increasing, the state does not converge to a sufficiently stable steady state. Therefore, the entropy and mutual information curves are not stable in the normal condition, and the mutual information converges to 1.0 instead of 0.0 in the diagonal condition and quantum condition.
3.2. Local Estimation in Two-Dimensional Network
It was reported that recurrence resonance could be observed not only in Boltzmann machines but also in various networks such as Hopfield [
38] networks if the connections between neurons are dense [
7]. This is certainly true. However, the way in which it manifests itself clearly differs depending on the pattern of connections between neurons. As we have seen so far, when the connections are dense as in the normal condition, the interactions between neurons effectively adjust the external noise, emphasizing the behaviour of the attractor of the network itself; while showing recurrence resonance and at the same time mitigating the effects of the noise, the system slowly reaches a steady state. In contrast, in the diagonal condition, recurrence resonance is also shown, but the effects of noise appear directly and do not seem to be significantly mitigated. In the quantum condition, the connection distribution seems to satisfy both characteristics, and recurrence resonance is shown while the effects of noise are also mitigated. Here, there is a struggle between the order latent inside the network and the noise (chaos) from the outside, and it is an interesting problem how the two are mediated in spacetime.
First, we consider a 3 × 3 neural network to consider the information-theoretic relationship between the whole and the part. The condition of connection between neurons is the normal condition. Global entropy and mutual information classify the nine-bit states by looking at all nine neurons and calculating the probability. In other words, the states are 000000000, 000000001, …, 111111111, and the probability is calculated for these states. In contrast, local entropy and mutual information are calculated from only three neurons. In this case, the states are 000, 001, 010, 011, 100, 101, 110, and 111, and the number of states is only . Also, local entropy and mutual information are calculated only for three neurons, , , and , rather than calculating the probability for every three bits for all neurons.
Figure 7A shows the global entropy and mutual information plotted against noise strength for a 3 × 3 neural network. As shown in the graphs of
Figure 4,
Figure 5 and
Figure 6, the different colours represent different connection strengths. The probability is calculated over a time evolution of 50,000 steps. Because of the large number of neurons, as the connection strength increases, the interaction weakens the noise effect, and the time to fall into a noisy steady state becomes long enough, and the curve becomes more uneven. However, the important point is that in the low-noise strength region, there is a clear increase in mutual information and the local maximum. In other words, the global mutual information shows the existence of recurrence resonance. However, the local mutual information does not show the existence of recurrence resonance (
Figure 7B). It only decreases monotonically with increasing noise intensity. There is no region where the value increases, which means that no complex pattern appears in any noise region.
Although a whole system realizes recurrence resonance by mitigating noise, such a tendency is not observed when only a part of the system is examined. The behaviour of the Boltzmann machine in the normal condition means that the conflict between chaos and order is not observed locally. Mutual information reduces the value of both chaotic patterns and regular patterns such as periodic oscillations; so, locally, there is either chaos or order, and neither of which are mixed in a complex pattern.
However,
Figure 8 shows that this is not a general tendency. It shows the behaviour of local entropy and local mutual information in a Boltzmann machine network consisting of 25 neurons arranged two-dimensionally in a 5 × 5 grid. The horizontal axis is the noise strength, and the colour of the curve indicates the weight strength, as in
Figure 4,
Figure 5,
Figure 6 and
Figure 7. Also, all the monotonically increasing curves are local entropy, and the other pairs that do not are mutual information, as shown in
Figure 4,
Figure 5,
Figure 6 and
Figure 7. The probability is calculated from the frequency counted over 10,000 steps, with the state defined as
.
The 5 × 5 circle distribution shown in each graph in
Figure 8 does not directly show the weight distribution of the connections. The circles represent neurons, and in the normal condition, the connection weight between all neurons is determined by probabilities that follow a normal distribution. In the blocky, diagonal, and quantum conditions, the connection weight between dark blue neurons is determined between 0.7 and 1.0, and between white neurons, the connection weight is determined between 0.0 and 0.05. In the quantum condition, the connection weight between light blue neurons is determined between 0.2 and 0.3. The difference is clear when comparing the normal condition with the other conditions—blocky, diagonal, and quantum. In all conditions except for the normal condition, an increase in value and a maximum value are observed in the local mutual information at low noise strength. In other words, under these conditions, recurrence resonance is clearly observed, and a complex pattern that has both chaos and order appears locally. This is thought to be due to the fact that the coupling has sparse parts, which is different from the normal condition.
3.3. 1/f Noise in Boolean and Quantum Condition
In the previous section, it was shown that a mixed state of chaos and order can be observed even in the spatial part, depending on the connection state of the neuron. This indicates a phase-transitional critical state in the phase transition from order to chaos. So, can a phase-transitional critical state be found in a single neuron? To evaluate this, a single neuron was selected from the network, and its time series of states was analyzed by Fourier transformation. The time series was calculated from 50 series with random initial values, and the power spectrum was averaged.
Figure 9 shows the power spectrum plotted against frequency in a log–log plot. The network of this Boltzmann machine is composed of five neurons, and the 5 × 5 square of circles shows the weights of the neuronal connections. The light orange circle in the normal condition indicates that all weights follow a normal distribution. In the blocky, diagonal, and quantum conditions, the dark orange circles show weights between 0.7 and 1.0, the white circles show weights between 0.0 and 0.05, and the light orange circles show weights between 0.2 and 0.3. The power spectrum in the normal condition clearly shows white noise, and no critical behaviour is observed. In the blocky condition, the slope is a little bit larger than 1.0, but there is a tendency toward a power law distribution. Note that the blocky condition also reveals Boolean algebra with respect to rough set lattice transformation. In the diagonal and quantum conditions, there is clear 1/
f noise in the high-frequency components. This means that the phase transition critical phenomenon is observed in these connection patterns.
Recall the 5 × 5 = 25 network (
Figure 8), which also shows recurrence resonance in local mutual information. Here, we again selected one neuron, analyzed its time series by fast Fourier transformation with different initial values, and plotted the power spectrum averaged over the time series of 50 trials against frequency, and displayed the graph in a log–log plot (
Figure 10). Here, again, the tendency of the power spectrum depending on the neuron’s connection type is almost the same as in
Figure 9. In the normal condition, it shows white noise, and in the blocky condition, it shows a tendency toward a power law distribution with a slope of −1.0. In the diagonal and quantum conditions, it shows what is called an exact
noise with a slope of −1.0. However, unlike the case of the 5-network, it shows
noise for low-frequency components.
In the quantum condition and diagonal conditions, the behavior is extremely similar, but since fluctuations are shown for low-frequency components, we can say that there is a clear critical behavior. The distribution pattern of the connection strength between neurons clearly plays a role, and it is thought that the distribution in which there are areas of strong connection and areas of sparse connection is important. In the quantum condition, the connection strength of the non-diagonal sub-relation area (corresponding to the area of strong connection in the blocky condition) is set between 0.2 and 0.3, but it is possible that a difference from the diagonal condition was not observed because this value is relatively small. In the quantum condition, the transition between the neurons arranged diagonally and the area of strong connection in the blocky condition is cut off, so when the transition between these areas is allowed, the frequent transition between chaos and order is further enhanced, and noise may be observed not only in low-frequency components but also in all frequency components. In any case, the increase in local mutual information observed at low noise intensity and the existence of a maximum value not only show the criticality of chaos and order in the subspace within the system, but also in the temporal changes of a single neuron.
Although
Figure 8 shows that only the low-frequency components show the power law, such a property is influenced by the connection strength in the non-diagonal block part.
Figure 11 shows the power spectrum for a time series of the state of a neuron plotted against the frequency, where the connection strength in the block part changes in various ways. If the connection strength is set between 0.25 and 0.65 (
Figure 11a), the power law is noticeable in the high- and low-frequency components.
So, are these connection strengths (and so on) determined arbitrarily? In excess Bayesian estimation and inverse Bayesian estimation [
39], which apply Bayesian inference locally to the connection probability distribution, it is known that the connection probability distribution becomes a distribution similar to the quantum condition. In that case, the connection probability, which corresponds to the connection strength, is determined autonomously by Bayesian and inverse Bayesian inference. It is highly likely that such a mechanism is involved in 1/
f noise, and further investigation will be required.
Local recurrence resonance observability (LRRO), for which RR can be detected only from locally parts of a neural network, could be a condition that the assemblage has a blank area and shows a cluster-like structure. Conversely, if such a material situation is allowed, LRRO can be observed regardless of the form of interaction or the form of the dynamical system, and it is possible that the characteristics of the critical phenomenon such as 1/f fluctuations associated with it can be found. Forms such as Boltzmann machines are usually considered to be advanced dynamical systems that implement majority voting, such as highly organized neuronal systems. If the critical phenomenon originates from a dynamical system, it would be misguided to compare the Boltzmann machines with the proteinoid microsphere. However, the LRRO found here and the critical phenomenon caused by it may be a characteristic that depends on the characteristics of the execution environment that executes computation and the distribution pattern of the element connections, rather than the characteristics of the dynamical system. Therefore, although the connection between the two is conceptual, we decided to investigate the membrane potential time series of proteinoid microspheres with cluster-like connections and focus our experiments on whether or not the 1/f fluctuations seen in LRRO can be observed.
3.4. 1/f Noise in Proteinoid Microsphere
As mentioned before, the proteinoid microsphere is generated by polymerization of aminoacids, and constitutes a cluster structure of microspheres. A membrane potential maintains a pulse-like signal as well as a spike emitted from a neuron [
40]. We here show that a time series of membrane potential reveals a typical
noise, as shown in
Figure 12.
Figure 12 shows a time series analysis and FFT for all eight channels. Each channel shows typical fluctuations in membrane potential (
) and their power spectral density
. The power spectra follow a power law relationship:
where (
f) represents frequency, (
β) is the scaling exponent, and (
C) is a constant. Taking the logarithm of both sides yields the following:
The power spectra analysis shows scaling exponents from −0.43 to −1.07. This confirms
-like noise in all channels. The membrane potential time series
can be characterized by its autocorrelation function:
where (
) is the time lag and (
) denotes time averaging. In our analysis, we define two distinct frequency regimes: the low-frequency regime (
Hz, corresponding to timescales longer than 100 s) and the high-frequency regime (
Hz, corresponding to timescales shorter than 100 s). The power spectrum scaling behaviour is characterized by
for low frequencies and
for high frequencies, where
ranges from −1.30 to −0.51 and
ranges from −1.07 to −0.41 across different channels. This division reveals different scaling behaviours at different temporal scales, with the low-frequency regime generally capturing long-term system dynamics and the high-frequency regime reflecting faster fluctuations. The Wiener–Khinchin theorem [
41] links the power spectral density to the autocorrelation function.
The amplitudes across channels vary by two orders of magnitude. They range from about 2 mV in channel 8 to 200 mV in channel 2. This shows a significant unevenness in the microsphere distribution and the relevant electrical activity.
The root mean square fluctuation (
F(
)) of the voltage signal over a time window (
) scales as follows:
where (
) is related to the power spectral exponent (
) by (
). The analysis of bioelectric signals from proteinoid microspheres reveals complex dynamics with distinct behaviour in low- and high-frequency regimes, as summarized in
Table 1. Several channels show strong long-range correlations in the low-frequency regime, particularly channels 2, 3, and 4 (
). The theoretical scaling exponents (
) range from 0.705 to 1.15, with the low-frequency regime generally showing higher values (0.755–1.15) compared to the high-frequency regime (0.705–1.035). Notably, the directly fitted scaling exponents (
) show systematically lower values than the theoretical predictions, ranging from 0.32 to 0.90, suggesting a more complex relationship between temporal correlations at different scales. The high-frequency regime shows more consistent scaling behaviour across channels, particularly evident in channels 1, 2, 4, 5, and 7 (
). This dual-regime analysis reveals that the temporal organization of these bioelectric patterns combines both slow, strongly correlated fluctuations and faster, more varied dynamics, resembling the multi-scale characteristics observed in biological membrane potentials.