Next Article in Journal
A Secure and Fast Image Encryption Scheme Based on Double Chaotic S-Boxes
Previous Article in Journal
Pricing Interval European Option with the Principle of Maximum Entropy
Previous Article in Special Issue
Effects of Advective-Diffusive Transport of Multiple Chemoattractants on Motility of Engineered Chemosensory Particles in Fluidic Environments

Entropy 2019, 21(8), 789; https://doi.org/10.3390/e21080789

Article
Asymptotic Behavior of Memristive Circuits
Theoretical Division and Center for Nonlinear Studies, Los Alamos National Laboratory, Los Alamos, NM 87545, USA
Received: 16 July 2019 / Accepted: 6 August 2019 / Published: 13 August 2019

Abstract

:
The interest in memristors has risen due to their possible application both as memory units and as computational devices in combination with CMOS. This is in part due to their nonlinear dynamics, and a strong dependence on the circuit topology. We provide evidence that also purely memristive circuits can be employed for computational purposes. In the present paper we show that a polynomial Lyapunov function in the memory parameters exists for the case of DC controlled memristors. Such a Lyapunov function can be asymptotically approximated with binary variables, and mapped to quadratic combinatorial optimization problems. This also shows a direct parallel between memristive circuits and the Hopfield-Little model. In the case of Erdos-Renyi random circuits, we show numerically that the distribution of the matrix elements of the projectors can be roughly approximated with a Gaussian distribution, and that it scales with the inverse square root of the number of elements. This provides an approximated but direct connection with the physics of disordered system and, in particular, of mean field spin glasses. Using this and the fact that the interaction is controlled by a projector operator on the loop space of the circuit. We estimate the number of stationary points of the approximate Lyapunov function and provide a scaling formula as an upper bound in terms of the circuit topology only.
Keywords:
memristive circuits; spin models; disordered systems

1. Introduction

The study of memristors has become recently an area of interest [1,2,3] for a variety of reasons. This is not only due to the fact that memristors are considered by many the fourth circuit element, but also due to their potential applications and their interesting dynamics. In particular, it has been understood from a purely computational theory perspective that the combination of memristive and CMOS components leads to universal computing machines, called memcomputers [4]. From a theoretical perspective, circuits made of memristors have been shown to exhibit non-trivial dynamics which can in principle serve for computational purposes [5,6,7,8]. Memristors are passive components which can be thought of as a time varying resistance sensitive to the either the current or the voltage, which in turn depends on a dynamical internal state variable. The main characteristic of a memristor is a pinched hysteresis loop in the Current-Voltage diagram when controlled in alternate voltage [9,10,11]. The aim of this paper is characterizing the asymptotic behavior of purely memristive circuits (i.e., only memristors) for general circuit topology. An open question which has not insofar been answered at a theoretical level is what is the role of the circuit topology in the relaxation process of memristive circuits. It is in fact thought that the topology plays a role in the optimization capabilities of these circuits with memory (also called information overhead by Di Ventra and Traversa [4]). We provide a precise answer for the simpler class of circuits in which we know the exact role of topology, and show that the dynamical interactions across the circuit map into a coupling term between the memristors for the Lyapunov function for the dynamics. Specifically, we show that memristive circuits, if controlled with constant voltage, can perform naturally a local unconstrained optimization, specifically a Quadratically Unconstrained Binary Optimization (QUBO) [12]. In order to gain some theoretical understanding on the complexity of the asymptotic states we employ the Kac-Rice formula (average number of stationary points) to provide a rough upper bound on the complexity of the function. In the process of studying random circuits, we also show the connection between memristor dynamics on random circuits and the Sherrington-Kirkpatrick model.
Our results not only establish a direct and analytical connection between (local) optimization and memristors, but potentially introduces a new class of (heuristic) optimization algorithms for combinatorial problems based on the first order dynamical equation studied in the present paper. In fact, an interesting byproduct of the present paper is that the Lyapunov function we derived goes beyond the physically implementable circuits, for example, in principle it applies to cases in which the projection operator is replaced with (semi-)positive operators. As an application, we map a classical problem of stock returns optimization (Markowitz) into ours, and test the performance of the found optimization procedure. We provide an instance of optimization of the Nikkei 225 dataset in the Markowitz framework, and show that it is competitive at least compared to exponential annealing, which usually performs poorly on hard combinatorial problems.

2. Memristive Circuits

Before we introduce the dynamics for a generic circuit, we first briefly discuss the type of memristors under scrutiny [9]. Specifically, we consider memristors whose internal dynamics (the parameter w) depends on the current only, and in which the resistance depends linearly on an internal parameter w, and satisfy the linear relation R ( w ) = R o n ( 1 w ) + R o f f w , with 0 w ( t ) 1 . We consider in particular the time evolution of a single memristor with diffusive dynamics [13,14] and for an applied voltage S:
d d t w ( t ) = α w ( t ) R o n β I = α w ( t ) R o n β S R ( w )
which shows that a competition between drift and decay occurs. This type of memristors has been considered for machine learning applications. With our convention, R o n and R o f f are the limiting resistances for w = 0 and w = 1 respectively ( R o f f > R o n > 0 ), and α and β are constants which set the timescales for the relaxation and excitation of the memristor respectively. It is worth mentioning that our model is different from the one introduced in Reference [9] as it has not only opposite polarity, but also a decay function which is slightly different. In a recent paper [15] it has been observed that mean field theory and techniques from statistical physics can be applied to study a specific circuit topology. In that paper, it was also noted that a Lyapunov function exists, and that zero temperature mean field theory provides a good estimate for the average asymptotic dynamics for a single mesh of memristors. In the present paper we extend some of these results to a more general class of memristive circuits, which sheds new light onto this type of nonlinear systems. Specifically, in Reference [5] the following differential equation was derived for a generic but purely memristive circuit:
d W d t = α W 1 β ( I + ξ Ω W ) 1 Ω S ( t ) ,
where S is the vector of voltage sources in series to each memristor, while I is the identity matrix. It is immediate to observe that the nonlinearity is controlled by the parameter ξ = R o f f R o n R o n . It is important to note that Ω and W are matrices, while I is the identity matrix. W is the diagonal matrix with the memristor memory values w i ( t ) on the diagonal, meanwhile Ω = A t ( A A t ) 1 A is a projector operator on the fundamental loops of the circuit, being A the cycle matrix of the directed graph representing the circuit flows. Given an orientation of each loop and edge, then A i β has dimensionality L × N , where N is the number of memristors and L the number of fundamental loops. If a memristor β has the same orientation of a loop i, then A β i = 1 , and 1 if opposite. A β i = 0 if the memristor β does not belong to the loop i. Thus the matrix A has only 1 , + 1 , 0 values (The matrix Ω is not diagonal because edges of a graph can belong to more than one loop.). The richness of the dynamicsl of this equation has been characterized in Reference [5], while the locality properties of Ω in Reference [16]. The important observation that we anticipate is that the dynamical Equation (2) has at least one Lyapunov function which we have derived analytically. While the derivation is provided in the Appendix A, here we provide only its functional form:
L ( W ) = α 2 i W i 2 α ξ 3 i Ω i i W i 3 α ξ i j Ω i j W i W j 2 + 1 β i j W i Ω i j S j ,
from which a few key facts can be immediately observed. For instance, the role of Ω is the one of a coupling matrix, as one would expect, and that it is not quadratic in the internal memory variables. Equation (3) satisfies d d t L ( W ) 0 whenever M i = 2 α ξ j i W j Ω j i W i is small (thus for very weakly interacting memristors, as it depends only on the offdiagonal terms) and d d t L ( W ) = 0 d d t W = 0 . One can obtain a more precise bound on the derivative of the Lyapunov function. Let us define | | Ω S | | 2 = N 2 s 2 ( N ) . We have proved in the Appendix A that if
4 ξ 2 ( 1 + ξ ) Ω ¯ 2 < s ( N ) 2 α 2 β 2 2 s ( N ) α β ,
then d d L ( W ) < 0 , where Ω ¯ = max i j | Ω i j | . It is interesting to note that the order parameter s ( N ) α β is a generalization of the one found in Reference [15] and which controlled the asymptotic state of the effective mean field circuit. Here again it plays a role. Note that ξ 2 ( 1 + ξ ) Ω ¯ 2 is what controls the height of the parabola, and intuitively the larger then nonlinearity the higher the voltage has to be for the system to relax according to the Lyapunov function. Thus, Equation (4) establishes a dynamical phase diagram.
We stress that the dynamics is constrained in the hypercube [ 0 , 1 ] N , where N is the number of memristors, and thus the Lyapunov function above when a memristor reaches the boundaries of this domain. However, if we now use the fact that the dynamics is controlled with constant voltage each memristor will eventually reach the asymptotic value 1 or 0. In this asymptotic limit, we note that the function above can be interpreted as a spin-like model asymptotically. In Reference [15] this phenomenon has been partly explained by the emergence of unstable fixed points in the dynamics (although only in the mean field approximation). For the specific case of the Equation (2) however, it is necessary to go beyond this approximation, for which each memristors will have (intuitively) a different fixed point associated to it. This phenomenon, which is circuit dependent, is associated to the structure of the couplings Ω i j , and will be studied elsewhere. Anyhow, for large times the Lyapunov function can be approximated by ( W j n W j )
L ( W t 1 ) [ α 2 i W i + α ξ 3 i Ω i i W i + α ξ i j Ω i j W i W j 1 β i j W i Ω i j S j ] L a ( W ) = [ i W i α 2 + α ξ 3 Ω i i 1 β j Ω i j S j + α ξ i j Ω i j W i W j ] ,
where we have used the fact that W i n = W i if W i = 1 , 0 . An effective external field h i = α 2 + α ξ 3 Ω i i 1 β Ω i j S j emerges in this asymptotic approximation. We also test numerically both the Lyapunov function of Equations (A1) and (6) in Figure 1. Meanwhile in Figure 1a we plot the evolution of each memory element, in Figure 1b we plot the evolution of both the Lyapunov function and the asymptotic approximation of it, which remains close to the exact one. On the other hand, in Figure 2 we show the difference between the obtained Lyapunov function and the asymptotic one as a function of time. This mapping is reminiscent of the case of continuous neuronal networks introduced by Little [17] and then Hopfield in a series of important papers [18,19]. The Little-Hopfield model has sparked interest from the Statistical Physics community since the very beginning [20]. In the past years this particular line of study has been subject of scrutiny by many experimental groups [21,22,23]. The key difference is that in each case, these were studied in conjunction with ordinary and active (CMOS) electronic components to build Hopfield learning networks. We argue instead that memristive circuits per se form a special kind of Hopfield network defined by the Lyapunov function above, without the need of extra components.

Functional Complexity for Random Circuits

We now study random circuits, in particular for the function complexity of Equation (6). First, this will allow us to show an interesting connection to the field of disordered systems, and in this approximation provide estimates of the complexity of the Lyapunov function. In fact, minimizing quadratic function with discrete variables is in general a hard problem to solve. We are however in the situation in which the problem at hand is in nature both continuous in time and in the variables (being these between 0 and 1), but naturally provides an answer for the more complicated case usually associated to finding the ground state of a frustrated spin systems. Despite the fact that we are not able in the present paper to provide a complete answer to which optimization class a system of purely memristive circuits belongs to, we try to provide some answers to the questions above using some techniques introduced to study random polynomials [24,25]. This will be important in light of the fact that we have shown that memristors perform a local optimization, rather than a global one, and that the circuit constraints should enter somehow.
The number of stationary points of a generic multi-varied function L ( W ) can be estimated by:
# = { w = 1 , 0 } det ( w i w j 2 L ( w ) ) i δ w i L ( w ) .
Thus, if we aim to consider the expected number of stationary points #, then it does make sense to consider the (quenched) average of the quantity in Equation (6) for the distribution for · P ( Ω ) with respect to the coupling matrix Ω . One important observation is that because Ω is a projector, det ( W i W j L ( W ) ) can be calculated exactly without knowledge of the elements distribution, but only using the fact that Ω is a projector (details in the Appendix B). We find that the average number of stationary points can be split in the form:
# = ( 3 α ξ ) N 1 1 N L Z ( S , Q )
where L is the number of fundamental loops of the circuit, and where:
Z ( S , P ( Ω ) ) = { w = 1 , 0 } i δ w i L ( w ) P ( Ω ) .
depends on the distribution of the elements of the matrix Ω .
We first need, then, to understand the distribution P ( Ω ) for random circuits. Since the analytical calculation of such distribution goes beyond the scope of this paper, we perform such analysis numerically. We first generate random circuits with Erdos-Renyi graph above the percolation threshold ( p = 0.7 ). We are interested, in particular, in the scaling with respect to the number of memristors N. For fixed Ω , the distribution P ( Ω ) of off-diagonal elements is shown in Figure 3a. We observe that although this seems to be unimodal, a careful look shows that it is not. We observe in Figure 3b that there is a non-zero probability of having elements not distributed around zero, but that these are orders of magnitudes smaller than the central distribution. While it is not surprising that the distribution is not completely random (thus, there are correlations among elements) since these are constrained by Ω 2 = Ω , it is surprising to note that we can roughly approximate this distribution with a unimodal one. We are in particular interested in how the width of the distribution scales with the number of memristors. This can be calculated exactly if we know the scaling of the diagonal elements, as we will see below. In Figure 4 we show how the diagonal elements of Ω scale with N. We fit first the diagonal elements, precisely 1 Ω i i , showing that it scales as a power law to a good approximation. If we perform a fit using the functional form Ω i i 1 c N α , we obtain the best fit values c = 1.74557 3 and find that α = 0.5043 1 / 2 . We will, since now on, consider these values in what follows. It is important to note that the scaling above is sufficient to obtain a scaling for the off-diagonal elements as well. We define the matrix of off-diagonal element G, as Ω ( 1 3 N ) I + G . Since we have that Ω 2 = Ω , it is easy to see that G must satisfy the equation G 2 + ( 1 2 3 N ) G + 3 N ( 3 N 1 ) I = 0 . For N 1 this implies that also G satisfies the equation G 2 = G as well. We can at this point solve for an eigenvalue equation for G, and obtain λ = { 3 N , 1 + 3 N } . Since this must be a projector for large values of N, we must have Ω I + 3 N Q , where G = 3 N Q , with Q again a projector. Thus, the scaling approximation obtained for random Erdos-Renyi type circuits, we have Ω i i 1 , meanwhile Ω i j 3 N Q i j . This is the scaling we will use in the following.
While in what follows we show only the intermediate results, all the exact calculations are provided in the Appendix B. We can in fact a this point calculate Z in Equation (6) for random circuits. We assume that P ( Q i j ) = 1 2 π σ 2 e Q i j 2 2 σ 2 , where σ is an effective width for N = 1 . In this case, calculating Z can be done by means of Gaussian integrals. In this case, we obtain, in the limit α ξ 1 :
# 1 1 N L ( 3 π σ ) N .
We now use Euler’s relation L = N V + χ , where V is the number of vertices, χ the Euler characteristic and use the Erdos-Renyi relation N = p ( V ) V ( V 1 ) 2 where p ( V ) is the edge probability. Now we have that, since χ > 0 and only topological, while L scales linearly for large N in the number of memristors. We now observe that given ρ = 3 π σ , we have that in the limit N , # goes to infinity for ρ > 1 , while it goes to zero for ρ 1 , as lim N ( 1 1 N ) N = 0 . We thus obtain the result that for σ > 3 π the Lyapunov function seems to have a large number of stationary points and thus a hint of the hardness of the minimization problem.
It is easy at this point to provide an approximate mapping between the asymptotic Lyapunov function and the Hamiltonian of a mean-field spin glass of the Sherrington-Kirkpatrick type with an external field:
L ( σ ) = i h ˜ i σ i + α ξ 3 N i j Q i j σ i σ j
where we used the mapping W i = 1 2 σ i 1 2 with σ i = ± 1 and h ˜ i = h i 2 1 2 j Q i j and disregarded an unimportant constant which only shifts the function. The result above is a generalization of what has been found in Reference [26] in the limit ξ 1 , for example, the fact that the dynamics of memristors follows a constrained gradient descent (Rosen projections). Moreover, it provides a physical system to test experimentally the physics of mean field spin glasses [27].

3. Asymptotic State Recollection and Combinatorial Optimization

One important question for memristive circuits is what is the asymptotic state they reach. As we have argued, this can be answered by looking at the minima of the Lyapunov function for the case of constant voltages. Although this is a hard problem to solve in its full generality, we can use some approximate analytical formulae to provide an answer in at least for some region of the parameters. First, we consider the case α = 0 . It is easy to see that we can integrate the equation [26] to obtain:
W + ξ 2 Ω W 2 = 1 β Ω S ( t t 0 ) + c
where c is an integration constant. Equation (12) is a unimodular quadratic vector equation, which is the special case of vector equations of the form x + b ( x , x ) = c . This equation does not have an exact solution in closed form but several numerical methods have been developed [28]. Here we use a heuristic method to identify what are the asymptotic values of W . Again we use the fact that in the limit t , we observe numerically the asymptotic values of W are either 0 or 1. For ξ = 0 , we have W ( t = ) = 1 sign ( Ω S ) 2 . For ξ > 0 and in the asymptotic limit, lim t W 2 W . Using this approximation, we obtain:
( I + ξ 2 Ω ) W = 1 β Ω S ( t t 0 ) + c .
We then can obtain the asymptotic formula:
W ( t = ) 1 sign ( ( I + ξ 2 Ω ) 1 Ω S ) 2 = 1 sign ( Ω S ) 2 ,
where we note that in the last equation we have used the fact that 1 + ξ / 2 > 0 if 0 < R o n < R o f f . However, we note that the exactness of the asymptotic behavior depends only on ξ and not on α . The results on how well Equation (13) predicts the asymptotic behavior of the circuit as a function of ξ are shown in Figure 5. Despite our heuristic method and approximation, the obtained values are a good approximation for small values of ξ of the real system. Using the formula above, we now provide the connection with the state recollection of neural networks. We note that the projection operator Ω can be written as [16] Ω i j = l = 1 L A ˜ i l A ˜ j l , where A ˜ i l is the orthonormalized loop matrix of the circuit. From the theory of neural networks we are aware that the number of stored patterns equals the number of independent eigenvectors of the interaction matrix Ω , which is purely topological, as argued in Reference [26]. In the case of a purely memristive circuit the number of independent memory units number is constrained by the topology of the circuit and it depends on the number of fundamental loops [26]. As we have noted before, the asymptotic behavior of the circuit can be approximated by (where we reinstate α )
W ( t = ) 1 sign ( ( ( 1 α ) I + ξ 2 Ω ) 1 Ω S ) 2 ,
where we realize that the source vector S can act as the recollection mechanism. If S = l ρ l A ˜ l , for a certain proportionality constant ρ , then ( Ω S ) i = l ρ l A ˜ i l . Thus, the asymptotic configuration of the memristors will be given by the approximate value:
W ( t = ) 1 sign ( ( ( 1 α ) I + ξ ρ 2 A ˜ l ) 1 ) 2 = 1 sign ( 1 α ) I + ξ ρ 2 A ˜ l 2 ,
which shows that stored patterns are indeed in the loop basis.
The analysis above is reminiscent of the asymptotic state recollection in the Hopfield-Little model.
We now ask the converse question of how close are the asymptotic states to minima obtained from the Lyapunov functional. As we have seen in the previous sections memristors can be used to provide solutions (possibly sub-optimal) to hard combinatorial problems. This claim should be taken, if not with skepticism, with some care. First we note that L ( W ) is not positive definite (although it is bounded) and thus we have an asymptotically local stable equilibrium rather than a global one. We are interested in providing some benchmarks on how well memristors find the minimum of the function L a ( W ) compared to other optimization techniques. Given 100 samples of random circuits based on Erdos-Renyi random graphs with p = 0.7 , in Figure 6 we show the energy attained using memristors in 60 time steps (red dots), using a Metropolis algorithm with exponential annealing (blue) with over 8000 steps and using the minimum of 100 random configurations as a reference (blue dots). We see that despite the fact that memristors do not seem to reach the absolute minimum, the energy is closer to the Metropolis result than to the random configuration one. Although it is easy to see that the dynamical equations do not reach the Metropolis attained minimum, it did take less than 100 iterations for the memristive system to converge using a simple Euler-Newton first order integration. This already shows that, as anticipated, memristors perform a local optimization in this regime. In the next section, however, we apply the algorithm to a specific dataset, where this optimization has some advantage over the exponential annealing.

4. An Application: Minimization of a Quadratic Function

We have argued that memristors’ dynamics can be thought, when these are connected in a circuit, as reaching an asymptotic state which is among the minima of a certain Lyapunov function. Albeit real memristors can have a dynamics which is far from ideal, we would like to consider if these could in principle be used for any sort of purpose which goes beyond electronics. In fact, note that (2) can be used as a quick meta-heuristic method to search for the minimum of a function, or to be given as input to other more complicated and efficient optimization techniques. In realistic circuits, the quadratic form matrix Ω which occurs in the Lyapunov function is a projector on the cycle space of the graph. Thus, problems for which analog memristive circuits can be used are naturally those that can be embedded in the cycle basis of a graph [29]. However, in principle we can simulate the system also for matrices Ω that are not necessarily projectors, as we have shown and as we discuss further below. We can thus perform a preliminary test of the applicability of analog memristive circuits in a problem of optimization and see how the circuit performs compared (for instance) to simulated annealing.
We thus consider a simple application of practical importance, for example, studying the problem of investment in a set of assets. For this purpose, we use the 225 Nikkei dataset which is used for benchmarking heuristic optimization algorithms [30], available in Reference [31]. We can use in fact a memristor-inspired optimization scheme, taking advantage of the fact that it works beyond projector operators Ω . We mention that the proof of the Lyapunov function relies only on the matrix I + ξ Ω W + W Ω 2 being positive definite and thus applies also to any positive matrix and to some non-positive ones (see Appendix A for details) for ξ small enough. Although we cannot use these in a hardware circuit, we can solve the differential equation numerically to infer a minimum heuristically. Using this idea, we ask in which of the assets of portfolio we should invest in (a yes-no answer). The dataset is composed of 225 assets, including returns and the covariance matrix and can thus use the combinatorial Markowitz functional [32]. If we use the fact that maximizing a quadratic function is equivalent to minimizing the function with a minus sign in front, we can write an equality between the Markowitz function for binary variables and the memristive one:
M ( W ) = i r i p 2 Σ i i W i p 2 i j W i Σ i j W j L ( W ) = i W i α 2 + α ξ 3 Ω i i 1 β j Ω i j S j α ξ i j Ω i j W i W j
where p is a trade-off parameter, r are the returns and Σ is the covariance matrix. From imposing the equality between the two functionals, we observe the necessity of imposing Ω = Σ and thus
α 2 + α ξ 3 Ω i i 1 β j Ω i j S j = r i p 2 Σ i i , p 2 = α ξ .
We can thus obtain the source vector to try to force the system towards the right minimum:
S = β Σ 1 α 2 + ( p 2 + α ξ 3 ) η r
where ( η ) i = Σ i i , for example, the variance of each return. We have the freedom to choose arbitrarily β but either ξ or α would be fixed. We observe that we need to invert the covariance matrix, which is a slow but polynomial in the number of variables. Moreover, we see that the harder the problem to solve (the norm of the inverse of Σ ), the smaller β has to be chosen, from which we infer the slowness of the process.
We are now in the position to test the memristive minimizationi against an exponential annealing process, which is known to perform poorly in the combinatorial setting. The results for the case of the Nikkei dataset are shown in Figure 7, comparing the Metropolis annealing to the case of the computational time and in the final value obtained. We performed three different tests. First, we randomized the initial states and performed 100 Simulated Annealing procedures [33], with initial temperature T = 100 , an exponential annealing rate λ = 0.995 ( T k = λ k T 0 ) and a number of time steps of N T = 500 N , where N was the number of assets 100. On the Nikkei 225 dataset, the maximum obtained with the simulated annealing was r m a x 9.15 . With the identical initial states, the memristive optimization has obtained a maximum of r m a x 14.23 . We have then used the final state of the memristive optimization output as an input to an annealing procedure but with a lower initial temperature T 0 = 0.025 , thus effectively fast-forwarding the annealing procedure. This combined optimization obtained the absolute maximum, albeit of only less than a percentage better than the memristive optimization, of r m a x 14.40 .

5. Concluding Remarks

In the present paper we have analyzed the asymptotic dynamics of memristive circuits, showing that an approximate Lyapunov function for the dynamics exists for memristors with slow decay. Because of the properties of memristors, we have argued that asymptotically the Lyapunov function of a memristive circuit can be written as a quadratic function on spin-like variables and have connected the coupling matrix to the projector operator on the loop space of the graph [34]. This result shows a direct connection between purely memristive circuits and Hopfield networks, which we argue minimize a similar type of Lyapunov function. The internally stored patterns are in this case the cycles of the graph. This is interesting for various reasons. Insofar it had been argued that a certain degree of external control was necessary in order to perform computation or use memristors as a memory device, either via the use of CMOS or the introduction of capacitors into the network. This paper provides sufficient evidence for the use of memristors in their completely analog regime for computational purposes. This comes at a cost, which is the embedding of the problem of interest in the cycle basis of the graph, which can be a rather nontrivial problem when it is solvable.
The Lyapunov function for the dynamics of memristive circuits can then be used for various purposes. We have first focused on the complexity of these functions in the case of random circuits. We have provided numerical evidence and argued that the couplings scale as in the case of mean-field spin glasses, that is, the Sherrington-Kirkpatrick model, although only if neglecting correlations between matrix elements of lowest order. Using this approximation we have provided approximate formulae for the number of stationary points, provided some topological considerations. This result explains the importance of the initial conditions for the asymptotic state of the memristors. In fact, it is known (for instance in the case of a glassy system) that for a system which has a large set of local minima there is a large sensitivity to the initial conditions.
As a test-bed of these ideas, we have used the memristor dynamics to see how well (or bad) the dynamics leads to the minimization of a certain quadratic functional We have compared the minimization of a combinatorial problem using both simulated annealing and the memristive dynamics introduced in this paper. Although the memristor dynamics could not reach a global minimum, the speed of convergence and the closeness to the Metropolis result suggest the use of these mixed dynamical-combinatorial algorithm to provide quick answers to combinatorial problems on spin-like variables (or alternative 0–1 variables, thus a quadratic unconstrained binary optimization). We do not claim these answers to be optimal but we find nonetheless interesting that a rather simple procedure like the one we did performed remarkably well compared to simulated annealing.
The main result of this paper however remains the Lyapunov functional for the study of the dynamics of memristors and to understand the relaxation properties of these circuits (at least when these are controlled with constant voltage). We have shown a direct connection between optimization and memristive circuits, as advocated by other authors, for instance [35,36,37]. It is inspiring to think that the minimization of the functional depends on the balance between reinforcement-decay properties of the memristive dynamics, along the lines of other similar algorithms [38] which use collective dynamics as heuristic optimization methods.
Thus, this paper provides needed background work to understand the dynamics of circuits with memory, their sensitivity to initial conditions and their use for computational purposes in the fully analog regime.

Funding

This work was supported by the US Department of Energy through the Los Alamos National Laboratory (Grant 20170660PRD1). Los Alamos National Laboratory is operated by Triad National Security, LLC, for the National Nuclear Security Administration of U.S. Department of Energy (Contract No. 89233218CNA000001).

Acknowledgments

We would like to thank J. P. Carbajal, C. Coffrin, M. Vuffray and A. Lokhov for various comments on the results of this paper. In particular, we really appreciated conversations on this same topic with M. Di Ventra, F. L. Traversa and F. Sheldon during my visit at UCSD in 2015. I would like to thank F. Sheldon in particular for pointing out some problems with the original proof on the Lyapunov function.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A. Properties of L(W)

Appendix A.1. Derivation of L(W)

The Lyapunov functional provided in the paper is given by:
L ( W ) = α 2 i W i 2 α ξ 3 i Ω i i W i 3 α ξ i j Ω i j W i W j 2 + 1 β i j W i Ω i j S j .
In order to derive the functional above, we use the differential equation for the memristive dynamics:
( I + ξ Ω W ) d W d t = ( I + ξ Ω W ) α W 1 β Ω S ,
for the case of constant voltages S . At this point we can obtain the functional form of the derivative of the Lyapunov function:
d d t L ( W ) = t L ( W ) + i δ w i L ( W ) d d t W i = i d W i d t [ ( I + ξ Ω W ) α W 1 β Ω S ] i + M · d W d t = i j d W i d t I + ξ Ω W i j d W j d t + M · d W d t = | | d W d t | | ( I + ξ Ω W ) 2 + M · d W d t
where we assumed that t L ( W ) = 0 , as S is constant in time. We also introduced the quantity M i = 2 α ξ j i W j Ω j i W i . The variation of the functional is provided in the subsection below in detail. First we prove that I + ξ Ω W is a positive definite matrix. We use the matrix similarity for eigenvalues, I + ξ Ω W I + ξ W Ω W to show that given a vector b , its norm is b t ( I + ξ W Ω W ) b = | | b | | 2 + ξ | | W b | | Ω . Since Ω is a projector and is semipositive, then we have that for any non-zero vector b , b t ( I + ξ W Ω W ) b > 0 . We now observe that the first term in Equation (A4) is always negative, meanwhile the second term is always positive, since the inverse of a positive operator is always positive. However, meanwhile the first term does not scale with any parameter, we have that the second term scales roughly as α 2 ξ 2 . Also, the min norm of the operator I + ξ Ω W is of order one, which implies that the sup norm of its inverse is also of order one, for arbitrary values of ξ . As a comment, we see that up to this point, this is enough to prove that if Ω is diagonal the second term disappears, as M = 0 . This completes the proof that d d t L ( W ) 0 for the case of Ω diagonal, which is trivial. Also, note that we have d d t L ( W ) = 0 if and only if d W d t = 0 . This confirms the fact that L ( W ) is a Lyapunov function in this regime and thus applies also for weakly interacting memristors.
We are interested in the more general case of arbitrary Ω . For this purpose, we consider the following bounds. We have
| | d W d t | | I + ξ Ω W 2 + d W d t · M | | d W d t | | 2 + | | d W d t | | · | | M | | = | | d W d t | | | | M | | | | d W d t | |
where we used the Cauchy-Schwarz inequality and the conservative lower bound | | d W d t | | I + ξ Ω W 2 | | d W d t | | 2 . We know already that if d d t W = 0 , then d d t L = 0 . Let us focus on d d t W 0 . Thus, a conservative bound suggests that we need to bound it via sup W | | M | | and min W | | d d t W | | . We have shown already that if Ω i j is diagonal then the proposed Lyapunov function has negative derivative, so we can make a very loose upper bound. We now have the following bound on | | M | | :
| | M | | = 2 α ξ i j i j k W i 2 Ω i j W j Ω i k W k 2 α ξ j i , k i Ω j k 2 2 α ξ Ω ¯ N ,
where Ω ¯ = sup | Ω i j | is the maximum absolute value of the elements of Ω i j . On the other hand, we have
m i n W | | d d t W | | = m i n W | | α W 1 β ( I + ξ Ω W ) 1 Ω S | | = min W α 2 | | W | | 2 + 1 β 2 | | ( I + ξ Ω W ) 1 Ω S | | 2 2 α β W · ( I + ξ Ω W ) 1 Ω S = α 2 min W | | W | | 2 + 1 β 2 min W | | ( I + ξ Ω W ) 1 Ω S | | 2 2 α β max W W · ( I + ξ Ω W ) 1 Ω S = 1 β 2 | | ( I + ξ Ω ) 1 Ω S | | 2 2 α β max W W · ( I + ξ Ω W ) 1 Ω S .
Now, since Ω is a projector, we have ( I + ξ Ω ) 1 Ω = j = 0 ( 1 ) j ξ j Ω j + 1 = j = 0 ( 1 ) j ξ j Ω = 1 1 + ξ Ω . Moreover,
max W W · ( I + ξ Ω W ) 1 Ω S N ( I + ξ ) 1 | | Ω S | |
and thus
m i n W | | d d t W | | = 1 1 + ξ 1 β 2 | | Ω S | | 2 2 α β N | | Ω S | | .
We thus find that if
2 α ξ Ω ¯ N 1 1 + ξ 1 β 2 | | Ω S | | 2 2 α β N | | Ω S | | < 0
the Lyapunov function we introduced has negative derivative always. Let us now define | | Ω S | | 2 = N 2 s 2 ( N ) . We can rewrite the inequality as
4 ξ 2 ( 1 + ξ ) Ω ¯ 2 < s ( N ) 2 α 2 β 2 2 s ( N ) α β .
Because s ( N ) O ( 1 ) in the physical case (e.g., the single voltage element does not scale with the size of the system), the bound above gives a meaningful relationship between the nonlinearity parameter ξ and the mean field dynamical properties of the system, s α β , which had been already introduced in Reference [15] in a mean field toy model of memristive dynamics. It is interesting to note that if we define s α β = q , the equation above forms a parabola and an inequality of the form
q 2 + 2 q + c < 0 ,
which characterizes the area below the curve. Thus, via the bound we see that below the parabola the system is going into a minimum of the Lyapunov function.
This shows what we had anticipated, for example, the fact that the Lyapunov function we defined has a negative derivative and because we have that d d t L ( W ) = 0 d d t W = 0 we have the required property for ξ small enough. Also, being L ( W ) defined on a compact and being smooth, we know it must be bounded from below, which concludes the proof. Clearly, how proof has a main fallacy: it applies to continuous dynamical systems with continuous derivatives. If certain memristors reach the boundary values, discontinuities appear in the dynamics. Simulations however show that the Lyapunov functional we obtained is an upper bound to the one obtained numerically and including the constraints.

Appendix A.2. Variation

Let us now calculate the variation of the functional of Equation (A4). Again, we consider the functional:
L ( W ) = α 2 i W i 2 α ξ 3 i Ω i i W i 3 α ξ j i Ω j i W j W i 2 + 1 β j i W j Ω j i S i
We have that:
δ W j L ( W ) = δ W j ( α 2 i W i 2 α ξ 3 i Ω i i W i 3 α ξ i j Ω j i W j W i 2 + 1 β j i W i Ω j i S i ) = α W j α ξ Ω j j W j 2 α ξ i i j Ω j i W i 2 + 1 β Ω j i S i 2 α ξ i i j W i Ω i j W j
which implies
δ W L ( W ) = α W α ξ Ω W 2 + 1 β Ω S ( I + ξ Ω W ) d d t W + M = ( I + ξ Ω W ) α W 1 β Ω S ( I + ξ Ω W ) d d t W + M = ( I + ξ Ω W ) d d t W + M
where on the last line we used the equations of motion and defined M i = 2 α ξ j i j W j Ω j i W i . For random networks, This mapping is reminiscent of the case of continuous neuronal networks introduced by Little [17] and then Hopfield in a series of important papers [18,19]. The Little-Hopfield model has sparked interest from the Statistical Physics community since the very beginning [20]. In the past years this particular line of study has been subject of scrutiny by many experimental groups [21,22,23]. The key difference is that in each case, these were studied in conjunction with ordinary and active (CMOS) electronic components to build Hopfield learning networks. We argue instead that memristive circuits per se form a special kind of Hopfield network defined by the Lyapunov function above, without the need of extra components.

Appendix B. Complexity of the Lyapunov Functional via Kac-Rice Formula

This section provides the details for the average number of stationary points of the Lyapunov functional above in the asymptotic regime. Let us now consider the following average:
# = { w = 1 , 0 } det ( w i w j 2 L ( w ) ) i δ w i L ( w ) P ( Ω )
where L ( w ) is the asymptotic functional in Equation (A4). We can see that
w i L ( w ) = α 2 + α ξ 3 Ω i i 1 β j Ω i j S j + α ξ j Ω i j Ω i i δ i j w j
and thus
w i w j 2 L ( w ) = α ξ Ω i j Ω i i δ i j
We are interested in an asymptotic upper bound on the number of local minima of the function L ( w ) . We first proceed by calculating the determinant. If N is the number of memristors, then det ( α ξ Ω i j Ω i i δ i j ) = ( α ξ ) N det Ω i j Ω i i δ i j . In the scaling observed above, we have noted that Ω i i 1 3 N , meanwhile Ω i j 3 N Q i j . Using this scaling, we have: det Ω i j Ω i i δ i j 3 N 2 N N 2 det Q i j ( N 3 ) δ i j N 1 det Q i j N δ i j , with Q a projector which can be written as
Q i j = l = 1 L v i l v j l t .
We now use the formula:
det ( v i l v j l t + A ) = ( 1 + v l t A 1 v ) det ( A )
if A is invertible and write:
det Q i j N δ i j = N N 2 l = 1 L ( 1 + v l t A l 1 v )
with A l = k = l + 1 L v i k v j k t N δ i j . Since k = l + 1 L v i k v j k t is also a projector on the subspace, we have:
A l 1 = 1 N ( 1 N ) k = l + 1 L v i k v j k t 1 N I .
It is now easy to see that since v l · v l = 0 if l l , we have that
v l t A l 1 v = 1 N ,
and thus we have the remarkable result that in the limit described above, we have
det Q i j N δ i j = 3 N 2 N N 2 1 1 N L N N 2 = 3 N 2 1 1 N L
which is independent from the distribution of Ω ’s elements but only on the scaling we used. We thus obtain a rough scaling of the number of minima, given by:
# = ( 3 α ξ ) N 1 1 N L Z ( S , Ω )
Up to here no assumption has been made on the distribution of Ω i j , if not the scaling observed numerically.
In order to evaluate Z ( S , Ω ) , however, assumptions have to be made. We now focus on evaluating the term:
Z ( S , P ( Ω ) ) = { w = 1 , 0 } i δ w i L ( w ) P ( Ω )
which can be written as:
Z ( S , P ( Ω ) ) = 1 ( 2 π ) N i j d Ω i j { w } i d η i e i η i α 2 + α ξ 3 Ω i i 1 β j Ω i j S j + α ξ j Ω i j Ω i i δ i j w j P ( Ω i j ) .
We will now introduce again the scaling Ω i i = 1 1 N and Ω i j = 3 N Q i j . We now consider the case in which P ( Q i j ) = 1 2 π σ 2 e Q i j 2 2 σ 2 . We can use a this point the formula:
d x e i q x x 2 2 b 2 = 2 π b 2 e b 2 q 2 2
and write:
Z ( S , P ( Ω ) ) = 1 ( 2 π ) N i j d Ω i j { w } i d η i e i η i α 2 + α ξ 3 Ω i i 1 β j Ω i j S j + α ξ j Ω i j Ω i i δ i j w j P ( Ω i j ) = 1 ( 2 π ) N { w } i d η i e i η i α ξ ( 1 3 N ) w j ( α 2 + α ξ ( 1 3 N ) · i j d Q i j e i 3 N η i Q i j α ξ w j + 1 β S j Q i j 2 2 σ 2 1 2 π σ 2 = 1 ( 2 π ) N { w } i d η i e i η i α ξ ( 1 3 N ) w j ( α 2 + α ξ ( 1 3 N ) e N η i 2 σ 2 α ξ w j + 1 β S j 2 3 2
and if we define a ( w ) = α ξ ( 1 3 N ) w j α 2 + α ξ ( 1 3 N ) and b i ( w ) = α ξ w + 1 β S i , we can write the integral as:
Z ( S , P ( Ω ) ) = 1 2 π 6 π N σ 2 N · i e 3 a ( 1 ) 2 2 N σ 2 b i ( 1 ) 2 b i ( 1 ) + e 3 a ( 0 ) 2 2 N σ 2 b i ( 0 ) 2 b i ( 0 )
We now rescale σ 2 σ 2 / N in order for σ to be physical in the limit N . We thus obtain, in the approximation in which S i = S , Z ( S ) becomes:
Z ( S ) = 3 π σ 2 N e 3 a ( 1 ) 2 2 σ 2 b ( 1 ) 2 b ( 1 ) + e 3 a ( 0 ) 2 2 σ 2 b ( 0 ) 2 b ( 0 ) N .
In the limit α ξ S 1 , we have e 3 a ( 1 ) 2 2 σ 2 b ( 1 ) 2 b ( 1 ) + e 3 a ( 0 ) 2 2 σ 2 b ( 0 ) 2 b ( 0 ) 1 α ξ . Thus, we have the final formula:
# 1 1 N L ( 3 π σ ) N .
which shows two regimes. If σ < 3 / π 0.977205 , the number of stationary points increases exponentially.

References

  1. Indiveri, G.; Liu, S.-C. Memory and information processing in neuromorphic systems. Proc. IEEE 2015, 103, 1379–1397. [Google Scholar] [CrossRef]
  2. Avizienis, A.V.; Sillin, H.O.; Martin-Olmos, C.; Shieh, H.H.; Aono, M.; Stieg, A.Z.; Gimzewski, J.K. Neuromorphic Atomic Switch Networks. PLoS ONE 2012, 7, e42772. [Google Scholar] [CrossRef] [PubMed]
  3. Stieg, A.Z.; Avizienis, A.V.; Sillin, H.O.; Martin-Olmos, C.; Aono, M.; Gimzewski, J.K. Emergent Criticality in Complex Turing B-Type Atomic Switch Networks. Adv. Mater. 2012, 24, 286–293. [Google Scholar] [CrossRef] [PubMed]
  4. Traversa, F.L.; di Ventra, M. Universal memcomputing machines. IEEE Trans. Neural Netw. Learn. Syst. 2015, 26, 2702–2715. [Google Scholar] [CrossRef] [PubMed]
  5. Caravelli, F.; Hamma, A.; di Ventra, M. Scale-free networks as an epiphenomenon of memory. Europhys. Lett. EPL 2015, 109, 28006. [Google Scholar] [CrossRef]
  6. Caravelli, F. Trajectories entropy in dynamical graphs with memory. Front. Robot. AI 2016, 3, 18. [Google Scholar] [CrossRef]
  7. Caravelli, F.; Carbajal, J.P. Memristors for the curious outsiders. Technologies 2018, 6, 118. [Google Scholar] [CrossRef]
  8. Zegarac, A.; Caravelli, F. Memristive networks: From graph theory to statistical physics. Europhys. Lett. EPL 2019, 125, 10001. [Google Scholar] [CrossRef]
  9. Strukov, D.B.; Snider, G.; Stewart, D.R.; Williams, R.S. The missing memristor found. Nature 2008, 453, 80–83. [Google Scholar] [CrossRef]
  10. Chua, L.O.; Kang, S.M. Memristive devices and systems. Proc. IEEE 1976, 64, 209–223. [Google Scholar] [CrossRef]
  11. Yang, J.J.; Strukov, D.B.; Stewart, D.R. Memristive devices for computing. Nat. Nanotechnol. 2013, 8, 13–24. [Google Scholar] [CrossRef] [PubMed]
  12. Glover, F.; Kochenberger, G.; Du, Y. A Tutorial on Formulating and Using QUBO Models. arXiv 2019, arXiv:1811.11538. [Google Scholar]
  13. Ohno, T.; Hasegawa, T.; Nayak, A.; Tsuruoka, T.; Gimzewski, J.K.; Aono, M. Sensory and short-term memory formations observed in a Ag2S gap-type atomic switch. Appl. Phys. Lett. 2011, 99, 203108. [Google Scholar] [CrossRef]
  14. Wang, Z.; Joshi, S.; Savel’ev, S.E.; Jiang, H.; Midya, R.; Lin, P.; Hu, M.; Ge, N.; Strachan, J.P.; Li, Z.; et al. Memristors with diffusive dynamics as synaptic emulators for neuromorphic computing. Nat. Mater. 2017, 16, 101–108. [Google Scholar] [CrossRef] [PubMed]
  15. Caravelli, F.; Barucca, P. A mean-field model of memristive circuit interaction. Eur. Phys. Lett. 2018, 122. [Google Scholar] [CrossRef]
  16. Caravelli, F. Locality of interactions for planar memristive circuits. Phys. Rev. E 2017, 96, 052206. [Google Scholar] [CrossRef]
  17. Little, W.A. The existence of persistent states in the brain. Math. Biosci. 1974, 19, 101–120. [Google Scholar] [CrossRef]
  18. Hopfield, J.J. Neural networks and physical systems with emergent collective computational abilities. Proc. Natl. Acad. Sci. USA 1982, 79, 2554–2558. [Google Scholar] [CrossRef]
  19. Hopfield, J.J.; Tank, D.W. Computing with neural circuits: A model. Science 1986, 233, 625–633. [Google Scholar] [CrossRef]
  20. Amit, D.J.; Gutfreund, H.; Sompolinsky, H. Spin-glass models of neural networks. Phys. Rev. A 1985, 32, 2. [Google Scholar] [CrossRef]
  21. Hu, S.G.; Liu, Y.; Liu, Z.; Chen, T.P.; Wang, J.J.; Yu, Q.; Deng, L.J.; Yin, Y.; Hosaka, S. Associative memory realized by a reconfigurable memristive Hopfield neural network. Nat. Commun. 2015, 6, 7522. [Google Scholar] [CrossRef]
  22. Kumar, S.; Strachan, J.P.; Williams, R.S. Chaotic dynamics in nanoscale NbO2 Mott memristors for analogue computing. Nature 2017, 548, 318–321. [Google Scholar] [CrossRef]
  23. Tarkov, M. Hopfield Network with Interneuronal Connections Based on Memristor Bridges. In Proceedings of the 13th International Symposium on Neural Networks, ISNN 2016, St. Petersburg, Russia, 6–8 July 2016; pp. 196–203. [Google Scholar]
  24. Adler, R.J.; Taylor, J.E. Random Fields and Geometry; Springer: New York, NY, USA, 2007. [Google Scholar]
  25. Fyodorov, Y.V. High-Dimensional Random Field and Random Matrix Theory. Markov Process. Relat. Fields 2015, 21, 483–518. [Google Scholar]
  26. Caravelli, F. The mise en scene of memristive networks: effective memory, dynamics and learning. Int. J. Par. Dist. Syst. 2018, 33, 350–366. [Google Scholar] [CrossRef]
  27. Parisi, G. A sequence of approximate solutions to the S-K model for spin glasses. J. Phys. A 1980, 13, L-115. [Google Scholar] [CrossRef]
  28. Poloni, F. Algorithms for Quadratic Matrix and Vector Equations; Theses (Scuola Normale Superiore); Edizioni della Normale: Pisa, Italy, 2011. [Google Scholar]
  29. Kavitha, T.; Liebchen, C.; Mehlhorn, K.; Michail, D.; Rizzi, R.; Ueckerdt, T.; Zweig, K.A. Cycle bases in graphs characterization, algorithms, complexity, and applications. Comput. Sci. Rev. 2009, 3, 199–243. [Google Scholar] [CrossRef]
  30. Chang, T.J.; Meade, N.; Beasley, J.E.; Sharaiha, Y.M. Heuristics for cardinality constrained portfolio optimisation. Comput. Oper. Res. 2000, 27, 1271–1302. [Google Scholar] [CrossRef]
  31. 225 Nikkei Asset Dataset, OR-Library. Available online: http://people.brunel.ac.uk/~mastjjb/jeb/orlib/files/port5.txt (accessed on 12 August 2019).
  32. Markowitz, H. Portfolio Selection. J. Financ. 1952, 7, 77–91. [Google Scholar]
  33. Kirkpatrick, S.; Gelatt, C.D., Jr.; Vecchi, M.P. Optimization by Simulated Annealing. Science 1983, 220, 671–680. [Google Scholar] [CrossRef]
  34. Caravelli, F.; Traversa, F.L.; di Ventra, M. The complex dynamics of memristive circuits: Analytical results and universal slow relaxation. Phys. Rev. E 2017, 95, 2. [Google Scholar] [CrossRef]
  35. Traversa, F.L.; Ramella, C.; Bonani, F.; di Ventra, M. Memcomputing NP-complete problems in polynomial time using polynomial resources and collective states. Sci. Adv. 2015, 1, e1500031. [Google Scholar] [CrossRef]
  36. Pershin, Y.V.; di Ventra, M. Solving mazes with memristors: A massively-parallel approach. Phys. Rev. E 2011, 84, 046703. [Google Scholar] [CrossRef]
  37. Pershin, Y.V.; di Ventra, M. Self-organization and solution of shortest-path optimization problems with memristive networks. Phys. Rev. E 2013, 88, 013305. [Google Scholar] [CrossRef]
  38. Dorigo, M.; Gambardella, L.M. Ant colonies for the traveling salesman problem. Biosystems 1997, 43, 73–81. [Google Scholar] [CrossRef]
Figure 1. Dynamics of the memory is shown in Figure (a) and the corresponding evolution of the Lyapunov function of Equation (A1) ((b), continuous line) for 8750 memristors and for the asymptotic function of Equation (6) Figure ((b)-dashed line). We considered α = 0.1 , β = 1 and ξ = 10 , for Ω i j from a random circuit of the Erdos-Renyi type with p = 0.9 . The sources S ’s elements were chosen at random between [ 0.05 , 0.05 ] .
Figure 1. Dynamics of the memory is shown in Figure (a) and the corresponding evolution of the Lyapunov function of Equation (A1) ((b), continuous line) for 8750 memristors and for the asymptotic function of Equation (6) Figure ((b)-dashed line). We considered α = 0.1 , β = 1 and ξ = 10 , for Ω i j from a random circuit of the Erdos-Renyi type with p = 0.9 . The sources S ’s elements were chosen at random between [ 0.05 , 0.05 ] .
Entropy 21 00789 g001
Figure 2. Dynamics of the Energy (a) and corresponding evolution of the Lyapunov function (b) in Figure 1.
Figure 2. Dynamics of the Energy (a) and corresponding evolution of the Lyapunov function (b) in Figure 1.
Entropy 21 00789 g002
Figure 3. Distribution of the elements of the matrix Ω in the case of an Erdos-Renyi graph. In (a) we plot the distribution density, while in (b) the logarithm, which show that the distribution is multimodal. However, we see that there are roughly 5 orders of magnitudes between the bulk of the probability distribution and the two smaller values of the density at ± 0.01 , which shows that the distribution can be well approximated with an unimodal one.
Figure 3. Distribution of the elements of the matrix Ω in the case of an Erdos-Renyi graph. In (a) we plot the distribution density, while in (b) the logarithm, which show that the distribution is multimodal. However, we see that there are roughly 5 orders of magnitudes between the bulk of the probability distribution and the two smaller values of the density at ± 0.01 , which shows that the distribution can be well approximated with an unimodal one.
Entropy 21 00789 g003
Figure 4. Scaling of the components Ω i i as a function of the number of memristors for randomly generated circuits.
Figure 4. Scaling of the components Ω i i as a function of the number of memristors for randomly generated circuits.
Entropy 21 00789 g004
Figure 5. Percentage of exactly predicted values of memristors according to Equation (13) as a function of ξ . The error bar are calculate from 100 samples, with approximately 800 memristors (on average) and ξ = 10 , α = 0.1 and β = 1 , with the vector S drawn at random between [ 0.5 , 5 ] .
Figure 5. Percentage of exactly predicted values of memristors according to Equation (13) as a function of ξ . The error bar are calculate from 100 samples, with approximately 800 memristors (on average) and ξ = 10 , α = 0.1 and β = 1 , with the vector S drawn at random between [ 0.5 , 5 ] .
Entropy 21 00789 g005
Figure 6. Obtained minimum energy using the minimum over 100 random binary values (blue crosses), roghly 8000 steps of exponential annealing (green pluses) and 60 time steps using the memristor dynamics (red circles). We generated each sample with approximately 800–900 memristors, fixing the percolation parameter to p = 0.7 , with ξ = 10 , α = 0.1 and β = 1 . The numerical integration was performed using a simple Euler-Newton integration with integration step d t = 0.1 and total number of steps T = 1000 . We truncated the Metropolis annealing to 10 N time steps, with N the number of memristors and with an annealing rate λ = 0.995 and an exponential annealing law for the temperature given by T e m p ( k ) = 100 λ k .
Figure 6. Obtained minimum energy using the minimum over 100 random binary values (blue crosses), roghly 8000 steps of exponential annealing (green pluses) and 60 time steps using the memristor dynamics (red circles). We generated each sample with approximately 800–900 memristors, fixing the percolation parameter to p = 0.7 , with ξ = 10 , α = 0.1 and β = 1 . The numerical integration was performed using a simple Euler-Newton integration with integration step d t = 0.1 and total number of steps T = 1000 . We truncated the Metropolis annealing to 10 N time steps, with N the number of memristors and with an annealing rate λ = 0.995 and an exponential annealing law for the temperature given by T e m p ( k ) = 100 λ k .
Entropy 21 00789 g006
Figure 7. Maximization of the return for the Nikkei 225 dataset using a Monte Carlo procedure. We compare the results between the best of 100 Simulated Annealing procedures (the green line in the plot 9.15 ) versus with the memristive optimization proposed in this paper (black lines), with identical initial conditions. The simulated annealing was conducted with an initial temperature T = 100 and an exponential annealing rate λ = 0.999 (and effective time steps N = 500). We have then used the final states of the Memristive Optimization procedure as an input to a simulated annealing procedure with lower temperature T 0 = 0.025 and identical annealing rate, which obtained the absolute maximum of the return.
Figure 7. Maximization of the return for the Nikkei 225 dataset using a Monte Carlo procedure. We compare the results between the best of 100 Simulated Annealing procedures (the green line in the plot 9.15 ) versus with the memristive optimization proposed in this paper (black lines), with identical initial conditions. The simulated annealing was conducted with an initial temperature T = 100 and an exponential annealing rate λ = 0.999 (and effective time steps N = 500). We have then used the final states of the Memristive Optimization procedure as an input to a simulated annealing procedure with lower temperature T 0 = 0.025 and identical annealing rate, which obtained the absolute maximum of the return.
Entropy 21 00789 g007

© 2019 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Back to TopTop