• Keine Ergebnisse gefunden

Covariant Lyapunov Vectors and Hyperbolicity

The Standard Model of Type I Membranes

2.10 Covariant Lyapunov Vectors and Hyperbolicity

103 104 105 N 10

100 1000

P

ν = 10Hz ν = 5Hz ν = 1Hz 103 104 105

N 10

100 1000

P

A B

Figure 2.22–Average participation ratioP¯versus network sizeNin balanced theta neuron networks with different average firing ratesν.¯ (A) Inhibitory networks (B) Excitatory-inhibitory networks, (dotted line: guide to the eye for ¯P∼N), (parameters: NI=0.2N, NE =0.8N, K =100, J0 =1, τm =10 ms, ε=0.3).

dynamics (vi(t)≡0), its chaos index is ci=0. If only one neuron participates (vi(t)≡1), only its chaos index does not vanish and is ci=√

N. If all neurons contribute equally to the chaotic dynamics (vi(t)≈1/√

N), all chaos indices are ci≈1. Although neurons with low firing rates and high variability had slightly reduced chaos indices, all neurons had a similar chaos index of approximately one (Fig. 2.20, 2.21D,E). This indicates that all neurons participated almost equally in the chaotic dynamics.

The number of neurons participating in the chaotic dynamics at any single point in time is captured by the participation ratio

P(t) =1/

N i=1

vi(t)4. (2.53)

Two limiting cases are interesting for the understanding of the participation ratio: (i) Delocalized (extended) states, where the Lyapunov vector is spread out over the entire network. Each vector element is then vi(t)≈1/√

N and the participation ratio P(t) =1/(N/N2) =N. (ii) Localized states, where the Lyapunov vector is localized to a few neurons. Then, only a few vector elements do not vanish and the participation ratio isP(t)N. The participation ratio exhibited substantial fluctuations indicating strongly varying group sizes (Fig. 2.20, 2.21B). For the considered rates 1 Hz≤ν¯ ≤10 Hz, the time-averaged participation ratio ¯P=hP(t)it obeyed a sublinear scaling P¯∼Nα, with 0.25≤α ≤0.5 (Fig. 2.22). This behavior neither clearly indicates localized states, for which ¯Pwould be independent ofN, nor delocalized states, where ¯Pwould depend linearly on N. One should, however, note that the groups are generally small relative to the network sizes, the fraction of most unstable neurons decreased algebraically asNα1.

These results of the chaos index and the participation ratio can be summarized in the following.

Although at one point in time only a small group of neurons participated in the chaotic dynamics, the neurons composing these groups constantly changed over time such that in the long run all neurons participated almost equally in the chaotic dynamics.

2.10 Covariant Lyapunov Vectors and Hyperbolicity

While the largest Lyapunov vector yielded interesting results about the active participation of the neurons in the chaotic dynamics, the N−1 other Lyapunov vectors yield additional insight into the collective dynamics of neural networks. The standard algorithm for the computation of the Lyapunov spectra evolves an orthonormal system (ONS) with the single spike Jacobians. The norms of the evolved ONS asymptotically yield the Lyapunov spectrum. The ONS itself is spanned

by vectors of the subspacesEirelated to the Lyapunov exponentsλ<i(see section 2.3.2).

The eigenvectors of the Oseledec matrix that span the subspacesEirelated to the Lyapunov ex-ponents are the covariant Lyapunov vectors (CLVs). Only the first vector of the ONS is equivalent with the first CLV. All other vectors in the ONS are not the CLVs—the corresponding eigenvectors to the exponents. Interestingly the CLVs are accessible from the information contained in the ONS.

A procedure to obtained the CLVs from the ONS was introduced in Ref. [96]. This vastly extends the approach for the characterization of the dynamics of spiking neuron networks.

Here, we discuss the hyperbolicity of the studied systems to validate the use of the Lyapunov exponents for the derivation of the attractor dimension and entropy production rate. A dynamical system is called hyperbolic if the stable and unstable manifolds are everywhere transversal to each other [96]. This will be tested by determining the angles between all CLVs. In case of transversality of the manifolds, thus hyperbolicity of the system, it can be shown that the system is an Axiom A dynamical system and then an SRB measure exists [67]. The existence of an SRB measure is sufficient for the Pesin identity and the Kaplan-Yorke conjecture to hold [72, 73]. Therefore, proving hyperbolicity of the studied systems validates the use of the Pesin identity and the Kaplan-Yorke conjecture to derive the entropy production rateH and the information dimension (attractor dimension)Dform the spectrum of Lyapunov exponents. If hyperbolicity is violated thenH and Dand provide upper bounds to the entropy production rate and information dimension.

The angles of all CLVs in both inhibitory and excitatory-inhibitory networks are displayed in Fig. 2.23. The computation of the CLVs requires saving the projection matrices obtained in the Gram-Schmidt orthogonalization procedure at all time steps. This restricts the use of the algorithm to moderately large networks up toN=1000 so far1. As a health check for the computation of the Lyapunov spectra with the ONS and CLVs, respectively, the spectra obtained from both methods are compared in Fig. 2.23A. In inhibitory networks, where longer simulations were possible they show excellent agreement. In excitatory-inhibitory networks, with shorter simulation due to the memory restrictions, there were slight differences. We should therefore focus on the results of the inhibitory networks for which longer and converged calculations can be presented.

At each time step s, the CLVs point in different directions{~vi,s}. The angles between different CLVsiand jare

αi j,s=arccos(~vi,s·~vj,s).

We used circular statistics2 to calculate their averages ¯αi j and variances σi j2. The cosine of the average angle cos(α¯i j) between all pairs of CLVs is plotted in the lower right triangle and the circular variance σi j2 is plotted in the upper left triangle of Fig. 2.23B. One can see that CLVs with quite differing indices (far away from the diagonal) are orthogonal to each other, the cosine of the average and the variance are zero. CLVs with similar indices (close to the diagonal) have nonvanishing cos(α¯i j) and nonvanishing σi j2. A closer look at the first offdiagonal reveals that adjacent CLVs can in fact be almost tangential (Fig. 2.23C). However, it seems that this is the case for larger indices, thus negative Lyapunov exponents. The CLVs corresponding to the positive Lyapunov exponents seem to be transversal. The angles between Lyapunov vectors further apart in their indices become more and more transversal as depicted in Fig. 2.23C and indicated by the

1The projection matrix ofN2/2 doubles is stored for each of theO(10N)time steps. Thus, the necessary memory for these computations quickly reaches the order of gigabytes and hence the limit of available RAM (e.g. N=1000 needs 8 B·5N340 GB RAM). This restricts the method to networks withN1000 so far.

2The averages over all considered time stepss=1. . .NscalledCi j=N1

sscosαi jandSi j=N1

sssinαi jlead to the circular average ¯αi j=arctan(Si j/Ci j)and circular varianceσi j2=1q

C2i j+S2i j.

2.10 Covariant Lyapunov Vectors and Hyperbolicity corresponding histograms in Fig. 2.23D.

Finally, we want to hint at the interesting fact that cos(α¯i j) and σi j2 both vanish for all CLVs with the CLV corresponding to the near zero Lyapunov exponent i=43 (see the white hair line in Fig. 2.23B). Any component of a perturbation in the direction of the trajectory would just cor-respond to a time-shift and thus have a vanishing Lyapunov exponent. Therefore, all CLVs with nonvanishing Lyapunov exponent must be orthogonal to the CLV with vanishing Lyapunov expo-nent pointing in the direction of the trajectory. This is reflected in the white hair line in Fig. 2.23B.

The index with the vanishing Lyapunov exponent is also important for the test of hyperbolicity as it separates the stable and unstable manifold. The CLVs with larger indices (negative Lya-punov exponents) span the stable manifold Es+, whereas the CLVs with smaller indices (positive Lyapunov exponents) span the unstable manifoldEs. The minimal angle between the stable and unstable manifold determines the hyperbolicity of the studied systems. At each time step, this angle Φs is obtained from the minimal angle between all pairs of CLVs from the stable and the unstable manifold, the least transversal pair of CLVs [96]:

Φs=min{αi j,s|~vi,s∈Es+,~vj,s∈Es}. (2.54) InFig. 2.24Aare shown the anglesΦs between the stable and unstable manifold at each time step s(spikes in the networks) for different network sizes N. The histogram of these angles shown in Fig. 2.24B indicates some hyperbolicity-violations, especially for small networks withN =100.

From data in Fig. 2.24 it is difficult to conclude that the higher-dimensional the systems get the fewer hyperbolicity-violations occur. This is, however, generally expected [96] and our simulations do not disprove it. We can thus carefully consider the large networks studied to be hyperbolic systems. Then the Pesin identity for the entropy production rateHKS=∑λi>0λi and the Kaplan-Yorke conjecture for the information dimension D1=d+|λdi=1λi

d+1| with d=max{n:∑ni=1λi≥0} hold, and validate the presented approach to quantify the dynamical entropy production rate in neural networks based on the Lyapunov spectra.

0 100 200 300 400 500

(a)Inhibitory networksN=500 (calculation with 25 000 spikes).

0 200 400 600 800 1000

(b)Excitatory-inhibitory networksNE=500,NI=500 (calculation with 10 000 spikes,ε=0.2).

Figure 2.23 – Covariant Lyapunov vectors in balanced theta neuron networks. (A) Lyapunov spec-trum obtained in the standard procedure with an orthonormal system (ONS) in the forward Gram-Schmidt-reorthogonalization procedure [69] and obtained with the covariant Lyapunov vectors (CLV) from the back-ward calculation [96], (B) Angles between covariant Lyapunov vectors cos(αi j) =~vi·~vj(cosine of circular mean ¯αi j in lower right triangle, circular varianceσi j2in upper left triangle), (C) cosine of the offdiagonal el-ementsi−j=1,2,5,10 of the circular mean ¯αi j, (D) histogram of the offdiagonal ¯αi jfrom (C), (parameters:

Figure 2.24–Minimal angle between stable and unstable manifold. (A) Minimal angleΦsof the least transversal pair of CLVs versus time steps, Eq. (2.54), (B) histogram of points in (A) (parameters: ¯ν=1 Hz, K=50,J0=1,τm=10 ms).

2.11 Summary

2.11 Summary

In this chapter we introduced a novel approach for a thorough characterization and quantification of the dynamics of spiking neuron networks. This approach allows for the numerically exact simulation of neural network dynamics and the calculation of the complete Lyapunov spectra of networks with arbitrary topologies and single neuron phase-response curves. As a fundamental example, we analyzed networks of theta neurons in the balanced state. The theta neuron model is the canonical form of type I excitable neurons and the balanced state, the prevailing explanation of the asynchronous irregular firing activity observed in the cortex. We therefore expect these results to be representative of a wide class of models of neural networks.

The balanced state emerged for a wide parameter range in both exclusively inhibitorily coupled networks and networks with excitatory and inhibitory populations. We also observed two phase transitions from the asynchronous balanced state to a synchronous state. Beyond a critical connec-tivity, that was independent of the networks’ size, the networks settled into a synchronous irregular state and for very strong excitatory coupling, the networks settled into a synchronous regular state.

The presented results of the collective network dynamics show that theta neurons in the asyn-chronous irregular balanced state exhibit:

• Deterministic chaos, characterized by positive and finite Lyapunov exponents.

• Extensive chaos, characterized by network size-invariant Lyapunov spectra and a linear in-crease of the number of positive Lyapunov exponents, attractor dimension and entropy pro-duction rate with the number of neurons.

• Fat chaotic attractors of 20-60% of the phase space dimension, implying large capacity for information processing and a huge repertoire of possible network states.

• High entropy production rates of 0.5-1 bit per spike per neuron. Compared to real sensory information provided to cortical neurons of about 1 bit per spike per neuron [93, 94], this implies that sensory information is overwritten at a similar rate as it is encoded.

• Activation of excitatory-inhibitory feedback loops intensifies the chaos, yet the dynamics in inhibitory and excitatory-inhibitory networks are qualitatively very similar.

• Temporal network chaos similar to spatiotemporal chaos. While a small fraction of neurons participated in the chaotic dynamics at one point in time, almost all neurons participated equally in the chaotic dynamics over time.