ebook img

Response of Boolean networks to perturbations PDF

0.16 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Response of Boolean networks to perturbations

Response of Boolean networks to perturbations C. Fretter and B. Drossel Institut fu¨r Festk¨orperphysik, TU Darmstadt, Hochschulstraße 6, 64289 Darmstadt, Germany We evaluate the probability that a Boolean network returns to an attractor after perturbing h nodes. We find that the return probability as function of h can display a variety of different behaviours,whichyieldsinsightsintothestate-spacestructure. Inadditiontoperformingcomputer simulations, we derive analytical results for several types of Boolean networks, in particular for 8 Random Boolean Networks. We also apply our method to networks that have been evolved for 0 robustness tosmall perturbations, and to a biological example. 0 2 PACSnumbers: 89.75.Hc,05.45.-a,89.75.Fb n a J I. INTRODUCTION In this paper, we investigate the effect of a perturba- 4 tion on a BN that is updated in parallel and that is on 2 anattractor. Thequantityweevaluateistheprobability Booleannetworks(BN) areused to model the dynam- that the network returns to the same attractor after the ics of a wide variety of complex systems, ranging from ] perturbation, as a function of the size of the perturba- h neural networks [1] and social systems [2] to gene regu- tion, i.e., of the number of nodes the states of which are c lation networks [3], sometimes combined with evolution- e changed. This leads to a curve that is characteristic of ary processes [4]. BN are composed of interacting nodes m the network and is strikingly different for different types withbinarystates,typically0and1,coupledamongeach - of networks. other. Thestateofeachnodeevolvesaccordingtoafunc- t a tion of the states observed in a certain neighbourhood, Theoutlineofthispaperisasfollows: First,weinvesti- t gatetwotypesofsimplenetworks,namelynetworkscon- s similarto whatisdonewhenusingcellularautomata[5], . sistingofindependentnodesandnetworksconsistingofa t butincontrastto cellularautomata,BNhavenoregular a singleloopofnodes. Theresultsthenhelptounderstand latticestructure,andnotallnodesareassignedthesame m the behaviour of Random Boolean Networks under per- update function. - turbations, which are studied in Section III dealing with d Usually,BNmodelsarestudiedusingdeterministicdy- frozen, critical and chaotic networks. In Section IV, we n namical rules. After a transient time, these networks then investigate a few specific networks, which are not o reachattractors,whichare a sequence ofperiodically re- random networks, and we find that their characteristic c peatedstates. The numberandlengthofsuchattractors [ curvesarevery differentfromthose ofrandomnetworks, is an important property investigated in BNs. Of equal reflectingfor instancethe higherrobustnessto perturba- 1 importance are the sizes of the basins of attraction of tions. Finally, we summarize and discuss our findings in v these attractors, which are the sets of states leading to Section V. 4 the attractors. Some networks may contain periodic se- 8 quences in state space that are not “attractors” in the 7 strict sense, because there are no states outside this se- 3 II. SIMPLE NETWORKS . quence that are attracted to it. However, in this paper 1 we do notmakethis distinction, andwe callallsequence 0 periodically repeated sequences of states “attractor”. A. Independent nodes 8 0 Real networks are often influenced by noise, since : molecule concentrations may be small (e.g., in biologi- LetusfirstconsiderasystemofN independentnodes. v calsystems[6])orbehaviourmaybe unpredictable(e.g., Then the response of each node to the perturbation is i X in social systems). For this reason, it is important to independent of the response of the other nodes. The r investigate how robustthe behaviour found under deter- state of each of the N nodes at time t+1 is determined a ministic update rules is when noise is added. Examples byitsstateattimet. Thereare4possibilitiestoassignto forsuchstudiesareIsing-likemodelsplacedonanetwork suchanodeanupdate function: (i)thestateofthenode topology [7, 8], BN with a probabilistic rule for choosing is 0 irrespective of the state at the previous time step; the update function at each time step[9], models with (ii) the state of the node is 1 irrespective of the state at stochasticupdatesequences[10]andsmallstochasticde- the previoustime step; (iii)the state ofthe node attime laysintheupdatetime[11]. Thesestochasticmodelscan t+1 is identical to the state at time t; (iv) the state of lead to surprising new results. For instance, for stochas- thenode attime t+1isthe oppositeofthestate attime tic update sequences it could be shown that the number t. This means that the node alternates between 0 and of attractors (now defined as a recurrent set of states) 1. The firsttwoupdate functions areconstantfunctions, grows like a power law [10, 11] as function of the net- the third is the “copy” function, the fourth the “invert” work size, while it grows superpolynomially for parallel function. update. We generate a network by assigning update functions 2 to the N nodes. We then initialize the network in a ran- The most important result of this subsection is that domlychosenstateandwaituntilitreachesanattractor. for small perturbations the return probability is simply In the simple networks considered here, an attractor is given by an exponential dependence reached after one time step. Unless no “invert” function ischosen,theperiodoftheattractoris2. Whenanodeis P (h)=P (1)h. ret ret perturbed, its response depends on its update function: If the update function is constant, the node returns in This result will be generally true for small h whenever the next time step to its previous value. If the update perturbations at different nodes decay independently functionis“copy”,thenoderemainsinitsnewstateand from each other. For this reason, we will see an expo- does not returnto its previous state. If the update func- nential decrease in the examples below when N is large tion is “invert”, the node continues to oscillate between and h is small. 1 and 0, however with a phase shift of one time step. When we perturb h nodes, the network returns to the attractoronlyifallperturbednodeshaveaconstantfunc- tion, or if all nodes with an “invert” function are per- B. Simple loops turbed but none with a “copy” function. Let us focus on the case that N is large and that each of the four Now, let us consider a simple loop of N nodes. update functions is assignedto one quarterof the nodes. The probability that the network returns to its previous attractor after perturbing h nodes is then N/2 P (h)= h , (1) ret (cid:0) N (cid:1) h (cid:0) (cid:1) which can be approximated by 1 P (h) (2) ret ≃ 2h for h/N 1. For h > N/2, the return probability is ≪ 0, since it is no longer possible that all perturbed nodes have constant functions. For N/4 h 3N/4, it is in FIG. 2: A loop of seven nodes ≤ ≤ principle possible that all nodes with “invert” functions but none with “copy” functions are perturbed, however, theprobabilitythatthisoccursissosmallthatweneglect Eachnode is connected to its predecessor on the loop, it here. and the update function of each node is either “copy” For general networks, the return probability has to be or “invert”. A constant function at one node will freeze evaluated by averaging over different initial states, how- the whole loop, in which case Pret = 1 for all h, and we ever,inthe specialcase consideredhere the returnprob- therefore focus on the more interesting case of no con- ability is the same for each initial state. stant function in the loop. A loop with n inversions can be mapped bijectively onto one with (n 2) inversions − by replacingtwo “invert”withtwo “copy”functions and 100 by inverting the values of all nodes between these two couplings. It is therefore sufficient to distinguish loops 10 withanevenoranoddnumberofinversions,andwecall 1 them “even” and “odd” loops respectively. When dis- cussing even loops, we consider loops with only “copy” Pret 0.1 functions. To odd loops we assign one “invert” function and N 1 “copy” functions. An even loop with a prime 0.01 − number of nodes returns to its initial state after N time steps. If N is not prime, shorter periods exist. Further- 0.001 more, if all nodes have the same state, the loop is on a 1e-04 fixed point. An odd loop with a prime number of nodes 0 2 4 6 8 10 12 14 16 18 20 returns to its initial state after 2N time steps. An odd h loop has no fixed points. The shortest attractor has pe- riod 2, with alternating 1’s and 0’s. FIG. 1: Pret(h) (in percent) of a set of independentnodes as Figure 3 shows P for an even loop with N = 7, as ret given byEq. (1), thedashed line is obtained using Eq.(2) obtained from computer simulations. 3 100 100 90 80 80 70 60 60 Pret Pret 50 40 40 30 20 20 10 0 0 0 1 2 3 4 5 6 7 0 1 2 3 4 5 6 7 h h FIG. 3: Pret(h)(in percent) for an even loop with N =7 FIG. 4: Pret(h)(in percent) of an odd loop N =7 For an odd loop with prime N, P is 1 for h=0 and ret h=N,andhasotherwisethe valueP =2(N 1)/2N. ret − Thisresultisobtainedinasimilarwayasforevenloops: The return probability is 0 for all odd values of h, The loop remains on the same attractor after perturb- and it has the same nonzero value for all even h larger ingh nodes,ifthestate generatedbythe perturbationis than 0. This happens for all even loops with N being a among the 2N 2 different states assumed by the loop − prime number, as we show in the following. If h nodes duringthe next2N 1time steps,but notafter N steps − are perturbed, the loop returns to the same attractor (wherethestateofallnodesissimplyinverted,whichcan only if the perturbation leads it to another state of this only be achieved for h = N). If we fix the nodes to be attractor. Allstates ofthe sameattractorhavethe same perturbed and the number m of time steps between the sequence of 0s and 1s, but rotated around the loop by unperturbed and the perturbed state, we can uniquely some number of steps. Only by inverting an identical identify the attractorby fixing the stateof onenode and number of 0s and 1s can we stay on the same attractor, then stepping aroundthe loop in steps of size m, assign- and this explains why Pret is nonzero only for even h. ing to each node the same state as to the previous node Now,byfixingthenodesthataretobeperturbedandby if the previous node is not among the perturbed nodes fixing the number m of steps by which the perturbation andifthe“invert”functionisnotbetweenthetwonodes. of these nodes shall rotate the attractor, we uniquely fix Otherwise, the inverted state is assigned. theattractor(apartfromaninversionofallnodes). This attractorisfoundbyfixingthestateofonenodeto1(or 0)andby requiringthat the nodem steps aheadhas the C. Collections of several loops same state if the first node is not a perturbed node, and the opposite state if the first node is a perturbed node. If the network consists of several independent loops, Thenwedetermineinthesamewaythestateofthenode the periods of the attractors are the least common mul- m steps further, and so on, until the state of all nodes is tiples of the periods of the loops. Obviously, a collection fixed. Since N is a prime number, we do not return to of loops can remain on the same attractor after a per- the first node before all other nodes have been visited. turbation only if the perturbation produces a state that The probability to be still on the same attractor after correspondsto a future state of the loop,but nota state perturbing h nodes is for even h therefore that is not part of the state sequence on this loop. If all loop lengths are different from each other and have no common divisor,every combinationof future loop states 2(N 1) belongs to the same attractor, and the return probabil- Pret = 2N− , (3) ity is identical to the probability that the perturbation generatesonlysuchstates. Otherwise,iftherearetwoor moreloopswithperiodsthathaveacommondivisor,the perturbation must advance all these loops by a multiple whichisthenumberofconfigurationsthatarerotatedby of this divisor if the systemshall remainon the same at- m=1,...N 1stepsundertheperturbation,dividedby tractor. Since the analyticalexpressions for P become ret − thetotalnumberofconfigurationsoftheloop. IfN isnot complicated and do not provide special insight, we omit a prime number, there are attractors of different length, them here. We only note that the function P (h) de- ret and the function P (h) depends in a more complicated pends strongly on the lengths and number of loops, and ret way on N. on the common divisors of their lengths. 4 III. RANDOM BOOLEAN NETWORKS of relevant nodes is 0 approaches a nonzero constant in the limit N , and therefore some networks always → ∞ A random Boolean network (RBN) is constructed by have a constant return probability Pret(h)=1. choosing for each node at random k nodes from which it Figure 5 shows the result of a computer simulation of receivesitsinput,andanupdatefunctionthatassignsto three networks in the frozen phase. Half of the nodes eachofthe2kstatesofthekinputnodesanoutput1or0. were assigned the value k = 1 and half of the nodes the The update function of each node is chosen at random value k = 2. (This combination of k values was chosen among all 22k possible update functions [12, 13]. All because networks with k = 1 are almost always com- pletely frozen. Even for the value k = 1.5, only few nodesareupdatedinparallel. Dependingonthevalueof h i networks were not completely frozen.) One can see the k and the probabilities assigned to the different update inital exponential decay and the broad range of curve functions, the dynamics of the network is either in the shapes when h/N is not small anymore. The behaviour frozenorinthe chaoticphase. Atthe boundarybetween ofthecurveswhenhapproachesN dependsontheprob- the two are critical networks. If all update functions are ability that the inversion of the state of all nodes leaves assigned the same probability, RBNs with k = 1 are in the network in the same basin of attraction. As we can the frozen phase, networks with k = 2 are critical, and seeinthisandotherfiguresinthispaper,thisprobability networks with k > 2 are chaotic. In the following, we varies widely between networks and is not rarely larger consider perturbations of these three types of networks. than P (N/2). ret A. RBNs in the frozen phase 100 Inthe frozenphase,allnodes apartfromasmallnum- 80 ber (that remains finite in the limit of infinite system size)assumeaconstantvalueafteratransienttime. Ifin 60 thestationarystatethevalueofonenodeischanged,this et perturbationpropagatesduringonetimesteponaverage Pr 40 to less than one other node. If all nodes that become frozen after some time are removed from the network, 20 there remain the nonfrozen nodes. In the limit N , →∞ itcanbeshownthatthesenonfrozennodesareconnected 0 to simple loops (the “relevant loops”) with trees rooted 0 50 100 150 200 intheloops[19]. Thenodesinthetreesareslavedtothe h dynamics on the loops, and the perturbation of a node on a tree does not induce a change of the attractor. The FIG. 5: Pret(h)(in percent) of three frozen networks N = perturbationofafrozennodecanaffectacoupleofother 200,hki=1.5 nodestowhichtheperturbationmaypropagate. Ifthese other nodes are also frozen or part of a nonfrozen tree, theywillsoonreturntothebehaviourtheyshowedbefore the perturbation. However, if the perturbation affects a B. RBNs in the chaotic phase nodesitting onarelevantloop,theattractorwillusually be changed. If we denote with p the probability that Inthechaoticphase,initiallysimilarconfigurationsdi- r the perturbation of one node will affect a node on a rel- verge exponentially. Attractors are usually long, and a evant loop,we obtainP (h)=(1 p )h for smallh. In non-vanishingproportionofallnodeskeepchangingtheir ret r − the limit N , the probability p must become pro- state even after long times. Networks with k = N are r → ∞ portional to 1/N, since the number of nodes on relevant easiest to understand among the chaotic networks, since loops remains finite in this limit. the successorofeachnetworkstate is a randomlychosen From these properties of frozen networks in the limit other network state [15]. In state space, we have there- of large N, we can conclude that P (h) is close to 1 fore a “network”with one randomly chosen “input” (i.e. ret for h N, since the probability of perturbing one of successor) for each “node” (i.e., state). In such a k =N ≪ the relevant nodes is very small. When h becomes of network,every perturbationleads the network anywhere the order of Np , relevant nodes are perturbed with a in state space, and we therefore expect P (h) to be a r ret considerable probability, and the shape of the function constant function, with the exception of the value 1 at P (h) depends on the properties of the relevant loops. h = 0. The average value of this constant can be de- ret The size and number of relevant loops is in general dif- termined from what is known for k = N networks: The ferent in different networks. From our discussion in the valueofP ontheplateauisgivenbytheaverageproba- ret previous section for collection of independent loops, we bilitythatarandomlychosenstateinstatespacebelongs conclude that there is no self-averagingof P (h) in the to the same basin of attraction as the randomly chosen ret limit of large N. Also, the probability that the number initial network state. We define the weight w of an at- ρ 5 tractor ρ as the length of the attractor plus the number Chaotic networks with fixed value of k and large N of basin states draining into that attractor, normalized have been shown to share many properties with chaotic by the size of the state space (2N), so that w = 1 k =N networks. From [16], it appears that chaotic net- ρ ρ [16]. We therefore have P works with k <N have the same set of basin weights as those with k = N, which would mean that the average P = w2 (4) height of the plateau should be the same for all chaotic ret ρ X networksinthe thermodynamiclimit. We testedthisas- ρ sumption by simulating smaller networks (of size 20) on Inthe limit N , the averagenumber ofattractors the computer for k = 3, 4 and 5. The average of the →∞ with a weightbetween w and w+dw is givenby [16, 17] return probability for a perturbation of size N/2 = 10 overapproximately10000networksis P (10)=0.6764, ret 0.675, 0.6735, 0.6686 and 0.6675 for k = 3,4,5,6,9 re- 1 g(w)= , (5) spectively. These values are impressively close to 2/3 2w√1 w given the small system size. For larger k, the value is − closer to 2/3, because the number of nonfrozen nodes leading to andthereforethesizeofthe(nontrivialpartofthe)state 1 spaceincreaseswithincreasingk. Wealsosimulatedsys- 2 P = g(w)w2dw = . (6) tems with larger values of N, however,it is very hard to ret Z0 3 obtaingoodstatisticsforlargerN becausethesizeofthe statespaceandthelengthofattractorsincreaseexponen- This analytical result applies to the ensemble averageof tiallywithN. Fromthedatawehaveitappearsthatthe all networks with k = N. In a given network, the value value of the plateau moves closer to 2/3 with increas- of the plateau usually deviates from 2/3, since always ing N. Figure 7 shows the probability distributions of a few large basins dominate the value of P , and the ret P (10), fromwhich the above-mentionedaverageshave values w of these basins differ between networks even in ret been obtained. This distribution is very broad and does the thermodynamic limit. notappeartobecomenarrowerforlargerN. Thereisno self-averagingof the plateau value of P . ret 100 In summary, we find that chaotic networks with large N have a return probability P (h) that decays rapidly ret 80 for small h and then reaches a plateau, the ensemble average of which lies at 2/3. 60 et Pr 18 40 ent) 16 c er 14 20 p n 12 s (i 0 k 10 0 10 20 30 40 50 60 wor 8 h et n 6 of n 4 FIG. 6: Pret(h)(in percent) of three chaotic networks with o N =60,k=3 cti 2 a fr 0 0 10 20 30 40 50 60 70 80 90 100 Now, let us turn to chaotic networks with fixed k P ret (smaller than N), for which no such analytical results areknown. Figure6showsP asfunctionofhforthree ret FIG. 7: Probablity distribution of the value Pret(10) for ap- networks with k = 3 and N = 60. We see a plateau for proximately 10000 networks of size N =20, for different val- h > 2, which means that perturbing one or two nodes uesof k. does not yet necessarily carry the network to a random state, but perturbing severalnodes does. The reasonfor this is that a certain proportion of nodes in chaotic net- C. Critical RBNs works with fixed k assume a constant value after some time, and perturbing only such nodes may not change the attractor. For larger h, it becomes very likely that Critical networks are at the boundary between the relevant nodes are changed by a perturbation, and then frozen and chaotic phase, and neighbouring configura- thesituationissimilarasink =N networks,whichhave tionsdivergeonlyalgebraicallywithtime. Justasfrozen no frozen core. networks,they havealargefrozencore. ForlargeN,the 6 number of nodes that do not become frozen after some criticalnetworks. Forsuchsmallnetworkssizes,the crit- time, is proportional to N2/3 [19]. Most of them are ir- ical features of P (h) derived for the thermodynamic ret relevantfordeterminingthe attractors. Whenthefrozen limit cannot yet be seen. core is removed from the network, the remaining nodes are connected to relevant components, most of whichare simple loops and all of which contain loops, and with IV. SOME SPECIFIC NETWORKS trees rooted in these loops. The number of nodes in the relevantcomponentsisproportionaltoN1/3 forlargeN. In the ensemble of all random networks of a not too The mean number of nodes affected by the perturbation large size, there are some networks whose characteristic of a single node is N1/3 for large N. The probabil- ∼ curvesareverydifferentfromconventionalones,asshown ity that a perturbation of size h = 1 affects a relevant in the next two figures. node is given by the probability N−2/3 that a given ∼ nodeisaffectedbytheperturbation,timesthenumberof relevant nodes, P (1) aN−1/3. We therefore obtain 100 ret ≃ P (h) (aN−1/3)h (7) 90 ret ≃ for h N1/3. This exponential decay can be seen in 80 Figure≪8 for small h. For h > N1/3, the probability of et Pr perturbingarelevantnodeisnotsmallanymore,andP ret 70 reaches a plateau. A rough estimate of the dependence onN ofthevalueofP ontheplateauisobtainedbythe ret 60 following reasoning: The number of attractors increases with N roughly as 2N1/3, and the basins of attraction 50 have a size of the order 2N/2N1/3. Using Eq. (4), this 0 20 40 60 80 100 h gives the estimate P 2−bN1/3 on the plateau with ret ∼ some constant b. The height of the plateau decreases rapidly with increasing N because of the vast number of FIG. 9: Pret(h)(in percent) of a critical network with N = 100,k=2 different attractors of comparable basin size. 100 100 80 80 60 et 60 Pr 40 Pret 40 20 20 0 0 20 40 60 80 100 0 h 0 50 100 150 200 250 300 h FIG. 8: Pret(h)(in percent) of three critical networks N = 100,k=2 FIG.10: Pret(h)(inpercent) of anothercritical network with N =300,k=2 Taking these results together, we expect that in large critical networks the function P (h) decays from 1 to a The firstcurve is that ofa veryrobustnetwork,which ret valuecloseto0whenhincreasesfrom0toN1/3andthen additionally has the property ofreturning more easilyto staysonthis plateauash increasesfurther. The valueof the original attractor when the perturbation is larger. plateau decreases as an exponential function of N1/3. The second curve is that of a network that has no However,itisknownthatthescalingwithN2/3an−dN1/3 plateau but a very extended decrease of P with h. A ret ofthenonfrozenandrelevantnodesbecomesclearlyvisi- closer inspection of this network reveals that, although ble only inhugenetworks(of the order106), andsmaller its parameter values classify it as critical, there is no networks may show broad distributions in the numbers frozencorebutallnodesarerelevantandpartofasingle ofthesenodes. Figure8showsthreeexamplesofP for complex relevant component. This means that the state ret 7 space is far from random, and that perturbations of dif- 100 ferentsizescarrythenetworktodifferentregionsinstate space. 90 80 Let us compare the first curve to that of networks et that have been evolved for robustness to small pertur- Pr 70 bations [18], shown in Figure 11. These networks have beenevolvedbymodifyingtheconnectionsandfunctions 60 and by selecting for a high probability of returning to the same attractor after perturbing one node. Interest- 50 ingly,althoughonlyalargevalueofP (1)hasbeenim- ret 0 2 4 6 8 10 12 posedinthesystem,Pret isveryhighforallperturbation h sizes. Many networks obtained by the same procedure even show P (h)=1 for all h. ret FIG. 12: Pret(h)(in percent) of the cell cycle regulation net- work of buddingyeast V. CONCLUSIONS 100 In this paper, we have evaluated the probability P (h) that a Boolean network returns to an attractor ret 95 after perturbing h nodes, averaged over different initial statesofthenetwork. We foundthatP (h)candisplay ret a variety of different shapes, which yields insights into Pret 90 the state-space structure. If the response of the network to the perturbation of several nodes is independent for 85 each perturbed node, Pret(h) decays exponentially with h for small h. Larger perturbations are of course not in- dependent, and if the perturbationleads the systemto a 80 randomplace in state space, P (h) shows a plateau for 0 10 20 30 40 50 60 ret h these perturbation sizes. When the size of the perturba- tion approaches the total number of nodes, P (h) can ret shownvarioustypesofbehaviour,includingadecreaseto FIG.11: Pret(h)(inpercent)ofanetworkof60nodesevolved zerooranincreasetoavaluelargerthantheplateau. The for robustness. reason is that the response to inverting the state of all nodesdependsstronglyonthenetworkstructure. Weob- tained analytical results for Random Boolean Networks in the limit of large N. For critical networks, P (h) ret decreases rapidly to (almost) 0 for large N. For chaotic and frozen networks, P (h) remains nonzero for large ret N andis not self-averaging,which means that the shape ofP (h)differswidely betweendifferentnetworks. The ret The network used for producing Figure 12 is the core ensemble averageof the plateau value of P for chaotic ret partofcellcycle regulationnetworkofbudding yeast,as networks is 2/3 for large N. The fact that this plateau represented in [20]. An important property of this net- valueismuchhigherthanincriticalnetworksmeansthat workisthatithasaprominentfixedpointattracting86% chaotic networks are more robust than critical networks of all network states. The diagram shows the surprising when perturbations affect a nonvanishing proportion of feature that P (h) has its minimum at h = 1. This all nodes. The reason for this is that chaotic networks ret means that the main attractor, which is a fixed point, have a much smaller number of attractors than critical is not very stable under perturbations of one node. Fur- networks, and therefore the probability that a perturba- theranalysisshowsthattherearethreeotherfixedpoints tion carries the network into the basin of attraction of thatdiffer fromthe mainfixedpointby the state ofonly a different attractor is larger in critical networks than one node. As suggested in [20], the cell is likely to be in chaotic networks. Similarly interesting and surpris- waiting for a new input when the network is at its fixed ing is the result that a biological network that has been point. This might explain why this fixed point is rather characterized as being very robust is pretty sensitive to sensitive to small perturbations. perturbations of size h=1, while P is much larger for ret 8 largerperturbations. Itremainstobeseenifthis feature VI. ACKNOWLEDGMENTS occurs also in other biological networks. To conclude, characterizing the dynamical stability of networks by a function P (h) gives far more informa- ret tion about the state space structure of the network than using a single number, such as the average “sensitivity” We thank C. Hamer for programming the yeast net- of nodes. work and T. Mihaljev for useful discussions. [1] M. Rosen-Zvi, A. Engel and I. Kanter, [12] S. A. Kauffman,J. Theor. Biol. 22 (1969), p. 437. Phys.Rev.Lett. 87, 078101 (2001). [13] S. A. Kauffman,Nature 224 (1969), p. 177. [2] A.A.Moreira,A.Mathur,D.DiermeierandL.A.N.Ama- [14] X. Qu, M. Aldana, Leo. P. Kadanoff, Numerical and ral, PNAS 101, 12085 (2004). Theoretical Studies of Noise Effects in the Kauffman [3] M.C. Lagomarsino, P. Jona and B. Bassetti, Model. Journal of Statistical Physics, 109 (5/6), (De- Phys.Rev.Lett. 95, 158701 (2005). cember 2002), 967-986. [4] K.E. Bassler, C. Lee and Y. Lee, Phys. Rev. Lett. 93, [15] Leo Kadanoff and Susan Coppersmith and Maximino 038101 (2004). Aldana,Boolean Dynamics with Random Couplings, in [5] H.J. Herrmann, “Cellular automata” in Nonlinear Phe- Perspectives and Problems in Nonlinear Science, A cel- nomena in Complex Systems, ed. A.N. Proto (North- ebratory volume in honor of Lawrence Sirovich, eds. Holland, Amsterdam, 1988), pp. 151-199. E. Kaplan,J. E.Mardsen andK.R.Sreenivasan(2003), [6] H.H.McAdamsandA.Arkin,Proc.Nat.Acad.Sci.USA Springer, New York. 94, 814 (1997). [16] P. Krawitz and I. Shmulevich,Basin Entropy in Boolean [7] J. Indekeu,physics/0310113 (2003). Network Ensembles, 2007 cond-mat/0702114 [8] A. Aleksiejuk, J.A. Holyst, D. Stauffer, Physica A 310, [17] B. Derrida, J. Phys. 48, 971 (1988). 260 (2002). [18] A.Szejka,B.Drossel, Eur.Phys.J.B56373-380 (2007) [9] I.Shmulevich,E.R.Dougherty,andW.Zhang,Proceed- [19] V. Kaufman, T. Mihaljev, B. Drossel, Phys. Rev. E 72, ings of the IEE 90, 1778 (2002). 046124 (2005) [10] F. Greil and Barbara Drossel, Phys. Rev. Lett. 95, [20] F. Li, T. Long, Y. Lu, Q. Ouyang, C. Tang, PNAS 101 048701 (2005). 4781-4786 (2004) [11] K. Klemm and S. Bornholdt, Phys. Rev. E 72 (2005) 055101(R).

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.