Abstract

We propose a stochastic nonlinear system to model the gating activity coupled with the membrane potential for a typical neuron. It distinguishes two different levels: a macroscopic one, for the membrane potential, and a mesoscopic one, for the gating process through the movement of its voltage sensors. Such a nonlinear system can be handled to form a Hodgkin-Huxley-like model, which links those two levels unlike the original deterministic Hodgkin-Huxley model which is positioned at a macroscopic scale only. Also, we show that an interacting particle system can be used to approximate our model, which is an approximation technique similar to the jump Markov processes, used to approximate the original Hodgkin-Huxley model.

1. Introduction

In 1952, Hodgkin and Huxley proposed their famous model for the membrane potential dynamic of a typical neuron through observations made in the squid giant axon. Basically, the model is based on the mean behavior of the potassium () and sodium () ion channels, whose states (open/closed) are determined by a gating process acting inside those channels. The proposed system is 4-dimensional nonlinear equations, fully coupled through the membrane potential (or voltage) and the probability that a representative gate/ion channel of each species is open (see [1]). As it describes deterministically the general membrane potential through such a mean field approach, it might be indicated as a macroscopic process with respect to all the processes involved in the internal neuronal process.

In order to explain the internal fluctuations or its channel noise in a neuron (see [2]), stochastic versions of the original Hodgkin-Huxley model have been suggested. A bunch of them are based on taking into account the underlying stochasticity in the gating activity, by describing the open/closed processes as jump voltage-coupled Markov processes and by using empirical measures instead of probability measures of the corresponding open times (see [36]). In a certain way, such methods are consistent because when the number of channels or gates goes towards infinity the original deterministic Hodgkin-Huxley equations are recovered (see [5, 6]).

In this work, beyond considering the inherent stochasticity of such phenomena, we want to include a continuous dynamical model for the gating activity, because the discrete two-state (open/closed) point of view of its modeling is just an approximation of the corresponding metastable states of the continuous movement of the proteins involved.

The voltage sensors are proteins whose conformational positions are responsible for the states of the gating activity (see [7]). This internal process is known as the voltage-gated ion channel process. A continuous state space stochastic process seems to be suitable to describe the position of such proteins, and it is what we are going to propose. Due to the nature and scale on which we will tackle this, we say that the dynamics representing such voltage sensors are located at a mesoscopic scale or level. Thus, our proposed system will consider two different levels, the macroscopic and the mesoscopic one, which are fully coupled through a link voltage-dependent function. We will call our model Hodgkin-Huxley-like model, which will conserve the main structural characteristics of the original voltage equation. Thus, placing the gating phenomenon from a continuous point of view is basically the main motivation for conducting this work.

Our model is a stochastic nonlinear system, where its nonlinearity is due to the intervention of the probability law of the stochastic process in its dynamic (see [812]). In our case, such a probability will represent the probability that a representative voltage sensor of any species “pulls up” its corresponding gate, which is equivalent to the probability that the corresponding channel is open in the original or classical model.

The mathematical consistence of our proposal is evaluated by means of the approximation via a particle system, that is, a system with several interacting voltage sensors. It means that our model will meet with features similar to those of the original macroscopic case, with respect to the convergence of empirical measures to the corresponding probability law. In our case, this convergence property by using empirical measures from a stochastic particle system approximating our ideal system is known as propagation of chaos.

Specifically, our propagation of chaos result is as follows: consider the voltage equation depending on empirical measures instead of the probability that the (potassium, sodium) ion channels are in an open state, as in the original Hodgkin-Huxley equations. Suppose that those empirical measures depend on the behavior of certain continuous state space stochastic processes representing the interacting voltage sensors of typical (potassium, sodium) ion channels. Then, when the number of sensors (or channels) goes towards infinity, we will recover a Hodgkin-Huxley-like system which has two remarkable fully coupled components: the voltage equation depending on the probability laws of the corresponding voltage sensor positions (macroscopic part) and the stochastic equations for the position of those voltage sensors (mesoscopic part).

This result extends earlier ideas for the description of the ion channel dynamic: from a discrete space state of open/closed gates to a continuous state space of the position of the voltage sensors. Our extension makes the introduction of the propagation of chaos property necessary, which comes to replace the approximation via jump voltage-coupled Markov processes.

Another nice property of our model is the recurrence, which has a biological interpretation. Also, this work generalizes the ideas in [13].

In Section 2, an introduction of those stochastic nonlinear processes, our model, and its main features are set.

In Section 3, the connection of our model with the original Hodgkin-Huxley model is given, as well as its mathematical justification.

Conclusions are given in Section 4.

2. Our Model

2.1. Nonlinear Processes and Propagation of Chaos

We are going to introduce some basic facts about a certain class of nonlinear stochastic differential equations which are useful to describe situations where individuals (particles, components of a system, etc.) are interacting with each other through a mean field force. Let be a continuous-time -valued stochastic process defined on some complete probability space satisfyingwhere is a -Brownian motion and . This kind of systems has been widely studied under different forms of and under extensions of the diffusion part (see, e.g., [812, 14]). As example, (1) is a special case of the nonlinear system considered in [11] and the existence and uniqueness of a solution of (1) are proved when (i.e., , where is the set of probability measures on satisfying , for ) and Lipschitz condition on (see Theorem 2.2 therein, which states the existence and uniqueness of a solution in law sense, although a stronger result is proved). That analysis is based on the contraction of the Wasserstein metric on , which is defined aswhere is the set of probability measures on such that the marginal law with respect to is and the marginal law with respect to is . The nice feature of this metric is that the metric space is complete and separable (see [15]). Here we say that (1) has a unique solution pathwise and in law, which for a usual Itô diffusion is analogous to saying that it has a unique strong solution.

The probability law of (1) follows a special parametrization of the McKean-Vlasov equation. This is a nonlinear equation whose general form is given bywhere is a measure on and is a test function with compact support and with derivatives of any order. In this case, it describes the Fokker-Planck equation for the temporal evolution of the law of (1) (also called Fokker-Planck McKean-Vlasov equation), where we have thatOther interesting examples arising in possibly singular cases of can be found in [10].

The law is considered to describe a mean field force. That is, two individuals from (1), and driven by independent and identically distributed (iid, for short) Brownian motions and , are “interacting” through its common law . But it is an ideal situation because, probabilistically, they are not interacting ( and are iid). It is a well-known fact that, under some conditions, we can approximate the law of (1) on bounded intervals of time by the interacting particle systemwhere the Brownian motions are iid. Here, the main result is as follows: for some fixed, if solves (1) driven by and , then uniformly on any finite time horizon (see, e.g., Theorem 2.3 in [11]). This property is known as propagation of chaos, term attributed to Kac ([16]).

In this work and under the reference of the Hodgkin-Huxley model ([1]), we are going to suggest that when the gating process in a typical potassium or sodium ion channel is seen as a continuous-space stochastic process describing the movement of a representative voltage sensor, the law of the opening times can be approximated via interacting particle systems as in (5). This comes to replace the classical view of the two-state jump voltage-coupled Markov process for the gating activity (see [36]). Due to the dimensional change, we say that the interacting particle system is located at a lower level than the general membrane potential. If the membrane potential is considered as a macroscopic process, which is natural since the voltage equation is describing the deterministic and general membrane potential through a mean field approach, the voltage sensor dynamic will be seen as a mesoscopic process (where for us, the ion dynamics are at microscopic scale).

2.2. The Membrane Potential and the Voltage-Gated Process Seen as a Nonlinear System

To describe the coupled evolution of the membrane potential and the voltage-gated process through the position of the voltage sensors, we distinguish two different scales according to its nature: the first one follows a deterministic general dynamic (macroscopic), and the second one follows a stochastic local dynamic in a continuous state space (mesoscopic). The classical Hodgkin-Huxley equations explicitly describe the mean performance of the and channels depending on the voltage evolution. The paradigm is that each channel contains 4 gates, which can be in one of the two states: open or closed. A channel is in an open state (conductance) if all its gates are open. Otherwise, the channel is in a closed state (nonconductance).

Such gates and its corresponding voltage sensors are proteins and together they form the so-called voltage-gated process. A channel has its 4 gates of the same type and a channel has 3 of one type and 1 of another. Although all of this comes from the classical formulation of Hodgkin-Huxley model, as technology advances (see [17]), more precise descriptions about how those proteins operate have been given (see, e.g., [7]).

For our mathematical treatment, it is enough to consider only a simple system with 1 gate (voltage sensor) type (extension arises naturally as we will see later). A simple scheme for the states of an ion channel is depicted in Figure 1.

The voltage sensors are responsible of the opening/closing times of its associated gates, therefore, the conformational state of those sensors determines the channel state. The movement of a representative voltage sensor can be seen as a double-well potential diffusion, where one well represents the open state and the other represents the closed state (see Figure 2).

A typical symmetric double-well potential as in Figure 2 is given bywhere , is a Brownian motion, and . (Just for simplicity, we chose the double-well potential centered at zero where its minimum is located at .) To use this kind of process for the position of a voltage sensor, we need to add a continuous voltage-dependent force, say , where is the membrane potential or the voltage evolution. That function is the force responsible for the depth variation of the basins of : according to the values of , there are periods where the voltage sensor tends to open (or “to go up”) and others where it tends to close (or “to go down”). For example, when takes positive values the depth of the basin of the right-hand side becomes deeper than the left-hand side one. Just the opposite occurs when takes negative values. Thus, the position of the voltage sensor will be described by the coupled process , where satisfiesNote that the new potential is now given by . It is a double-well potential if and only if .

To complete our model, we need to provide the dynamical behavior of . As the general voltage evolution of a neuron is usually modeled via a mean field approach of the gating activity, following the Hodgkin-Huxley paradigm we need to introduce the law of . Thus, our ideal complete system will be given bywhere is an appropriate function describing the voltage evolution. Here, (8) combines two processes at different levels: the membrane potential or general voltage at a macroscopic level and the voltage sensor dynamic at a mesoscopic level. The function plays the role of connecting those two scales. In the next section we are going to relate (8) with the Hodgkin-Huxley model in a more precise way.

Now, we are going to impose some assumptions to ensure the existence and uniqueness of a solution of (8). These are as follows.(A.1) and , where is the set of bounded probability measures.(A.2) is a Lipschitz function on with constant .(A.3) is a Lipschitz function where there exists such thatBefore showing that, under (A.1)–(A.3), (8) has a unique solution pathwise and in law, we are going to show that (8) can be written as in (1).

Let , where is the matrix/vector transposition. Consider the following applications:By denoting and , we have that (8) can be written aswhere and then .

The function is locally Lipschitz since is locally Lipschitz. Thus, the usual globally Lipschitz assumption for the existence and uniqueness of a solution of (11) is not satisfied. Nevertheless, we can use a stopping argument to ensure that. First, consider, for any , , where by (A.3) it is clear that and . Then, (7) is equivalent toTherefore, , where each interval limit solveswith and where the Brownian motions involved are indistinguishable from each other. The last two processes are well defined, as is shown in the following proposition.

Proposition 1. Let , , and . Let be a real-valued stochastic process defined on some complete probability space and assume that it is given bywhere , is a Brownian motion, and . Then, there exists a unique strong solution for the above equation with second moment, , for any .

Clearly, the processes can be identified with a special case of (14).

Proof. The drift part is locally Lipschitz because is polynomial. Then, by [18] (Chapter 2, Theorem 3.5), it is enough to show thatfor some .
We haveTaking we obtain the result.

Comment 1. In [18] (Chapter 2, Theorem 3.5) a stopping procedure based on a special growth condition, as the one displayed in the preceding proof, is used in order to demonstrate the existence and uniqueness of a strong solution. Nevertheless, we are going to use additionally another stopping argument which arises from condition (A.1) and regularity, which is property defined below and part of the recurrence property.
Recurrence of (14) is an interesting feature from a biological standpoint. It means that the process representing the voltage sensor is always operating, unless the heat transference between the cell and its environment or reservoir is lost ( in (11)).

Definition 2. Let be a continuous-time real-valued stochastic process. Denote . For any integer , consider the sets and define the following increasing sequence of stopping times:and its limit . (i)A process is said to be regular if, for all , (ii)Let , where is the Borel algebra on . A process is said to be recurrent relative to (-recurrence) if it is regular and for every , where is the first exit time from starting at .
A process is said to be recurrent if it is recurrent relative to any segment containing the origin.

Comment 2. This last definition has been inspired from [19].
Before showing the recurrence of (14), note that the Fokker-Planck equation associated with the law of such a stochastic differential equation is given bywhose stationary probability measure is equal towhere . By continuity, it will imply that the expected proportion of time that spends in each well (if ) can be bounded by the stationary probability that and spend in those wells.

Proposition 3. is recurrent.

Proof. Note that the corresponding generator is given byConsider the functionIt is easily seen that and when . To conclude, we recall Lemma 3.9 from [19].
Lemma 4. Suppose that a process almost surely exits from each bounded domain in a finite time. Then a sufficient condition for -recurrence is that there exists a nonnegative function in the domain such thatThe existence comes from Proposition 1. Thus, the function satisfies the assumptions of the previous lemma for every segment containing the origin, and hence, proposition holds.

Mathematically speaking, one of the consequences of recurrence is that the solution of the corresponding system does not explode or escape towards infinity: if we define, for any and , , then . Indeed, if not, there exists such that . Thus, the event has a positive probability, which contradicts regularity of . Another consequence is that the corresponding solution is almost surely always visiting the segments containing the origin, which within our biological setting will imply that the voltage sensor described by (7) will never be stuck in one of the wells of . That is, the states “up” and “down” will always be visited, which is one of the characteristics of the two-state gating approach.

Theorem 5. Under (A.1)–(A.3) system (8) (or (11)) has a unique solution pathwise and in law.

For convenience, we are going to use both notations (8) and (11).

Proof. Consider the processes and :By (A.1), there exists such that . For any , define and . Let . By assuming and , consider and . Then, according to Theorem 2.2 in [11] (or Theorem 1.1 in [10]), there exists a unique solution pathwise and in law to the equation:since by (A.2) and (A.3) on the set is bounded Lipschitz. But it is almost surely bounded Lipschitz on any finite time horizon, since regularity of and implies that, for any , . Thus, theorem holds.

It can be easily verified that the voltage sensor represented by (7) is also recurrent since it is continuous and is in between two recurrent processes (called above ).

2.3. A Short Remark about the Possible Existence of Different Time Scales Involved in (8) and Some Subsequent Results

From a macroscopic perspective, we can find that in some situations it is possible to justify the presence of different time scales involved in a membrane potential/gating model, as in the Hodgkin-Huxley model (see, e.g., [20]) and in the FitzHugh-Nagumo model (see [21]). What could happen is that works faster than since they are positioned at different levels. In such a case, we may homogenize time scales.

The way we will do this is based on an analogous procedure made for the FitzHugh-Nagumo model in [21]. (Despite the fact that such a procedure is similar to that which we will use here, the graduation of the time scales is different because our mesoscopic component is not considered therein.)

Consider a small parameter, say (), which acts in the following way:where , and under the time-change we getwhere . Dependence of on is due to time acting in the second moment of the Brownian motion. If we might expect that , and then, under the standpoint of the faster dynamic , is near to being a constant ( varies slowly, to be more precise). Thus, this could tempt us to regard (7) as an equation of the form of (14) by freezing the activity of at some point .

Following this idea, let be a solution of (14) with , that is, by considering that the double-well structure holds. Replace therein by and consider . The corresponding deterministic solution (i.e., when ) has three fixed points: , where and are stable points and is an unstable point. Define the quasi-potentials:Consider the basins of attraction and , and let and be the first exit times from and , respectively. If , then for any we haveif , andif . This is a result from [22] (Chapter 4, Theorem 4.2) in which the authors used large deviations to argue the classical Kramers’ time for the expected time between transitions from one well to another in the small noise limit. It was also applied within the neuroscience context in the mentioned work [21] for the FitzHugh-Nagumo model, using a similar procedure but having a different time scale perspective.

In this remark we wanted to point out that when different time scales are involved, approximated methods can be used to obtain some interesting results, which can be difficult to obtain when such different time scales are not present or their existence cannot be justified. Indeed, more detailed results can be obtained for our model under this situation, but it would mean moving away from the main aims of this work.

3. Connection with the Hodgkin-Huxley Model

3.1. The Ideal Case

The original Hodgkin-Huxley model (1952) describes the membrane potential or voltage under the following scheme:where is the membrane capacitance and (A) represents the external stimulus which will be seen here as a given parameter; () represents the maximum conductance; and , , and are the Nernst potassium, sodium, and leak equilibrium potentials. The leakage current is an Ohmic current representing mostly the chloride () one. Processes , , and represent the corresponding probability that a representative gate is in an open state. As we mentioned before, in the case of potassium channels 4 gates are of the same type () and in the case of sodium channels there are 3 gates of the same type () and 1 of another type () (do not be confused with the function defined in (7)). The probability that a typical potassium and sodium ion channel is open is given by and , respectively, where the iid assumption for the behavior of each corresponding gate is adopted. The explicit values for the transition rates , as well as the typical values for the involved constants can be found in [23].

The master equations for , , and also suggest that each gate works according to the following transitional scheme:that is, a two-state jump voltage-coupled Markov regime. Because of independence between ionic species and according to the mathematical treatment that we want to do, it is enough to consider the law of one gate of any type and a generic function . Such a reduction is given bywith . System (34) together with the stochastic regime of the gating transitional scheme described above forms a particular example of a piecewise deterministic Markov process (PDMP), which corresponds to a general class of nondiffusion stochastic models introduced by [24] in 1984. Its construction can be easily described as follows (see [6, 25] for a further development of such a regime in this context). For all , let be the solution of the voltage equation (34) at time starting at . Let be the first time that the gate gets closed and let be the first time after that the gate opens. Then,where in this case . Analogously, let be the first time after that the gate gets closed. Then,where now and so on. All this leads us to the conclusion that the classical Hodgkin-Huxley model, together with its corresponding gating regimes, can be clearly identified with a PDMP.

With this, we want to stress that in (34) the stochasticity of the gating regime is hidden, since only its probability law intervenes, but in our setting, the stochasticity of the gating regime is explicitly given. In order to relate our proposed model with some major characteristics of the classical Hodgkin-Huxley model, consider the following features of such a function from (34):

(a) is a continuous function such that for any the equationis linear, and, for any , is polynomial in .

We can identify such properties with those of the Hodgkin-Huxley model, where its generalization to three gate types (i.e., , , and types) is straightforward: Just consider instead of , where . We are going to identify those characteristics with our macroscopic/mesoscopic approach and show that, despite the scale reduction for the gating activity, it conserves a similar structure for the voltage equation as in the classical Hodgkin-Huxley model.

Let be an increasing Lipschitz function and consider , such that is continuous and, for any , is linear in and polynomial in . Define the stochastic process by . Following the mean field approach of the Hodgkin-Huxley process, as , we can define , where clearly satisfies (a). So, from now on, assume that, for all , . Call this last assumption Assumption (A.2′).

With all those ingredients, we say that our system (8) satisfies a Hodgkin-Huxley-like model. Here, represents a cumulative distribution function such that is the probability that a typical voltage sensor is in the open or “up” state (thus, it plays the role of in the macroscopic case (34)). As example, we have the Laplace cumulative distribution function:where . As is increasing, it can be convenient to think that the open or “up” state is located at the right-hand side of the potential , that is, . Note that our proposal enlarges the state space of the dynamic of the gating activity: from a discrete two-state space dynamic to a continuous real state space dynamic. So, it is such enlargement which causes a change in the reference level: from the macroscopic 0-1 gating dynamic to the mesoscopic scale reduction regarding the continuous movement of the voltage sensor.

Comment 3. In [13] we suggested the function , which arises as a limit point of the degenerated case in the previous Laplace cumulative distribution function. So, this work extends the mentioned one by considering a class of smooth functions for picturing the probability that a typical voltage sensor is in the open or “up” state.

As we mentioned in Section 2.3, processes (8) and (34) are in an ideal situation due to the determinism of and the absence of probabilistic interaction among all the voltage sensors. The more realistic situation is when (8) and (34) are just limiting cases, which is the topic that comes next.

3.2. The Approximated Case: Propagation of Chaos instead of Limit Jump Processes

To be realistic, we cannot ignore the inherent stochasticity of the voltage evolution. With respect to (34), some authors have proposed to use an empirical stochastic measure instead of ([36], among others). The basic idea is to consider a stochastic sequence , with representing the state of the th gate: 1 if the gate is open and 0 if not. Hence, if we take into account gates, we can replace by . Thus, system (34) is replaced bywith , where the expression is defined between jumps of . Specifically, forms a sequence of jump voltage-coupled Markov process, characterized by (i)its state space ,(ii)its intensity ,(iii)its jump law (see [6]).

In [5, 6] the convergence uniformly on any bounded time horizon, where satisfies (34), is proved under smoothness and Lipschitz conditions on , , and , the boundedness of on any finite time horizon and the uniformly boundedness of in , on any finite time horizon. Thus, stochastic hybrid system (39) forms a PDMP where the jumps of go to zero when , thereby raising the paradigm that the Hodgkin-Huxley equations arise as a limit of infinity gates.

Further results were also obtained in [6]. For example, a Central Limit Theorem for was provided, which allows us to justify mathematically the following diffusion approximation:where is a Brownian motion.

In [13] we proposed the following voltage dynamic, in the ideal case:where is a the voltage sensor dynamic and represents the “open region.” The problem is that it allows just the function as integrand in the mean field approach, trying to emulate the classical 0-1 gating process, which can be recovered by a degenerate limit of a suitable smooth function (see the previous comment). Thus, in this work a variety of smooth functions can be considered as integrand in the mean field approach in order to describe the probability of the open gate state, or the “up” voltage sensor state. Also, the smoothness of sets our model in a more familiar structure, as pictured in (11), with respect to the nonlinear stochastic differential equations as (1).

Under this nonideal setting, in [13] a propagation of chaos result was proposed, but just locally in time. In this work, we can develop a propagation of chaos result in a very classical way (see, e.g., Theorem 1.4 in [10] and Theorem 2.3 in [11]) due to the smoothness of , although an additional step is needed in order to overcome the problem that our drift part is locally Lipschitz and not globally, as assumed in the above references (see Theorem 7 below).

For our model, systems as (5) arise in order to approximate nonlinear equations as (11). Thus, our paradigm is that we can recover a Hodgkin-Huxley-like model when propagation of chaos holds. That is the main difference between the setting of PDMPs approximating the classical Hodgkin-Huxley equations and the proposed setting of an interacting particle system approximating a Hodgkin-Huxley-like, as defined by us in the previous subsection: propagation of chaos is the tool that we have to use in order to show consistency.

So, consider gates and let be a stochastic sequence where each element satisfieswhere the Brownian motions are iid. Here, each represents the voltage sensor associated with the th gate being dependent on each other through the interaction function . Thus, propagation of chaos will ensure us that the voltage sensors become independent of each other, implying that all its gates become independent of each other as in the macroscopic case. Note that our approximating system is in and the ideal system is in . That is because in the limit of infinity gates it is sufficient to consider only one representative ideal gate (voltage sensor) to describe the law . Before setting this fact, we have to show that our approximating system is well posed.

Theorem 6. Under (A.1)–(A.3) and (A.2′) system (43) has a unique strong solution with second moment.

Proof. In this proof we are going to proceed similarly as in Proposition 1 because our definition of recurrence was set for one dimension only.
Let , where . We have . Consider the application:where is representing the drift part of (43).
By (A.2), (A.3) and due to being polynomial, is locally Lipschitz, and then, again by Theorem 3.5 in [18] it is enough to show thatfor some positive .
We haveOn one hand, by (A.3) and as a straightforward generalization of the proof of Proposition 1, we getOn the other hand, consider . Thenby (A.2) and (A.2′); there exist , , and , such that . Define and .
Then we getDefining we obtainLet . Thenwhere we have used . By Jensen’s inequality we obtainFinally, putting all together we getThe result holds by taking .

To show our propagation of chaos result, for convenience, instead of (43) we are going to considerwhere and . Thus, we will show the convergence of (54) towards (11).

Theorem 7. For any fixed , let be a component from (54) and assume that solves (11), with , where they are driven by indistinguishable Brownian motions. Then, under (A.1)–(A.3) and (A.2′)on any finite time horizon , when goes to infinity.

Comment 4. Actually, (A.2′) is not used here. It was set only by formalism due to Theorem 6.

Proof. Assume for now that is Lipschitz of constant . Then,by Jensen’s inequality. Using Lipschitz property of and we getNow, consider an iid sequence     such that each element solves (11) where and are driven by indistinguishable Brownian motions and . ThenLet . By taking expectations we obtainNote that , because are iid. Hence, . On the other hand, have the same distribution and hence using again , . ThereforeLet and consider, for any , , which satisfiesand using Gronwall’s inequality we obtainThus, , which proves the theorem but under the assumption that is Lipschitz. Since is locally Lipschitz, is locally Lipschitz as well, and the above inequality holds just locally in time (i.e., stopped by means of a suitable stopping time with respect to ), where the constant (and then ) will depend on the bound at which the process is stopped. To show that for any finite time horizon there exists a finite constant such that the above inequality still holds (almost surely) in our case, we are going to proceed under a similar stopping method to that used in the proof of Theorem 5.
Consider the processes:As there exists such that , for any , define , , and . Thus, we have thatfor , where whenever . But regularity of and implies that is finite almost surely. Therefore, for any there exists a finite constant , almost surely, such that the following inequality holds:Hence, the proof is finished.

This last result gives us the consistency result. So, we can set a Hodgkin-Huxley-like model, which preserves similar characteristics and properties as in the macroscopic case (see Section 3.1). It should be noted that we could theoretically recover some macroscopic features from the mesoscopic system. From (43), using the preceding theorem and Doob’s inequality we obtainon any finite time horizon . That is, the macroscopic link can be estimated by observing the mesoscopic situation.

4. Conclusions

We saw that it is possible to obtain similar equations to those of Hodgkin-Huxley equations through a stochastic nonlinear system, which links two different levels: the macroscopic one, represented by the membrane potential, and the mesoscopic one, represented by the gating process through the movement of its voltage sensors. The advantage of our model in the mathematical description is that it precisely includes such a second level in the neuronal modeling, giving a more accurate description of the gating process in the sense that the state space of the position of a voltage sensor enlarges the earlier two-state descriptions. As an analogy, these two states in the gating activity correspond to the two metastable states of our proposed stochastic continuous voltage sensor model.

Also, in our continuous state space model, the recurrence property of the dynamic of the voltage sensors is in agreement with the biological mechanism of those proteins, because there is no absorbent state: gates pass from one metastate to the other (open/closed or “up”/“down” for voltage sensors) recurrently.

In this biological context, propagation of chaos plays the role of approximating a Hodgkin-Huxley-like model, as the PDMPs do for the classical Hodgkin-Huxley equations. This means that an interacting particle system is a nice way to include a process in a lower level than the macroscopic one, preserving similar characteristics and properties to those within the macroscopic setting.

Conflict of Interests

The author declares that there is no conflict of interests regarding the publication of this paper.

Acknowledgments

The author was supported by FONDECYT 3140613 postdoctorate grant and kindly acknowledges the partial support by Instituto de Ecología y Biodiversidad (IEB), Departamento de Ecología at Pontificia Universidad Católica de Chile, Grant PIA-CONICYT ACT 1112 “Stochastic Analysis Research Network,” and VRI-PUC Grant “Biostochastic Program.” The author’s thanks go to Professor Etienne Tanré (INRIA, Sophia-Antipolis), Professor Rolando Rebolledo (Centro de Análisis Estocástico, Pontificia Universidad Católica de Chile), Professor Michèle Thieullen (Laboratoire de Probabilités et Modeles Aléatoires, Université Pierre et Marie Curie, Paris 6), and Professor Pablo Marquet (Departamento de Ecología, Pontificia Universidad Católica de Chile) for their valuable advice.