## Neural Computation

In the probability density function (PDF) approach to neural network modeling, a common simplifying assumption is that the arrival times of elementary postsynaptic events are governed by a Poisson process. This assumption ignores temporal correlations in the input that sometimes have important physiological consequences. We extend PDF methods to models with synaptic event times governed by any modulated renewal process. We focus on the integrate-and-fire neuron with instantaneous synaptic kinetics and a random elementary excitatory postsynaptic potential (EPSP), *A*. Between presynaptic events, the membrane voltage, *v*, decays exponentially toward rest, while *s*, the time since the last synaptic input event, evolves with unit velocity. When a synaptic event arrives, *v* jumps by *A*, and *s* is reset to zero. If *v* crosses the threshold voltage, an action potential occurs, and *v* is reset to *v*_{reset}. The probability per unit time of a synaptic event at time *t*, given the elapsed time *s* since the last event, *h*(*s*, *t*), depends on specifics of the renewal process. We study how regularity of the train of synaptic input events affects output spike rate, PDF and coefficient of variation (CV) of the interspike interval, and the autocorrelation function of the output spike train. In the limit of a deterministic, clocklike train of input events, the PDF of the interspike interval converges to a sum of delta functions, with coefficients determined by the PDF for *A*. The limiting autocorrelation function of the output spike train is a sum of delta functions whose coefficients fall under a damped oscillatory envelope. When the EPSP CV, *σ _{A}/μ_{A}*, is equal to 0.45, a CV for the intersynaptic event interval,

*σ*, is functionally equivalent to a deterministic periodic train of synaptic input events (CV = 0) with respect to spike statistics. We discuss the relevance to neural network simulations.

_{T}/μ_{T}= 0.35