• Keine Ergebnisse gefunden

A simple Hidden Markov Model for midbrain dopaminergic neurons

N/A
N/A
Protected

Academic year: 2022

Aktie "A simple Hidden Markov Model for midbrain dopaminergic neurons"

Copied!
2
0
0

Wird geladen.... (Jetzt Volltext ansehen)

Volltext

(1)

BioMed Central

Page 1 of 2

(page number not for citation purposes)

BMC Neuroscience

Open Access

Poster presentation

A simple Hidden Markov Model for midbrain dopaminergic neurons Felipe Gerhard*

1

, Julia Schiemann

2

, Jochen Roeper

2

and Gaby Schneider

3

Address: 1Frankfurt Institute for Advanced Studies (FIAS), Frankfurt (Main), 60438, Germany, 2Institute of Neurophysiology, Neuroscience Center, Goethe University, Frankfurt (Main), 60590, Germany and 3Department of Computer Science and Mathematics, Goethe University, Frankfurt (Main), 60438, Germany

Email: Felipe Gerhard* - gerhard@fias.uni-frankfurt.de

* Corresponding author

Introduction

Dopaminergic neurons in the midbrain show a variety of firing patterns, ranging from very regular firing pacemaker cells to bursty and irregular neurons. The effects of differ- ent experimental conditions (like pharmacological treat- ment or genetical manipulations) on these neuronal discharge patterns may be subtle. Applying a stochastic model is a quantitative approach to reveal these changes.

Model

We present a simple Hidden Markov Model (HMM) that has been first proposed in [1] to describe single-unit recordings from spontaneously active dopaminergic neu- rons in the midbrain. Here, we consider only two hidden states associated with a tonic and a bursting state.

Depending on the current state, the next inter-spike inter- val is drawn from a state-dependent (continuous) distri- bution. By constraining the state transition matrix, the model can be reduced to a common renewal process in the case of non-bursting cells. Inference is done via the EM algorithm and maximum-likelihood estimation of the parametrized distributions. Due to the simple model structure, the time-rescaling theorem [2] allows investiga- tion the plausibility of the chosen model complexity and fitted parameters.

A new burst identification algorithm

The model's applicability is not restricted to dopaminergic neurons. It is suited to study the characteristics of intrinsi- cally bursty neurons. For spike trains that are dominated

by bursts, classical burst detection algorithms based on surprise measures [3,4] are prone to fail. In the case of the HMM, the most probable state sequence (derived by the Viterbi algorithm) tags bursts within the spike train so that further properties (such as intra-burst frequency, distribu- tion of number of spikes per burst) can be calculated to characterize bursting behavior.

Results and discussion

The HMM is able to capture basic spike train statistics of a dataset of more than 60 recorded neurons, even in the presence of non-stationarities. More specifically, second- order statistics like the auto-correlation function (ACF) are accurately modeled. It further extends the work in [1]

as the estimated parameters allow for a simple classifica- tion scheme into pacemaker, bursty and irregular firing neurons which was previously done by eye-inspection of the ACF [5]. Finally, model parameters have an intuitive interpretation and are used to derive measures of bursting properties of spike trains. Thus, our model offers a unify- ing framework for the description of varied discharge pat- terns of dopaminergic neurons. Used as a burst identification algorithm, it establishes a quantitative link between the biophysical level (e.g. ion-channel dynam- ics) and spike train statistics.

References

1. Camproux A, Saunier F, Chouvet G, Thalabard J, Thomas G: A hid- den Markov model approach to neuron firing patterns. Bio- physical Journal 1996, 71:2404-2412.

from Eighteenth Annual Computational Neuroscience Meeting: CNS*2009 Berlin, Germany. 18–23 July 2009

Published: 13 July 2009

BMC Neuroscience 2009, 10(Suppl 1):P235 doi:10.1186/1471-2202-10-S1-P235

<supplement> <title> <p>Eighteenth Annual Computational Neuroscience Meeting: CNS*2009</p> </title> <editor>Don H Johnson</editor> <note>Meeting abstracts – A single PDF containing all abstracts in this Supplement is available <a href="http://www.biomedcentral.com/content/files/pdf/1471-2202-10-S1-full.pdf">here</a>.</note> <url>http://www.biomedcentral.com/content/pdf/1471-2202-10-S1-info.pdf</url> </supplement>

This abstract is available from: http://www.biomedcentral.com/1471-2202/10/S1/P235

© 2009 Gerhard et al; licensee BioMed Central Ltd.

(2)

Publish with BioMed Central and every scientist can read your work free of charge

"BioMed Central will be the most significant development for disseminating the results of biomedical researc h in our lifetime."

Sir Paul Nurse, Cancer Research UK

Your research papers will be:

available free of charge to the entire biomedical community peer reviewed and published immediately upon acceptance cited in PubMed and archived on PubMed Central yours — you keep the copyright

Submit your manuscript here:

http://www.biomedcentral.com/info/publishing_adv.asp

BioMedcentral BMC Neuroscience 2009, 10(Suppl 1):P235 http://www.biomedcentral.com/1471-2202/10/S1/P235

Page 2 of 2

(page number not for citation purposes) 2. Brown EN: The time-rescaling theorem and its application to

neural spike train data analysis. Neural Computation 2001, 14:325-346.

3. Legendy CR, Salcman M: Bursts and recurrences of bursts in the spike trains of spontaneously active striate cortex neurons. J Neurophysiology 1985, 53:926-939.

4. Gourevitch B, Eggermont JJ: A nonparametric approach for detection of bursts in spike trains. J Neuroscience Methods 2007, 160:349-358.

5. Wilson CJ, Young SJ, Groves PM: Statistical properties of neuro- nal spike trains in the substantia nigra: Cell types and their interactions. Brain Research 1977, 136:243-260.

Referenzen

ÄHNLICHE DOKUMENTE

Hagedorn and I now have preliminary evidence that extirpation of the neurosecretory system will prevent the ovary from secreting ecydsone after a blood meal, and as a consequence

HFT applies the latest technological advances in market and market data access like collocation, low latency and order routing to maximize returns on established trading

The conditional probability distribution p(x|y) is conditionally independent (in both “fence” and “comb” models), because it can be written as a product of probabilities. Do

– Maximum Likelihood Principle for Markov Chains – Unsupervised learning,

Discriminative learning – large margin learning, SSVM, loss-based learning, learning with latent variables

The algorithm computes an approximation of the Gaussian cumulative distribution function as defined in Equation (1). The values were calculated with the code taken

The state established for the tax liabilities represented by duties, taxes, rates and other state back finances including matching increase in tax and penalties for overdue

Samas võib teiste ligandide signaaliedastust ErbB2 suurenenud tase mõjutada erinevalt - näiteks NRG1 korral, kuna selle ligandi ErbB4 vahendatud signaal viib dopamiini