The field began with the initial observation that the activity of

The field began with the initial observation that the activity of dopamine neurons resembles a reward prediction error from formal reinforcement-learning theory [4], and now subsumes an elaborate framework that can potentially account for the functions of many different parts of the brain. It is likely

that this approach will continue to be useful as we embark on the attempt to understand how different RL component processes are ultimately combined together to produce integrated behavior. Nothing declared. Papers of particular interest, published within the period of review, have been highlighted as: • of special interest This work was supported by an NIH Ganetespib cost conte center grant on the neurobiology of social decision making (P50MH094258-01A1), NIH grant DA033077-01 (supported by OppNet, NIH’s Basic Behavioral and Social Science Opportunity Network) and National Science Foundation grant 1207573 to JOD. “
“Current Opinion in Behavioral Sciences 2015, 1:101–106 This find more review comes from a themed issue on Cognitive neuroscience Edited by Cindy Lustig and Howard Eichenbaum http://dx.doi.org/10.1016/j.cobeha.2014.10.007 2352-1546/© 2014 Published by Elsevier Ltd. The prefrontal cortex is often described as subserving decision-making and executive control.

Decision-making research focuses on the PFC function in action selection according to perceptual cues and reward values 1 and 2]. Executive control research focuses on the PFC function in learning and switching between behavioral rules or sets that guide action 1, 3, 4, 5, 6, 7, 8, 9 and 10]. These two lines of research have often been carried out independently. Here we review recent findings and outline a theoretical framework unifying these two conceptual approaches of PFC function. There is converging evidence that the computation of expected rewards driving action selection primarily involves the ventromedial PFC (vmPFC) 11, 12 and 13]. The vmPFC, especially its ventral portion (often referred to as the medial orbitofrontal cortex), enables to convert distinct subjective reward scales

into a ‘common currency’ scale for allowing value comparison 14, 15, 16 and 17] that drives selection. Reward values are generally associated with action outcomes rather than actions per se. Consistently, the Protein tyrosine phosphatase vmPFC is involved in predicting action outcomes 18, 19, 20, 21• and 22], suggesting that the vmPFC encodes action-outcome associations for selecting actions according to reward values. By contrast, selecting actions according to perceptual cues involves the lateral premotor cortex 9, 23, 24 and 25]. However, when expected rewards and perceptual cues are not linked to specific actions, decisions are presumably made between more abstract action sets that may subsequently guide the selection of specific actions according to stimuli.

Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>