2022
Right Place, Right Time: Spatiotemporal Predictions Guide Attention in Dynamic Visual Search
Boettcher S, Shalev N, Wolfe J, Nobre A. Right Place, Right Time: Spatiotemporal Predictions Guide Attention in Dynamic Visual Search. Journal Of Experimental Psychology General 2022, 151: 348-362. PMID: 34843369, PMCID: PMC8920297, DOI: 10.1037/xge0000901.Peer-Reviewed Original ResearchConceptsVisual searchLong-term representationsEye-movement recordingsLong-term learningExperiment 4Priming effectExperiment 1Experiment 2Spatiotemporal regularityGuides attentionParticipants' performanceElements of human behaviorReal-life searchStatic displaysHuman behaviorParticipantsDynamic environmentLaboratory settingTaskDistractorsExtended processSpatiotemporal predictionTemporal dimensionRegularization
2020
Modeling perception and behavior in individuals at clinical high risk for psychosis: Support for the predictive processing framework
Kafadar E, Mittal VA, Strauss GP, Chapman HC, Ellman LM, Bansal S, Gold JM, Alderson-Day B, Evans S, Moffatt J, Silverstein SM, Walker EF, Woods SW, Corlett PR, Powers AR. Modeling perception and behavior in individuals at clinical high risk for psychosis: Support for the predictive processing framework. Schizophrenia Research 2020, 226: 167-175. PMID: 32593735, PMCID: PMC7774587, DOI: 10.1016/j.schres.2020.04.017.Peer-Reviewed Original ResearchConceptsClinical high riskCHR participantsDegraded speech stimuliPredictive processing frameworkUtility of interventionsSample of participantsPerceptual inferenceSensory evidencePsychotic spectrum disordersSpeech stimuliSpeech taskComputational underpinningsBehavioral tasksEfficacy of interventionsSpectrum disorderTarget tonesParticipants' performanceComputational modelingHigh riskPoor recognitionLatent factorsSuch tasksPrior beliefsTaskAppropriate risk stratification
2014
People Help Robots Who Help Others, Not Robots Who Help Themselves
Hayes B, Ullman D, Alexander E, Bank C, Scassellati B. People Help Robots Who Help Others, Not Robots Who Help Themselves. 2014, 255-260. DOI: 10.1109/roman.2014.6926262.Peer-Reviewed Original Research
2011
Cue Integration in Categorical Tasks: Insights from Audio-Visual Speech Perception
Bejjanki VR, Clayards M, Knill DC, Aslin RN. Cue Integration in Categorical Tasks: Insights from Audio-Visual Speech Perception. PLOS ONE 2011, 6: e19812. PMID: 21637344, PMCID: PMC3102664, DOI: 10.1371/journal.pone.0019812.Peer-Reviewed Original ResearchConceptsCue integrationCue weightsPerceptual dimensionsSensory uncertaintyCategorical taskAudio-visual speech perceptionBayes-optimal observerSpeech perception tasksNormative modelPhonemic categorizationSensory reliabilityPerception taskSpeech perceptionVisual modalityComputational principlesHuman performanceSensory varianceParticipants' performanceVisual cuesTrial basisPhonemic labelingCategory variabilityCuesSensory variabilityTask
This site is protected by hCaptcha and its Privacy Policy and Terms of Service apply