2024
GPR-SCSANet: Unequal-Length Time Series Normalization with Split-Channel Residual Convolution and Self-Attention for Brain Age Prediction
Sun F, Liang C, Adali T, Zhang D, Jiang R, Calhoun V, Qi S. GPR-SCSANet: Unequal-Length Time Series Normalization with Split-Channel Residual Convolution and Self-Attention for Brain Age Prediction. 2024, 00: 5097-5103. DOI: 10.1109/bibm62325.2024.10822453.Peer-Reviewed Original ResearchSelf-attention mechanismResidual convolutionGaussian process regressionFunctional magnetic resonance imagingReal-world scenariosAge prediction taskSelf-attentionPrediction taskBrain age estimationAge predictionInherent informationBrain age predictionFMRI time coursesLength of time seriesProcess regressionVariables conflictBrain functional alterationsConvolutionPrediction accuracyUnequal-lengthTraditional methodsMotion artifactsDownstream applicationsTime series normalizationPrediction modelLearning integral operators via neural integral equations
Zappala E, Fonseca A, Caro J, Moberly A, Higley M, Cardin J, Dijk D. Learning integral operators via neural integral equations. Nature Machine Intelligence 2024, 6: 1046-1062. DOI: 10.1038/s42256-024-00886-8.Peer-Reviewed Original ResearchSelf-attentionNon-local operatorsMachine learningModeling complex systemsReal-world dataHigher-dimensional problemsComplex systemsDynamic embeddingsIntegral equationsModel capacitySpatiotemporal dependenciesIntegral operatorsSecond-kind integral equationsIntegral equation solversModeling capabilitiesNonlinear operatorsTheoretical analysisNon-local systemNumerical benchmarksMachineLearningApproximate resultsNavier-StokesEquation solverScalabilityA deep spatio-temporal attention model of dynamic functional network connectivity shows sensitivity to Alzheimer’s in asymptomatic individuals
Wei Y, Abrol A, Lah J, Qiu D, Calhoun V. A deep spatio-temporal attention model of dynamic functional network connectivity shows sensitivity to Alzheimer’s in asymptomatic individuals. Annual International Conference Of The IEEE Engineering In Medicine And Biology Society (EMBC) 2024, 00: 1-4. PMID: 40039841, DOI: 10.1109/embc53108.2024.10781740.Peer-Reviewed Original ResearchConceptsDynamic functional network connectivityFunctional magnetic resonance imagingSpatio-temporal attention modelNetwork connectivityMild cognitive impairmentDeep learning advancesFunctional network connectivityMachine learning methodsSelf-attentionAttention modelAt-risk subjectsLearning methodsLearning advancesAlzheimer's diseaseNetwork dependencePredicting Protein-Protein Interactions Using Self-Attention-Based Deep Neural Networks and FastText Embeddings
Oviya I, Sravya N, Raja K. Predicting Protein-Protein Interactions Using Self-Attention-Based Deep Neural Networks and FastText Embeddings. 2024, 00: 1-6. DOI: 10.1109/icccnt61001.2024.10725821.Peer-Reviewed Original ResearchPredicting Protein-Protein InteractionsProtein-protein interactionsProtein sequencesRepresentation of protein sequencesEncoded protein sequencesK-mer sequencesLearning modelsAmino acid segmentSelf-attention-basedNetwork feature extractionDeep neural networksDeep learning modelsPredicting PPIsK-mersMachine learning modelsProtein interactionsCellular functionsFastText embeddingsSelf-attentionTransfer learningAcid segmentFeature extractionHuman bacillusEncoding techniqueNeural networkSpach Transformer: Spatial and Channel-Wise Transformer Based on Local and Global Self-Attentions for PET Image Denoising
Jang S, Pan T, Li Y, Heidari P, Chen J, Li Q, Gong K. Spach Transformer: Spatial and Channel-Wise Transformer Based on Local and Global Self-Attentions for PET Image Denoising. IEEE Transactions On Medical Imaging 2024, 43: 2036-2049. PMID: 37995174, PMCID: PMC11111593, DOI: 10.1109/tmi.2023.3336237.Peer-Reviewed Original ResearchConceptsMulti-head self-attentionConvolutional neural networkSelf-attentionSignal-to-noise ratioState-of-the-art deep learning architecturesGlobal self-attentionState-of-the-artLocal feature extractionDeep learning architectureLow signal-to-noise ratioImage denoisingChannel informationChannel-wiseLearning architectureFeature extractionNeural networkTransformation frameworkComputational costReceptive fieldsImage qualityQuantitative meritDenoisingFrameworkQuantitative resultsDataset
2022
Low-Dose Tau PET Imaging Based on Swin Restormer with Diagonally Scaled Self-Attention
Jang S, Lois C, Becker J, Thibault E, Li Y, Price J, Fakhri G, Li Q, Johnson K, Gong K. Low-Dose Tau PET Imaging Based on Swin Restormer with Diagonally Scaled Self-Attention. 2022, 00: 1-3. DOI: 10.1109/nss/mic44845.2022.10399169.Peer-Reviewed Original ResearchConvolutional neural networkSelf-attention mechanismSelf-attentionTransformer architectureComputer vision tasksLocal feature extractionLong-range informationVision tasksDenoising performanceSwin TransformerFeature extractionImage datasetsUNet structureNeural networkSwinComputational costReceptive fieldsImage qualityMap calculationNetwork structureArchitecturePET image qualityChannel dimensionsQuantitative evaluationDenoising
This site is protected by hCaptcha and its Privacy Policy and Terms of Service apply