Browsing by Author "Rodríguez Fernandez, María"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
- ItemAlternative language paradigms for functional magnetic resonance imaging as presurgical tools for inducing crossed cerebro-cerebellar language activations in brain tumor patients(SPRINGER, 2021) Thakkar Ishani, Rajendra; Arrano Carrasco, Leonardo Marcelo; Cortes Rivera, Barbara; Zunino Pesce, Romina Francesca; Mery Munoz, Francisco Javier; Rodríguez Fernandez, María; Smits, Marion; Mendez Orellana, CarolinaObjectives Crossed cerebro-cerebellar BOLD activations have recently come to light as additional diagnostic features for patients with brain tumors. The covert verb generation (VG) task is a widely used language paradigm to determine these language-related crossed activations. Here we demonstrate these crossed activations in two additional language paradigms, the semantic and phonological association tasks. We propose the merit of these tasks to language lateralization determination in the clinic as they are easy to monitor and suitable for patients with aphasia. Methods Patients with brain tumors localized at different cortical sites (n = 71) performed three language paradigms, namely the VG task as well as the semantic (SA) and phonological (PA) association tasks with button-press responses. Respective language activations in disparate cortical regions and the cerebellum were assigned laterality. Agreements in laterality between the two new tasks and the verb generation task were tested using Cohen's kappa. Results Both tasks significantly agreed in cortical and cerebellar lateralization with the verb generation task in patients. Additionally, a McNemar test confirmed the presence of crossed activations in the cortex and the cerebellum in the entire subject population. Conclusion We demonstrated that the semantic and phonological association tasks resulted in crossed cerebro-cerebellar language lateralization activations as those observed due to the covert verb generation task. This may suggest the possibility of these tasks being used conjointly with the traditional verb generation task, especially for subjects that may be unable to perform the latter.
- ItemCircadian Phase Prediction From Non-Intrusive and Ambulatory Physiological Data(2021) Suárez Pinto, Alexis Adrián; Núñez Retamal, Felipe Eduardo; Rodríguez Fernandez, MaríaChronotherapy aims to treat patients according to their endogenous biological rhythms and requires, therefore, knowing their circadian phase. Circadian phase is partially determined by genetics and, under natural conditions, is normally entrained by environmental signals (zeitgebers), predominantly by light. Physiological data such as melatonin concentration and core body temperature (CBT) have been used to estimate circadian phase. However, due to their expensive and intrusive obtention, other physiological variables that also present circadian rhythmicity, such as heart rate variability, skin temperature, activity, and body position, have recently been proposed in several studies to estimate circadian phase. This study aims to predict circadian phase using minimally intrusive ambulatory physiological data modeled with machine learning techniques. Two approaches were considered; first, time-series were used to train artificial neural networks (ANNs) that predict CBT and melatonin dynamics and, second, a novel approach that uses scalar variables to build regression models that predict the time of the minimum CBT and the dim light melatonin onset (DLMO). ANNs require less than 48 hours of minimally intrusive data collection to predict circadian phase with an accuracy of less than one hour. On the other hand, regression models that use only three variables (body mass index, activity, and heart rate) are simpler and show higher accuracy with less than one minute of error, although they require longer times of data collection. This is a promising approach that should be validated in further studies considering a broader population and a wider range of conditions, including circadian misalignment.