DTSC - G2PI - Artículos de Revistas
Permanent URI for this collection
Browse
Recent Submissions
Now showing 1 - 20 of 55
Publication Multidimensional analysis of immune cells from COVID-19 patients identified cell subsets associated with the severity at hospital admission(Ashley L. St. John, Duke-National University of Singapore, SINGAPORE, 2023-06-13) Gil Manso, Sergio; Herrero Quevedo, Diego; Carbonell, Diego; Martinez Bonet, Marta; Bernaldo De Quiros, Esther; Kennedy Batalla, Rebeca; Gallego Valle, Jorge; Lopez Esteban, Rocio; Blazquez Lopez, Elena; Miguens Blanco, Iria; Correa Rocha, Rafael; Gómez Verdejo, Vanessa; Pion, MarjorieBackground SARS-CoV-2 emerged as a new coronavirus causing COVID-19, and it has been responsible for more than 760 million cases and 6.8 million deaths worldwide until March 2023. Although infected individuals could be asymptomatic, other patients presented heterogeneity and a wide range of symptoms. Therefore, identifying those infected individuals and being able to classify them according to their expected severity could help target health efforts more effectively. Methodology/Principal findings Therefore, we wanted to develop a machine learning model to predict those who will develop severe disease at the moment of hospital admission. We recruited 75 individuals and analysed innate and adaptive immune system subsets by flow cytometry. Also, we collected clinical and biochemical information. The objective of the study was to leverage machine learning techniques to identify clinical features associated with disease severity progression. Additionally, the study sought to elucidate the specific cellular subsets involved in the disease following the onset of symptoms. Among the several machine learning models tested, we found that the Elastic Net model was the better to predict the severity score according to a modified WHO classification. This model was able to predict the severity score of 72 out of 75 individuals. Besides, all the machine learning models revealed that CD38+ Treg and CD16+ CD56neg HLA-DR+ NK cells were highly correlated with the severity. Conclusions/Significance The Elastic Net model could stratify the uninfected individuals and the COVID-19 patients from asymptomatic to severe COVID-19 patients. On the other hand, these cellular subsets presented here could help to understand better the induction and progression of the symptoms in COVID-19 individuals.Publication Brain hemodynamic activity during viewing and re-viewing of comedy movies explained by experienced humor(Springer Nature, 2016-06-21) Jaaskelainen, Liro P.; Pajula, Juha; Tohka, Jussi; Lee, Hsin-Ju; Kuo, Wen-Jui; Lin, Fa-Hsuan; European Commission; Ministerio de Economía y Competitividad (España)Humor is crucial in human social interactions. To study the underlying neural processes, three comedy clips were shown twice to 20 volunteers during functional magnetic resonance imaging (fMRI). Inter-subject similarities in humor ratings, obtained immediately after fMRI, explained inter-subject correlation of hemodynamic activity in right frontal pole and in a number of other brain regions. General linear model analysis also indicated activity in right frontal pole, as well as in additional cortical areas and subcortically in striatum, explained by humorousness. The association of the right frontal pole with experienced humorousness is a novel finding, which might be related to humor unfolding over longer time scales in the movie clips. Specifically, frontal pole has been shown to exhibit longer temporal receptive windows than, e.g., sensory areas, which might have enabled processing of humor in the clips based on holding information and reinterpreting that in light of new information several (even tens of) seconds later. As another novel finding, medial and lateral prefrontal areas, frontal pole, posterior-inferior temporal areas, posterior parietal areas, posterior cingulate, striatal structures and amygdala showed reduced activity upon re-viewing of the clips, suggesting involvement in processing of humor related to novelty of the comedic events.Publication Rey's Auditory Verbal Learning Test scores can be predicted from whole brain MRI in Alzheimer's disease(Elsevier, 2017-01-01) Moradi, Elaheh; Hallikainen, Ilona; Hänninen, Tuomo; Tohka, Jussi; Banco Santander; European Commission; Ministerio de Economía y Competitividad (España); Ministerio de Educación, Cultura y Deporte (España)Rey's Auditory Verbal Learning Test (RAVLT) is a powerful neuropsychological tool for testing episodic memory, which is widely used for the cognitive assessment in dementia and pre-dementia conditions. Several studies have shown that an impairment in RAVLT scores reflect well the underlying pathology caused by Alzheimer's disease (AD), thus making RAVLT an effective early marker to detect AD in persons with memory complaints. We investigated the association between RAVLT scores (RAVLT Immediate and RAVLT Percent Forgetting) and the structural brain atrophy caused by AD. The aim was to comprehensively study to what extent the RAVLT scores are predictable based on structural magnetic resonance imaging (MRI) data using machine learning approaches as well as to find the most important brain regions for the estimation of RAVLT scores. For this, we built a predictive model to estimate RAVLT scores from gray matter density via elastic net penalized linear regression model. The proposed approach provided highly significant cross-validated correlation between the estimated and observed RAVLT Immediate (R = 0.50) and RAVLT Percent Forgetting (R = 0.43) in a dataset consisting of 806 AD, mild cognitive impairment (MCI) or healthy subjects. In addition, the selected machine learning method provided more accurate estimates of RAVLT scores than the relevance vector regression used earlier for the estimation of RAVLT based on MRI data. The top predictors were medial temporal lobe structures and amygdala for the estimation of RAVLT Immediate and angular gyrus, hippocampus and amygdala for the estimation of RAVLT Percent Forgetting. Further, the conversion of MCI subjects to AD in 3-years could be predicted based on either observed or estimated RAVLT scores with an accuracy comparable to MRI-based biomarkers.Publication Validation of scientific topic models using graph analysis and corpus metadata(Springer Nature, 2022-03-30) Vázquez López, Manuel Alberto; Pereira Delgado, Jorge; Cid Sueiro, Jesús; Arenas García, Jerónimo; European Commission; Ministerio de Ciencia, Innovación y Universidades (España); Agencia Estatal de Investigación (España)Probabilistic topic modeling algorithms like Latent Dirichlet Allocation (LDA) have become powerful tools for the analysis of large collections of documents (such as papers, projects, or funding applications) in science, technology an innovation (STI) policy design and monitoring. However, selecting an appropriate and stable topic model for a specific application (by adjusting the hyperparameters of the algorithm) is not a trivial problem. Common validation metrics like coherence or perplexity, which are focused on the quality of topics, are not a good fit in applications where the quality of the document similarity relations inferred from the topic model is especially relevant. Relying on graph analysis techniques, the aim of our work is to state a new methodology for the selection of hyperparameters which is specifically oriented to optimize the similarity metrics emanating from the topic model. In order to do this, we propose two graph metrics: the first measures the variability of the similarity graphs that result from different runs of the algorithm for a fixed value of the hyperparameters, while the second metric measures the alignment between the graph derived from the LDA model and another obtained using metadata available for the corresponding corpus. Through experiments on various corpora related to STI, it is shown that the proposed metrics provide relevant indicators to select the number of topics and build persistent topic models that are consistent with the metadata. Their use, which can be extended to other topic models beyond LDA, could facilitate the systematic adoption of this kind of techniques in STI policy analysis and design.Publication Evaluation of Colombian crops fibrous byproducts for potential applications in sustainable building acoustics(MDPI, 2021-01-01) Gomez, Tomas Simon; Zuluaga, Santiago; Jiménez, Maritza; Navacerrada, Maria De Los Angeles; Barbero-Barrera, Maria Del Mar; Prida Caballero, Daniel de la; Restrepo-Osorio, Adriana; Fernández-Morales, PatriciaLocal production of construction materials is a valuable tool for improving the building sector sustainability. In this sense, the use of lignocellulosic fibers from local species becomes an interesting alternative to the development of such materials. As it is thought that the properties of fiber-based materials are dependent on the fibers properties, the knowledge of such properties is fundamental to promote materials development. This study compares the physical, morphological, acoustic, and mechanical characteristics of coir (Cocos nucifera) and fique (Furcraea Agavaceae) fibers and panels. The chemical composition appears to be associated with the general behavior of the fibers and panels, regarding higher tensile strength, thermal degradation behavior, and water absorption. In most tests, fique had the upper hand, showing superior performance; however, on thermal degradation and water absorption, both materials had similar behavior. The sound absorption measurement showed that the fiber diameter affects the sound absorption at high frequencies, where fique panels showed better performance than coir panels.Publication Comportamiento acústico y térmico de materiales basados en fibras naturales para la eficiencia energética en edificación(Editorial CSIC, 2021-01-01) Navacerrada, María Ángeles; Prida Caballero, Daniel de la; Sesmero, Alberto; Pedrero, Antonio; Gómez, Tomás; Fernández-Morales, PatriciaEl uso de materiales aislantes es el primer paso para reducir la energía requerida para mantener una buena temperatura en el interior de un edificio y alcanzar la eficiencia energética. El objetivo es diseñar materiales para aislamiento acústico y térmico baratos, biodegradables y reciclables como los basados en fibras naturales. En este trabajo se estudian las propiedades térmicas y acústicas de no tejidos basados en fibras de fique, de coco y de algodón reciclado a partir de tejido denim. Se proponen posibles usos para los materiales fabricados basados en las exigencias del Código Técnico.Publication Appraisal of non-destructive in situ techniques to determine moisture- and salt crystallization-induced damage in dolostones(Elsevier, 2022-08-01) Fort, R.; Feijoo, J.; Varas-Muriel, M. J.; Navacerrada, M. A.; Barbero-Barrera, M. M.; Prida Caballero, Daniel de la; Comunidad de MadridThe characterisation of both surface and subsurface pathologies (position, depth, width, …) that affects the porous materials used in building constructions, once in service, is important to establish the most suitable intervention strategy. In this sense, the use of non-destructive techniques allows the analysis of different properties without affecting the material. The present study shows the accuracy of different non-destructive in situ techniques, such as: electrical conductivity and capacitance, infrared thermography, ultrasonic pulse velocity, sound absorption, and electrical resistivity tomography, applied on dolostone ashlar stones outer façade of a sixteenth-century belltower, affected by moisture and salt induced decay. The joint analysis of the results obtained with different techniques substantially improves the interpretation and characterisation of the detected pathologies, as they complement each other perfectly. Electrical resistivity tomography, which delivers resistivity cross-sections, yields very good results in detecting subsurface pathologies, and sound absorption is particularly useful for stone surfaces. In both cases, the frequency of the electric field and that of the acoustic emission to detect the extent of damage must be established in advance. The joint study of electrical conductivity and capacitance determines the degree of moisture/salts, both at the surface and subsurface, in the materials tested, one of the main causes of scaling and flaking in stony materials. However, the petrological characteristics of the materials used and the identification of the saline phases present must be known in advance to make a correct interpretation of the results.Publication A Nonstandard Schwarz Domain Decomposition Method for Finite-Element Mesh Truncation of Infinite Arrays(Institute of Electrical and Electronics Engineers (IEEE), 2018-11) García Doñoro, Daniel; García Castillo, Luis Emilio; Sarkar, Tapan K.; Zhang, YuA nonstandard Schwarz domain decomposition method is proposed as finite-element mesh truncation for the analysis of infinite arrays. The proposed methodology provides an (asymptotic) numerically exact radiation condition regardless of the distance to the sources of the problem and without disturbing the original sparsity of the finite-element matrices. Furthermore, it works as a multi Floquet mode (propagating and evanescent) absorbing boundary condition. Numerical results illustrating main features of the proposed methodology are shown.Publication Self-Adaptive hp Finite Element Method with Iterative Mesh Truncation Technique Accelerated with Adaptive Cross Approximation(Elsevier, 2016-05-01) Barrio Garrido, Rosa María; García Castillo, Luis Emilio; Gómez Revuelto, Ignacio; Salazar Palma, MagdalenaTo alleviate the computational bottleneck of a powerful two-dimensional self-adaptive hp finite element method (FEM) for the analysis of open region problems, which uses an iterative computation of the Integral Equation over a fictitious boundary for truncating the FEM domain, we propose the use of Adaptive Cross Approximation (ACA) to effectively accelerate the computation of the Integral Equation. It will be shown that in this context ACA exhibits a robust behavior, yields good accuracy and compression levels up to 90%, and provides a good fair control of the approximants, which is a crucial advantage for hp adaptivity. Theoretical and empirical results of performance (computational complexity) comparing the accelerated and non-accelerated versions of the method are presented. Several canonical scenarios are addressed to resemble the behavior of ACA with h, p and hp adaptive strategies, and higher order methods in general.Publication Second-Order Nedelec Curl-Conforming Prismatic Element for Computational Electromagnetics(Institute of Electrical and Electronics Engineers (IEEE), 2016-10-02) Amor Martín, Adrián; García Castillo, Luis Emilio; García Doñoro, Daniel; Ministerio de Asuntos Económicos y Transformación Digital (España)A systematic approach to obtaining mixed-order curl-conforming basis functions for a triangular prism is presented; focus is made on the second-order case. Space of functions for the prism is given. Basis functions are obtained as the dual basis with respect to suitably discretized Nedelec degrees of freedom functionals acting on elements of the space. Thus, the linear independence of the basis functions is assured while the belonging of the basis to the a priori given space of functions is guaranteed. Different strategies for the finite element assembly of the basis are discussed. Numerical results showing the verification procedure of the correctness of the implemented basis functions are given. Numerical results about sensibility of the condition number of the basis obtained concerning the quality of the elements of the mesh are also shown. Comparison with other representative sets of basis functions for prisms is included.Publication On the use of many-core machines for the acceleration of a mesh truncation technique for FEM(Springer Science and Business Media LLC, 2019-03) Belloch Rodríguez, José Antonio; Amor Martín, Adrián; García Doñoro, Daniel; Martínez Zaldívar, Francisco J.; Comunidad de Madrid; Ministerio de Economía y Competitividad (España); Ministerio de Asuntos Económicos y Transformación Digital (España)Finite element method (FEM) has been used for years for radiation problems in the field of electromagnetism. To tackle problems of this kind, mesh truncation techniques are required, which may lead to the use of high computational resources. In fact, electrically large radiation problems can only be tackled using massively parallel computational resources. Different types of multi-core machines are commonly employed in diverse fields of science for accelerating a number of applications. However, properly managing their computational resources becomes a very challenging task. On the one hand, we present a hybrid message passing interface + OpenMP-based acceleration of a mesh truncation technique included in a FEM code for electromagnetism in a high-performance computing cluster equipped with 140 compute nodes. Results show that we obtain about 85% of the theoretical maximum speedup of the machine. On the other hand, a graphics processing unit has been used to accelerate one of the parts that presents high fine-grain parallelism.Publication Construction of higher-order curl-conforming finite elements and its assembly(Wiley, 2019-08) Amor Martín, Adrián; García Castillo, Luis Emilio; Ministerio de Ciencia y Tecnología (España); Ministerio de Educación, Cultura y Deporte (España)Different choices are available when constructing vector finite element bases in real coordinates. In this communication, two different designs of higher-order curl-conforming basis functions are introduced and explained, showing the particularities of its assembly. Tetrahedra and hexahedra are used as element shapes to assess the effect of triangular and quadrilateral faces on the two considered constructions of basis functions. A comparison of their robustness in terms of the condition number of the finite element matrices for a number of distortions is includedPublication GPU Acceleration of a Non-Standard Finite Element Mesh Truncation Technique for Electromagnetics(Institute of Electrical and Electronics Engineers (IEEE), 2020-05-07) Badía, José M.; Amor Martín, Adrián; Belloch Rodríguez, José Antonio; García Castillo, Luis Emilio; Ministerio de Ciencia e Innovación (España)The emergence of General Purpose Graphics Processing Units (GPGPUs) provides new opportunities to accelerate applications involving a large number of regular computations. However, properly leveraging the computational resources of graphical processors is a very challenging task. In this paper, we use this kind of device to parallelize FE-IIEE (Finite Element-Iterative Integral Equation Evaluation), a non-standard finite element mesh truncation technique introduced by two of the authors. This application is computationally very demanding due to the amount, size and complexity of the data involved in the procedure. Besides, an efficient implementation becomes even more difficult if the parallelization has to maintain the complex workflow of the original code. The proposed implementation using CUDA applies different optimization techniques to improve performance. These include leveraging the fastest memories of the GPU and increasing the granularity of the computations to reduce the impact of memory access. We have applied our parallel algorithm to two real radiation and scattering problems demonstrating speedups higher than 140 on a state-of-the-art GPU.Publication Test-Driven Development of a Substructuring Technique for the Analysis of Electromagnetic Finite Periodic Structures(MDPI AG, 2021-12-07) Martínez Fernández, Ignacio; Amor Martín, Adrián; García Castillo, Luis Emilio; Ministerio de Ciencia e Innovación (España)In this paper, we follow the Test-Driven Development (TDD) paradigm in the development of an in-house code to allow for the finite element analysis of finite periodic type electromagnetic structures (e.g., antenna arrays, metamaterials, and several relevant electromagnetic problems). We use unit and integration tests, system tests (using the Method of Manufactured Solutions—MMS), and application tests (smoke, performance, and validation tests) to increase the reliability of the code and to shorten its development cycle. We apply substructuring techniques based on the definition of a unit cell to benefit from the repeatability of the problem and speed up the computations. Specifically, we propose an approach to model the problem using only one type of Schur complement which has advantages concerning other substructuring techniques.Publication A Testbench of Arbitrary Accuracy for Electromagnetic Simulations(Wiley, 2020-10) Amor Martín, AdriánSeveral electromagnetic problems for verification purposes in computational electromagnetics are introduced. Details about the formulation of a generalized eigenvalue problem for non-lossy and lossy materials are provided to obtain a fast and ready-to-use way of verification. Codes written using the symbolic toolbox from MATLAB are detailed to obtain an arbitrary accuracy for the proposed problems. Finally, numerical results in a finite element method code are presented together with the analytical values to show the accuracy of the code proposed.Publication 3D Magnetotelluric Modeling Using High-Order Tetrahedral Nédélec Elements on Massively Parallel Computing Platforms(ELSEVIER BV, 2022-03) Castillo Reyes, Octavio; Modesto, David; Queralt, Pilar; Marcuello, Alex; Ledo, Juanjo; Amor Martín, Adrián; de la Puente, Josep; García Castillo, Luis Emilio; European Commission; Ministerio de Ciencia e Innovación (España)We present a routine for 3D magnetotelluric (MT) modeling based upon high-order edge finite element method (HEFEM), tailored and unstructured tetrahedral meshes, and high-performance computing (HPC). This implementation extends the PETGEM modeller capabilities, initially developed for active-source electromagnetic methods in frequency-domain. We assess the accuracy, robustness, and performance of the code using a set of reference models developed by the MT community in well-known reported workshops. The scale and geological properties of these 3D MT setups are challenging, making them ideal for addressing a rigorous validation. Our numerical assessment proves that this new algorithm can produce the expected solutions for arbitrarily 3D MT models. Also, our extensive experimental results reveal four main insights: (1) high-order discretizations in conjunction with tailored meshes can offer excellent accuracy; (2) a rigorous mesh design based on the skin-depth principle can be beneficial for the solution of the 3D MT problem in terms of numerical accuracy and run-time; (3) high-order polynomial basis functions achieve better speed-up and parallel efficiency ratios than low-order polynomial basis functions on cutting-edge HPC platforms; (4) a triple helix approach based on HEFEM, tailored meshes, and HPC can be extremely competitive for the solution of realistic and complex 3D MT models and geophysical electromagnetics in general.Publication Adaptive Semi-Structured Mesh Refinement Techniques for the Finite Element Method(MDPI, 2021-04-02) Amor Martín, Adrián; García Castillo, Luis Emilio; Ministerio de Ciencia e Innovación (España)The adaptive mesh techniques applied to the Finite Element Method have continuously been an active research line. However, these techniques are usually applied to tetrahedra. Here, we use the triangular prismatic element as the discretization shape for a Finite Element Method code with adaptivity. The adaptive process consists of three steps: error estimation, marking, and refinement. We adapt techniques already applied for other shapes to the triangular prisms, showing the differences here in detail. We use five different marking strategies, comparing the results obtained with different parameters. We adapt these strategies to a conformation process necessary to avoid hanging nodes in the resulting mesh. We have also applied two special rules to ensure the quality of the refined mesh. We show the effect of these rules with the Method of Manufactured Solutions and numerical results to validate the implementation introduced.Publication Sparse and kernel OPLS feature extraction based on eigenvalue problem solving(Elsevier, 2015-05-01) Muñoz Romero, Sergio; Arenas García, Jerónimo; Gómez Verdejo, Vanessa; Ministerio de Economía y Competitividad (España)Orthonormalized partial least squares (OPLS) is a popular multivariate analysis method to perform supervised feature extraction. Usually, in machine learning papers OPLS projections are obtained by solving a generalized eigenvalue problem. However, in statistical papers the method is typically formulated in terms of a reduced-rank regression problem, leading to a formulation based on a standard eigenvalue decomposition. A first contribution of this paper is to derive explicit expressions for matching the OPLS solutions derived under both approaches and discuss that the standard eigenvalue formulation is also normally more convenient for feature extraction in machine learning. More importantly, since optimization with respect to the projection vectors is carried out without constraints via a minimization problem, inclusion of penalty terms that favor sparsity is straightforward. In the paper, we exploit this fact to propose modified versions of OPLS. In particular, relying on the ℓ1 norm, we propose a sparse version of linear OPLS, as well as a non-linear kernel OPLS with pattern selection. We also incorporate a group-lasso penalty to derive an OPLS method with true feature selection. The discriminative power of the proposed methods is analyzed on a benchmark of classification problems. Furthermore, we compare the degree of sparsity achieved by our methods and compare them with other state-of-the-art methods for sparse feature extraction.Publication Combinations of adaptive filters(IEEE, 2016-01) Arenas García, Jerónimo; Azpicueta Ruiz, Luis Antonio; Silva, Magno T. M.; Nascimento, Vitor H.; Sayed, Ali H.; Ministerio de Economía y Competitividad (España)Adaptive filters are at the core of many signal processing applications, ranging from acoustic noise supression to echo cancelation [1], array beamforming [2], channel equalization [3], to more recent sensor network applications in surveillance, target localization, and tracking. A trending approach in this direction is to recur to in-network distributed processing in which individual nodes implement adaptation rules and diffuse their estimation to the network [4], [5].Publication Adaptive diffusion schemes for heterogeneous networks(IEEE, 2017-11-01) Fernández Bes, Jesús; Arenas García, Jerónimo; Silva, Magno T. M.; Azpicueta Ruiz, Luis Antonio; Comunidad de Madrid; European Commission; Ministerio de Economía y Competitividad (España)In this paper, we deal with distributed estimation problems in diffusion networks with heterogeneous nodes, i.e., nodes that either implement different adaptive rules or differ in some other aspect such as the filter structure or length, or step size. Although such heterogeneous networks have been considered from the first works on diffusion networks, obtaining practical and robust schemes to adaptively adjust the combiners in different scenarios is still an open problem. In this paper, we study a diffusion strategy specially designed and suited to heterogeneous networks. Our approach is based on two key ingredients: 1) the adaptation and combination phases are completely decoupled, so that network nodes keep purely local estimations at all times and 2) combiners are adapted to minimize estimates of the network mean-square-error. Our scheme is compared with the standard adapt-then-combine scheme and theoretically analyzed using energy conservation arguments. Several experiments involving networks with heterogeneous nodes show that the proposed decoupled adapt-then-combine approach with adaptive combiners outperforms other state-of-the-art techniques, becoming a competitive approach in these scenarios.
- «
- 1 (current)
- 2
- 3
- »