Browsing by Keyword "Computer Vision and Pattern Recognition"
Now showing 1 - 19 of 19
Results Per Page
Sort Options
Item Adversarial Networks for Spatial Context-Aware Spectral Image Reconstruction from RGB(IEEE, 2017-10) Alvarez-Gila, Aitor; Van de Weijer, Joost; Garrote, Estibaliz; Tecnalia Research & Innovation; QuantumHyperspectral signal reconstruction aims at recovering the original spectral input that produced a certain trichromatic (RGB) response from a capturing device or observer. Given the heavily underconstrained, non-linear nature of the problem, traditional techniques leverage different statistical properties of the spectral signal in order to build informative priors from real world object reflectances for constructing such RGB to spectral signal mapping. However, most of them treat each sample independently, and thus do not benefit from the contextual information that the spatial dimensions can provide. We pose hyperspectral natural image reconstruction as an image to image mapping learning problem, and apply a conditional generative adversarial framework to help capture spatial semantics. This is the first time Convolutional Neural Networks -and, particularly, Generative Adversarial Networks- are used to solve this task. Quantitative evaluation shows a Root Mean Squared Error (RMSE) drop of 44.7% and a Relative RMSE drop of 47.0% on the ICVL natural hyperspectral image dataset.Item Automatic Red-Channel underwater image restoration(2015-01) Galdran, Adrian; Pardo, David; Picon, Artzai; Alvarez-Gila, Aitor; Tecnalia Research & Innovation; COMPUTER_VISION; VISUALUnderwater images typically exhibit color distortion and low contrast as a result of the exponential decay that light suffers as it travels. Moreover, colors associated to different wavelengths have different attenuation rates, being the red wavelength the one that attenuates the fastest. To restore underwater images, we propose a Red Channel method, where colors associated to short wavelengths are recovered, as expected for underwater images, leading to a recovery of the lost contrast. The Red Channel method can be interpreted as a variant of the Dark Channel method used for images degraded by the atmosphere when exposed to haze. Experimental results show that our technique handles gracefully artificially illuminated areas, and achieves a natural color correction and superior or equivalent visibility improvement when compared to other state-of-the-art methods.Item Battery Storage Demonstration Projects An Overview Across Europe(Institute of Electrical and Electronics Engineers Inc., 2021) Astero, Poria; Maki, Kari; Evens, Corentin; Papadimitriou, Christina; Efthymiou, Venizelos; Niebe, Astrid; Holly, Stefanie; Marinelli, Mattia; Gabderakhmanova, Tatiana; Melendez, Joaquim; Herraiz, Sergio; Rodriguez-Sanchez, Raul; Morch, Andrei; De Urtasun, Laura Gimenez; Fernandez, Gregorio; Divshali, Poria Hasanpor; Tecnalia Research & InnovationThis paper summarises results and experiences from several demonstration projects across European countries in the field of battery energy storage system (BESS) integration to the power system. These research projects are selected among research institutes and universities that are part of the European Energy Research Alliance (EERA) Joint Program on Smart Grids. The paper categorizes these projects according to the demonstrated applications of BESS and then reviews specific aspects of each project. This paper provides an opportunity to find out the summary of the most recent results as well as challenges and open research questions in projects focusing on different BESS application in the power system.Item Cyber Hygiene Maturity Assessment Framework for Smart Grid Scenarios(2021-03-10) Skarga-Bandurova, Inna; Kotsiuba, Igor; Velasco, Erkuden Rios; CIBERSEC&DLTCyber hygiene is a relatively new paradigm premised on the idea that organizations and stakeholders are able to achieve additional robustness and overall cybersecurity strength by implementing and following sound security practices. It is a preventive approach entailing high organizational culture and education for information cybersecurity to enhance resilience and protect sensitive data. In an attempt to achieve high resilience of Smart Grids against negative impacts caused by different types of common, predictable but also uncommon, unexpected, and uncertain threats and keep entities safe, the Secure and PrivatE smArt gRid (SPEAR) Horizon 2020 project has created an organization-wide cyber hygiene policy and developed a Cyber Hygiene Maturity assessment Framework (CHMF). This article presents the assessment framework for evaluating Cyber Hygiene Level (CHL) in relation to the Smart Grids. Complementary to the SPEAR Cyber Hygiene Maturity Model (CHMM), we propose a self-assessment methodology based on a questionnaire for Smart Grid cyber hygiene practices evaluation. The result of the assessment can be used as a cyber-health check to define countermeasures and to reapprove cyber hygiene rules and security standards and specifications adopted by the Smart Grid operator organization. The proposed methodology is one example of a resilient approach to cybersecurity. It can be applied for the assessment of the CHL of Smart Grids operating organizations with respect to a number of recommended good practices in cyber hygiene.Item DECKUBIK: An unique open building experimental facility aimed to test ict based Ambient Assisted Living products(Association for Computing Machinery, 2014-09-10) Arakistain, Ivan; Albizu, Josu; Castelruiz, Amaia; DIGITALIZACIÓN Y AUTOMATIZACIÓN DE LA CONSTRUCCIÓN; Tecnalia Research & InnovationDECKUBIK is a unique facility for R&D aimed at developing new concepts, products and services for the aging market. This new facility has been installed on the ground floor of the experimental building KUBIK, located in Derio, Vizcaya (Spain). DECKUBIK simulates a dwelling, with two bedrooms, a living room, a kitchen and a bathroom. The spaces are equipped with the necessary infrastructure and are prepared to place new products, interfaces, sensor systems, services and technologies that need to be tested and validated by Ambient Assisted Living (AAL) users. This paper will focus in two of the technologies that we brought into practice which can enhance user experience at AAL infrastructure. Interactive furniture with capacitance based sensing which can provide new non-intrusive functionalities, and self-powered control systems.Item Development of computer games for assessment and training in post-stroke arm telerehabilitation(2012) Rodriguez-De-Pablo, Cristina; Perry, Joel C.; Cavallaro, Francesca I.; Zabaleta, Haritz; Keller, Thierry; Tecnalia Research & InnovationStroke is the leading cause of long term disability among adults in industrialized nations. The majority of these disabilities include deficiencies in arm function, which can make independent living very difficult. Research shows that better results in rehabilitation are obtained when patients receive more intensive therapy. However this intensive therapy is currently too expensive to be provided by the public health system, and at home few patients perform the repetitive exercises recommended by their therapists. Computer games can provide an affordable, enjoyable, and effective way to intensify treatment, while keeping the patient as well as their therapists informed about their progress. This paper presents the study, design, implementation and user-testing of a set of computer games for at-home assessment and training of upper-limb motor impairment after stroke.Item Ethical and legal implications for technological devices in clinical research in Europe: Flowchart design for ethical and legal decisions in clinical research(Association for Computing Machinery, 2021-09-22) Garzo, Ainara; Garay-Vitoria, Nestor; Molina-Tanco, Luis; Manresa-Yee, Cristina; Gonzalez-Gonzalez, Carina; Montalvo-Gallego, Blanca; Reyes-Lecuona, Arcadio; Medical TechnologiesIn recent years engineers developing new technologies with assistive or medical purposes have become aware that to create acceptable and usable solutions they need to involve final users, patients and stakeholders in the design, development and evaluation of systems as well as in the device certification processes. Involving stakeholders in such processes has several ethical and legal implications. It has become evident that it is still difficult for engineers in Europe to know which ethical and legal processes should be carried out as they have not been previously trained in these issues during their studies. This article is a review of the laws, standards and recommendations applicable in Europe concerning human involvement in new technologies research, with the aim of helping researchers in the region in question to identify the ethical and legal issues that could arise during those tasks. This review has been carried out in response to the identified need on the part of technological researchers. The design of a flowchart is presented as a summary of the interpretation of the documentation reviewed with the aim of helping the researchers to take the ethical and legal decisions that apply to research involving humans. The flowchart presented has been validated with various research projects in which the authors have participated. The proposed conceptual design can be used for taking decisions, but it is suggested that a tool based on this design be built with the aim of making decision taking easier for researchers in this area.Item Evolutionary Multitask Optimization: a Methodological Overview, Challenges, and Future Research Directions: a Methodological Overview, Challenges, and Future Research Directions(2022-04-12) Osaba, Eneko; Del Ser, Javier; Martinez, Aritz D.; Hussain, Amir; Quantum; IAIn this work, we consider multitasking in the context of solving multiple optimization problems simultaneously by conducting a single search process. The principal goal when dealing with this scenario is to dynamically exploit the existing complementarities among the problems (tasks) being optimized, helping each other through the exchange of valuable knowledge. Additionally, the emerging paradigm of evolutionary multitasking tackles multitask optimization scenarios by using biologically inspired concepts drawn from swarm intelligence and evolutionary computation. The main purpose of this survey is to collect, organize, and critically examine the abundant literature published so far in evolutionary multitasking, with an emphasis on the methodological patterns followed when designing new algorithmic proposals in this area (namely, multifactorial optimization and multipopulation-based multitasking). We complement our critical analysis with an identification of challenges that remain open to date, along with promising research directions that can leverage the potential of biologically inspired algorithms for multitask optimization. Our discussions held throughout this manuscript are offered to the audience as a reference of the general trajectory followed by the community working in this field in recent times, as well as a self-contained entry point for newcomers and researchers interested to join this exciting research avenue.Item Impact of a resistive superconductive fault current limiter in a multi-terminal HVDC grid(Institute of Electrical and Electronics Engineers Inc., 2018-06-08) Saldana, G.; Etxegarai, A.; Larruskain, D. M.; Iturregi, A.; Apinaniz, S.; POWER ELECTRONICS AND SYSTEM EQUIPMENTThe relevance of multi-terminal HVDC grids is expected to increase in next years. However, there are still several technical, economical and legal limitations that interfere with the construction of those multi-terminal links. One of the main technical obstacles is the handling of DC fault currents, because the interruption of DC currents in HVDC systems is nowadays a challenge. Among other proposals, the application of Superconducting Fault Current Limiters (SFCLs) combined with a mechanical Circuit Breaker (CB) solves technically that issue, according to the analysis and simulation results presented in this paper.Item On the Duality Between Retinex and Image Dehazing(IEEE Computer Society, 2018-12-14) Galdran, Adrian; Bria, Alessandro; Alvarez-Gila, Aitor; Vazquez-Corral, Javier; Bertalmío, Marcelo; Tecnalia Research & Innovation; VISUALImage dehazing deals with the removal of undesired loss of visibility in outdoor images due to the presence of fog. Retinex is a color vision model mimicking the ability of the Human Visual System to robustly discount varying illuminations when observing a scene under different spectral lighting conditions. Retinex has been widely explored in the computer vision literature for image enhancement and other related tasks. While these two problems are apparently unrelated, the goal of this work is to show that they can be connected by a simple linear relationship. Specifically, most Retinex-based algorithms have the characteristic feature of always increasing image brightness, which turns them into ideal candidates for effective image dehazing by directly applying Retinex to a hazy image whose intensities have been inverted. In this paper, we give theoretical proof that Retinex on inverted intensities is a solution to the image dehazing problem. Comprehensive qualitative and quantitative results indicate that several classical and modern implementations of Retinex can be transformed into competing image dehazing algorithms performing on pair with more complex fog removal methods, and can overcome some of the main challenges associated with this problem.Item On the heritability of dandelion-encoded harmony search heuristics for tree optimization problems(IEEE, 2015-09-24) Perfecto, Cristina; Bilbao, Miren Nekane; Del Ser, Javier; Ferro, Armando; IATree based optimization problems stand for those paradigms where solutions can be arranged within a tree-like graph whose nodes represent the optimization variables of the problem at hand and their interconnecting edges topological and/or hierarchical relationships between such variables. In this context, a research line of increasing interest during the last decade focuses on the derivation of intelligent solution encoding strategies capable of 1) capturing all topological constraints of this particular class of graphs; and 2) preserving their connectivity properties when they undergo combination/mutation operations within approximative evolutionary solvers. This manuscript takes a step over the state of the art by shedding light on the heri-tability properties of the Dandelion tree encoding approach under avant-garde stochastically-controlled evolutionary operators. In particular we elaborate on the topological heritability of the so-called Harmony Memory Considering Rate (HMCR) exploitative operator of the Harmony Search algorithm, a population-based meta-heuristic algorithm that has so far shown to outperform other evolutionary schemes in a wide range of optimization scenarios. Results from extensive Monte Carlo simulations are discussed in terms of the preserved structural properties of the newly produced solutions with respect to the initial Dandelion-encoded population.Item REVE 2021: 9th International Workshop on Reverse Variability Engineering: 9th International Workshop on Reverse Variability Engineering(Association for Computing Machinery, 2021-09-06) Assunção, Wesley K.G.; Lopez-Herrejon, Roberto E.; Ziadi, Tewfik; Martinez, Jabier; Mousavi, Mohammad; Schobbens, Pierre-Yves; Araujo, Hugo; Schaefer, Ina; ter Beek, Maurice H.; Devroey, Xavier; Rojas, Jose Miguel; Pinto, Monica; Teixeira, Leopoldo; Berger, Thorsten; Noppen, Johannes; Reinhartz-Berger, Iris; Temple, Paul; Damiani, Ferruccio; Petke, Justyna; SWTSoftware Product Line (SPL) migration remains a challenging endeavour. From organizational issues to purely technical challenges, there is a wide range of barriers that complicates SPL adoption. This workshop aims to foster research about making the most of the two main inputs for SPL migration: 1) domain knowledge and 2) legacy assets. Domain knowledge, usually implicit and spread across an organization, is key to define the SPL scope and to validate the variability model and its semantics. At the technical level, domain expertise is also needed to create or extract the reusable software components. Legacy assets can be, for instance, similar product variants (e.g., requirements, models, source code, etc.) that were implemented using ad-hoc reuse techniques such as clone-and-own. More generally, the workshop REverse Variability Engineering attracts researchers and practitioners contributing to processes, techniques, tools, or empirical studies related to the automatic, semi-automatic or manual extraction or refinement of SPL assets.Item Self-healing Multi-Cloud Application Modelling(ACM Digital Library, 2017-08-29) Rios, Erkuden; Iturbe, Eider; Palacios, Maria Carmen; CIBERSEC&DLTCloud computing market forecasts and technology trends confirm that Cloud is an IT disrupting phenomena and that the number of companies with multi-cloud strategy is continuously growing. Cost optimization and increased competitiveness of companies that exploit multi-cloud will only be possible when they are able to leverage multiple cloud offerings, while mastering both the complexity of multiple cloud provider management and the protection against the higher exposure to attacks that multi-cloud brings. This paper presents the MUSA Security modelling language for multi-cloud applications which is based on the Cloud Application Modelling and Execution Language (CAMEL) to overcome the lack of expressiveness of state-of-the-art modelling languages towards easing: a) the automation of distributed deployment, b) the computation of composite Service Level Agreements (SLAs) that include security and privacy aspects, and c) the risk analysis and service match-making taking into account not only functionality and business aspects of the cloud services, but also security aspects. The paper includes the description of the MUSA Modeller as the Web tool supporting the modelling with the MUSA modelling language. The paper introduces also the MUSA SecDevOps framework in which the MUSA Modeller is integrated and with which the MUSA Modeller will be validated.Item Self-supervised Blur Detection from Synthetically Blurred Scenes(2019-12) Alvarez-Gila, Aitor; Galdran, Adrian; Garrote, Estibaliz; Van de Weijer, Joost; Tecnalia Research & Innovation; VISUAL; QuantumBlur detection aims at segmenting the blurred areas of a given image. Recent deep learning-based methods approach this problem by learning an end-to-end mapping between the blurred input and a binary mask representing the localization of its blurred areas. Nevertheless, the effectiveness of such deep models is limited due to the scarcity of datasets annotated in terms of blur segmentation, as blur annotation is labour intensive. In this work, we bypass the need for such annotated datasets for end-to-end learning, and instead rely on object proposals and a model for blur generation in order to produce a dataset of synthetically blurred images. This allows us to perform self-supervised learning over the generated image and ground truth blur mask pairs using CNNs, defining a framework that can be employed in purely self-supervised, weakly supervised or semi-supervised configurations. Interestingly, experimental results of such setups over the largest blur segmentation datasets available show that this approach achieves state of the art results in blur segmentation, even without ever observing any real blurred image.Item Spectrum-based feature localization: A case study using ArgoUML: A case study using ArgoUML(Association for Computing Machinery, 2021-09-06) Michelon, Gabriela K.; Sotto-Mayor, Bruno; Martinez, Jabier; Arrieta, Aitor; Abreu, Rui; Assunção, Wesley K. G.; Mousavi, Mohammad; Schobbens, Pierre-Yves; Araujo, Hugo; Schaefer, Ina; ter Beek, Maurice H.; Devroey, Xavier; Rojas, Jose Miguel; Pinto, Monica; Teixeira, Leopoldo; Berger, Thorsten; Noppen, Johannes; Reinhartz-Berger, Iris; Temple, Paul; Damiani, Ferruccio; Petke, Justyna; SWTFeature localization (FL) is a basic activity in re-engineering legacy systems into software product lines. In this work, we explore the use of the Spectrum-based localization technique for this task. This technique is traditionally used for fault localization but with practical applications in other tasks like the dynamic FL approach that we propose. The ArgoUML SPL benchmark is used as a case study and we compare it with a previous hybrid (static and dynamic) approach from which we reuse the manual and testing execution traces of the features. We conclude that it is feasible and sound to use the Spectrum-based approach providing promising results in the benchmark metrics.Item Substation-Aware. An intrusion detection system for the IEC 61850 protocol.(Association for Computing Machinery, 2022-08-23) Lopez, Jose Antonio; Angulo, Iñaki; Martinez, Saturnino; POWER SYSTEMS; DIGITAL ENERGY; Tecnalia Research & InnovationThe number of cyberattacks against the Smart Grid has increased in the last years. Considered as a critical infrastructure, power system operators must improve the cybersecurity countermeasures of their installations. Intrusion Detection Systems (IDS) appears as a promising solution to detect hidden activity of the hackers before launching the attack. Most detection tools are generalist, designed to find predefined patterns such as frequency of messages, well-known malware packets, source and destination of the messages or the content of each packet itself. These tools also allow plugging modules for different protocols, offering a better understanding of the analysed data, such as the protocol action (read, write, reset...) or data model/schema understanding. However, the semantics of the data transmitted cannot be inferred. The Substation-Aware (SBT-Aware) tool adds the latest feature for primary and secondary substations, taking into account not only the protocols defined in the IEC 61850 standard, but the substation topology as well. In this paper we present the SBT-Aware, an IDS that has been developed and tested in the course of the H2020 SDN-microSENSE project.Item Towards an anonymous incident communication channel for electric smart grids(Association for Computing Machinery, 2018-11-29) Triantafyllou, Anna; Sarigiannidis, Panagiotis; Sarigiannidis, Antonios; Rios, Erkuden; Iturbe, Eider; Mamalis, Basilis; Karanikolas, Nikitas N.; CIBERSEC&DLTThe Electric Smart Grid (ESG) is an intelligent critical infrastructure aiming to create an automated and distributed advanced energy delivery network, while preserving information privacy. This study proposes the implementation of an Anonymous Incident Communication Channel (AICC) amongst smart grids across Europe to improve situational awareness and enhance security of the new electric intelligent infrastructures. All participating organizations will have the ability to broadcast sensitive information, stored anonymously in a repository, without exposing the reputation of the organisation. This work focuses on the requirements of establishment, the possible obstacles and proposed data protection techniques to be applied in the AICC. Furthermore, a discussion is conducted regarding the documentation of cyber-incidents. Last but not least, the benefits and the potential risks of this AICC concept are also provided.Item Unravelling the effect of data augmentation transformations in polyp segmentation(2020-12) Sánchez-Peralta, Luisa F.; Picón, Artzai; Sánchez-Margallo, Francisco M.; Pagador, J. Blas; COMPUTER_VISIONPurpose: Data augmentation is a common technique to overcome the lack of large annotated databases, a usual situation when applying deep learning to medical imaging problems. Nevertheless, there is no consensus on which transformations to apply for a particular field. This work aims at identifying the effect of different transformations on polyp segmentation using deep learning. Methods: A set of transformations and ranges have been selected, considering image-based (width and height shift, rotation, shear, zooming, horizontal and vertical flip and elastic deformation), pixel-based (changes in brightness and contrast) and application-based (specular lights and blurry frames) transformations. A model has been trained under the same conditions without data augmentation transformations (baseline) and for each of the transformation and ranges, using CVC-EndoSceneStill and Kvasir-SEG, independently. Statistical analysis is performed to compare the baseline performance against results of each range of each transformation on the same test set for each dataset. Results: This basic method identifies the most adequate transformations for each dataset. For CVC-EndoSceneStill, changes in brightness and contrast significantly improve the model performance. On the contrary, Kvasir-SEG benefits to a greater extent from the image-based transformations, especially rotation and shear. Augmentation with synthetic specular lights also improves the performance. Conclusion: Despite being infrequently used, pixel-based transformations show a great potential to improve polyp segmentation in CVC-EndoSceneStill. On the other hand, image-based transformations are more suitable for Kvasir-SEG. Problem-based transformations behave similarly in both datasets. Polyp area, brightness and contrast of the dataset have an influence on these differences.Item Validating item response processes in digital competence assessment through eye-tracking techniques(ACM, 2020-10-21) Bartolomé, Juan; Garaizar, Pablo; Bastida, Leire; Garcia-Penalvo, Francisco Jose; ADV_INTER_PLATThis paper reports on an exploratory study with the aim to validate item response processes in digital competence assessment through eye-tracking techniques. When measuring complex cognitive constructs, it is crucial to correctly design the evaluation items to trigger the intended knowledge and skills. Furthermore, to assess the validity of a test requires considering not only the content of the evaluation tasks involved in the test, but also whether examinees respond to the tasks by engaging construct-relevant response processes. The eye tracking observations helped to fill an ‘explanatory gap’ by providing data on variation in item response processes that are not captured by other sources of process data such as think aloud protocols or computer-generated log files. We proposed a set of metrics that could help test designers to validate the different item formats used in the evaluation of digital competence. The gaze data provided detailed information on test item response strategies, enabling profiling of examinee engagement and response processes associated with successful performance. There were notable differences between the participants who correctly solved the tasks and those who failed, in terms of the time spent on solving them, as well as the data on their gazes. Moreover, this included insights into response processes which contributed to the validation of the assessment criteria of each item.