Browsing by Type "doctoral thesis"
Now showing 1 - 12 of 12
Results Per Page
Sort Options
Item Advanced Machine Learning Techniques and Meta-Heuristic Optimization for the Detection of Masquerading Attacks in Social Networks(Universidad de Alcalá, 2015-12-11) Villar-Rodriguez, Esther; Del Ser, Javier; Salcedo-Sanz, SanchoAccording to the report published by the online protection firm Iovation in 2012, cyber fraud ranged from 1 percent of the Internet transactions in North America Africa to a 7 percent in Africa, most of them involving credit card fraud, identity theft, and account takeover or h¼acking attempts. This kind of crime is still growing due to the advantages offered by a non face-to-face channel where a increasing number of unsuspecting victims divulges sensitive information. Interpol classifies these illegal activities into 3 types: • Attacks against computer hardware and software. • Financial crimes and corruption. • Abuse, in the form of grooming or “sexploitation”. Most research efforts have been focused on the target of the crime developing different strategies depending on the casuistic. Thus, for the well-known phising, stored blacklist or crime signals through the text are employed eventually designing adhoc detectors hardly conveyed to other scenarios even if the background is widely shared. Identity theft or masquerading can be described as a criminal activity oriented towards the misuse of those stolen credentials to obtain goods or services by deception. On March 4, 2005, a million of personal and sensitive information such as credit card and social security numbers was collected by White Hat hackers at Seattle University who just surfed the Web for less than 60 minutes by means of the Google search engine. As a consequence they proved the vulnerability and lack of protection with a mere group of sophisticated search terms typed in the engine whose large data warehouse still allowed showing company or government websites data temporarily cached. As aforementioned, platforms to connect distant people in which the interaction is undirected pose a forcible entry for unauthorized thirds who impersonate the licit user in a attempt to go unnoticed with some malicious, not necessarily economic, interests. In fact, the last point in the list above regarding abuses has become a major and a terrible risk along with the bullying being both by means of threats, harassment or even self-incrimination likely to drive someone to suicide, depression or helplessness. California Penal Code Section 528.5 states: “Notwithstanding any other provision of law, any person who knowingly and without consent credibly impersonates another actual person through or on an Internet Web site or by other electronic means for purposes of harming, intimidating, threatening, or defrauding another person is guilty of a public offense punishable pursuant to subdivision [...]”. IV Therefore, impersonation consists of any criminal activity in which someone assumes a false identity and acts as his or her assumed character with intent to get a pecuniary benefit or cause some harm. User profiling, in turn, is the process of harvesting user information in order to construct a rich template with all the advantageous attributes in the field at hand and with specific purposes. User profiling is often employed as a mechanism for recommendation of items or useful information which has not yet considered by the client. Nevertheless, deriving user tendency or preferences can be also exploited to define the inherent behavior and address the problem of impersonation by detecting outliers or strange deviations prone to entail a potential attack. This dissertation is meant to elaborate on impersonation attacks from a profiling perspective, eventually developing a 2-stage environment which consequently embraces 2 levels of privacy intrusion, thus providing the following contributions: • The inference of behavioral patterns from the connection time traces aiming at avoiding the usurpation of more confidential information. When compared to previous approaches, this procedure abstains from impinging on the user privacy by taking over the messages content, since it only relies on time statistics of the user sessions rather than on their content. • The application and subsequent discussion of two selected algorithms for the previous point resolution: – A commonly employed supervised algorithm executed as a binary classifier which thereafter has forced us to figure out a method to deal with the absence of labeled instances representing an identity theft. – And a meta-heuristic algorithm in the search for the most convenient parameters to array the instances within a high dimensional space into properly delimited clusters so as to finally apply an unsupervised clustering algorithm. • The analysis of message content encroaching on more private information but easing the user identification by mining discriminative features by Natural Language Processing (NLP) techniques. As a consequence, the development of a new feature extraction algorithm based on linguistic theories motivated by the massive quantity of features often gathered when it comes to texts. In summary, this dissertation means to go beyond typical, ad-hoc approaches adopted by previous identity theft and authorship attribution research. Specifically it proposes tailored solutions to this particular and extensively studied paradigm with the aim at introducing a generic approach from a profiling view, not tightly bound to a unique application field. In addition technical contributions have been made in the course of the solution formulation intending to optimize familiar methods for a better versatility towards the problem at hand. In summary: this Thesis establishes an encouraging research basis towards unveiling subtle impersonation attacks in Social Networks by means of intelligent learning techniques.Item Algoritmos de generación de consigna de velocidad angular y ajuste del control de velocidad en aerogeneradores de media potencia(Universidad del País Vasco, 2017-02-17) González-González, A.; Zulueta Guerrero, EkaitzEl presente trabajo de tesis está dirigido a la optimización del algoritmo de consigna de velocidad angular del rotor de un aerogenerador de media potencia (100kW). El cálculo de los parámetros integral y proporcional del controlador PI se realiza mediante la técnica de programación de ganancias para seis aproximaciones del modelo de aerogenerador: Método I, II, III, IV, V y VI. Se muestran cuatro estrategias de ajuste de la consigna de velocidad angular del rotor: Constante, convencional, aprendizaje por refuerzo (RL) y optimización metaheurística por enjambre de partículas (PSO). Los métodos y las estrategias se evalúan en base a múltiples objetivos contrapuestos: maximizar la energía captada del viento, minimizar el error de la velocidad angular, minimizar la aceleración angular del rotor y minimizar la velocidad angular del pitch. Por un lado, comparando los métodos, los mejores resultados se obtienen con usando los métodos IV, V y VI. Por otro lado, comparando las estrategias, la estrategia RL no mejora significativamente los resultados en comparación con la estrategia constante y convencional, mientras que la estrategia PSO obtiene los mejores resultados. (c)2017 ASIER GONZALEZ GONZALEZItem CORROSIÓN BACTERIANA EN CONDICIONES REPRESENTATIVAS DE UN ALMACÉN GEOLÓGICO PROFUNDO(UPV-EHU, 2014-11) Madina, Virginia; Legarreta, Juan AndrésIn a Deep Geologic Repository (DGR) for high-level radioactive waste it is not discarded that the metal container that contains and confines the nuclear waste may undergo microbiologically influenced corrosion (MIC). Bacteria naturally occurring in groundwater, rocks and filler material surrounding the container are capable of promoting, accelerating (or inhibiting) corrosion processes due to the formation of biofilms, or due to the generation of corrosive metabolic products. Among the bacteria involved in corrosion phenomena, the sulphate reducing bacteria (SRB), commonly observed in groundwater environments typical of clay and granite geological formations are, from a corrosion perspective, of major importance since these bacteria produce sulphide, highly corrosive for many metals.Item Desarrollo de algoritmos de procesamiento de imagen avanzado para interpretación de imágenes médicas.(UPV - EHU, 2016-12-01) Bereciartua, Arantza; Picón, Artzai; Iriondo, Pedro M.La imagen médica se ha convertido en los últimos años en una potente herramienta de ayuda al diagnóstico. Gracias a los avanzados escáneres y software de reconstrucción de imágenes disponibles es posible la identificación de distintos órganos y tejidos, así como la obtención de datos que ayuden a caracterizar y cuantificar las patologías. Los radiólogos son los responsables del uso e interpretación de dichas imágenes y demandan herramientas que les permitan localizar órganos y tejidos con mayor precisión y rapidez, así como la identificación y caracterización cuantitativa de las patologías presentes en ellos, con el fin de realizar un diagnóstico preciso. Por otra parte, el cáncer de hígado es una de las principales causas de muerte por cáncer en todo el mundo. Las técnicas invasivas utilizadas para su diagnóstico, tales como biopsias quirúrgicas, a veces pueden ser reemplazadas por técnicas no invasivas con imagen médica como la tomografía axial computerizada (TAC o CT, en sus siglas en inglés) y la resonancia magnética (RM o MRI, en inglés), con claros beneficios para el paciente. Con el fin de ayudar a los radiólogos y cirujanos en una planificación fiable de la intervención, son necesarios nuevos métodos y herramientas para localizar y segmentar adecuadamente el órgano de interés y las patologías presentes. La segmentación (delimitación) automática del hígado es un problema complejo. Se han alcanzado resultados parciales principalmente sobre imágenes obtenidas mediante CT. La técnica de MRI ofrece mayor información para fines de diagnóstico. Sin embargo, la segmentación del hígado en imágenes de MRI representa un desafío debido a la presencia de artefactos característicos de dicha tecnología de adquisición, como es el caso de los volúmenes parciales, el ruido, y en general, la baja nitidez y el escaso contraste existente entre órganos, de manera que el límite entre los diferentes tejidos suele ser confuso. Existen menos desarrollos sobre imágenes de MRI, aunque éstos han ido en aumento progresivo en los últimos años. En este trabajo, se presenta un nuevo método para la segmentación automática de hígado sobre imagen multicanal obtenida mediante resonancia magnética. El método propuesto consiste en la minimización de una superficie activa 3D mediante la aproximación dual a la formulación variacional subyacente del problema. Esta superficie activa evoluciona sobre un mapa de probabilidad que se basa en un nuevo descriptor compacto propuesto que contiene la información espacial y multisecuencia de cada píxel en relación a un modelo estadístico multivariable de hígado generado previamente. Esta superficie activa 3D integra de manera natural la regularización volumétrica. El descriptor visual compacto junto con el enfoque propuesto constituye un método de segmentación 3D rápido y preciso. El método fue probado en 18 estudios de hígado sano y los resultados se compararon con una segmentación de referencia realizada por expertos radiólogos. Las comparaciones con otros métodos del estado del arte se realizan mediante la obtención de 9 métricas establecidas. Los resultados obtenidos son comparables, incluso mejores en algunos casos, a los de otras técnicas del estado del arte. Se ha obtenido un coeficiente de similaridad de Dice de 98.59. Medical imaging has become in recent years a powerful tool to support diagnosis. Thanks to advanced scanners and image reconstruction software available, it is possible to identify different organs and tissues, as well as obtaining data that may help to characterize and quantify the pathologies. Radiologists are responsible for the use and interpretation of these images and require tools that allow them to locate organs and tissues with greater accuracy and speed, as well as the identification and quantitative characterization of the pathologies present in them, in order to make an accurate diagnosis. Moreover, liver cancer is one of the leading causes of cancer death worldwide. Invasive techniques used for diagnosis, such as surgical biopsies can sometimes be replaced by non-invasive techniques in medical imaging such as computed axial tomography (CT) and magnetic resonance imaging (MRI), with clear benefits for the patient. In order to assist radiologists and surgeons in a reliable intervention planning, new methods and accurate and efficient tools are needed to locate and segment properly the organ of interest and the pathologies inside. Automatic segmentation (delimitation) of the liver is a complex problem. Partial results have been achieved mainly on images obtained by CT. MRI technique provides more information for diagnostic purposes. However, liver segmentation in MRI images is a challenge due to the presence of characteristic artifacts, such as the partial volumes, the noise, and generally, the low sharpness and the low contrast between organs, so that the boundary between different tissues is often confusing. There are fewer developments on MRI, although these have been steadily increasing in recent years. In this thesis, we present a novel method for multichannel MRI automatic liver segmentation. The proposed method consists of the minimization of a 3D active surface by means of the dual approach to the variational formulation of the underlying problem. This active surface evolves over a probability map that is based on a new compact descriptor comprising spatial and multisequence information of every pixel which is further modeled by means of a liver multivariate statistical model that has been previously generated. This proposed 3D active surface approach naturally integrates volumetric regularization in the statistical model. The advantages of the compact visual descriptor together with the proposed approach result in a fast and accurate 3D segmentation method. The method was tested on 18 healthy liver studies and results were compared to a gold standard made by expert radiologists. Comparisons with other state-of-the-art approaches are provided by means of nine well established quality metrics. The obtained results are in line with the state-of-the-art methodologies, and are even better than them in some cases. A Dice Similarity Coefficient of 98.59 has been achieved.Item Fatiga de Elementos Estructurales por Acciones Dinámicas Aleatorias en Medio Ambiente Agresivo(University of Cantabria, 2014-12-12) Calderón-Uríszar-Aldaca, Iñigo; Biezma Moraleda, María VictoriaLa fatiga es un fenómeno que se da cuando una estructura o elemento estructural se ve sometido a acciones dinámicas que producen una variación de tensiones cíclica en el tiempo, provocando la aparición y propagación de grietas que, con el tiempo, alcanzan el tamaño crítico suficiente para causar el fallo del elemento o estructura a tensiones inferiores al límite elástico. Desde que la fatiga se empezó a estudiar por primera vez por el ingeniero Wöhler en el caso del ferrocarril en 1870, con estudios limitados a ciclos de amplitud constante y tensión media nula, ha habido contribuciones significativas al Estado del Arte, relativas a la consideración de la acumulación de daño en secuencias de carga cada vez más complejas, con ciclos de amplitudes diferentes, de varias tensiones medias, en distintos órdenes, mediante reglas lineales, no lineales, etc. En esta Tesis Doctoral se ha estudiado el Estado del Arte de la normativa de ingeniería estructural, constatando la obligatoriedad y necesidad de considerar el fenómeno de fatiga en las estructuras actuales más avanzadas, determinando los límites de aplicación y sencillez de los modelos aplicados actualmente y justificando así su importancia en la ingeniería de estructuras. Por otro lado, se ha hecho un estudio en profundidad de la literatura científica e ingenieril relativa a la dinámica de estructuras frente a acciones aleatorias, métodos de recuento de ciclos, relativos a la interacción de la corrosión con la fatiga, etc. Como resultado, se ha determinado que los modelos actuales carecen de medios sencillos y aplicables para considerar la tensión media, el desorden de los ciclos y cómo afecta el fenómeno de la corrosión en secuencias de carga dinámicas aleatorias. Por lo tanto, la adaptación sencilla de los métodos de cálculo de fatiga actuales para tener en cuenta estas variables se considera que constituye, sin lugar a dudas, un avance significativo del Estado del Arte. Como consecuencia, para dar ese avance significativo, se estudia y analiza la tensión media, el desorden de los ciclos y el efecto de la corrosión, desarrollando modelos teóricos para considerar esos efectos en el cálculo de la acumulación de daño por la Regla Lineal de Palmgren-Miner, la más utilizada en la actualidad, mediante una serie de coeficientes o factores de empuje y aplicándolos por separado a casos sencillos. Lo cual, además de constituir un avance significativo para la ciencia, será un avance práctico. Finalmente, se consideran y aplican todos los factores en conjunto en un caso particular de estudio para demostrar su aplicabilidad y el efecto que tendrían, constatándose que las predicciones de fatiga, sin tener en cuenta todos estos factores, conducen a previsiones de daño muy inferiores, pudiendo llegar al fallo inesperado, apartando el modelo del lado de la seguridad si se aplicara sin estos factores de empuje correctores.Item A Harmonized Compositional Assurance Approach for Safety-Critical Systems(Universidad de Deusto, 2015-12-16) Ruiz, Alejandra; Espinoza, Huascar; Kelly, TimSafety-critical systems, those whose failure could end up in loss or injuries to people or the environment, are required to go through laborious and expensive certification processes. These systems have also increased their complexity and as it has already been done in other domains, they have applied component-based system developments to deal with complexity. However, components are difficult to assess as certification is done at system level and not at component level. Compositional certification approach proposes to get incremental credit by accepting that a specific component complies with specific standard’s requirements and it is correctly integrated. The objective is to support integration of new components while the previously integrated components do not need to work for re-acceptance. We propose (1) the use of assurance modelling techniques to provide us the mechanism to understand the common basis of standards shared by different domains such as the avionics, automotive and the medical devices design. We propose (2) an assurance decomposition methodology offering guidance and modelling mechanisms to decompose the responsibilities associated with the life-cycle of safety-critical components. This methodology ensures a hierarchy of assurance and certification projects where the responsibilities and project tasks can be specified and its accomplishment can be assessed to determine the compliance of functional safety standards. Assurance decomposition supports the reuse of components as it guides us not just for standards compliance but specifically on the understanding and tailoring of those standards for component assurance and support when those components are integrated into the final system. We propose (3) a contract-based approach to support the integration of reused components and at the same time, the proposal supports the identification of assumptions, a very laborious and time consuming task. Assurance Contracts are defined to ensure incremental compliance once the components are integrated. The objective of this assurance contracts is to ensure the overall compliance of the system with the selected standards and reference documents such as guidelines or advisory circulars. The defined approach to assurance contracts specification attempts to balance the need for unambiguity on the composition while maintaining the heterogeneity of the information managed. The claims classification offers an easy method to support the assessment of contract completeness and the structured expressions provide a semi-formal language to specify the assumptions and guarantees of a component. This work has been mainly framed in a European collaborative research projects such as OPENCOSS a Large-scale integrating project (IP) with 17 partners from 9 countries to develop a platform for safety assurance and certification of safety-critical systems (compliance with standards, robust argumentation, evidence management, process transparency), SAFEADAPT an FP7 project with 9 partners and RECOMP an ARTEMIS project.. The results of this work have been presented to the standardization group of the Object Management Group responsible for the SACM (Structured Assurance Case Metamodel) standard specification, which currently discusses its inclusion in future versions. The (4) tools presented and used in this work have been included in the results of an open tool platform developed within the OPENCOSS project that is being released in PolarSys. PolarSys is an Eclipse Industry Working Group created by large industry players and by tools providers to collaborate on the creation and support of Open Source tools for the development of embedded systems.Item La Belleza del Código: Influencia de la Web 2.0, los medios sociales y los contenidos multimedia en el desarrollo de HTML5(Universidad de Salamanca, 2015-07-29) Tabarés-Gutiérrez, Raúl; Echeverría, Javier; García-Figuerola, CarlosThe last version of hypertext´s standard that drives the Web (HTML5) has been developed from 2004 to 2014. During this time, it has experienced a great number of technological trajectories and social interactions among the different groups of stake holders interested in its development. With the standard getting official by the W3C , the period of uncertainty around the future of the Web comes to an end. But at the same time, it opens up doors to a change of great dimension concerning its own conception of the hypertext´s standard and its functions. In the current research we expound on the technological development of HTML standard till its last version (known as HTML5). Simultaneously, we try to explain the many crossroads that bind technology and society in the light of several social construction of technology theories. We claim that HTML5 development is a response from different stake holders to the proliferation of propietary software that takes place during the Web 2.0 period. Furthermore, we argue that the HTML5 development process is a phenomenon that combines Social Innovation, Social Diffusion and Social Appropiation. This technology looks for social ends through social means involving a social reorganization among the different concerned players. During this thesis and in order to hold those statements, it has been carried out a historical and philosophical analysis of the evolution of hypertext´s standards from the origins of the Web to the rising of HTML5. The aforementioned analysis has been checked with a fieldwork that is based on 17 semi-structured interviews with 21 HTML5 experts involved. These lead users represents at the same time different stake holders of Web´s value chain.Item MULTISCALE INFORMATION MANAGEMENT FOR HISTORIC DISTRICTS´ ENERGY RETROFITTING. A framework. A methodology. A model.(Universidad Politecnica de Cataluña-UPC, 2015-12-11) Egusquiza, Aitziber; Roca, Josep; Izkara, Jose LuisEuropean Historic Urban Districts are highly appreciated by their inhabitants and visitors and they can be considered as one of the most valuable collective achievements of the European culture. The preservation of our urban heritage requires the protection of the social context as well as the preservation of the authenticity and integrity of its physical materiality. That means to improve the quality of life of their inhabitants as well as the sustainability of the historic districts. This dissertation analysed the Historic Urban Districts as complex energy and informational systems in order to address the challenge to improve their sustainability and liveability while protecting their cultural values. First, a methodological framework for energy retrofitting in all its phases has been defined based on a strategic information management and from a multiscale perspective. Secondly, a decision making methodology that allows the modelling of the historic city and the selection of the best strategies has been developed. In order to support the whole system a multiscale data model has been designed. Finally, the historic city of Santiago de Compostela has been selected for the implementation.Item NUEVO MÉTODO BASADO EN EL HRV PARA LA EVALUACIÓN DE HMIs Y SISTEMAS ITS PARA TRANSPORTE POR CARRETERA INTEGRANDO FACTORES PERSONALES, TEMPORALES Y AMBIENTALES(Universidad de Deusto, 2015-10) Murgoitio, Jesus; Gutierrez, Jose LuisThis doctoral thesis makes an in-depth analysis of HRV (Heart Rate Variability) as calculated from an ECG (Electro Cardio Gram) and how it varies related to the attention span shown by individuals when driving land vehicles. The aim is to use this knowledge in applications where this attention span needs to be estimated, and to focus primarily on measuring the influence that the new HMIs (Human Machine Interfaces) being tried out in the latest generation of vehicles may have on road safety. More specifically, the thesis concentrates on the identification of those parameters which best relate to variability due to personal characteristics such as age, temporary features such as circadian cycles, or the surrounding environment as in road category, in order to integrate all of these features into a complete system. It is this integrating feature which is most lacking in the literature, indeed it is almost non-existent, so that the present work seeks to achieve the most universal classification criteria. This is essential for its use in applications such as vehicle driving, which require a constant state of alert. To this end, an application has been developed which allows patterns to be generated according to a system of four standard variables derived from the HRV, allowing different ways of interacting with on-road vehicles to be compared and assessed. Finally, a further point of note is that the analysis contained in this thesis may be extended to other fields (e.g. risky activities or environments) where it is useful or indeed necessary to take a relative measurement of attention span rather than follow a simple patterns, as understood by the Kahneman model.Item OBTENCIÓN Y CARACTERIZACIÓN DE ALEACIONES BASADAS EN γ-TiAl MEDIANTE MÉTODOS PULVIMETALÚRGICOS(UPV-EHU, 2014) Lagos, M.A.; Agote, Iñígo; San Juan, José MaríaEn esta tesis doctoral se ha estudiado la obtención y caracterización de aleaciones basadas en γ-TiAl mediante varios métodos pulvimetalúrgicos no convencionales. Las aleaciones basadas en γ-TiAl son muy interesantes para aplicaciones aeroespaciales debido a su bajo peso y a sus buenas propiedades mecánicas hasta 700-800 ºC. Sin embargo, un gran inconveniente es la dificultad en la fabricación de estas aleaciones por métodos convencionales. En los métodos de fusión, suele existir variabilidad en las propiedades, debido generalmente a las segregaciones de composición que se producen en los lingotes. La pulvimetalurgia presenta una importante ventaja en cuanto al control de la composición. En este trabajo se estudian tres rutas tecnológicas, dos basadas en la síntesis por combustión y otra basada en la sinterización por corriente eléctrica. En general, la principal ventaja de estas técnicas es la reducción del tiempo de procesado y la posibilidad de utilizar polvos elementales, cuyo coste es inferior a los pre-aleados. Se ha estudiado el efecto de los parámetros de procesado en los diversos métodos y se han optimizado las rutas tecnológicas para obtener la mayor densidad y homogeneidad microestructural posible. Una vez definidas las condiciones de procesado más interesantes para cada ruta, se ha caracterizado comparativamente la densidad, la microestructura, las fases cristalográficas y las propiedades mecánicas de tracción y fluencia. También se han comparado los valores obtenidos con los de los métodos convencionales. Por último, se han considerado algunos aspectos ligados al escalado de la sinterización por corriente eléctrica, aspecto importante a la hora de una posible aplicación industrial.Item Techno-economic evaluation of building energy refurbishment processes from a life cycle perspective(UPV-EHU, 2015-12-18) Oregi, Xabat; Hernández, Rufino J.; Hernandez, PatxiBased on the new energy performance limitations determinate by the Directive 2010/31/EU, the buildings become more energy efficient and the impact of the operational stages is reduced, increasing the relevance of the environmental and economic impact of the other life cycle stages. In addition, the energy refurbishment requirements are increasing, generating the need to integrate the life cycle methodology during the prioritization process between the different energy-efficient retrofitting strategies for buildings. However, despite the Life Cycle methodology has already been standardized, as buildings are extremely complex systems, a lot of studies usually apply some simplifications to reduce the time of the evaluation. Based on these simplifications and seeing that when talking about the energy renovation of an existing building, the main objective is focused on reducing its impact during its operational stage, are being generating various questions in relation to the need and added value of the application of the Life Cycle methodology: what extent can the boundary system be simplified without reducing the accuracy of the results? What is the relationship between the impact reduced during the operational stage and the impact generated during the other stages of the life cycle? What are the most relevant parameters and/or stages when conducting a study and making a decision? In order to seek an answer to these questions on the basis of the EN 15978 and prEN 16627 standards, this research work proposed a quantitative methodology to allow assessment of the impact generated at each stage in the life cycle of an energy-efficient retrofitting of a building. The methodology proposed is validated using a building constructed in San Sebastian (Spain) in 1963. However, given the possible disadvantages of treating these results based on a single case of study as overall conclusions, the research proposes an exhaustive sensitivity analysis presenting new scenarios related to most of the parameters that have a direct influence on the method of calculation. After analysing all the new scenarios defined using these data, the results obtained make it possible to give an answer concerning the relationship between the increased accuracy of the results and the quantification of all the stages in the life cycle.Item Visibility Recovery on Images Acquired in Attenuating Media. Application to Underwater, Fog, and Mammographic Imaging(UPV-EHU, 2015-12-17) Galdran, Adrian; Picon, Artzai; Pardo, DavidWhen acquired in attenuating media, digital images often suffer from a particularly complex degradation that reduces their visual quality, hindering their suitability for further computational applications, or simply decreasing the visual pleasantness for the user. In these cases, mathematical image processing reveals itself as an ideal tool to recover some of the information lost during the degradation process. In this dissertation, we deal with three of such practical scenarios in which this problematic is specially relevant, namely, underwater image enhancement, fog removal and mammographic image processing. In the case of digital mammograms, X-ray beams traverse human tissue, and electronic detectors capture them as they reach the other side. However, the superposition on a bidimensional image of three-dimensional structures produces lowcontrasted images in which structures of interest suffer from a diminished visibility, obstructing diagnosis tasks. Regarding fog removal, the loss of contrast is produced by the atmospheric conditions, and white colour takes over the scene uniformly as distance increases, also reducing visibility. For underwater images, there is an added difficulty, since colour is not lost uniformly; instead, red colours decay the fastest, and green and blue colours typically dominate the acquired images. To address all these challenges, in this dissertation we develop new methodologies that rely on: a) physical models of the observed degradation, and b) the calculus of variations. Equipped with this powerful machinery, we design novel theoretical and computational tools, including image-dependent functional energies that capture the particularities of each degradation model. These energies are composed of different integral terms that are simultaneously minimized by means of efficient numerical schemes, producing a clean, visually-pleasant and useful output image, with better contrast and increased visibility. In every considered application, we provide comprehensive qualitative (visual) and quantitative experimental results to validate our methods, confirming that the developed techniques outperform other existing approaches in the literature.