Browsing by Author "Diaz-De-Arcaya, Josu"
Now showing 1 - 6 of 6
Results Per Page
Sort Options
Item Akats: A System for Resilient Deployments on Edge Computing Environments Using Federated Machine Learning Techniques(Institute of Electrical and Electronics Engineers Inc., 2023) Diaz-De-Arcaya, Josu; Torre-Bastida, Ana I.; Bonilla, Lander; López-De-Armentia, Juan; Miñón, Raúl; Zarate, Gorka; Almeida, Aitor; Solic, Petar; Nizetic, Sandro; Rodrigues, Joel J. P. C.; Rodrigues, Joel J. P. C.; Rodrigues, Joel J. P. C.; Lopez-de-Ipina Gonzalez-de-Artaza, Diego; Perkovic, Toni; Catarinucci, Luca; Patrono, Luigi; HPAEdge computing is a game changer for IoT, as it allows IoT devices to independently process and analyze data instead of just sending it to the cloud. But managing this considerable number of devices and deploying workloads on them in a coordinated and intelligent manner remains a challenge nowadays. In this paper, we focus on introducing the resilience dimension into these deployments, and we provide two main contributions: the use of federated machine learning techniques to develop a collaborative tool between the different devices aimed at detecting the possibility of a device failure, and subsequently, the utilization of the inferred information to optimize deployment plans ensuring the resilience in the devices. These two advances are implemented in an intelligent system, Akats, whose architecture is described in detail in this article. Finally, an application scenario is presented, based on Industry 4.0 - Machine predictive maintenance, to exemplify the benefits of the proposed intelligent system.Item IEM: A Unified Lifecycle Orchestrator for Multilingual IaC Deployments(Association for Computing Machinery, Inc, 2023-04-15) Diaz-De-Arcaya, Josu; Osaba, Eneko; Benguria, Gorka; Etxaniz, Iñaki; Lobo, Jesus L.; Alonso, Juncal; Torre-Bastida, Ana I.; Almeida, Aitor; HPA; Quantum; IAOver the last few years, DevOps methodologies have promoted a more streamlined operationalization of software components in production environments. Infrastructure as Code (IaC) technologies play a key role in the lifecycle management of applications, as they promote the delivery of the infrastructural elements alongside the application components. This way, IaC technologies aspire to minimize the problems associated with the environment by providing a repeatable and traceable process. However, there are a large variety of IaC frameworks, each of them focusing on a different phase of the operationalization lifecycle, hence the necessity to master numerous technologies. In this research, we present the IaC Execution Manager (IEM), a tool devoted to providing a unified framework for the operationalization of software components that encompasses the various stages and technologies involved in the application lifecycle. We analyze an industrial use case to improve the current approach and conclude the IEM is a suitable tool for solving the problem as it promotes automation, while reducing the learning curve associated with the required IaC technologies.Item K2E: Building MLOps Environments for Governing Data and Models Catalogues while Tracking Versions(Institute of Electrical and Electronics Engineers Inc., 2022) Zarate, Gorka; Minon, Raul; Diaz-De-Arcaya, Josu; Torre-Bastida, Ana I.; HPANowadays, there are a variety of problems associated with the process of extracting value and information from data such as: Data heterogeneity, data distribution, model versioning, and the vast variety of techniques and approaches. Due to all this, the data management process becomes hard to implement in real world scenarios. In this context, the catalogue tools for data and Artificial Intelligence models alleviate the burden of dealing with versioning tasks. Thus, the automation of the data and models' management processes is facilitated, complying with DataOps and MLOps good practices. This work in progress enumerates key challenges to address when creating these types of catalogues: On the one hand, the management of the diversity of data and models' internal nature and their different versions, and on the other hand, the provision of adequate meta-information and Governance tools such as access control and auditing. In this paper, the Knowledge to Environment (K2E) platform is presented, whose architecture aims to define the necessary components for the creation of environments that allow working with data and model catalogues. By environment creation, we mean providing a workspace populated with the datasets and models of an organization, while tracking their distinct versions by using specialised catalogues. In addition, this workspace will incorporate added-value tools for governance and auditing. Finally, an approach for implementing K2E is detailed.Item MLPacker: A Unified Software Tool for Packaging and Deploying Atomic and Distributed Analytic Pipelines(Institute of Electrical and Electronics Engineers Inc., 2022) Minon, Raul; Diaz-De-Arcaya, Josu; Torre-Bastida, Ana I.; Zarate, Gorka; Moreno-Fernandez-De-Leceta, Aitor; Solic, Petar; Nizetic, Sandro; Rodrigues, Joel J. P. C.; Rodrigues, Joel J.P.C.; Gonzalez-de-Artaza, Diego Lopez-de-Ipina; Perkovic, Toni; Catarinucci, Luca; Patrono, Luigi; HPAIn the last years, MLOps (Machine Learning Operations) paradigm is attracting the attention from the community, extrapolating the DevOps (Development and Operations) paradigm to the artificial intelligence (AI) development life-cycle. In this area, some challenges must be addressed to successfully deliver solutions since there are specific nuances when dealing with AI operationalization such as the model packaging or monitoring. Fortunately, interesting and helpful approaches, both from the research community and industry have emerged. However, further research is still necessary to fulfil key gaps. This paper presents a tool, MLPacker, for addressing some of them. Concretely, this tool provides mechanisms to package and deploy analytic pipelines both in REST APIs and in streaming mode. In addition, the analytic pipelines can be deployed atomically (i.e., the whole pipeline in the same machine) or in a distributed fashion (i.e., deploying each stage of the pipeline in distinct machines). In this way, users can take advantage from the cloud continuum paradigm considering edge-fog-cloud computing layers. Finally, the tool is decoupled from the training stage to avoid data scientists the integration of blocks of code in their experiments for the operationalization. Besides the package mode (REST API or streaming), the tool can be configured to perform the deployments in local or in remote machines and by using or not containers. For this aim, this paper describes the gaps this tool addresses, the detailed components and flows supported, as well as an scenario with three different case studies to better explain the research conducted.Item Multiobjective Optimization Analysis for Finding Infrastructure-as-Code Deployment Configurations(Association for Computing Machinery, 2023-08-04) Osaba, Eneko; Diaz-De-Arcaya, Josu; Alonso, Juncal; Lobo, Jesus L.; Benguria, Gorka; Etxaniz, Iñaki; Quantum; HPA; IAMultiobjective optimization is a hot topic in the artificial intelligence and operations research communities. The design and development of multiobjective methods is a frequent task for researchers and practitioners. As a result of this vibrant activity, a myriad of techniques have been proposed in the literature to date, demonstrating a significant effectiveness for dealing with situations coming from a wide range of real-world areas. This paper is focused on a multiobjective problem related to optimizing Infrastructure-as-Code deployment configurations. The system implemented for solving this problem has been coined as IaC Optimizer Platform (IOP). Despite the fact that a prototypical version of the IOP has been introduced in the literature before, a deeper analysis focused on the resolution of the problem is needed, in order to determine which is the most appropriate multiobjective method for embedding in the IOP. The main motivation behind the analysis conducted in this work is to enhance the IOP performance as much as possible. This is a crucial aspect of this system, deeming that it will be deployed in a real environment, as it is being developed as part of a H2020 European project. Going deeper, we resort in this paper to nine different evolutionary computation-based multiobjective algorithms. For assessing the quality of the considered solvers, 12 different problem instances have been generated based on real-world settings. Results obtained by each method after 10 independent runs have been compared using Friedman's non-parametric tests. Findings reached from the tests carried out lad to the creation of a multi-algorithm system, capable of applying different techniques according to the user's needs.Item PIACERE Project: Description and Prototype for Optimizing Infrastructure as Code Deployment Configurations(Association for Computing Machinery, Inc, 2022-07-09) Osaba, Eneko; Diaz-De-Arcaya, Josu; Orue-Echevarria, Leire; Alonso, Juncal; Lobo, Jesus L.; Benguria, Gorka; Etxaniz, Iñaki; Quantum; HPA; Tecnalia Research & Innovation; IAPIACERE is an European project supported by the Union's Horizon 2020 research and innovation programme, whose objective is to enhance the productivity of DevOps teams in the operation of Infrastructure as Code (IaC) by offering an integrated DevSec-Ops framework. Thus, DevOps practitioners can develop IaC as if they were programming a common software application. In order to achieve this challenging task, one of the core technologies considered within PIACERE will be the design and development of optimization metaheuristics, in a module coined as IaC Optimizer Platform (IOP). The main objective of the IOP is to provide DevSecOps teams with the most appropriate deployment configurations that best fit a set of defined constraints. The goal of this technical paper is to describe the preliminary approach followed in PIACERE for carrying out this optimization, and how the IOP fits into the whole PIACERE ecosystem. Additionally, results obtained in a preliminary experimentation are detailed in this study.