Examinando por Materia "data set"
Mostrando 1 - 7 de 7
Resultados por página
Opciones de ordenación
Ítem Automatic detection of building typology using deep learning methods on street level images(PERGAMON-ELSEVIER SCIENCE LTD, 2020-03-20) Duque, J.; Gonzalez, D.; Rueda Plata, Diego; Acevedo, A.; Ramos, R.; Betancourt, A.; García, S.; Mecánica AplicadaAn exposure model is a key component for assessing potential human and economic losses from natural disasters. An exposure model consists of a spatially disaggregated description of the infrastructure and population of a region under study. Depending on the size of the settlement area, developing such models can be a costly and time-consuming task. In this paper we use a manually annotated dataset consisting of approximately 10,000 photos acquired at street level in the urban area of Medellín to explore the potential for using a convolutional neural network (CNN) to automatically detect building materials and types of lateral-load resisting systems, which are attributes that define a building's structural typology (which is a key issue in exposure models for seismic risk assessment). The results of the developed model achieved a precision of 93% and a recall of 95% when identifying nonductile buildings, which are the buildings most likely to be damaged in an earthquake. Identifying fine-grained material typology is more difficult, because many visual clues are physically hidden, but our model matches expert level performances, achieving a recall of 85% and accuracy scores ranging from 60% to 82% on the three most common building typologies, which account for 91% of the total building population in Medellín. Overall, this study shows that a CNN can make a substantial contribution to developing cost-effective exposure models. © 2020 Elsevier LtdÍtem Automatic detection of building typology using deep learning methods on street level images(PERGAMON-ELSEVIER SCIENCE LTD, 2020-03-20) Duque, J.; Gonzalez, D.; Rueda Plata, Diego; Acevedo, A.; Ramos, R.; Betancourt, A.; García, S.; Universidad EAFIT. Departamento de Economía y Finanzas; Research in Spatial Economics (RISE)An exposure model is a key component for assessing potential human and economic losses from natural disasters. An exposure model consists of a spatially disaggregated description of the infrastructure and population of a region under study. Depending on the size of the settlement area, developing such models can be a costly and time-consuming task. In this paper we use a manually annotated dataset consisting of approximately 10,000 photos acquired at street level in the urban area of Medellín to explore the potential for using a convolutional neural network (CNN) to automatically detect building materials and types of lateral-load resisting systems, which are attributes that define a building's structural typology (which is a key issue in exposure models for seismic risk assessment). The results of the developed model achieved a precision of 93% and a recall of 95% when identifying nonductile buildings, which are the buildings most likely to be damaged in an earthquake. Identifying fine-grained material typology is more difficult, because many visual clues are physically hidden, but our model matches expert level performances, achieving a recall of 85% and accuracy scores ranging from 60% to 82% on the three most common building typologies, which account for 91% of the total building population in Medellín. Overall, this study shows that a CNN can make a substantial contribution to developing cost-effective exposure models. © 2020 Elsevier LtdÍtem Automatic detection of building typology using deep learning methods on street level images(PERGAMON-ELSEVIER SCIENCE LTD, 2020-03-20) Duque, J.; Gonzalez, D.; Rueda Plata, Diego; Acevedo, A.; Ramos, R.; Betancourt, A.; García, S.; Duque, J.; Gonzalez, D.; Rueda Plata, Diego; Acevedo, A.; Ramos, R.; Betancourt, A.; García, S.; Universidad EAFIT. Departamento de Ingeniería de Producción; Materiales de IngenieríaAn exposure model is a key component for assessing potential human and economic losses from natural disasters. An exposure model consists of a spatially disaggregated description of the infrastructure and population of a region under study. Depending on the size of the settlement area, developing such models can be a costly and time-consuming task. In this paper we use a manually annotated dataset consisting of approximately 10,000 photos acquired at street level in the urban area of Medellín to explore the potential for using a convolutional neural network (CNN) to automatically detect building materials and types of lateral-load resisting systems, which are attributes that define a building's structural typology (which is a key issue in exposure models for seismic risk assessment). The results of the developed model achieved a precision of 93% and a recall of 95% when identifying nonductile buildings, which are the buildings most likely to be damaged in an earthquake. Identifying fine-grained material typology is more difficult, because many visual clues are physically hidden, but our model matches expert level performances, achieving a recall of 85% and accuracy scores ranging from 60% to 82% on the three most common building typologies, which account for 91% of the total building population in Medellín. Overall, this study shows that a CNN can make a substantial contribution to developing cost-effective exposure models. © 2020 Elsevier LtdÍtem Laguerre-gauss filters in reverse time migration image reconstruction(Sociedade Brasileira de Geofisica, 2017-01-01) Castrillón, J.G.P.; Montoya, O.L.Q.; Sierra-Sosa, D.; Universidad EAFIT. Escuela de Ciencias; Modelado MatemáticoReverse time migration (RTM) solves the acoustic or elastic wave equation by means of the extrapolation from source and receiver wavefield in time. A migrated image is obtained by applying a criteria known as imaging condition. The cross-correlation between source and receiver wavefields is the commonly used imaging condition. However, this imaging condition produces spatial low-frequency noise, called artifacts, due to the unwanted correlation of the diving, head and backscattered waves. Several techniques have been proposed to reduce the artifacts occurrence. Derivative operators as Laplacian are the most frequently used. In this work, we propose a technique based on a spiral phase filter ranging from 0 to 2p, and a toroidal amplitude bandpass filter, known as Laguerre-Gauss transform. Through numerical experiments we present the application of this particular filter on three synthetic data sets. In addition, we present a comparative spectral study of images obtained by the zero-lag cross-correlation imaging condition, the Laplacian filtering and the Laguerre-Gauss filtering, showing their frequency features. We also present evidences not only with simulated noisy velocity fields but also by comparison with the model velocity field gradients that this method improves the RTM images by reducing the artifacts and notably enhance the reflective events. © 2017 Sociedade Brasileira de Geofísica.Ítem The Network-Max-P-Regions model(TAYLOR & FRANCIS LTD, 2017-05-04) She, B.; Duque, J.C.; Ye, X.; Universidad EAFIT. Departamento de Economía y Finanzas; Research in Spatial Economics (RISE)This paper introduces a new p-regions model called the Network-Max-P-Regions (NMPR) model. The NMPR is a regionalization model that aims to aggregate n areas into the maximum number of regions (max-p) that satisfy a threshold constraint and to minimize the heterogeneity while taking into account the influence of a street network. The exact formulation of the NMPR is presented, and a heuristic solution is proposed to effectively compute the near-optimized partitions in several simulation datasets and a case study in Wuhan, China. © 2016 Informa UK Limited, trading as Taylor & Francis Group.Ítem The p-Regions Problem(WILEY-BLACKWELL, 2011-01-01) Duque, Juan C.; Church, Richard L.; Middleton, Richard S.; Universidad EAFIT. Departamento de Economía y Finanzas; Research in Spatial Economics (RISE)The p-regions problem involves the aggregation or clustering of n small areas into p spatially contiguous regions while optimizing some criteria. The main objective of this article is to explore possible avenues for formulating this problem as a mixed integer-programming (MIP) problem. The critical issue in formulating this problem is to ensure that each region is a spatially contiguous cluster of small areas. We introduce three MIP models for solving the p regions problem. Each model minimizes the sum of dissimilarities between all pairs of areas within each region while guaranteeing contiguity. Three strategies designed to ensure contiguity are presented: (1) an adaptation of the Miller, Tucker, and Zemlin tour-breaking constraints developed for the traveling salesman problem; (2) the use of ordered-area assignment variables based upon an extension of an approach by Cova and Church for the geographical site design problem; and (3) the use of flow constraints based upon an extension of work by Shirabe. We test the efficacy of each formulation as well as specify a strategy to reduce overall problem size. © 2011 The Ohio State University.Ítem Wind turbine selection method based on the statistical analysis of nominal specifications for estimating the cost of energy(Elsevier Ltd, 2018-10-15) Arias-Rosales, A.; Osorio-Gómez, G.; Universidad EAFIT. Departamento de Ingeniería de Diseño; Ingeniería de Diseño (GRID)Wind turbine selection is a critical engineering problem in the overall cost-effectiveness of a wind project. With the wide spreading and democratization of wind energy technologies, non-expert stakeholders are being faced with the challenge of selecting among very different wind turbines. As a comprehensive indicator, the cost of energy can serve as a guide, but reportedly misleading publicity and commonly unavailable information render its calculation more inaccessible and less reliable. Accordingly, this work proposes a method to compare wind turbines, on the basis of the cost of energy, from only nominal specifications and a standard characterization of the local wind conditions. For this endeavor, it was identified that two key variables are not usually available at a preliminary stage: the total efficiency and a feasible hub height. Through a systematic statistical analysis of the trends in a constructed dataset of 176 turbines, it was possible to establish regression models for the estimation of both variables. These models were tested in a validation set and their estimations were found to correctly characterize the central trend of the data without significant deviations. The uncertainty related to the use of both models was addressed by analyzing the 95% Prediction Intervals and the stochastic rank dominance. The established statistical models were then used as the core of the proposed selection method. When the available information is limited or not trustworthy, the steps of the method can be followed as an approach to estimate the cost of energy of a given horizontal axis wind turbine in a given location. © 2018 Elsevier Ltd