Desarrollo y Diseño de Procesos
URI permanente para esta comunidad
El grupo de desarrollo y diseño de procesos busca desarrollar y mejorar procesos empleando herramientas teóricas y experimentales para satisfacer las necesidades industriales y ambientales del país y la región.
Líneas de investigación: Desarrollo de Procesos y Productos; Simulación y Modelación; Procesos Ambientales.
Código Minciencias: COL0037569.
Categoría 2019: A1.
Escuela: Ingeniería.
Departamento académico: Ingeniería de Procesos.
Coordinadora: Santiago Builes Toro.
Correo electrónico: sbuiles@eafit.edu.co
Líneas de investigación: Desarrollo de Procesos y Productos; Simulación y Modelación; Procesos Ambientales.
Código Minciencias: COL0037569.
Categoría 2019: A1.
Escuela: Ingeniería.
Departamento académico: Ingeniería de Procesos.
Coordinadora: Santiago Builes Toro.
Correo electrónico: sbuiles@eafit.edu.co
Examinar
Examinando Desarrollo y Diseño de Procesos por Fecha de publicación
Mostrando 1 - 20 de 81
Resultados por página
Opciones de ordenación
Ítem Design of computer experiments applied to modeling compliant mechanisms(DELFT UNIV TECHNOLOGY, FAC INDUST DESIGN ENG, 2010-01-01) Arango, D.R.; Acosta, D.A.; Durango, S.; Ruiz, O.E.; Universidad EAFIT. Departamento de Ingeniería de Procesos; Desarrollo y Diseño de ProcesosThis article discusses a procedure for force-displacement modeling compliant mechanisms by using a design of computer experiments methodology. This approach produces a force-displacement metamodel that is suited for real-time control of compliant mechanisms. The term metamodel is used to represent a simplified and efficient mathematical model of unknown phenomenon or computer codes. The metamodeling of compliant mechanisms is performed from virtual experiments based on factorial and space filling design of experiments. The procedure is used to modeling the quasi-static behavior of the HexFlex compliant mechanism. The HexFlex is a parallel compliant mechanism for nanomanipulating that allows six degrees of freedom of its moving stage. The metamodel of the HexFlex is performed from virtual experiments by the Finite Element Method (FEM). The obtained metamodel for the HexFlex is linear for the movement range of the mechanism. Simulations of the metamodel were conducted, finding good accuracy with respect to the virtual experiments. © Organizing Committee of TMCE 2010 Symposium.Ítem Statistical tuning of adaptive-weight depth map algorithm(SPRINGER, 2011-01-01) Hoyos, Alejandro; Congote, John; Barandiaran, Inigo; Acosta, Diego; Ruiz, Oscar; Universidad EAFIT. Departamento de Ingeniería de Procesos; Desarrollo y Diseño de ProcesosIn depth map generation, the settings of the algorithm parameters to yield an accurate disparity estimation are usually chosen empirically or based on unplanned experiments. A systematic statistical approach including classical and exploratory data analyses on over 14000 images to measure the relative influence of the parameters allows their tuning based on the number of bad-pixels. Our approach is systematic in the sense that the heuristics used for parameter tuning are supported by formal statistical methods. The implemented methodology improves the performance of dense depth map algorithms. As a result of the statistical based tuning, the algorithm improves from 16.78% to 14.48% bad-pixels rising 7 spots as per the Middlebury Stereo Evaluation Ranking Table. The performance is measured based on the distance of the algorithm results vs. the Ground Truth by Middlebury. Future work aims to achieve the tuning by using significantly smaller data sets on fractional factorial and surface-response designs of experiments. © 2011 Springer-Verlag.Ítem Statistical tuning of adaptive-weight depth map algorithm(2011-01-01) Acosta, Diego AndrésÍtem III Congreso Internacional sobre Diseño de Procesos y Productos(Fondo Editorial Universidad EAFIT, 2011-10-14) Acosta, Diego A.; Gil Pavas, Edison; MOLINA, KEVIN GIOVANNI; ESCOBAR, JAIME ALBERTOÍtem Tuning of Adaptive Weight Depth Map Generation Algorithms Exploratory Data Analysis and Design of Computer Experiments DOCE(SPRINGER, 2012-07-01) Acosta, Diego Andrés; Universidad EAFIT. Departamento de Ingeniería de Procesos; Desarrollo y Diseño de ProcesosÍtem Fitting of Analytic Surfaces to Noisy Point Clouds(2013-01-01) RUIZ, OSCAR EDUARDO; Arroyave-Tobón, S.; Acosta, Diego A.; Universidad EAFIT. Departamento de Ingeniería de Procesos; Desarrollo y Diseño de ProcesosFitting -continuous or superior surfaces to a set of points sampled on a 2-manifold is central to reverse engineering, computer aided geometric modeling, entertaining, modeling of art heritage, etc. This article addresses the fitting of analytic (ellipsoÍtem Parametric curve reconstruction from point clouds using minimization techniques(2013-01-01) Ruiz, O.E.; Cortés, C.; Aristizábal, M.; Acosta, D.A.; Vanegas, C.A.; Universidad EAFIT. Departamento de Ingeniería de Procesos; Desarrollo y Diseño de ProcesosCurve reconstruction from noisy point samples is central to surface reconstruction and therefore to reverse engineering, medical imaging, etc. Although Piecewise Linear (PL) curve reconstruction plays an important role, smooth (C1-, C2-,?) curves are needed for many applications. In reconstruction of parametric curves from noisy point samples there remain unsolved issues such as (1) high computational expenses, (2) presence of artifacts and outlier curls, (3) erratic behavior of self-intersecting curves, and (4) erratic excursions at sharp corners. Some of these issues are related to non-Nyquist (i.e. sparse) samples. In response to these shortcomings, this article reports the minimization-based fitting of parametric curves for noisy point clouds. Our approach features: (a) Principal Component Analysis (PCA) pre-processing to obtain a topologically correct approximation of the sampled curve. (b) Numerical, instead of algebraic, calculation of roots in point-to-curve distances. (c) Penalties for curve excursions by using point cloud to - curve and curve to point cloud. (d) Objective functions which are economic to minimize. The implemented algorithms successfully deal with self - intersecting and / or non-Nyquist samples. Ongoing research includes self-tuning of the algorithms and decimation of the point cloud and the control polygon.Ítem Diseño conceptual de una sonda langmüir para caracterización de plasmas fríos mediante diseño estadístico de experimentos.(IMPRENTA UNIV ANTIOQUIA, 2013-01-01) Camargo, V.; Acosta, Diego A.; JARAMILLO, JUAN MANUEL; Universidad EAFIT. Departamento de Ingeniería de Procesos; Desarrollo y Diseño de ProcesosThe characterization and control of plasma-assisted processes, has become increasingly urgent to adapt this kind technology to industrial contexts. This work presents the design and construction of a cold plasma characterization system by electrostatic means (Langmuir probe), based on concepts of plasma physics and tools of engineering, design of experiments and conceptual design. The result of this work is a functional prototype probe and some measurements on the reactor.Ítem Collaborative Networked Virtual Surgical Simulators (CNVSS): Factors Affecting Collaborative Performance(MIT PRESS, 2013-01-01) Diaz, Christian; Trefftz, Helmuth; Quintero, Lucia; Acosta, Diego A.; Srivastava, Sakti; Universidad EAFIT. Departamento de Ingeniería de Procesos; Desarrollo y Diseño de ProcesosStand-alone and networked surgical simulators based on virtual reality have been proposed as a means to train surgeons in specific surgical skills with or without expert guidance and supervision. However, a surgical operation usually involves a group of medical practitioners who cooperate as team members. To this end, CNVSS have been proposed for the collaborative training of surgical procedures in which users with different surgical roles can take part in the training session. To be successful, these simulators should guarantee synchronicity, which requires (1) consistent viewing of the surgical scene and (2) a quick response time. These two variables are affected by factors such as users' machine capabilities and network conditions. As far as we know, the impact of these factors on the performance of CNVSS has not been evaluated. In this paper, we describe the development of CNVSS and a statistical factorial design of experiments (DOE) to determine the most important factors affecting collaboration in CNVSS. From the results obtained, it was concluded that delay, jitter, packet loss percentage, and processor speed have a major impact on collaboration in CNVSS.Ítem Conceptual design of a Langmuir probe for cold plasma characterization employing statistical design of experiments(IMPRENTA UNIV ANTIOQUIA, 2013-06-01) Camargo Suarez, Victor Hugo; Acosta Maya, Diego Andres; Jaramillo O, Juan Manuel; Universidad EAFIT. Departamento de Ingeniería de Procesos; Desarrollo y Diseño de ProcesosThe characterization and control of plasma-assisted processes, has become increasingly urgent to adapt this kind technology to industrial contexts. This work presents the design and construction of a cold plasma characterization system by electrostatic means (Langmuir probe), based on concepts of plasma physics and tools of engineering, design of experiments and conceptual design. The result of this work is a functional prototype probe and some measurements on the reactor.Ítem Design of computer experiments applied to modeling of compliant mechanisms for real-time control(SPRINGER, 2013-07-01) Acosta, Diego A.; Restrepo, David; Durango, Sebastian; Ruiz, Oscar E.; Universidad EAFIT. Departamento de Ingeniería de Procesos; Desarrollo y Diseño de ProcesosThis article discusses the use of design of computer experiments (DOCE) (i.e., experiments run with a computer model to find how a set of inputs affects a set of outputs) to obtain a force-displacement meta-model (i.e., a mathematical equation that summarizes and aids in analyzing the input-output data of a DOCE) of compliant mechanisms (CMs). The procedure discussed produces a force-displacement meta-model, or closed analytic vector function, that aims to control CMs in real-time. In our work, the factorial and space-filling DOCE meta-model of CMs is supported by finite element analysis (FEA). The protocol discussed is used to model the HexFlex mechanism functioning under quasi-static conditions. The HexFlex is a parallel CM for nano-manipulation that allows six degrees of freedom (x, y, z, ? x, ? y, ? z ) of its moving platform. In the multi-linear model fit of the HexFlex, the products or interactions proved to be negligible, yielding a linear model (i.e., linear in the inputs) for the operating range. The accuracy of the meta-model was calculated by conducting a set of computer experiments with random uniform distribution of the input forces. Three error criteria were recorded comparing the meta-model prediction with respect to the results of the FEA experiments by determining: (1) maximum of the absolute value of the error, (2) relative error, and (3) root mean square error. The maximum errors of our model are lower than high-precision manufacturing tolerances and are also lower than those reported by other researchers who have tried to fit meta-models to the HexFlex mechanism. © 2012 Springer-Verlag London Limited.Ítem Tuning of adaptive weight depth map generation algorithms: Exploratory data analysis and design of computer experiments (DOCE)(SPRINGER, 2013-09-01) Acosta, Diego; Barandiaran, Inigo; Congote, John; Ruiz, Oscar; Hoyos, Alejandro; Grana, Manuel; Universidad EAFIT. Departamento de Ingeniería de Procesos; Desarrollo y Diseño de ProcesosIn depth map generation algorithms, parameters settings to yield an accurate disparity map estimation are usually chosen empirically or based on unplanned experiments. Algorithms' performance is measured based on the distance of the algorithm results vs. the Ground Truth by Middlebury's standards. This work shows a systematic statistical approach including exploratory data analyses on over 14000 images and designs of experiments using 31 depth maps to measure the relative influence of the parameters and to fine-tune them based on the number of bad pixels. The implemented methodology improves the performance of adaptive weight based dense depth map algorithms. As a result, the algorithm improves from 16.78 to 14.48 % bad pixels using a classical exploratory data analysis of over 14000 existing images, while using designs of computer experiments with 31 runs yielded an even better performance by lowering bad pixels from 16.78 to 13 %. © 2012 Springer Science+Business Media, LLC.Ítem Adaptive Architecture to Support Context-Aware Collaborative Networked Virtual Surgical Simulators (CNVSS)(2014-01-01) Diaz Leon, Christian Andres; Gomez, H.T.; Lucia Quintero M, O.; Acosta Maya, D.A.; Sakti SrivastavaStand-alone and networked surgical virtual reality based simulators have been proposed as means to train surgical skills with or without a supervisor nearby the student or trainee. However, surgical skills teaching in medicine schools and hospitals...Ítem Simulación dinámica y control de procesos(Editorial EAFIT, 2014-01-01) Lamb, Cristina Patricia; Alvarez, J.O.; Universidad EAFIT. Departamento de Ingeniería de Procesos; Desarrollo y Diseño de ProcesosÍtem Adaptive architecture to support context-aware Collaborative Networked Virtual Surgical Simulators (CNVSS)(SPRINGER, 2014-01-01) Diaz, C.; Trefftz, H.; Quintero, L.; Acosta, D.; Srivastava, S.; Universidad EAFIT. Departamento de Ingeniería de Procesos; Desarrollo y Diseño de ProcesosStand-alone and networked surgical virtual reality based simulators have been proposed as means to train surgical skills with or without a supervisor nearby the student or trainee. However, surgical skills teaching in medicine schools and hospitals is changing, requiring the development of new tools to focus on: (i) importance of mentors role, (ii) teamwork skills and (iii) remote training support. For these reasons a surgical simulator should not only allow the training involving a student and an instructor that are located remotely, but also the collaborative training session involving a group of several students adopting different medical roles during the training session. Collaborative Networked Virtual Surgical Simulators (CNVSS) allow collaborative training of surgical procedures where remotely located users with different surgical roles can take part in a training session. Several works have addressed the issues related to the development of CNVSS using various strategies. To the best of our knowledge no one has focused on handling heterogeneity in collaborative surgical virtual environments. Handling heterogeneity in this type of collaborative sessions is important because not all remotely located users have homogeneous Internet connections, nor the same interaction devices and displays, nor the same computational resources, among other factors. Additionally, if heterogeneity is not handled properly, it will have an adverse impact on the performance of each user during the collaborative session. In this paper we describe the development of an adaptive architecture with the purpose of implementing a context-aware model for collaborative virtual surgical simulation in order to handle the heterogeneity involved in the collaboration session. © 2014 Springer International Publishing.Ítem Simulación dinámica y control de procesos - Guía práctica(Editorial EAFIT, 2014-01-01) ORTEGA, JUAN DAVID; Universidad EAFIT. Departamento de Ingeniería de Procesos; Desarrollo y Diseño de ProcesosÍtem Understanding the performance of new amine-functionalized mesoporous silica materials for CO2 adsorption(American Chemical Society, 2014-09-01) BUILES, SANTIAGO; Universidad EAFIT. Departamento de Ingeniería de Procesos; Desarrollo y Diseño de ProcesosÍtem Hybrid aminopolymer-silica materials for efficient CO2 adsorption(ROYAL SOC CHEMISTRY, 2015-01-01) López-Aranguren, P.; Builes S.; Fraile, J.; López-Periago, A.; Vega, L.F.; Domingo, C.; Universidad EAFIT. Departamento de Ingeniería de Procesos; Desarrollo y Diseño de ProcesosThe present work focuses on the development of a new eco-efficient chemical method for the polymerization of aziridine to hyperbranched polyethyleneimine (PEI) into mesoporous silica by using compressed CO2 as a solvent, reaction medium and catalyst. PEI was in situ grafted into MCM-41 and silica gel substrates, with pore diameters of 3.8 and 9.0 nm, respectively. The optimal polymerization conditions were found by varying the reaction pressure (1.0-10 MPa), temperature (25-45°C) and time (20-400 min). The thermal stability analysis indicated that aminopolymer chains were covalently attached on the amorphous silica surface. The described compressed CO2 route for the synthesis of high amine content hybrid products (6-8 mmolN g-1) is a very fast method, with processing times in the order of few minutes even at very low working pressures (1.0 MPa), being a step forward in the design of efficient hybrid aminopolymer nanocomposites for CO2 capture. The adsorptive behavior of the prepared hybrid materials was experimentally established by recording the N2 (-196°C) and CO2 (25, 50 and 75°C) adsorption isotherms. Results were compared to molecular simulation studies performed using Grand Canonical Monte Carlo for either N2 or CO2 adsorbed on amino modified MCM-41, thus helping to elucidate the predominant PEI configuration present in the functionalized materials. © The Royal Society of Chemistry 2015.Ítem Temperature regulation of a pilot-scale batch reaction system via explicit model predictive control(Institute of Electrical and Electronics Engineers Inc., 2015-01-01) Sanchez-Cossio, J.; Ortega-Alvarez, J.D.; Ocampo-Martinez, C.; Universidad EAFIT. Departamento de Ingeniería de Procesos; Desarrollo y Diseño de ProcesosIn this paper, the temperature of a pilot-scale batch reaction system is modeled towards the design of a controller based on the explicit model predictive control (EMPC) strategy. Some mathematical models are developed from experimental data to describe the system behavior. The simplest, yet reliable, model obtained is a (1,1,1)-order ARX polynomial model for which the mentioned EMPC controller has been designed. The resultant controller has a reduced mathematical complexity and, according to the successful results obtained in simulations, will be used directly on the real control system in a next stage of the entire experimental framework.Ítem Collaborative Networked Virtual Surgical Simulators (CNVSS) Implementing Hybrid Client-Server Architecture: Factors Affecting Collaborative Performance(MIT PRESS, 2015-01-01) Diaz, Christian; Trefftz, Helmuth; Quintero, Lucia; Acosta, Diego A.; Srivastava, Sakti; Universidad EAFIT. Departamento de Ingeniería de Procesos; Desarrollo y Diseño de ProcesosCurrently, surgical skills teaching in medical schools and hospitals is changing, requiring the development of new tools to focus on (i) the importance of the mentor's role, (ii) teamwork skills training, and (iii) remote training support. Collaborative Networked Virtual Surgical Simulators (CNVSS) allow collaborative training of surgical procedures where remotely located users with different surgical roles can take part in the training session. To provide successful training involving good collaborative performance, CNVSS should guarantee synchronicity in time of the surgical scene viewed by each user and a quick response time which are affected by factors such as users' machine capabilities and network conditions. To the best of our knowledge, the impact of these factors on the performance of CNVSS implementing hybrid client-server architecture has not been evaluated. In this paper the development of a CNVSS implementing a hybrid client-server architecture and two statistical designs of experiments (DOE) is described by using (i) a fractional factorial DOE and (ii) a central composite DOE, to determine the most influential factors and how these factors affect the collaboration in a CNVSS. From the results obtained, it was concluded that packet loss, bandwidth, and delay have a larger effect on the consistency of the shared virtual environment, whereas bandwidth, server machine capabilities, and delay and interaction between factors bandwidth and packet loss have a larger effect on the time difference and number of errors of the collaborative task.