Examinando por Materia "Experiments"
Mostrando 1 - 9 de 9
Resultados por página
Opciones de ordenación
Ítem Design of computer experiments applied to modeling compliant mechanisms(DELFT UNIV TECHNOLOGY, FAC INDUST DESIGN ENG, 2010-01-01) Arango, D.R.; Acosta, D.A.; Durango, S.; Ruiz, O.E.; Universidad EAFIT. Departamento de Ingeniería Mecánica; Laboratorio CAD/CAM/CAEThis article discusses a procedure for force-displacement modeling compliant mechanisms by using a design of computer experiments methodology. This approach produces a force-displacement metamodel that is suited for real-time control of compliant mechanisms. The term metamodel is used to represent a simplified and efficient mathematical model of unknown phenomenon or computer codes. The metamodeling of compliant mechanisms is performed from virtual experiments based on factorial and space filling design of experiments. The procedure is used to modeling the quasi-static behavior of the HexFlex compliant mechanism. The HexFlex is a parallel compliant mechanism for nanomanipulating that allows six degrees of freedom of its moving stage. The metamodel of the HexFlex is performed from virtual experiments by the Finite Element Method (FEM). The obtained metamodel for the HexFlex is linear for the movement range of the mechanism. Simulations of the metamodel were conducted, finding good accuracy with respect to the virtual experiments. © Organizing Committee of TMCE 2010 Symposium.Ítem Design of computer experiments applied to modeling compliant mechanisms(DELFT UNIV TECHNOLOGY, FAC INDUST DESIGN ENG, 2010-01-01) Arango, D.R.; Acosta, D.A.; Durango, S.; Ruiz, O.E.; Universidad EAFIT. Departamento de Ingeniería de Procesos; Desarrollo y Diseño de ProcesosThis article discusses a procedure for force-displacement modeling compliant mechanisms by using a design of computer experiments methodology. This approach produces a force-displacement metamodel that is suited for real-time control of compliant mechanisms. The term metamodel is used to represent a simplified and efficient mathematical model of unknown phenomenon or computer codes. The metamodeling of compliant mechanisms is performed from virtual experiments based on factorial and space filling design of experiments. The procedure is used to modeling the quasi-static behavior of the HexFlex compliant mechanism. The HexFlex is a parallel compliant mechanism for nanomanipulating that allows six degrees of freedom of its moving stage. The metamodel of the HexFlex is performed from virtual experiments by the Finite Element Method (FEM). The obtained metamodel for the HexFlex is linear for the movement range of the mechanism. Simulations of the metamodel were conducted, finding good accuracy with respect to the virtual experiments. © Organizing Committee of TMCE 2010 Symposium.Ítem Design of computer experiments applied to modeling of compliant mechanisms for real-time control(SPRINGER, 2013-07-01) Acosta, Diego A.; Restrepo, David; Durango, Sebastian; Ruiz, Oscar E.; Universidad EAFIT. Departamento de Ingeniería de Procesos; Desarrollo y Diseño de ProcesosThis article discusses the use of design of computer experiments (DOCE) (i.e., experiments run with a computer model to find how a set of inputs affects a set of outputs) to obtain a force-displacement meta-model (i.e., a mathematical equation that summarizes and aids in analyzing the input-output data of a DOCE) of compliant mechanisms (CMs). The procedure discussed produces a force-displacement meta-model, or closed analytic vector function, that aims to control CMs in real-time. In our work, the factorial and space-filling DOCE meta-model of CMs is supported by finite element analysis (FEA). The protocol discussed is used to model the HexFlex mechanism functioning under quasi-static conditions. The HexFlex is a parallel CM for nano-manipulation that allows six degrees of freedom (x, y, z, ? x, ? y, ? z ) of its moving platform. In the multi-linear model fit of the HexFlex, the products or interactions proved to be negligible, yielding a linear model (i.e., linear in the inputs) for the operating range. The accuracy of the meta-model was calculated by conducting a set of computer experiments with random uniform distribution of the input forces. Three error criteria were recorded comparing the meta-model prediction with respect to the results of the FEA experiments by determining: (1) maximum of the absolute value of the error, (2) relative error, and (3) root mean square error. The maximum errors of our model are lower than high-precision manufacturing tolerances and are also lower than those reported by other researchers who have tried to fit meta-models to the HexFlex mechanism. © 2012 Springer-Verlag London Limited.Ítem Design of computer experiments applied to modeling of compliant mechanisms for real-time control(SPRINGER, 2013-07-01) Acosta, Diego A.; Restrepo, David; Durango, Sebastian; Ruiz, Oscar E.; Universidad EAFIT. Departamento de Ingeniería Mecánica; Laboratorio CAD/CAM/CAEThis article discusses the use of design of computer experiments (DOCE) (i.e., experiments run with a computer model to find how a set of inputs affects a set of outputs) to obtain a force-displacement meta-model (i.e., a mathematical equation that summarizes and aids in analyzing the input-output data of a DOCE) of compliant mechanisms (CMs). The procedure discussed produces a force-displacement meta-model, or closed analytic vector function, that aims to control CMs in real-time. In our work, the factorial and space-filling DOCE meta-model of CMs is supported by finite element analysis (FEA). The protocol discussed is used to model the HexFlex mechanism functioning under quasi-static conditions. The HexFlex is a parallel CM for nano-manipulation that allows six degrees of freedom (x, y, z, ? x, ? y, ? z ) of its moving platform. In the multi-linear model fit of the HexFlex, the products or interactions proved to be negligible, yielding a linear model (i.e., linear in the inputs) for the operating range. The accuracy of the meta-model was calculated by conducting a set of computer experiments with random uniform distribution of the input forces. Three error criteria were recorded comparing the meta-model prediction with respect to the results of the FEA experiments by determining: (1) maximum of the absolute value of the error, (2) relative error, and (3) root mean square error. The maximum errors of our model are lower than high-precision manufacturing tolerances and are also lower than those reported by other researchers who have tried to fit meta-models to the HexFlex mechanism. © 2012 Springer-Verlag London Limited.Ítem Pay less or pay what is fair? Analyzing pricing strategies(Universidad EAFIT, 2017-10-31) Azuela Flores, José Ignacio; Ochoa Hernández, Magda Lizet; Jiménez Almaguer, Karla Paola; Universidad Autónoma de Tamaulipas. OPTO: Opinión Pública, Marketing y Comportamiento del Consumidor; Universidad Autónoma de TamaulipasÍtem Remote access to an interferometric fringes stabilization active system via RENATA(SPIE-INT SOC OPTICAL ENGINEERING, 2013-01-01) Espitia-Gomez, Javier; Angel-Toro, Luciano; Universidad EAFIT. Departamento de Ciencias Básicas; Óptica AplicadaThe Advanced Technology National Network (RENATA, for its acronym in Spanish) is a Colombian, collaborative work tool, linked to other networks worldwide, in which take participation researchers, teachers and students, by sharing laboratory resources located in different universities, institutes and research centers throughout the country. In the Universidad EAFIT (Medellín, Colombia) it has been designed an interferometric fringes stabilization active system, which can be accessed remotely via the RENATA network. A Mach-Zehnder interferometer was implemented, with independent piezoelectric actuators in each arm, with which the lengths of optical path of light that goes over in each of them can be modified. Using these actuators, one can simultaneously perturb the system and compensate the phase differences caused by that perturbation. This allows us to experiment with different disturbs, and analyze the system response to each one of them. This can be made from any location worldwide, and especially from those regions in which optical and optoelectronic components required for the implementation of the interferometer or for the stabilization system are not available. The device can also be used as a platform in order to conduct diverse experiments, involving optical and controlling aspects, constituting with this in a pedagogic tool. For the future, it can be predicted that remote access to available applications would be possible, as well as modifications of the implemented code in labVIEWTM, so that researchers and teachers can adapt and improve their functionalities or develop new applications, based on the collaborative work. © 2013 SPIE.Ítem Statistical tuning of adaptive-weight depth map algorithm(SPRINGER, 2011-01-01) Hoyos, Alejandro; Congote, John; Barandiaran, Inigo; Acosta, Diego; Ruiz, Oscar; Universidad EAFIT. Departamento de Ingeniería de Procesos; Desarrollo y Diseño de ProcesosIn depth map generation, the settings of the algorithm parameters to yield an accurate disparity estimation are usually chosen empirically or based on unplanned experiments. A systematic statistical approach including classical and exploratory data analyses on over 14000 images to measure the relative influence of the parameters allows their tuning based on the number of bad-pixels. Our approach is systematic in the sense that the heuristics used for parameter tuning are supported by formal statistical methods. The implemented methodology improves the performance of dense depth map algorithms. As a result of the statistical based tuning, the algorithm improves from 16.78% to 14.48% bad-pixels rising 7 spots as per the Middlebury Stereo Evaluation Ranking Table. The performance is measured based on the distance of the algorithm results vs. the Ground Truth by Middlebury. Future work aims to achieve the tuning by using significantly smaller data sets on fractional factorial and surface-response designs of experiments. © 2011 Springer-Verlag.Ítem Tuning of adaptive weight depth map generation algorithms: Exploratory data analysis and design of computer experiments (DOCE)(SPRINGER, 2013-09-01) Acosta, Diego; Barandiaran, Inigo; Congote, John; Ruiz, Oscar; Hoyos, Alejandro; Grana, Manuel; Universidad EAFIT. Departamento de Ingeniería de Procesos; Desarrollo y Diseño de ProcesosIn depth map generation algorithms, parameters settings to yield an accurate disparity map estimation are usually chosen empirically or based on unplanned experiments. Algorithms' performance is measured based on the distance of the algorithm results vs. the Ground Truth by Middlebury's standards. This work shows a systematic statistical approach including exploratory data analyses on over 14000 images and designs of experiments using 31 depth maps to measure the relative influence of the parameters and to fine-tune them based on the number of bad pixels. The implemented methodology improves the performance of adaptive weight based dense depth map algorithms. As a result, the algorithm improves from 16.78 to 14.48 % bad pixels using a classical exploratory data analysis of over 14000 existing images, while using designs of computer experiments with 31 runs yielded an even better performance by lowering bad pixels from 16.78 to 13 %. © 2012 Springer Science+Business Media, LLC.Ítem Tuning of adaptive weight depth map generation algorithms: Exploratory data analysis and design of computer experiments (DOCE)(SPRINGER, 2013-09-01) Acosta, Diego; Barandiaran, Inigo; Congote, John; Ruiz, Oscar; Hoyos, Alejandro; Grana, Manuel; Universidad EAFIT. Departamento de Ingeniería Mecánica; Laboratorio CAD/CAM/CAEIn depth map generation algorithms, parameters settings to yield an accurate disparity map estimation are usually chosen empirically or based on unplanned experiments. Algorithms' performance is measured based on the distance of the algorithm results vs. the Ground Truth by Middlebury's standards. This work shows a systematic statistical approach including exploratory data analyses on over 14000 images and designs of experiments using 31 depth maps to measure the relative influence of the parameters and to fine-tune them based on the number of bad pixels. The implemented methodology improves the performance of adaptive weight based dense depth map algorithms. As a result, the algorithm improves from 16.78 to 14.48 % bad pixels using a classical exploratory data analysis of over 14000 existing images, while using designs of computer experiments with 31 runs yielded an even better performance by lowering bad pixels from 16.78 to 13 %. © 2012 Springer Science+Business Media, LLC.