Examinando por Materia "Computer experiment"
Mostrando 1 - 4 de 4
Resultados por página
Opciones de ordenación
Ítem Design of computer experiments applied to modeling compliant mechanisms(DELFT UNIV TECHNOLOGY, FAC INDUST DESIGN ENG, 2010-01-01) Arango, D.R.; Acosta, D.A.; Durango, S.; Ruiz, O.E.; Universidad EAFIT. Departamento de Ingeniería Mecánica; Laboratorio CAD/CAM/CAEThis article discusses a procedure for force-displacement modeling compliant mechanisms by using a design of computer experiments methodology. This approach produces a force-displacement metamodel that is suited for real-time control of compliant mechanisms. The term metamodel is used to represent a simplified and efficient mathematical model of unknown phenomenon or computer codes. The metamodeling of compliant mechanisms is performed from virtual experiments based on factorial and space filling design of experiments. The procedure is used to modeling the quasi-static behavior of the HexFlex compliant mechanism. The HexFlex is a parallel compliant mechanism for nanomanipulating that allows six degrees of freedom of its moving stage. The metamodel of the HexFlex is performed from virtual experiments by the Finite Element Method (FEM). The obtained metamodel for the HexFlex is linear for the movement range of the mechanism. Simulations of the metamodel were conducted, finding good accuracy with respect to the virtual experiments. © Organizing Committee of TMCE 2010 Symposium.Ítem Design of computer experiments applied to modeling compliant mechanisms(DELFT UNIV TECHNOLOGY, FAC INDUST DESIGN ENG, 2010-01-01) Arango, D.R.; Acosta, D.A.; Durango, S.; Ruiz, O.E.; Universidad EAFIT. Departamento de Ingeniería de Procesos; Desarrollo y Diseño de ProcesosThis article discusses a procedure for force-displacement modeling compliant mechanisms by using a design of computer experiments methodology. This approach produces a force-displacement metamodel that is suited for real-time control of compliant mechanisms. The term metamodel is used to represent a simplified and efficient mathematical model of unknown phenomenon or computer codes. The metamodeling of compliant mechanisms is performed from virtual experiments based on factorial and space filling design of experiments. The procedure is used to modeling the quasi-static behavior of the HexFlex compliant mechanism. The HexFlex is a parallel compliant mechanism for nanomanipulating that allows six degrees of freedom of its moving stage. The metamodel of the HexFlex is performed from virtual experiments by the Finite Element Method (FEM). The obtained metamodel for the HexFlex is linear for the movement range of the mechanism. Simulations of the metamodel were conducted, finding good accuracy with respect to the virtual experiments. © Organizing Committee of TMCE 2010 Symposium.Ítem Tuning of adaptive weight depth map generation algorithms: Exploratory data analysis and design of computer experiments (DOCE)(SPRINGER, 2013-09-01) Acosta, Diego; Barandiaran, Inigo; Congote, John; Ruiz, Oscar; Hoyos, Alejandro; Grana, Manuel; Universidad EAFIT. Departamento de Ingeniería de Procesos; Desarrollo y Diseño de ProcesosIn depth map generation algorithms, parameters settings to yield an accurate disparity map estimation are usually chosen empirically or based on unplanned experiments. Algorithms' performance is measured based on the distance of the algorithm results vs. the Ground Truth by Middlebury's standards. This work shows a systematic statistical approach including exploratory data analyses on over 14000 images and designs of experiments using 31 depth maps to measure the relative influence of the parameters and to fine-tune them based on the number of bad pixels. The implemented methodology improves the performance of adaptive weight based dense depth map algorithms. As a result, the algorithm improves from 16.78 to 14.48 % bad pixels using a classical exploratory data analysis of over 14000 existing images, while using designs of computer experiments with 31 runs yielded an even better performance by lowering bad pixels from 16.78 to 13 %. © 2012 Springer Science+Business Media, LLC.Ítem Tuning of adaptive weight depth map generation algorithms: Exploratory data analysis and design of computer experiments (DOCE)(SPRINGER, 2013-09-01) Acosta, Diego; Barandiaran, Inigo; Congote, John; Ruiz, Oscar; Hoyos, Alejandro; Grana, Manuel; Universidad EAFIT. Departamento de Ingeniería Mecánica; Laboratorio CAD/CAM/CAEIn depth map generation algorithms, parameters settings to yield an accurate disparity map estimation are usually chosen empirically or based on unplanned experiments. Algorithms' performance is measured based on the distance of the algorithm results vs. the Ground Truth by Middlebury's standards. This work shows a systematic statistical approach including exploratory data analyses on over 14000 images and designs of experiments using 31 depth maps to measure the relative influence of the parameters and to fine-tune them based on the number of bad pixels. The implemented methodology improves the performance of adaptive weight based dense depth map algorithms. As a result, the algorithm improves from 16.78 to 14.48 % bad pixels using a classical exploratory data analysis of over 14000 existing images, while using designs of computer experiments with 31 runs yielded an even better performance by lowering bad pixels from 16.78 to 13 %. © 2012 Springer Science+Business Media, LLC.