Examinando por Materia "Computational complexity"
Mostrando 1 - 8 de 8
Resultados por página
Opciones de ordenación
Ítem 2D shape similarity as a complement for Voronoi-Delone methods in shape reconstruction(PERGAMON-ELSEVIER SCIENCE LTD, 2005-02-01) Ruiz, O.E.; Cadavid, C.A.; Granados, M.; Peña, S.; Vásquez, E.; Universidad EAFIT. Departamento de Ingeniería Mecánica; Laboratorio CAD/CAM/CAEIn surface reconstruction from planar cross sections it is necessary to build surfaces between 2D contours in consecutive cross sections. This problem has been traditionally attacked by (i) direct reconstruction based on local geometric proximity between the contours, and (ii) classification of topological events between the cross sections. These approaches have been separately applied with limited success. In case (i), the resulting surfaces may have overstretched or unnatural branches. These arise from local contour proximity which does not reflect global similarity between the contours. In case (ii), the topological events are identified but are not translated into the actual construction of a surface. This article presents an integration of the approaches (i) and (ii). Similarity between the composite 2D regions bounded by the contours in consecutive cross sections is used to: (a) decide whether a surface should actually relate two composite 2D regions, (b) identify the type and location of topological transitions between cross sections and (c) drive the surface construction for the regions found to be related in step (a). The implemented method avoids overstretched or unnatural branches, rendering a surface which is both geometrically intuitive and topologically faithful to the cross sections of the original object. The presented method is a good alternative in cases in which correct reproduction of the topology of the surface (e.g. simulation of flow in conduits) is more important than its geometry (e.g. assessment of tumor mass in radiation planning). © 2004 Elsevier Ltd. All rights reserved.Ítem A debugging scheme for functional logic programs(Elsevier BV, 2002-01-01) Alpuente, M.; Correa, F.; Falaschi, M.; Alpuente, M.; Correa, F.; Falaschi, M.; Universidad EAFIT. Departamento de Ciencias; Lógica y ComputaciónWe present a generic scheme for the declarative debugging of functional logic programs which is valid for eager as well as lazy programs. In particular we show that the framework extends naturally some previous work and applies to the most modern lazy strategies, such as needed narrowing. First we associate to our programs a semantics based on a (continuous) immediate consequence operator, TR, which models computed answers. We show that, given the intended specification of a program R, it is possible to check the correctness of R by a single step of TR. We consider then a more effective methodology which is based on abstract interpretation: by approximating the intended specification of the success set we derive a finitely terminating diagnosis method, which can be used statically and is parametric w.r.t. to the chosen approximation. In order to correct the bugs, we sketch a preliminary deductive approach which uses example-guided unfolding. We specialize the incorrect rules w.r.t. sets of positive and negative examples which are gathered (bottom-up) during the diagnosis process, so that all refutations of negative examples and no refutation of positive examples are excluded. Our debugging framework does not require the user to either provide error symptoms in advance or answer difficult questions concerning program correctness. We extend an implementation of our system to the case of needed narrowing and illustrate it through some examples which demonstrate the practicality of our approach. © 2002 Published by Elsevier Science B.V.Ítem Gravitational topological quantum computation(SPRINGER, 2007-01-01) Velez, Mario; Ospina, Juan; Velez, Mario; Ospina, Juan; Universidad EAFIT. Departamento de Ciencias; Lógica y ComputaciónA new model in topological quantum computing, named Gravitational Topological Quantum Computing (GTQC), is introduced as an alternative respect to the Anyonic Topological Quantum Computing and DNA Computing. In the new model the quantum computer is the quantum space-time itself and the corresponding quantum algorithms refer to the computation of topological invariants for knots, links and tangles. Some applications of GTQC in quantum complexity theory and computability theory are discussed, particularly it is conjectured that the Khovanov polynomial for knots and links is more hard than #P-hard; and that the homeomorphism problem, which is noncomputable, maybe can be computed after all via a hyper-computer based on GTQC. © Springer-Verlag Berlin Heidelberg 2007.Ítem Hacia un perfil docente para el desarrollo del pensamiento computacional basado en educación Stem para la Media Técnica en desarrollo de software(Universidad EAFIT, 2014) Vásquez Giraldo, Alberto León; Zea Restrepo, Claudia MaríaLa educación en Colombia, regida por la Ley 115 de 1994, establece la educación formal desde tres niveles (educación primaria, básica y media) esta última, dio paso a la incorporación de la media técnica, que debía estar en consonancia con las necesidades del sector económico de más alto impacto del país, orientada por medio de ciclos propedéuticos flexibles y a través de la formación por competencias, para que los logros obtenidos por los estudiantes fueran reconocidos en el siguiente nivel de profesionalización; seguida de una articulación con una Institución de Educación Superior o donde los logros obtenidos fueran los adecuados para iniciar su vida laboral -- En Medellín la tendencia laboral, se inclina hacia el área de la prestación de servicios, dirigida hacia el clúster TIC, y a su vez, alineada en la propuesta del gobierno nacional con el propósito de fortalecer la educación técnica y tecnológica, por medio de alianzas estratégicas del sector público, privado y de educación; con el fin de preparar profesionales cualificados, ya que en la actualidad, existe gran demanda de recurso humano que pueda ejecutar esas tareas, y a su vez, crea la necesidad de personal docente que pueda formarlo -- Para superar esta problemática, las alianzas estratégicas han determinado la necesidad de contar con estudios previos de los perfiles necesarios para el clúster TIC y un estado del arte en cuanto a las nuevas tendencias de enseñanza-aprendizaje como el desarrollo del pensamiento computacional, la educación STEM (Science, Technology, Engineering y Mathematics), de los currículos para el adecuado funcionamiento de los programas de formación docente -- Este trabajo de investigación se planteó el objetivo de realizar un estado del arte en cuanto la enseñanza de las nuevas tendencias en pensamientos computacional y educación STEM, que permitiera diseñar un perfil docente idóneo para la educación en media técnica en el área del desarrollo de software -- Estado del arte que inicia con los conceptos básicos que deben estar definidos para el buen desarrollo de esta investigación -- Los conceptos que se determinaron son: formación por competencias, las disciplinas de la informática, la educación STEM y el pensamiento computacional -- A continuación se hace un recorrido bibliográfico alrededor del mundo hasta llegar al nivel nacional y local de lo que se está haciendo en políticas en pro del pensamiento computacional, educación STEM y currículos de las ciencias de la computación, para finalizar con los programas de formación docente -- La revisión de esta información, dio bases para la construcción de dos matrices de los componentes comunes de pensamiento computacional, trabajados entre los países que poseen currículos de las ciencias de la computación y los elementos comunes entre los países que cuentan con programas de formación docente en la misma áreaÍtem The need for exploring alternatives in systemic intervention: Two "intentional" arguments(2009-01-01) Vélez-Castiblanco, J.; Universidad EAFIT. Departamento de Administración; Administración y OrganizacionesA recurrent guideline in many of the systems approaches to intervention is the need for exploring different alternatives. This guideline is present despite the different types of tools, the different paradigms or the arguments behind it. The purpose of this paper is not to contradict this, but to provide new arguments to this need that can be applied to the whole range of tools. The arguments shown here use ideas from language pragmatics and a combination of philosophy of action and complexity theory. What is central to the arguments presented is the concern with the intentions of the agents. In light of those, it is claimed that the advantages in the exploration of alternatives are hindered if they are not used in an intentional way.Ítem Weighted area/angle distortion minimization for Mesh Parameterization(EMERALD GROUP PUBLISHING LIMITED, 2017-01-01) Mejia D.; Acosta D.A.; Ruiz-Salguero O.; Universidad EAFIT. Departamento de Ingeniería de Procesos; Desarrollo y Diseño de ProcesosPurpose: Mesh Parameterization is central to reverse engineering, tool path planning, etc. This work synthesizes parameterizations with un-constrained borders, overall minimum angle plus area distortion. This study aims to present an assessment of the sensitivity of the minimized distortion with respect to weighed area and angle distortions. Design/methodology/approach: A Mesh Parameterization which does not constrain borders is implemented by performing: isometry maps for each triangle to the plane Z = 0; an affine transform within the plane Z = 0 to glue the triangles back together; and a Levenberg-Marquardt minimization algorithm of a nonlinear F penalty function that modifies the parameters of the first two transformations to discourage triangle flips, angle or area distortions. F is a convex weighed combination of area distortion (weight: a with 0 = a = 1) and angle distortion (weight: 1 - a). Findings: The present study parameterization algorithm has linear complexity [O(n), n = number of mesh vertices]. The sensitivity analysis permits a fine-tuning of the weight parameter which achieves overall bijective parameterizations in the studied cases. No theoretical guarantee is given in this manuscript for the bijectivity. This algorithm has equal or superior performance compared with the ABF, LSCM and ARAP algorithms for the Ball, Cow and Gargoyle data sets. Additional correct results of this algorithm alone are presented for the Foot, Fandisk and Sliced-Glove data sets. Originality/value: The devised free boundary nonlinear Mesh Parameterization method does not require a valid initial parameterization and produces locally bijective parameterizations in all of our tests. A formal sensitivity analysis shows that the resulting parameterization is more stable, i.e. the UV mapping changes very little when the algorithm tries to preserve angles than when it tries to preserve areas. The algorithm presented in this study belongs to the class that parameterizes meshes with holes. This study presents the results of a complexity analysis comparing the present study algorithm with 12 competing ones. © Emerald Publishing Limited.Ítem Weighted area/angle distortion minimization for Mesh Parameterization(EMERALD GROUP PUBLISHING LIMITED, 2017-01-01) Mejia D.; Acosta D.A.; Ruiz-Salguero O.; Universidad EAFIT. Departamento de Ingeniería Mecánica; Laboratorio CAD/CAM/CAEPurpose: Mesh Parameterization is central to reverse engineering, tool path planning, etc. This work synthesizes parameterizations with un-constrained borders, overall minimum angle plus area distortion. This study aims to present an assessment of the sensitivity of the minimized distortion with respect to weighed area and angle distortions. Design/methodology/approach: A Mesh Parameterization which does not constrain borders is implemented by performing: isometry maps for each triangle to the plane Z = 0; an affine transform within the plane Z = 0 to glue the triangles back together; and a Levenberg-Marquardt minimization algorithm of a nonlinear F penalty function that modifies the parameters of the first two transformations to discourage triangle flips, angle or area distortions. F is a convex weighed combination of area distortion (weight: a with 0 = a = 1) and angle distortion (weight: 1 - a). Findings: The present study parameterization algorithm has linear complexity [O(n), n = number of mesh vertices]. The sensitivity analysis permits a fine-tuning of the weight parameter which achieves overall bijective parameterizations in the studied cases. No theoretical guarantee is given in this manuscript for the bijectivity. This algorithm has equal or superior performance compared with the ABF, LSCM and ARAP algorithms for the Ball, Cow and Gargoyle data sets. Additional correct results of this algorithm alone are presented for the Foot, Fandisk and Sliced-Glove data sets. Originality/value: The devised free boundary nonlinear Mesh Parameterization method does not require a valid initial parameterization and produces locally bijective parameterizations in all of our tests. A formal sensitivity analysis shows that the resulting parameterization is more stable, i.e. the UV mapping changes very little when the algorithm tries to preserve angles than when it tries to preserve areas. The algorithm presented in this study belongs to the class that parameterizes meshes with holes. This study presents the results of a complexity analysis comparing the present study algorithm with 12 competing ones. © Emerald Publishing Limited.Ítem Weighted area/angle distortion minimization for Mesh Parameterization(EMERALD GROUP PUBLISHING LIMITED, 2017-01-01) Mejia D.; Acosta D.A.; Ruiz-Salguero O.; Mejia D.; Acosta D.A.; Ruiz-Salguero O.; Universidad EAFIT. Departamento de Ingeniería de Procesos; Procesos Ambientales (GIPAB)Purpose: Mesh Parameterization is central to reverse engineering, tool path planning, etc. This work synthesizes parameterizations with un-constrained borders, overall minimum angle plus area distortion. This study aims to present an assessment of the sensitivity of the minimized distortion with respect to weighed area and angle distortions. Design/methodology/approach: A Mesh Parameterization which does not constrain borders is implemented by performing: isometry maps for each triangle to the plane Z = 0; an affine transform within the plane Z = 0 to glue the triangles back together; and a Levenberg-Marquardt minimization algorithm of a nonlinear F penalty function that modifies the parameters of the first two transformations to discourage triangle flips, angle or area distortions. F is a convex weighed combination of area distortion (weight: a with 0 = a = 1) and angle distortion (weight: 1 - a). Findings: The present study parameterization algorithm has linear complexity [O(n), n = number of mesh vertices]. The sensitivity analysis permits a fine-tuning of the weight parameter which achieves overall bijective parameterizations in the studied cases. No theoretical guarantee is given in this manuscript for the bijectivity. This algorithm has equal or superior performance compared with the ABF, LSCM and ARAP algorithms for the Ball, Cow and Gargoyle data sets. Additional correct results of this algorithm alone are presented for the Foot, Fandisk and Sliced-Glove data sets. Originality/value: The devised free boundary nonlinear Mesh Parameterization method does not require a valid initial parameterization and produces locally bijective parameterizations in all of our tests. A formal sensitivity analysis shows that the resulting parameterization is more stable, i.e. the UV mapping changes very little when the algorithm tries to preserve angles than when it tries to preserve areas. The algorithm presented in this study belongs to the class that parameterizes meshes with holes. This study presents the results of a complexity analysis comparing the present study algorithm with 12 competing ones. © Emerald Publishing Limited.