Embed this Speech!

<script type='text/javascript' src='http://www.sweetspeeches.com/s/e/16382---symmetry-group-based-learning-for-regularity-discovery-from-real-world-patterns'></script>

Verified

Symmetry Group-based Learning for Regularity Discovery from Real World Patterns February 10, 2009

Send This Speech Embed This Speech

Favorite:

  • Favorite_star_off
  • Bg_dislike

    0

Google Tech Talks
December 15, 2008

ABSTRACT

We explore a formal and computational characterization of real world regularity using discrete symmetry groups (hierarchy) as a theoretical basis, embedded in a well-defined Bayesian framework. Our existing work on "A Computational Model for Periodic Pattern Perception Based on Frieze and Wallpaper Groups" (TPAMI 2004), 'Near-regular texture analysis and manipulation' (SGIGRAPH 2004), and "A Lattice-based MRF Model for Dynamic Near-regular Texture Tracking" (PAMI 2007) already demonstrate the power of such a formalization on a diverse set of real problems, such as texture analysis, synthesis, tracking, perception and manipulation in terms of regularity. Symmetry and symmetry group detection from real world data turns out to be a very challenging problem that has been puzzling computer vision researchers for the past 40 years. Our novel formalization will lead the way to a more robust and comprehensive algorithmic treatment of the whole regularity spectrum, from regular (perfect symmetry), near-regular (deviations from symmetry), to various types of irregularities. The recent results of the proposed methodology will be illustrated in this talk by several real world applications such as deformed lattice detection, rotation and glide-reflection detection, gait recognition, grid-cell clustering, symmetry of dance, automatic geo-tagging and image de-fencing.

Speaker: Yanxi Liu
Yanxi Liu received her B.S. degree in physics/electrical engineering and her Ph.D. degree in computer science for group theory applications in robotics (UMass Amherst). Her postdoctoral training was performed in LIFIA/IMAG, Grenoble, France. She spent one year at DIMACS (NSF center for Discrete Mathematics and Theoretical Computer Science) with an NSF research-education fellowship award. Before joining the Departments of Computer Science and Engineering and Electrical Engineering at Penn State in Fall 2006 as a tenured faculty member, Dr. Liu had been with the faculty of the Robotics Institute of Carnegie Mellon University, and affiliated with the Machine Learning Department of CMU. She is also an adjunct associate professor in the Radiology Department of University of Pittsburgh. Dr. Liu is the co-director of the Laboratory for Perception, Action, and Cognition (LPAC) at Penn State (http://vision.cse.psu.edu/). Dr. Liu's research interests span a wide range of applications in computer vision and pattern recognition, computer graphics, medical image analysis and robotics, with two main research themes: computational (a)symmetry and discriminative subspace learning. With her colleagues, Dr. Liu won first place in the clinical science category and the best paper overall at the Annual Conference of Plastic and Reconstructive Surgeons for their work on "Measurement of Asymmetry in Persons with Facial Paralysis." Dr. Liu chaired the First International Workshop on Computer Vision for Biomedical Image Applications (CVBIA) in conjunction with ICCV 2005 in Beijing, and co-edited the book: "CVBIA: Current Techniques and Future Trends," Springer-Verlag LNCS 3765. Dr. Liu serves as an area chair/reviewer/committee member/panelist for all major journals, conferences, and NIH/NSF panels in computer vision, computer graphics, pattern recognition, biomedical image analysis, and machine learning. She has served as a chartered NIH study section member. She is a senior member of IEEE and the IEEE Computer Society.

Telepromptor

Print transcript

Full Transcript coming soon

  • Randomspeech

Speech Sender

close [x]

You are sending:

Symmetry Group-based Learning for Regularity Discovery from Real World Patterns- February 10, 2009

- - -
Send to:

We welcome any and all feedback for Sweet Speeches! Speak your mind!