• 博客(0)
  • 资源 (4)

空空如也

C#数字图像处理算法典型实例(随书光盘源码)

program文件夹内存放的程序,按照每章的顺序进行分类

2012-10-11

Feature Extraction and Image Processing

We will no doubt be asked many times: why on earth write a new book on computer vision? Fair question: there are already many good books on computer vision in the bookshops, as you will find referenced later, so why add to them? Part of the answer is that any textbook is a snapshot of material that exists before it. Computer vision, the art of processing images stored within a computer, has seen a considerable amount of research by highly qualified people and the volume of research would appear even to have increased in recent years. This means that a lot of new techniques have been developed, and many of the more recent approaches have yet to migrate to textbooks. But it is not just the new research: part of the speedy advance in computer vision technique has left some areas covered only in scanty detail. By the nature of research, one cannot publish material on technique that is seen more to fill historical gaps, rather than to advance knowledge. This is again where a new text can contribute. Finally, the technology itself continues to advance. This means that there is new hardware, and there are new programming languages and new programming environments. In particular for computer vision, the advance of technology means that computing power and memory are now relatively cheap. It is certainly considerably cheaper than when computer vision was starting as a research field. One of the authors here notes that the laptop that his portion of the book was written on has more memory, is faster, and has bigger disk space and better graphics than the computer that served the entire university of his student days. And he is not that old! One of the more advantageous recent changes brought about by progress has been the development of mathematical programming systems. These allow us to concentrate on mathematical technique itself, rather than on implementation detail. There are several sophisticated flavours, of which Mathcad and Matlab, the chosen vehicles here, are among the most popular. We have been using these techniques in research and teaching, and we would argue that they have been of considerable benefit there. In research, they help us to develop technique more quickly and to evaluate its final implementation. For teaching, the power of a modern laptop and a mathematical system combines to show students, in lectures and in study, not only how techniques are implemented, but also how and why they work with an explicit relation to conventional teaching material.

2011-02-24

Information Theory and Statistical Learning

This book presents theoretical and practical results of information theoretic methods used in the context of statistical learning. Its major goal is to advocate and promote the importance and usefulness of information theoretic concepts for understanding and developing the sophisticated machine learning methods necessary not only to cope with the challenges of modern data analysis but also to gain further insights into their theoretical foundations. Here Statistical Learning is loosely defined as a synonym, for, e.g., Applied Statistics, Artificial Intelligence or Machine Learning. Over the last decades, many approaches and algorithms have been suggested in the fields mentioned above, for which information theoretic concepts constitute core ingredients. For this reason we present a selected collection of some of the finest concepts and applications thereof from the perspective of information theory as the underlying guiding principles.We consider such a perspective as very insightful and expect an even greater appreciation for this perspective over the next years.

2011-02-24

Pattern Recognition and Machine Learning

Pattern recognition has its origins in engineering, whereas machine learning grew out of computer science. However, these activities can be viewed as two facets of the same field, and together they have undergone substantial development over the past ten years. In particular, Bayesian methods have grown from a specialist niche to become mainstream, while graphical models have emerged as a general framework for describing and applying probabilistic models. Also, the practical applicability of Bayesian methods has been greatly enhanced through the development of a range of approximate inference algorithms such as variational Bayes and expectation propagation. Similarly, new models based on kernels have had significant impact on both algorithms and applications.

2010-11-30

空空如也

TA创建的收藏夹 TA关注的收藏夹

TA关注的人

提示
确定要删除当前文章?
取消 删除