【情報理論、推測と学習のアルゴリズム】
Information Theory, Inference and Learning Algorithms. hardcover 550 p., 5 color plates, 30 figs. 03
MacKay, David J.C. 著
著者紹介
内容
目次
1. Introduction to information theory; 2. Probability, entropy andinference; 3. More about inference; Part I. Data Compression: 4. The sourcecoding theorem; 5. Symbol codes; 6. Stream codes; 7. Codes for integers; PartII. Noisy-Channel Coding: 8. Dependent random variables; 9. Communicationover a noisy channel; 10. The noisy-channel coding theorem; 11.Error-correcting codes and real channels; Part III. Further Topics inInformation Theory: 12. Hash codes; 13. Binary codes; 14. Very good linearcodes exist; 15. Further exercises on information theory; 16. Messagepassing; 17. Constrained noiseless channels; 18. Crosswords and codebreaking;19. Why have sex? Information acquisition and evolution; Part IV.Probabilities and Inference: 20. An example inference task: clustering; 21.Exact inference by complete enumeration; 22. Maximum likelihood andclustering; 23. Useful probability distributions; 24. Exact marginalization;25. Exact marginalization in trellises; 26. Exact marginalization in graphs;27. Laplace's method; 28. Model comparison and Occam's razor; 29. Monte Carlomethods; 30. Efficient Monte Carlo methods; 31. Ising models; 32. Exact MonteCarlo sampling; 33. Variational methods; 34. Independent component analysis;35. Random inference topics; 36. Decision theory; 37. Bayesian inference andsampling theory; Part V. Neural Networks: 38. Introduction to neuralnetworks; 39. The single neuron as a classifier; 40. Capacity of a singleneuron; 41. Learning as inference; 42. Hopfield networks; 43. Boltzmannmachines; 44. Supervised learning in multilayer networks; 45. Gaussianprocesses; 46. Deconvolution; Part VI. Sparse Graph Codes; 47. Low-densityparity-check codes; 48. Convolutional codes and turbo codes; 49.Repeat-accumulate codes; 50. Digital fountain codes; Part VII. Appendices: A.Notation; B. Some physics; C. Some mathematics; Bibliography; Index.
カート
カートに商品は入っていません。