This book is very specifically targeted to problems in communications and compression by providing the fundamental principles and results in information theory and rate distortion theory for these applications and presenting methods that have proved and will prove useful in analyzing and designing real systems. The chapters contain treatments of entropy, mutual information, lossless source coding, channel capacity, and rate distortion theory; however, it is the selection, ordering, and presentation of the topics within these broad categories that is unique to this concise book. While the coverage of some standard topics is shortened or eliminated, the standard, but important, topics of the chain rules for entropy and mutual information, relative entropy, the data processing inequality, and the Markov chain condition receive a full treatment. Similarly, lossless source coding techniques presented include the Lempel-Ziv-Welch coding method. The material on rate Distortion theory and exploring fundamental limits on lossy source coding covers the often-neglected Shannon lower bound and the Shannon backward channel condition, rate distortion theory for sources with memory, and the extremely practical topic of rate distortion functions for composite sources.