Constraints on the entropy function are sometimes referred to as the laws of information theory. For a long time, the submodular inequalities, or equiva- lently the nonnegativity of the Shannon information measures, are the only known constraints. Inequalities that are implied by the submodular inequality are categorically referred to as Shannon-type inequalities. If the number of random variables is fixed, a Shannon-type inequality can in principle be verified by a linear program known as ITIP. A non-Shannon-type inequality is a constraint on the entropy function which is not implied by the submodular inequality. In the late 1990’s, the discovery of a few such inequalities revealed that Shannon-type inequalities alone do not constitute a complete set of constraints on the entropy function. Subsequently, connections between the entropy function and a number of fields in the science of information, mathematics, and physics have been established. These fields include probability theory, network coding, combinatorics, group theory, Kolmogorov complexity, matrix theory, and quantum mechanics. This talk presents a picture for the many facets of the entropy function.