The concept of information functions of type β (β > 0) is introduced and discussed. By means of these information functions the entropies of type β are defined. These entropies have a number of interesting algebraic and analytic properties similar to Shannon's entropy. The capacity of type β (β > 1) of a discrete constant channel is defined by means of the entropy of type β. Examples are given for the computation of the capacity of type β, from which the Shannon's capacity can be derived as the limiting case β = 1.