Tutorial on Estimating Information from Image Colours

This page provides an introduction to estimating Shannon information from RGB coloured images and how these estimates may be used in practice.


1. Basics

The colors in an image of a scene provide information about its reflecting surfaces under the prevailing illumination. But how should this information be quantified? How does it vary from scene to scene? And how does it depend on the spectral sensitivities of the camera or eye used to view the scene? The aim of this tutorial is to show how these questions can be addressed with the aid of some basic ideas from information theory and the computational routines downloadable from this site.

You will need this package, which contains MatLab MEX files for Windows MatLab 32 bit, Windows MatLab 64 bit, and Mac OSX operating systems.

This material is base on the following publication: Marín-Franch, I. and Foster, D. H. (2013). Estimating Information from Image Colors: An Application to Digital Cameras and Natural Scenes. IEEE Transactions on Pattern Analysis and Machine Intelligence, 35(1), 78-91.

Suppose that a camera or the eye produces a triplet of values r, g, b at each point of the image, as in Fig. 1. 

brufefields

Figure 1. Fields scene. The pixel at the top right has colour values r, g, b.

If the location of the point is unpredictable, then taken together, the triplet (r, g, b)  can be treated as a trivariate continuous random variable, A say, with probability density function (pdf), f say. Histogram representations of the r, gb signal levels (for the eye, long-, medium-, short-wavelength cone signal levels) are shown in Fig. 2.

r-levelsg-levelsb-levels

Figure 2. Histograms of estimated r, g, b signal levels from Fig. 1 for the human eye.

The uncertainty or randomness of the variable A is captured by Shannon's differential entropy h(A), defined [1] as

h(A) = −  f(a) log f(a) da.
(1)

The entropy is measured in bits if the logarithm in Eqn (1) is to the base 2.

The mutual information I(A1;A2) between two images of a scene under different conditions, e.g. under different illuminations, represents how much information one scene contains about the other by virtue of its colours. It can be defined by the following combination of entropies [1]:

I(A1;A2) = h(A1) + h(A2) − h(A1,A2),
(2)

where h(A1,A2) is the differential entropy of the variables A1 and A2. Unlike differential entropy, mutual information does not depend on the units used to quantify the r, gb signal levels. As explained in the next section, mutual information is intimately related to the number of identifiable points in a scene.

These estimates do not include any uncertainty in the camera or eye due to noise, whose differential entropy may also be included in the calculation [2]. Different cameras with different sensors will produce different histograms from those illustrated in Fig. 2. Examples are given in [3].

Notice that these calculations are based solely on spectral information. They make no use of information about spatial position. 


2. Kozachenko-Leonenko estimator and offset versions

Mutual information depends directly or indirectly on probability density functions. Unfortunately, using histograms such as those in Fig. 2 to estimate pdfs is difficult and can lead to marked biases [4]. Instead, it is possible to use nearest-neighbour statistics to estimate entropies directly from the data, and then apply Eqn (2) to estimate mutual information. The Kozachenko-Leonenko kth-nearest-neighbour estimator [5] is used here. Its convergence properties were improved with an offset device, details of which are given in [6].

Shannon's channel coding theorem [2] gives a nice interpretation of the mutual information between two images of a scene obtained under different illumination conditions. If the mutual information is I, then the number of points or elements in the scene that retain their identity by virtue of their colour is given by

N = 2I.

As an example, the images in Fig. 3 are of the scene shown in Fig. 1 under the setting sun and north skylight. 

brufefields 4000 K     brufefields 25000 K

Figure 3. Images of the scene shown in Fig. 1 under the setting sun (left) and north skylight (right), correlated colour temperatures of 4000 K and 25000 K, respectively.

For the human eye, the offset Kozachenko-Leonenko estimator of the mutual information between these two images is 17.0 bits and the number of identifiable points is 1.3x105.

The following sections describe how to make these kinds of estimates with the routines available in a package downloadable from this site here.


3. Contents of the package

The package contains MatLab MEX files for Windows MatLab 32 bit, Windows MatLab 64 bit, and Mac OSX operating systems, and it can be downloaded here. With the MatLab MEX files, both differential entropy and mutual information can be calculated. The C++ source code used to generate the MEX files is also included in the package.

The C++ code for the MatLab MEX files was implemented by Martin Sanz (martin.sanz@uv.es).

If you use this software in published research, please give the reference of the source work in full, namely Marín-Franch, I. and Foster, D. H. (2013). Estimating Information from Image Colors: An Application to Digital Cameras and Natural Scenes. IEEE Transactions on Pattern Analysis and Machine Intelligence, 35(1), 78-91.


4. Using the package to estimate mutual information between two images

To estimate the mutual information between the distributions of colours in two images, proceed as follows. Load the subsampled scene ref4_scene5.mat and find the dimensions of the loaded array reflectances.

load ref4_scene5.mat;
[nrow,ncol,nwav] = size(reflectances);

The number of rows nrow should be 255, the number of columns ncol 335, and the number of wavelengths nwav 33.

Next obtain the radiances of the reflected image of the scene under a daylight illuminant of color-correlated temperature 25000 K and then one of 4000 K.

load illum_25000.mat;
load illum_4000.mat;
radiances_25000 = zeros(nrow,ncol,nwav); % initialize array
radiances_4000 = zeros(nrow,ncol,nwav); % initialize array
for i = 1:33,
  radiances_25000(:,:,i) = reflectances(:,:,i)*illum_25000(i);
  radiances_4000(:,:,i) = reflectances(:,:,i)*illum_4000(i);
end

To obtain the RGB sensor responses of a camera, choose one of the following sensor sets: agilent, foveonx3, kodak, nikond1, and sony

rgbsens = rgbcurves('agilent');
rgbsens(:,1) = []; % delete first column with wavelengths
radiances_25000 = reshape(radiances_25000,nrow*ncol,nwav );
radiances_4000 = reshape(radiances_4000,nrow*ncol,nwav );
rgb_25000 = radiances_25000*rgbsens;
rgb_4000 = radiances_4000*rgbsens;

To estimate the mutual information apply the Kozachenko-Leonenko estimator 'kl' and, for comparison, the offset version of the estimator 'klo' to the arrays rgb_25000 and rgb_4000.

mi = mikl(rgb_25000,rgb_4000,'kl' );
mio = mikl(rgb_25000,rgb_4000,'klo');

The Kozachenko-Leonenko estimator gives a value of 13.9 bits whereas the offset estimator gives a value of 17.2 bits.

The calculation can be repeated with any of the other sensor sets to illustrate the effect of their different spectral sensitvities. For example, replace 'agilent' by 'eye' to obtain the following:

rgbsens = rgbcurves('eye');
rgbsens(:,1) = []; % delete first column with wavelengths
radiances_25000 = reshape(radiances_25000,nrow*ncol,nwav );
radiances_4000 = reshape(radiances_4000,nrow*ncol,nwav );
rgb_25000 = radiances_25000*rgbsens;
rgb_4000 = radiances_4000*rgbsens;
mi = mikl(rgb_25000,rgb_4000,'kl' );
mio = mikl(rgb_25000,rgb_4000,'klo');

The Kozachenko-Leonenko estimator gives a value of 13.0 bits and the offset estimator gives a value of 16.9 bits.

The value of 16.9 bits for the offset estimator is only slightly smaller than the 17.0 bits reported in Section 2, which was derived from an image with higher spatial resolution (downloadable as a zip file here).


5. Using the package to estimate differential entropies

The estimate of the mutual information in Section 4 is based on Eqn (2). The individual estimates of the  differential entropies can be obtained explicitly as follows:

h1 = entkl(rgb_25000,'klo');
h2 = entkl(rgb_4000
,'klo');
h12 = entkl([rgb_25000,rgb_4000],'klo');

The estimates are 4.2 bits, 3.1 bits, and -9.6 bits, respectively. When combined according to Eqn (2), they give the estimate of 16.9 bits for the mutual information, as in Section 4.


6. Citing

If you use this software in published research, please give the reference of the source work in full, namely Marín-Franch, I. and Foster, D. H. (2013). Estimating Information from Image Colors: An Application to Digital Cameras and Natural Scenes. IEEE Transactions on Pattern Analysis and Machine Intelligence, 35(1), 78-91.


References

  1. T. Cover, T. M., & Thomas, J. A. (2006). Elements of Information Theory (2nd ed.). Hoboken, New Jersey: John Wiley & Sons, Inc.
  2. Marín-Franch, I. and Foster, D. H. (2010). Number of perceptually distinct surface colors in natural scenes. Journal of Vision, 10(9):9(9).
  3. Foster, D. H. and Marín-Franch, I. (2013). Effectiveness of Digital Camera Sensors in Distinguishing Colored Surfaces in Outdoor Scenes. Imaging Systems and Applications, Arlington, Virginia, http://dx.doi.org/10.1364/ISA.2013.ITh3D.2
  4. Steuer, R., Kurths, J., Daub, C. O., Weise, J., and Selbig, J. (2002). The mutual information: Detecting and evaluating dependencies between variables. Bioinformatics, 18, S231-S240.
  5. Kozachenko, L. F. and Leonenko, N. N. (1987). Sample estimate of the entropy of a random vector. Problems of Information Transmission (Tr. Problemy Peredachi Informatsii. Vol. 23, No.2, pp. 9-16, 1987), 23(2), 95-101.
  6. Marín-Franch, I. and Foster, D. H. (2013). Estimating Information from Image Colors: An Application to Digital Cameras and Natural Scenes. IEEE Transactions on Pattern Analysis and Machine Intelligence, 35(1), 78-91.


(c) D. H. Foster and I. Marín-Franch, 2016


d.h.foster@manchester.ac.uk | +44 (0)161 306 3888 | www.eee.manchester.ac.uk/d.h.foster 
School of Electrical and Electronic Engineering, University of Manchester, Manchester M13 9PL, UK