Brain and Mind wiki
Advertisement

In the visual system of man the process of image formation begins with the light rays coming from the outside world and impinging on the photoreceptors in the retina. Proportionally, a digital photograph is created by light impinging on another photosensitive device, the CCD array. Every greyscale digital image is a 2-D array (matrix) of numbers. A black pixel typically has the value zero (weakest intensity) whereas a white pixel has the value 255 (strongest intensity). A human photoreceptor hyperpolates according to the amount of light impinging on it, just like the CCD gives the appropriate value to the digital pixel.


Some basics on digital image processing[]

Many basic image processing techniques such as computing image derivatives or noise smoothing are based on linear filtering. Linear filtering consists in convolving the image with a constant matrix, called mask or kernel or simply window.

Convolution in mathematics is an operator, just like addition or multiplication.

For two continuous functions and , convolution is written and is defined as the integral of the product of the two functions after one is reversed and shifted: .

For discrete functions, one can use a discrete version of the convolution. It is given by

.

The process of filtering an image with convolution is the process of applying the filter to every pixel of the original image in order to compute the pixel values of the filtered image. Consider a digital image and a filter . The filtered image will be given by :

.

Notice that since the image (and the filter kernel) are finite in length all the other values in the sum are regarded as zero. A less formal version of the above formula would be: for the N x M image and the m x m kernel where m is an odd number smaller than both M and N, the filtered version of at each pixel is given by : .

Here indicates integer division (e.g 3/2 = 1).

For example, consider the 5x5 random image and the 3x3 kernel of an averaging filter.Each pixel of the filtered image is the average value of the 3x3 neighborhood of the corresponding pixel in .

the first step of filtering

We say that the response of a pixel to a specific filter is the value given to that pixel after the filtering. In the above example, the response of pixel (2,2) to the averaging filter is 5.

Using gradient filters to estimate contrast[]

The majority of the ganglion cells in the optical axis are responsible for carrying information regarding the change in light intensity around points in the scene. Such points are important in perception because they outline the edges and therefore the contour of objects. In image proccessing we call such points edge points and they are the pixels at or around which the image values undergo a sharp variation. The value (or light intensity) of a pixel depends on the place of the pixel on the image plane. This means that intensity is a function of the coordinates of the pixel. In order to find how the intensity changes in a specific pixel we need to compute the spatial derivative of the intensity function at that pixel.

Gauss zoom

The white arrow (J) represents the spatial derivative (or gradient) of the light intensity at that pixel. The length of the arrow is the magnitude (or strength) and the angle θ is the orientation of the gradient.

The gradient at each pixel of the image will have two components, one representing the change in the x-axis (Jx) and one represinting change in the y-axis (Jy). These two components are the image's partial derivatives at that pixel. Assuming that the partial derivatives exist at that point, a common way to find them is by computing the finite difference estimates at that point. The central difference approximation for Jx would be :

.

The step, , represents the size of the pixel and can be set to or . This makes the formulae simpler:

which can be computed for every pixel in the image by convolving the image rows with the mask

.

Likewise, Jy can be computed by convolving the image columns with the same mask.

One can also use the second derivatives to find the edge points by estimating the zero crossings of the Laplace operator. The mask for the Laplacian,

,

would be :

.

Ganglion cells in the retina: Receptive field and Lateral inhibition[]

  • the retina
  • the centre-surround organisation (why ganglion? only these fire AP -some amacrines also)
  • a simple model (pic) -similarities with LaPlace filter
  • purpose : signal compression ->less noise (p.519)
Advertisement