If we consider diffraction effects we need to model light as waves. Let's start by working with just one frequency of light from the camera flash. Let's also suppose that the flash is far enough away from the monitor that the spherical wavefronts can be modelled as plane waves when they meet the monitor. Here's a picture:

J(k,t) = ∫ dy I(y,t-my/c)

J(k) = ∫ dy I(y)exp(-iωmy/c) = ∫ dy I(y)exp(-i2πmy/λ)





Where I used ω = 2πc/λ and λ is the wavelength of the light. In other words, J is the Fourier transform of I. More generally, if we modelled the z-axis we'd find that J is given by the 2D Fourier transform of I as a function of the 2D surface of the monitor. The actual power striking the sensor is given by the absolute value of the square of J.





Ordinary reflections





I(y) = constant. The Fourier transform of a So now I can pop the first of my pending explanations off the stack. Suppose that the surface is completely uniform. Thenconstant. The Fourier transform of a constant function is the dirac delta function. In other words - despite the fact that we're modelling every point on the surface as its own emitter, the resulting reflection from a perfectly smooth surface is a wave going straight back where it came from. (Exercise, modify the above treatment to show that waves striking a mirror at an angle obey the usual reflection law for mirrors.)





Periodic structure





Now suppose that the surface of our monitor has a periodic structure. The Fourier transform of a periodic function is concentrated at spikes corresponding to the fundamental frequency of the structure and its overtones. So we expect to see a grid-like structure in the result. I believe that's what we're seeing in the spray of quantised vectors coming from the center of the image and the fact that the arms of the 'X' look like discrete blobs of colour arranged in a line rather than the white spikes in the astronomical example I linked to above. That tells us we're probably seeing artifacts caused by the microstructure of the LCD pixels.





An edge





I(y,z) = -1 for y<0 and 1 for y>0. In other words, the sgn function along the y-axis and constant along the z-axis. Up to multiplication by a constant, its z-axis and 1/ω along the y-axis. On other words, it's a spike, starting at the origin, extending along the y-axis, but fading away as we approach infinity. This is the source of diffraction spikes. Any time we see such a spike it's likely there was a perpendicular edge somewhere in the function I. For telescopes it often comes from the struts supporting the secondary mirror. For cameras it comes from the polygonal shape of the iris. In this case, it looks like we must have pixels whose shape has edges perpendicular to the 'X'. I have to admit, when I came to this conclusion it sounded very implausible. But then someone posted this image from Consider the function) = -1 for<0 and 1 for>0. In other words, the sgn function along the-axis and constant along the-axis. Up to multiplication by a constant, its Fourier transform is given by the dirac delta along the-axis and 1/ω along the-axis. On other words, it's a spike, starting at the origin, extending along the-axis, but fading away as we approach infinity. This is the source of diffraction spikes. Any time we see such a spike it's likely there was a perpendicular edge somewhere in the function. For telescopes it often comes from the struts supporting the secondary mirror. For cameras it comes from the polygonal shape of the iris. In this case, it looks like we must have pixels whose shape has edges perpendicular to the 'X'. I have to admit, when I came to this conclusion it sounded very implausible. But then someone posted this image from here

There are those edges at the predicted angles.





Simulating the artifact





Now we have enough information to simulate the artifact. You can think of the remainder of this article as written in "literate octave".





Save the above close-up of the pixels in a file and read it in.





I = imread('lcd-pattern.jpg');





I crop out one quarter as it simply repeats. This also removes the text.





I1 = I(251:500,251:500,1);





Now clean up some stray (image) pixels and pick out just one (LCD) pixel.





I1(1:250,100:250) = 0;





The photo is blurry, probably has compression artifacts and seems quite noisy to me. So I threshold it at intensity 200 to get a hard shape. I then compute the discrete Fourier transform and scale the intensity to something nice.





I2 = 0.0001*abs(fft2(I1>200).^2);





The Fourier transform puts the origin in the corner so I shift it to the centre.





I2 = circshift(I2,[125,125]);





Now we need to do the same thing for red and green. We reuse the red image but it needs scaling in proportion to the wavelength. The numbers 1.2 and 1.4 are intended to be the ratio of the wavelengths to that of blue light. They're just ballpark figures: I chose them to make the image sizes convenient when rescaled. I also crop out a piece from the centre of the rescaled images so they line up nicely. I use imresize2 from here





Ib = I2;

Ig = imresize2(Ir,1.2,'nearest')(26:275, 26:275);

Ir = imresize2(Ir,1.4,'nearest')(51:300, 51:300);





Now I assemble the final RGB image.





J = zeros(250, 250, 3);

J(:, :, 1) = Ir;

J(:, :, 2) = Ig;

J(:, :, 3) = Ib;

imwrite(J,'cross.jpg')





Here's the result:









Analysis





Not bad, but it's not perfect. That's unsurprising. Have a look at the thresholded image straight after thresholding. It's a very poor representation of the pixel shape. It would be better to redraw the shape correctly at a higher resolution (or even work analytically, not infeasible for simple shapes like these pixels). Stray (image) pixels can make a big difference in Fourier space. That explains the 'dirtiness' of my image. Nonetheless, it has the big X and it has the rows of dots near the middle. Qualitatively it's doing the right thing.





Note that the big X in the middle seems to have ghost reflections at the left and right. These are, I think, aliasing and purely artifacts of the digitisation.





It'd look slightly better with some multi-spectral rendering. I've treated the pixels as reflecting one wavelength but actually each one reflects a band.





There are also, almost certainly, lots more effects going in that picture. The close-up photograph is only revealing one part of the structure. I'm sure there is 3D structure to those pixels, not just a flat surface. I suspect that's where the horizontal white line is coming from. Because it's white it suggests an artifact due to something interacting equally with all visible wavelengths. Maybe a vertical edge between pixels.



But overall I think I've showed that Fourier optics is a step in the right direction for simulating this kind of effect.





Note





Everything I know about Fourier optics I learnt from my ex-colleague Oliver James





All the errors are mine of course.

One of the fun things about working in the visual effects industry is the large number of disciplines involved, especially when working for a smaller company where everyone has to do everything. At one point many years ago we did some work on simulating camera artifacts including diffraction spikes . Although the wikipedia page is in the context of astronomy you can get diffraction artifacts with any kind of camera, and you'll get spikes whenever the iris has a polygonal shape. I hope to sketch why later.I noticed an intriguing question about a diffraction pattern on Quora. (For those not following the link: this photo is a picture of a reflection of the photographer and camera in an LCD screen.) My immediate thought was "how could I have faked this in a render?".My main interest was in the 'X' shape. And the first question is this: how do I know this is a diffraction effect? It might be that there are specular reflections off periodic structures in the LCD display orthogonal to the arms of the 'X'. This would explain everything with only geometric optics There are two big clues I know of that diffraction is at work: we have separation of white light into separate colours. This is typical of diffraction effects. We also see a discrete structure: rays directed along a discrete set of directions in addition to the arms of the 'X'. This is typical of diffraction from periodic microstructures, and again I hope to sketch why later.So how can we explain the appearance of this photograph?The thick black vertical line is the monitor screen. The other vertical lines are the incoming plane waves. And the red arrow shows the direction of motion. Huygen's principle tells us that when the waves reflect off the monitor surface we can treat the entire monitor surface as a collection of points, each of which is emitting spherical waves equally in all directions. If you're unfamiliar with Huygen's principle it may seem wrong to you. When you shine a light at a mirror we get reflection along one particular direction, not in all directions. That's true, but we can view this fact as a consequence of interference between all of the little points on the surface emitting in all directions. Again, that's another thing that will emerge from the calculation below.The question I now want to answer is this: how much light gets reflected in direction? I'll assume we're observing things from far enough away that we can neglect the details of the geometry. We'll just compute the contribution in direction(a unit vector ()) from all of the points on our surface. We'll assume that because of variations on the surface, the intensity of the emission varies along the surface. We'll model the emission as a function, sois the emission from positionat time. We want to collect the light going in directionthrough some light collector at time. Call that. Here's a picture:Remember: there are waves going in all directions, but I've just drawn the waves going in directionNow comes the interesting bit: the light arriving at the collector is the integral of all the light emitted from the points on the monitor surface. But the time of flight from each point on the monitor surface is different. So when we perform our integral we need to build in a suitable delay. It's straightforward geometry to show that the time delay between waves arriving from₁ and₂ iswhereis the speed of light andas defined above, is the-component of. So, up to a constant of proportionality, and some absolute choice if time, we want this integralWe assumed that the monitor surface was being struck by plane waves. Assuming they were orthogonal to the surface and coherent this means thatis proportional to exp(). The time dependence inis just the sinusoid, so we drop that from further calculations. So we can write, ignoring another constant of proportionality,