A couple of people have asked me what I mean by infrared photographs so I thought I would describe it simply so that I could point people at my answer. Here goes:
Digital camera sensors are sensitive to a wide spectrum of electromagnetic radiation which goes beyond the useful visible range. So that the cameras measure exposure and focus accurately based on what the photographer sees, a filter is always place in front of the sensor to allow only visible light to pass. Infrared photography, which is more accurately termed “near infrared” is performed with a camera whose sensor filter has been replaced by a filter which blocks the usual visible spectrum and allows near infrared to pass. This near infrared usually has a wavelength of 750nm to 900nm. Beyond 900nm is the realm of military night vision and beyond the capability of the sensors used.
The simple answer is because we can and because it’s different. I like it because it is often surprising. Because we don’t see in infrared we can only guess what the image will look like.
Let me show you.
|The image at the back of the camera looks like this.||In Lightroom, we apply a custom profile for infrared to shift the white balance and even out the red.||The image is essentially monochrome and a simple black and white conversion maybe what you are looking for.|
If you want to experiment further and get the infrared images you see on the internet with false color, you can open up the original in photoshop.
|You start with this image open in photoshop.||Using the channel mixer we swap the red and blue channels and get a very different image||Then, using levels on each of R, G and B you can play until you get what you want.|
I have and infrared gallery which shows a number of different outcomes using these techniques. I find them fascinating.