I have a question about something I just actually discovered. I’ve been playing around with Luminar and noticed that when it loads my raw files the image is really terrible, underexposed, washed out, the histogram looks terrible. In poking around I found out this is normal. Apparently Luminar removes any adjustments or anything your camera made and just shows the raw sensor data. Is that correct? I saw people on their forums saying that the same images images in photoshop looked much better. In my playing around it seems that if I open the file in pixelmator it looks like I expect it to.
So, my question is could someone explain this to me? Does all software strip out whatever the camera does? What are the things the camera is actually doing? And if I actually want my staring point to be the image the camera produces should I be shooting in JPEG and editing that? I kind of thought the point was that with RAW all that extra info is there and you can adjust it. I am surprised how much the camera is actually doing to the image.
Well every camera’s version of RAW (or every manufacturer of camera’s version) is somewhat unique, but the basic premise of RAW is to store exactly what the image sensor captured, without further processing. The idea being you will have a more powerful CPU/program than the one running inside the camera to make similar decisions to the camera, with better results. Luminar brags about its AI capabilities in processing, so I am somewhat surprised it doesn’t present its best guess as a first approximation of the image.
TBH I have captured RAW images in the past, out of my first digital SLR, but I never bothered to process them anyway, and I was mostly happy with the JPGs. (It would write both to the CF card.) Now that I have better processing abilities, I really should go back and see if they could be improved.
Raw is supposed to be like that, by definition it has no processing at all. So it is not Luminar or any other app that is stripping out the camera processing.
You can tweak raw as much as you like and never lose any quality, that is the power of raw, on the other hand it takes time especially if you are not an expert so most people stick with the default images cameras produce in jpg.
If you do not have a huge amount of time, that is probably the best path, but there are a few edge cases that raw might come handy. When you have images that are too bright or too dark, then you might find that whatever your camera does with it, might not be what you would like, in those cases shooting in raw allows you to preserve the information so that you can tweak it to something that you like.
JPG is a lossy translation–it has thrown away data from the RAW file to make a certain presentation. With RAW and the right processing, it may be possible to extract details that were lost during the translation to JPG. If you like the pictures you’re getting from your camera as JPG and don’t feel there is anything you’d like to enhance, then the answer to your question is probably “nothing of consequence.” On the other hand, if you feel the choices your camera is making when providing you JPGs could be different or better, then you can play with the RAW file if you have it, and see if indeed there is more there to explore.
Shout out to @big_D for the plug. But yeah, I shared some information about this on HOP. THANKS for checking it out. Main thing to consider now-a-days when dealing with raw vs jpg is a convenience for you. SOME jpgs have come a long way these days on quality for some systems.
Hmmm, can’t say I’ve had the issue with the CR3 files from my Canon EOS M50. They tend to look the same as they do on the camera. Raw is definitely the way to go though. There’s more data there for you to adjust.