Blue Fox Posted April 13, 2016 Share Posted April 13, 2016 This may sound like a really stupid question, but I keep struggling with it. I thought white balance was easy. The various tutorials just say you should adjust the Kelvin value and if necessary the quadrant (color square) and show a few images with wrong wbs and some other obvious information (about sunsets, ...). All nice in theory, but I can't get my wb right! I have two problems; the first is rather a question, the second is about the best method to set the wb accurately on camera so that the images will look great when shown on a screen. Let me illustrate my first problem with a small example. I use a Sony a6000 and, very conveniently, it has an EVF. During noon I go outdoor, I look trough the EVF and set my wb to 5900K and some quadrant settings so that the colors in the EVF and outside match. During the night, indoors, I take my camera again, still set to 5900K but the picture in the EVF doesn't match what I'm seeing anymore. To get the colors matching again I have to set the temperature to something around 3500K. This looks weird to me! Of course, the wb of my eyes changes (it even sometimes differs, around 50-200K I think, between my left and right eye, especially after having been looking trough the EVF for a while), and the scene lighting (sun, LED bulbs, ...) also changes. But the camera's sensor is RGB, the EVF is RGB, so shouldn't the EVF just 'reproduce' the scene, irrespective of the change in lighting and the WB of my eyes? Could this 'issue' be to the fact that the whole range of light frequencies is squeezed into RGB, and that the frequencies (color wavelengths) of the sensor's RGB, viewfinder's RGB and my eye's RGB don't match? In this case, I shouldn't trust the EVF for setting the wb? So how should I set my white balance? At home, looking at my stills and video, I find that the auto white balance or presets do a rather good job but in general they are around 200K off (which I think is too much) and that my own 'color matching' (trough the EVF) gives results that sometimes are spot-on (with fantastic skin colors!), sometimes close but also a lot up to 400K off (giving a serious, often warm, color cast). For stills I can shoot RAW, but for video I really have to get the wb right (?). With the 8-bit 50mbps codec and Sony's fantastic color science, I don't like correcting the wb in post. It gives unnatural results. As I understand, when developing a RAW file to see the result on a screen, we should compensate for the differences in light temperature between the temperature of the screen (which should be a fixed value if the screen has been calibrated?), the wb of the viewer's eye when seeing the scene (probably most influenced by the scene lighting) and the wb of the viewers eye when viewing the image, which (I suppose) depends on the lights in the viewing room (with in dim lighting the display probably being the reference source for our eyes). The simple rule of working with a white object also doesn't seem to work for me. Looking at them with my own eyes, they look too cold or way too warm in a lot of circumstances (especially with artificial lighting, sunsets, …). Thus the use of these white objects hasn't much sense (in those circumstances)? I haven't yet used a real grey card, does this give way different results than a piece of quality paper? Some practical (and technical) explanation would be very, very helpful, because I feel completely lost. Quote Link to comment Share on other sites More sharing options...
Super Members Mattias Burling Posted April 13, 2016 Super Members Share Posted April 13, 2016 "During noon I go outdoor, I look trough the EVF and set my wb to 5900K and some quadrant settings so that the colors in the EVF and outside match. During the night, indoors, I take my camera again, still set to 5900K but the picture in the EVF doesn't match what I'm seeing anymore. To get the colors matching again I have to set the temperature to something around 3500K." Do you have lights on indoors when you shoot at night. Im going to guess "yes". And Im also gonna guess your lights at home are not 5900K Quote Link to comment Share on other sites More sharing options...
Blue Fox Posted April 13, 2016 Author Share Posted April 13, 2016 1 minute ago, Mattias Burling said: "During noon I go outdoor, I look trough the EVF and set my wb to 5900K and some quadrant settings so that the colors in the EVF and outside match. During the night, indoors, I take my camera again, still set to 5900K but the picture in the EVF doesn't match what I'm seeing anymore. To get the colors matching again I have to set the temperature to something around 3500K." Do you have lights on indoors when you shoot at night. Im going to guess "yes". And Im also gonna guess your lights at home are not 5900K Yes, I've lights on (I haven't an a7s with a 1.4 ) and I suppose they're around 2700 to 3500K. Of course the white balance of these light sources is very different. So, when taking an image I should use different white balances in order to show an image without a color cast. But that shouldn't change the image in the EVF? If white light going in the sensor with a temperature of <x> is converted (with certain settings, for example 5900K) to RGB and shown in the EVF and the EVF also shows light with a temperature of <x> in that case, shouldn't that same EVF with the same settings output a temperature of <y> when light with a temperature of <y> hits the sensor? Quote Link to comment Share on other sites More sharing options...
Super Members Mattias Burling Posted April 13, 2016 Super Members Share Posted April 13, 2016 4 minutes ago, Blue Fox said: But that shouldn't change the image in the EVF? But you said it didn't, you said, "During noon I go outdoor, I look trough the EVF and set my wb to 5900K and some quadrant settings so that the colors in the EVF and outside match. During the night, indoors, I take my camera again, still set to 5900K but the picture in the EVF doesn't match what I'm seeing anymore. To get the colors matching again I have to set the temperature to something around 3500K." When you set you're WB to the "correct" value of 3500K to match your lights it looks the same in the EVF and real life. When you kept the "wrong" WB of 5900K matching daylight it looked weird in the EVF. It sounds just as it should imo. Quote Link to comment Share on other sites More sharing options...
Blue Fox Posted April 13, 2016 Author Share Posted April 13, 2016 28 minutes ago, Mattias Burling said: (...) When you set you're WB to the "correct" value of 3500K to match your lights it looks the same in the EVF and real life. When you kept the "wrong" WB of 5900K matching daylight it looked weird in the EVF. It sounds just as it should imo. No, I don't think so. I agree, if I use the same wb for both scenes (noon and indoors), at least one of the images will have a color cast when seen on a screen afterwards. But if using the EVF, my eyes already correct for the difference in lighting (in both scenes white objects looked rather white to me) so they should also correct for the color cast in the EVF? It looks like either the camera does some strange processing OR that my eyes don't correct the image in the EVF (or on the camera LCD). Both seem very strange to me, as I see no reason why my eyes would correct the wb of one object differently than an other object, both in my view and close to each other. Anyway, how do you set your white balance (and do you correct it in post)? Quote Link to comment Share on other sites More sharing options...
Ivanhurba Posted April 13, 2016 Share Posted April 13, 2016 14 minutes ago, Blue Fox said: If white light going in the sensor with a temperature of <x> is converted (with certain settings, for example 5900K) to RGB and shown in the EVF and the EVF also shows light with a temperature of <x> in that case, shouldn't that same EVF with the same settings output a temperature of <y> when light with a temperature of <y> hits the sensor? You set the WB to X=white. If the temperature of your subject changes to Y your EVF will still be set to X= white, so Y will have a cast, either warmer if Y is lower Kº or blueish if Kº. The EVF won't compensate for that; your eye will, you have AWB on all the time as you said. Also remember to never trust the EVFs or LCDs completely. Sometimes they have a cast and can't be color calibrated so you'll have to learn their bias and count with it when you're shooting or you'll get the differences once they reach the edit phase. Quote Link to comment Share on other sites More sharing options...
Super Members Mattias Burling Posted April 13, 2016 Super Members Share Posted April 13, 2016 5 minutes ago, Blue Fox said: No, I don't think so. I agree, if I use the same wb for both scenes (noon and indoors), at least one of the images will have a color cast when seen on a screen afterwards. Your EVF is a screen afterwards, thats what an EVF does, its just a screen showing the video you are about to record. Its not an eye and its not meant to match what your eye sees. Its made to show what the camera sees. Geoff CB 1 Quote Link to comment Share on other sites More sharing options...
DayRaven Posted April 13, 2016 Share Posted April 13, 2016 Imagine you have a card that is a theoritical perfect white. Any visible wavelength which strikes the card is reflected, none are absorbed. The sunshine has only a few wavelengths missing or reduced, so let's say photons with wavelengths R-O-Y-G-B-I-V (Red, Orange, Yellow, Green, Blue, Indigo & Violet) leave the sun. They then strike our atmosphere. As we know, by looking, the atmosphere scatters blue more than red, but at noon, the sun passes relatively little atmosphere before hitting your eyes, so at noon, the full compliment of ROYGBIV hits your card and bounces into your camera Now, it's nighttime. Very little light from the sun is around, so you switch on your lights at home. These lights are incandescent, or maybe an energy saver bulb made to replicate the yellow light incandecsents produce .The light is giving out red, orange, yellow and green in equal proportions to the sun, but it isn't producing as much blue, indigo and violet. Thus we can write that it gives out ROYGbiv. You bring your theoretical card indoors, turn on the lights and it reflects ROYGbiv. Less of the "cooler colours". You look at the card. Our brains are very clever, it automatically balances the picture, you know the card is supposed to be white, so your brain makes it white. Out of the corner of your eye, you see someone walk past your window. They look blue, the whole world looks blue, an anonamly of this automatic white balance. We point the camera at our card. Our camera is in manual mode and is not so clever, it sees, bouncing off the card ROYGbiv. If you had put a yellow card in front of it in the noon sun, it would have also seen ROYGbiv bouncing from it. As our camera is in manual mode, it does not understand the context of the picture. It has no cultural reference or memory of this card being white, it displays it as yellow because it sees less biv and more ROYG. Fortunately, our camera has an adjustment, so we can set a temperature. Skip the part where the scale we use is derived from black body radiation, just know, we can set our camera to be more sensitive to blue or red, and we tell it how sensitive to be to either with a scale measured in kelvin. We know our indoors light is producing a temperature of 2800K, so when we set our camera to that, our white card looks white again. However, when we take it back out into the noon sun the next day, our camera setting is telling it to be especially sensitive to blues (because our bulb was producing few blues), so our image looks blue. We dial it back up to 5600K and at this point, it is balancing the sensitivity of blues and reds equally. The image looks correct again. ----- I know you know most of that, but it's important to understand that the reality is, there are wavelengths of light missing, our brains compensate for that but the sensor of the camers shows what is really there. When this gets passed to the processors for processing, a white balances is applied, this is part and parcel of turning electronic voltages into colours. When you look at the screen on the camera, your white balanced eye sees the output of the monitor, and your brain is both clever enough to not confuse what you see on screen with the real object and stupid enough to try to WB to that in the section of your vision that the screen occupies. Badly. Also, the screen being yet another light source with it's own colour temperature just adds to the brains problems. As you have noticed, this is not reliable and, if you spend a long time looking through a viewfinder, the output of that eye is adjusted to compensate, similarly, if you ask people who have watched a movie with a strong colour cast throughout, they often didn't notice it because their eyes adjusted after a few minutes. That doesn't mean that you will be able to balance them to each other though - theoretically yes, if you can hold attention rigidly on both objects simultaneously without anything else moving in your field of view, but the reality is, this is an impossible thing to work towards. Basically, don't trust what you see, your brain is working against you by trying to work for you! As for the best method, well, the most accurate is to have a good lightmeter and a grey card, and use it religiously, every shot. The most convienient is to stick it in auto wb. You can greycard your camera and use it's inbuilt metering system, you can adjust the WB only when the lighting changes drastically. I expect there are more methods, these are the ones that work for me, depending on how important quality is, how quick I need to be etc. And, yes, the whole point of a grey card is to give your camera a surface with no colouration of it's own to bounce every wavelength accurately to be metered - it's trying to be our theoretically perfect card. They can be suprisingly expensive, but as we've learnt, we can't trust our eyes when they tell us that the piece of paper we are using is white - it may be slightly yellow, it may be slightly blue, even if we see it as the purest white and that's going to screw us over by a couple of hundred K. Don't try to go for reality either, for starters, actual reality looks terrible and none of us can see it, secondly, we all develop our internal WB differently, with variation depending on our genes and our upbringing, and reality for us changes constantly over time as well as our brain constantly adjusts. Go for what looks good to you on the day. sgreszcz and Blue Fox 2 Quote Link to comment Share on other sites More sharing options...
AKH Posted April 13, 2016 Share Posted April 13, 2016 Not something about WB, but does demonstrate how your brain can work against you. https://en.wikipedia.org/wiki/Checker_shadow_illusion Quote Link to comment Share on other sites More sharing options...
Blue Fox Posted April 13, 2016 Author Share Posted April 13, 2016 2 hours ago, DayRaven said: (...) ----- I know you know most of that, but it's important to understand that the reality is, there are wavelengths of light missing, our brains compensate for that but the sensor of the camers shows what is really there. When this gets passed to the processors for processing, a white balances is applied, this is part and parcel of turning electronic voltages into colours. When you look at the screen on the camera, your white balanced eye sees the output of the monitor, and your brain is both clever enough to not confuse what you see on screen with the real object and stupid enough to try to WB to that in the section of your vision that the screen occupies. Badly. Also, the screen being yet another light source with it's own colour temperature just adds to the brains problems. As you have noticed, this is not reliable and, if you spend a long time looking through a viewfinder, the output of that eye is adjusted to compensate, similarly, if you ask people who have watched a movie with a strong colour cast throughout, they often didn't notice it because their eyes adjusted after a few minutes. That doesn't mean that you will be able to balance them to each other though - theoretically yes, if you can hold attention rigidly on both objects simultaneously without anything else moving in your field of view, but the reality is, this is an impossible thing to work towards. Basically, don't trust what you see, your brain is working against you by trying to work for you! As for the best method, well, the most accurate is to have a good lightmeter and a grey card, and use it religiously, every shot. The most convienient is to stick it in auto wb. You can greycard your camera and use it's inbuilt metering system, you can adjust the WB only when the lighting changes drastically. I expect there are more methods, these are the ones that work for me, depending on how important quality is, how quick I need to be etc. And, yes, the whole point of a grey card is to give your camera a surface with no colouration of it's own to bounce every wavelength accurately to be metered - it's trying to be our theoretically perfect card. They can be suprisingly expensive, but as we've learnt, we can't trust our eyes when they tell us that the piece of paper we are using is white - it may be slightly yellow, it may be slightly blue, even if we see it as the purest white and that's going to screw us over by a couple of hundred K. Don't try to go for reality either, for starters, actual reality looks terrible and none of us can see it, secondly, we all develop our internal WB differently, with variation depending on our genes and our upbringing, and reality for us changes constantly over time as well as our brain constantly adjusts. Go for what looks good to you on the day. Thank you very much for your post, DayRaven. I really appreciate your time and helpful input. I like your ROYGBIV example. So to conclude I would say: - I shouldn't (always) believe my eyes, because brains take into account multiple light sources (screen - scene - ...) in one scene and correct them all, differently for each object. - thus the 'color matching' method I used is complete nonsense and I should not use it anymore (or at least think twice before using it). - it's not a big problem when my wb isn't 100% perfect, I should just go for a good look in post. Ivanhurba 1 Quote Link to comment Share on other sites More sharing options...
DayRaven Posted April 13, 2016 Share Posted April 13, 2016 On 13/04/2016 at 8:20 PM, Blue Fox said: Thank you very much for your post, DayRaven. I really appreciate your time and helpful input. I like your ROYGBIV example. So to conclude I would say: - I shouldn't (always) believe my eyes, because brains take into account multiple light sources (screen - scene - ...) in one scene and correct them all, differently for each object. - thus the 'color matching' method I used is complete nonsense and I should not use it anymore (or at least think twice before using it). - it's not a big problem when my wb isn't 100% perfect, I should just go for a good look in post. It's not quite differently for each object - your brain can correct zones differently, but it's not great at it, and things like hormone levels can affect it greatly, a simple fight or flight state can turn your vision black and white in some zones! Definately think twice, depending on your sensitivity to WB issues, it's not really reliable, and it sounds like you are sensitive to them Try to get it perfect for you in camera, especially with 8bits, but don't expect everyone else to see perfect for you as perfect for them! Even when you're trying to shoot a realistic scene colourwise, you're still making artistic decisions, especially with the limitations of technology and some people just have slightly different expectations of correct WB. It's definately worth investing in a colour card with fleshtone panels on it to play with, set up a few scenes, lit different ways with the card in it, so you can see and get a feel for how your camera handles your white balance adjustments - experience and practice is the real key Blue Fox 1 Quote Link to comment Share on other sites More sharing options...
Blue Fox Posted April 14, 2016 Author Share Posted April 14, 2016 I'll experiment with color temperature! As I wasn't yet sure by how much our brains correct the wb, I made a render with Blender and Gimp. The two small squares have of course exactly the same color in the render image. While not 100% scientific, it fools indeed the brain (look full screen and with one eye to make it 3D): Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.