Jump to content

kye

Members
  • Posts

    7,817
  • Joined

  • Last visited

Everything posted by kye

  1. This is what I used to think, but if we think about film stocks and how they have these tints (IIRC there were other film stocks that had magenta/green tint instead of yellow/cyan) then here's the issue - every film ever shot on these film stocks would have this look built in. And the problem with that is that every film shot on film didn't look like it had only two colours, or that it was boring, or that it hid a lot. So, if film stocks had these things and didn't look like the POS that most orange/teal grades have, then I figure I must be missing something. Thus this thread. Maybe we're not missing things and in the film days they just designed sets and lighting to compensate, but maybe not. I'm no-where near done with this
  2. Let's start with film emulation. Resolve comes with a bunch of Film Emulation LUTs, let's play with the Kodak 2383 D60 LUT. Firstly, here's a greyscale to see what it does to luminance: The first is unprocessed, and so has nothing on the vectorscope as there is no colour. The second clearly shows that there is a push towards orange in the highlights, and a push towards teal in the shadows. If we change to a smoother gradient and zoom into the vectorscope further (by saturating the image hugely), we get this: From this we can see that it saturates the upper and lower mids more than the highlights and shadows, and that it isn't a straight application of two hues, but the hue varies. If we crop out the highlights and shadows in order to confirm which bits of the image are which bits in the vectorscope then this is what we get: Which confirms that the 'hook' parts of the vectorscope were the highlights and shadows. So there are changes in saturation and also hue applied to greyscale tones, but this is the OT look in this film stock. I suggested to the guys on LiftGammaGain that the film emulations in Resolve must be pretty good and was met with skepticism, however, if we assume that any lack of rigour on the part of the person creating these LUTs would tend towards making them overly simplistic rather than overly complex (a relatively safe assumption, but one all the same) then that suggests that this type of non-linear application of tint is likely in film stocks. So, what does this LUT do to colour? This is a very handy LUT stress-test image courtesy of TrueColor.us: Before image shows that the test image has hues that are in-line with the primary reference markers on the vectorscope, and that all the lines are straight, indicating there is also equal saturation of the image. After image shows a number of interesting things: The most saturated areas of the image are reduced saturation but the mid-levels of saturation are increased, giving a non-linear saturation response that would tend to increase saturation in the image without clipping things, very nice but not relevant to the OT look In terms of relative saturation, the Yellows are the most saturated, Cyan is next most saturated, Red and Magenta are in the middle of the range, and Blue and Green are the least saturated In terms of Hue, Cyan got pushed towards blue a bit, Yellow got pushed a bit orange, Magenta got pushed a little red, and RGB seemed unaffected @Juan Melara made an excellent video re-creating this LUT (he did the D65 version, which has a warmer white-point, but the overall look should be the same): The video is interesting as he uses a range of techniques that apply OT elements to the image: He uses a set of curves which apply a cooler tint to the shadows and a warmer tint to the highlights, and by having different curves for the Red and Green channel, gets some Hue variation across that range too Then there's a Hue vs Sat curve that saturates Yellow and Cyan above the other hues: Then a Hue vs Hue curve that pushes Yellow slightly towards Orange, Green towards Cyan, and Blue towards Cyan (up in the graph pushes colours left on the rainbow in the background of the chart): Then he has two YUV nodes each with a separate Key that are too complicated to explain easily here, but affect both the hue and saturation of the colours in the image. Juan also has the Resolve Powergrade available for download for free on his website, so check it out: https://juanmelara.com.au it's in his store, but it's free. So film tends to have elements of OT. Here are some additional film emulation LUTs in Resolve for comparison - note that all of them have Yellow and Cyan as the most saturated primaries..
  3. The Orange and Teal look is famous, and I've seen it in many different contexts and this thread is about trying to make sense of it. Specifically, why we might do it, and then what the best ways are. From what I can tell: It simulates golden hour (where direct sunlight is warm and the shadows are cool as they're lit by the blue sky) It creates more contrast between skin-tones and darker background elements, making the people in the shot stand out more It is also part of the look of (some) film stocks, so is mixed up in the retro aesthetic, and of course the "cinematic footage" trope There might be other reasons to do it too. If you can think of some please let me know. I've played with the look in the past and I just find that when I apply it to my footage it looks awful. Film, on the other hand, often looks wonderful with bright saturated colours, which I like a great deal. This probably means I'm not doing it right, and that's probably because I don't understand it sufficiently enough, thus this thread. As usual, more to come...
  4. kye

    Lenses

    No experience with that lens, but I agree with @TheRenaissanceMan about it covering some very useful focal lengths. You should be able to search places like flickr to see many example images - being an L lens there might be some reviews bouncing around too. Even if it is “optically middling” maybe that’s a good thing? Lots of people are fans of vintage glass as they take the digital edge off modern cameras, especially those with internal sharpening and h.264 compression. Being an L lens it should still be really nice though - my 70-210 F4 FD lens (not L) was lower down in the range and quite old now and is still lovely to use, so Canon has been putting out high quality glass for a very long time.
  5. I haven't seen that one before... How hilarious!! ???
  6. Yes, although it's a tech board more than an art board, I'm assuming that's how Andrew wants it. My attempts to get people to shoot more were only mildly successful, but from my own perspective I am seeing enough shortcomings of what I shoot that I don't need other people to point out where I need to improve lol. I am intending to come back to a lot of the threads I've started but for the moment I'm just chilling for the holiday and enjoying relaxing. I am about to start a new thread on the Orange/Teal look and how it relates to film-making, how it relates to film, and how to do it in post, as I've been thinking a lot about that recently and have fallen down the rabbit-hole pretty far (eg, YUV channel mixer vs Hue v Sat vs Hue v Hue vs RGB curves... that far!).
  7. Dude, that was hilarious!! Nice work I think socks and sandals in the ocean was my favourite bit, but the pause for sub was also right up there. The effect of stabilising on your head worked really well actually, I'm sure it's 'been done' already but definitely looks cool.
  8. The way I see it is that they haven't died out at all, they're either being used for over capture (360 cameras are almost good enough for that now - I've started to see people using these in "real" productions - for example check out Tiny House Nation on Netflix which used a GoPro Fusion for getting multiple camera angles and action/reaction shots in the tiny houses they're filming in and around) and 3D will also "come back" because it's a technology looking for a use and we haven't worked out what it's really for yet. Even if 3D never becomes a publishing format and stays in the over capture space, AI will really benefit from having multiple cameras with parallax of varying amounts. For example if you have a single 'normal' camera and have a time-of-flight camera / normal camera pairing to the left and another pairing to the right then you can get excellent depth, you have a second (or third) perspective for when objects are close to the camera and block one perspective, and you can also see slightly behind the subject and can use that information to change perspectives slightly, either 3D camera adjustment for offset stabilisation, for 3D effects (I've seen facebook images that use the portrait-mode of the phone to create an image with multiple planes that animate when you scroll or tilt/rotate the phone) etc. even just to see what's slightly behind the subject and you can then blur it for better bokeh simulation, the possibilities are endless. If you're going to stay current then you're going to have to separate the capture format from the publishing format. Computational photography separates these by definition by processing the input before it becomes the output.
  9. I agree with much of what @Oliver Daniel suggested, including the IBIS and EIS taking over gimbals. I think the technical solution for this might look very different though, but let me set some context from the other thread about how far we’ve come in the last decade.. in the last ten years: the 5Dii accidentally started the DSLR revolution the BMCC gave RAW 2.5K to the humble masses we got 3D consumer cameras we got 360 consumer cameras we got face-recognition AF, we got specific face recognition AF, we got eye AF we got computational photography that used multiple exposures from one camera we got computational photography that used multiple dedicated cameras (and other devices like time-of-flight sensors) and did so completely invisibly we got digital re-lighting we got the above 4 points in the most popular camera on the planet we got 6K50 RAW So, from before the DSLR revolution to 6K50 RAW and AI in the most popular and accessible camera on the planet in the last decade.... I think that we should take some queues from The Expanse and look at the floating camera that appears in Season 4. I think consumer cameras will go towards being smart and simple to use for the non-expert as phones continue to eat the entire photography market from the bottom up. They’ve basically killed the point-and-shoot and will continue to eat the low-end DSLR/MILC range and up. So I think we’ll get more and more 360 3D setups by default in an over capture situation where they have cameras in the front/back/sides/top/bottom/corners which will mean that they can do EIS perfectly in post. That will eliminate all rotational instability, but not any physical movement in any direction (how gimbals bob up and down when people walk with them). This will be addressed via AI, as the device will be taking the image apart, processing it in 3D, then putting it back together again. It won’t take much parallax adjustment in post to stabilise a few inches of travel when objects aren’t that close to the device. We won’t get floating cameras though, that would require a pretty significant advances in physics! This AI will enable computational DoF and other things, but I don’t think these will matter much, as I think shallow DoF is a trend that will decline gradually. If this is surprising to you then I’d suggest you go watch the excellent Fimmaker IQ video about the history of DoF, and take note that there was a period in cinematic history when everyone wanted deep DoF and once it became available through the tech the cinematic leaders of the time used it to tell stories where the plot advanced in the foreground, mid-ground and background simultaneously, and everyone wanted it. The normal folks of today view shallow DoF (and nice colours and lighting design and composition for that matter) as indicating something is fake, and it becomes associated with TV, movies, and large companies trying to PR you into loving them while they deforest the amazon and poison your drinking water. The look at authenticity is the look of a smartphone, because that’s where unscripted and unedited stories come from. The last decade started with barely the smartphone, now the smartphone is ubiquetious, and so developed that they basically all look the same. Camera phones basically didn’t exist a decade ago, now we have had the first feature films shot on them by famous directors (publicity stunt or not). People over-estimate what can happen in a year, but underestimate what can happen in a decade.
  10. kye

    Sirui anamorphic

    I'd suggest that it wouldn't work so well trying to make such a large change in hue. The problem is the same as greenscreening except that this example has extremely soft edges and getting the key right would be tricky and the edges would probably have strange halos. Take a few screenshots of one of the videos above and try it. I'd suggest grabbing a key, taking the key and blurring it vertically in one node, blurring it horizontally very heavily in the next node and then changing the colour based on that. Not sure if PP or FCPX will let you do that work flow, but it's pretty easy in Resolve Studio.
  11. well, ok, just a few.... You can adapt almost anything: It's flexible Portable - this is my dual camera setup for travel - pic taken on my towel on my lap in the tour bus doing 80kmh on the beach... and the files are crazy gradable - for example if we take this image here and try to break it we basically can't do it.... which means you're free to make whatever images you want..
  12. For me it's the GH5. Not perfect, but a real workhorse, and was really ahead of its time. IBIS means I can shoot hand-held but not be limited to a hand-held look, 10-bit internal gives hugely flexible footage without having to have an external recorder, 4K60 and VFR up to 1080p180 are great, MFT mount makes adapting almost anything super easy, flippy screen and EVF are really useful, the list goes on... That's my personal choice, but if I had to pick what was the most significant things in general then it would be the BMPCC and BMCC as I think they moved the goal posts forward by an enormous margin and much of the features we all enjoy today were inspired by that shift, as camera manufacturers have historically been very traditional and reluctant to give anything except the smallest improvement possible on newer models and this shook up the industry a bit and showed there is room for new players to be profitable and innovative.
  13. I was thinking that too. Even a stereo pair placed above the camera mixed in would give some sense of space. The best solution would be to decode a 3D signal to binaural output based on your viewing angle, but do we know if the platforms support this kind of encoded sound, or if they just pump out a stereo audio track? I figure that we're in the infancy of this stuff, and it will get there, but not really sure how far the tech is along right now w.r.t audio. I do know that in terms of adjustability it's normally pretty rubbish. I have MobileVRStation for iOS and the number of options in it is ridiculous, and the number of options in most online streaming platforms is zero.
  14. Great intro video from Seven Dovey about 180 VR film-making. He talks about the gear, framing and compositions, sound, and touches on editing too. He's shot a few films in this format and they're coming soon apparently so that will be interesting.
  15. kye

    Extreme telephoto work

    Finished processing the two timelapses I took. Resolve doesn't read RW2 files from the GH5 so I had to have a couple of goes of processing them, but I discovered that the Adobe DNG Converter creates 16-bit DNG files so keeps the bit-depth and DR of the RAW files, which is useful when you're clipping things this hard. Processing involved stabilisation, setting black and white point and a key of just the clipped areas which I pushed towards orange so it wasn't digital white. I'm thinking I'll do one where I don't clip the sun and see how it looks, although I suspect that everything else will just be black, but I'd be happy to be proven wrong. In both of them the edges of the sun appear to be blocky and horribly compressed, but it's actually due to the atmospheric effects, the RAW files are, well, RAW, and the pixel size is much smaller than the rather square-looking edges. I suspect it would make more sense visually if I had a higher frame rate (the GH5 can't do a time-lapse interval smaller than 1s which is what these are). I'd try video but I've zoomed in a bit on these, so the extra resolution seems to be quite useful. Suggestions welcome..
  16. Thanks all.. Looks like it's either a "real" video capture solution from BM or equivalent, a "real" monitor that I could also use for shooting, or a mains powered TV or monitor of some kind. I wouldn't use a monitor that much in my normal shooting but I would use it for shooting my kids sports games, although how many more seasons he sticks with that is uncertain.
  17. Thanks all. I was really hoping for a link to an $30 ebay USB dongle. I was kind of thinking of a laptop as kind of a half-way measure between having an on-camera monitor and having a huge TV/monitor on set like is common on big productions. I wouldn't carry around a laptop for my run-n-gun shooting, but if you're shooting in a single location then I don't see why a larger screen wouldn't be useful. I've had heaps of things that I shot on the GH5 and they looked fine on the monitor (or the viewfinder which is higher resolution) but were completely fubar when I saw them on my 32" display, and I'm not sure that a 7" display is really large enough to be a perfect proxy for a 32" 4K display, let alone a 65" 4K TV or a projector setup. And before anyone tells me that big productions use 7" monitors just fine let me say that I watch Netflix / Prime on my 32" 4K display and I see out-of-focus shots all the time.... There should be a chip that is HMDI to USB and there should be a factory in China just pumping out the standard implementation circuit from the spec sheet at a very small markup to the parts cost. Surely?
  18. kye

    Sirui anamorphic

    I think of it much less like marketing and more like market research. There’s a principle in market research that you can ask someone if they would purchase something and get an answer, but it’s only if you ask the person to actually buy that you get their real answer. Lots of people will tell you your idea is good, or that they would buy, to make you feel better, or they don’t like saying no to people, or they’re optimistic, or excited by the idea, etc but it’s not reality. Why design a product, fit out a factory with tooling, manufacture a bunch of products, market the shit out of it, and only then discover that the people in the focus group who said they were excited to buy it were just being nice... ? These crowdfunding campaigns are more valuable to manufacturers in market research than they are in anything else.
  19. kye

    Extreme telephoto work

    @leslie lots of ways.. You can either do the whole image, I’d suggest playing with the Gamma or Gain wheels and see if you can get the sun and the smoke the right colours at the same time. The other approach is to do a key of just the sun, and then you could try the same things, or try the Hue vs Hue and shift it towards red/orange and further away from pink/magenta.
  20. @heart0less - that is a fascinating video. Thanks! What I found interesting about it is: They credit the guys at LiftGammaGain (I suspect most similar videos come from people who either don't know about LGG or plainly rip it off without saying) They mention a LUT for all of 2 seconds before talking about everything except colour They made a complete tutorial that didn't include possibly the most significant things that (IMHO) make something look like film They also did something that I think was great - they showed the LGG thread. A quick search and now I can go read it! The thread is only about halation and gate weave (which I had to look up - it's the jumping around that film does - like why the credits move around when being projected). It's also interesting that it's started by Jason Bowdach whose name was familiar - he'd written one of the reference articles on film emulation that I refer back to on occasion.. So, all that said, here are the other links that I've found that are useful if you want to emulate film... To start, this article is very good and talks about non-linearities and saturation behaviour: https://www.provideocoalition.com/film-look-two/ It also links to this very interesting video showing how Alexa handles things: Noam Kroll has a decent write-up: https://noamkroll.com/how-to-make-your-digital-footage-look-like-film-in-post-production/ The article from Jason Bowdach that covers a lot of ground: https://blog.frame.io/2019/10/21/emulating-film-look/ and lastly the article the video above references: https://liftgammagain.com/forum/index.php?threads/halation-and-gate-weave.13056/ of course, after all this, the film look really comes from lighting and composition in the first place, and remember that if you want the film look then it's better to first get someone to look at your film (ha ha) so it should be entertaining in the first instance Film is probably an infinitely deep rabbit hole....
  21. kye

    Extreme telephoto work

    Yeah, the focus would be hard - for football season I shoot highlights at my kids games and so I spend a bit of time manually pulling focus at around 200-400mm equivalent, but you're in another whole league at 1500mm. One challenge I have with my rig is that it's on a TC and also a dumb adapter but the lens doesn't have a hot shoe, so it's being supported by the camera (and I also hold the weight of the lens with the hand that's focusing) but getting some additional lens support would be ideal in future. That is a really cool video and awesome zooms. Thanks for posting the article, some good tech talk in there and it makes sense they would have needed a robot - those focal lengths are really difficult to manage, especially with those fast transitions. Luckily the sun doesn't require me to move the camera!! lol, let's hope! One of our neighbours has a two storey house with a balcony and they're often out there so I'm not the only one. Of course, I can't really see into other peoples backyards from that angle either, so there's that. well, not an optical viewfinder anyway! I have no idea at what point the sun is too much, but I'm shooting without filters, although at F4 the lens is hardly super fast. Thanks - interesting setup. I went with the oats solution as it keeps the camera low to the ground so that air is only going to go over the top of it rather than around it with lots of turbulence, and the oats also provides multiple points of contact rather than there being a single point subject to flex and twists. My tripod would have to be at eye-level (and I'm over 6 foot) to see the horizon, so at it's least stable configuration basically.
  22. Interesting film from Blackmagic... Shot on P6K, post in Resolve 16, and the credits say "Produced in 14 days" so presumably a short timeframe project. What I think is interesting is that (although it doesn't say it) I suspect that the whole production was done in Resolve, so that likely means that all the VFX are from the Fusion page. This would be pretty cool as I think that the Fusion page has huge untapped potential and we haven't seen it in use yet as the people who really know Fusion are off getting paid not being on YT. It would be interesting to see some BTS, especially of what happened in post for this.
  23. The other day I noticed the sunset and decided to have a go at filming the sunset at super-telephoto so that the sun fills a large section of the frame and you see all the atmospheric distortions and stuff. First attempt was a write-off because I stuffed up the focus. Second attempt was better as I was able to focus better, and then closed the aperture a few stops to make sure I got it. I stuffed up the settings as I still had it on my normal auto-exposure and auto-WB, so that's why I'm only posting this still, however I think I'm getting closer.. This was shot in 4K but I think next time I will do a normal time-lapse and get the benefits of RAW and a better ISO as there is quite a bit of grain in the file. The setup seems to work though: GH5, Canon FD 70-210 F4 at 210mm F11, Canon FD 2x TC. Setup on a packet of oats for stability in the breeze we had, and on top of a pillar on the fence. I suspect that at 840mm equivalent that stability is one of the key factors here, and the oats seemed to do the trick. Once I get it all setup then I can probably put a bag of pasta or something on the top to further damp it and deflect some of the wind. Yesterday was cloudy so no sunset, but hopefully tomorrow I'll get another go with fixed settings and in time-lapse mode and I'll be able to post a video instead of just a still Anyone else shooting longer than 600mm, or 800mm equivalents?
  24. Can I use a laptop as a large portable HDMI monitor, presumably via some kind of USB HDMI in type adapter/dongle and some software that allows viewing (and maybe some cool features like LUTs, false colour, focus peaking etc). It seems ridiculous to suggest that I have to buy a HDMI monitor when that means I have to re-buy the screen, buttons, battery etc that are already in a laptop. Surely there's an adapter that's cheaper than a monitor with focus peaking and various features? I'd be looking to use it with my GH5 for critical focus when I'm either too far from the camera to see the floppy screen, or when I can't check focus well enough using the (rather mediocre) focus peaking that it delivers.
×
×
  • Create New...