Jump to content

kye

Members
  • Posts

    7,817
  • Joined

  • Last visited

Everything posted by kye

  1. I think @mercer mentioned using a belt clip for a tape measure at one point.. ?
  2. kye

    Slowing down 24p

    Recently I've seen a few instances where a TV show or movie slowed down 24p footage, effectively creating a <12fps shot. The interesting thing is that often these productions are modern, shot on Alexas, and could just as easily shot 50p or 120p+, but didn't. The moments that were slowed down were obvious shots for slow motion, so it's not like the editor went in an unexpected direction that the director or script couldn't have anticipated. The thing that strikes me is that it's a different effect to shooting 50p and slowing it down. A different aesthetic. TBH it seems more timeless and more emotional than the typical 50p, high-emotion, queue the big music, pivotal story moment that is commonly done. It feels more like classic cinema. Have you noticed this? How does it feel to you? Would you do it deliberately?
  3. TV manufacturers are in the business of selling TVs. If you want to talk about the mis-alignment between what film-makers want, what consumers can see, and what is getting used to sell TVs, then you're wasting your time talking about anything other than resolution, and specifically 4K. Most people can't tell the difference between 1080p and 4K at normal / sensible viewing distances, and most video content even shot in 4K looks awful when viewed close enough to see the horrific compression that gets applied to it by streaming services. HDR is something that I think is worthwhile, and an argument can be made for why that should be something that is worth re-buying your TV for, but 4K in the consumer market is just a gimmick to sell things. I find that the world makes zero sense when you don't look at it from the right angle - the design of a TV isn't to give the best cinematic experience, it's to put dollars into the pockets of electronic company CEOs! Lots of discussions here on the forums are around how depressing the world has become, for a variety of reasons, and I think that a major contributor is the idea that the world should be a certain way, and that that way is different to how it is. This is simply a recipe for frustration and depression. I think Fuzzynormal said it best....
  4. If you care about it that much then I'd say get more discerning friends! My experience is that people see / hear / smell / taste things differently and uniquely, not worse or better. They are likely not sensitive to the things that you are attune to, and vice versa. When I ask my wife about colour grading she has all kinds of opinions, however she doesn't know the words to use and so it's not an easy subject to communicate about, but my overall impression is that she's sensitive to different things than I am, rather than not caring. This was my experience when I was into high-end audio equipment too, it's very common for the wives of audiophiles to be able to tell that you did a minor upgrade, but without being told and by noticing it from the other room. The other thing to consider is that much of film-making is the deliberate crafting of things that other people find sub-conscious or completely un-conscious. The difference between good and bad editing for example results in the viewer being able to follow the plot vs being confused by certain elements, and this is often accomplished (especially by the great directors / editors who show and don't tell) by the careful arranging of lines, shots, and edits which the viewer will correctly interpret but likely won't be aware of. I suggest that rather than rant on the internet because the world doesn't align to your particular preferences, you take this as an opportunity to learn what they do care about and focus on that. After all, you're making content for other people, and not yourself, right?
  5. I definitely agree with the sentiment "the least moves wins", although I would refine that considerably. I used to find myself basically fighting the footage, unable to get the footage to look even half-way decent. This was because my skill level was basically zero at that point. As I gradually learned, I started being able to make a few good adjustments before the 'tweaking' started doing more harm than good. Now, I think of it as a skill-level thing - the more skill you have the more adjustments you can make before you're making it worse and not better. My new workflow is now to apply a look (Kodak 2393 LUT with some blurring/texture as shown above is my current favourite but I'm adjusting and optimising over time) and then to WB, adjust levels, and do any localised corrections required, but all underneath the look. The secondaries are normally Hue-v-Hue / Hue-v-Lum / Hue-v-Sat but I will also do local adjustments if required. These are often to match shots rather than significantly push things around. I'm often doing things like darkening / desaturating distracting objects in the background or doing a large and soft power-window to brighten the subject. I regard this as being quite minimal, considering that the 2393 is by far the largest adjustment in the grade and it was created by people far above my skill level, so I'm still responsible for the minority of the grade. With my control surface I'm able to rip through an edit only spending a short amount of time per shot, by only making a few relatively repeatable adjustments.
  6. Every camera has great rolloff if the DR is there, just apply a curve. If I can get a compliment on the highlight rolloff from a budget P&S camera from 2009 then there are no excuses! It's also possible to get a nice rolloff on areas that are clipped, although you don't get any detail back from them obviously. The more I learn about colour grading, the more that I realise that grading is in the same category as production design or lighting, you can't expect your footage to look great if you don't do any set dressing, hair or makeup, or just shoot with whatever light happens to be there when you happen to show up, so why would it look great if you didn't do any colour grading either?
  7. kye

    PC Builds

    Been online singe 1994 - the strength of my filters should be a testament to that!
  8. Could be, they're very easy to drink! Speaking of Korean TV series, they're surprisingly well produced and image quality is often remarkably high. My wife says the storytelling is also very good, so if you're inclined to a bit of soap then they might be worth a watch. Netflix is full of them if you can navigate their algorithm in the right direction 🙂
  9. Everyone wants to talk about 4K and 6K and 8K and 12K, and to talk about 8-bit vs 10-bit vs 12-bit vs 14-bit, but when it actually comes to the quality of the image, people don't want to know. I was interested in it but found so little online that I ended up having to do my own study on the SNR of various codecs myself. It's here: You're talking about the quality / performance of the same flavours of codec but with differing encoding methods (software/hardware). I'd bet you won't find anything and you'll have to do your own testing to find answers. I found it almost impossible to get information on the T2 chip that even specified if it was supported, and if so, what codecs / bit-depths / subsampling it supported. Good luck, but my advice is to give up asking.
  10. We bought one of each and tried them all at a dinner party, with my wife (who is heavily into watching Korean TV series) explaining the customs around drinking it, with the eldest person serving first, how to hold the bottle as you pour, who has the glass highest when toasting, etc etc. The only challenge was the 'original' flavour, which reminded me of grappa / methylated spirits! Urrgh.
  11. kye

    PC Builds

    Yeah, that's what I thought you were getting at. Wow... you mean that hiring a photographer (or videographer) isn't just hiring their camera, and that the operator is more than just a technician????? HOLD THE FRONT PAGE!! But seriously... yeah. All this equipment stuff is BS if you're putting it ahead of the creative side. No-one does their best work when fighting with the tech, even if the tech is nice tech. "The acting was uninspired, the lighting was awful, and the story was un-engaging, but overall it was a great film because it was shot on a great camera with high resolution and great colour science" - said no-one ever.
  12. This conversation gets better and better.... The simple choice of wanting 8K output not only completely dominates the hardware that the OP has to use, necessitates a proxy workflow which the OP seems to be against, but now it dominates the choice of NLE! Never mind if I hate the computer, workflow, or software.... I have 8K! I think the OP should really be asking themselves how much 8K is worth in comparison to the other things it looks like they have to trade-off. I used to be very 4K-centric and was wanting to future-proof my videos as much as possible (which is a genuine thing considering I am basically the family historian so videos will get more interesting over time rather than less) but as I gradually discovered what I liked in an image and what made the most difference to getting those results I have gone down to 1080p. My interest in 4K was naive and it was only through learning more about the craft that I realised how little it actually matters. Everyone is different in their priorities, but when priority #1 means having to compromise on #2, #3, #4, and #5... it's a good time to question how that stacks up.
  13. I was advocating for 1080p Prores / DNxHD proies, so that only requires a 1080p ALL-I capable computer, which these days is almost all of them. If he wants to shoot 8K and master in 8K then good luck to him, it's the editing experience that is the question. Plus who knows what kind of hype around 8K is present in the marketplace these days - until clients work out that 8K is pretty well useless and doesn't improve IQ even if it can be broadcast then there might still be money in offering 8K as a service, and considering the state of film-making and 2020 I understand anything to give a competitive edge..
  14. Soju! I had my first soju experience only the other day, and it was quite delicious 🙂 I also like that the grapefruit one was featured - it was one of the ones that everyone liked the most!
  15. kye

    PC Builds

    The other challenge that @herein2020 was avoiding was that of incompatibility. My dad used to work for a large educational institution and ordered a custom built PC to replace their main file server, so naturally ordered the latest motherboard, CPU, RAM, RAID controller and drives. Long story short, two months after getting it he still hadn't managed to get an OS to install correctly, and neither had the other people on the hundred-page thread talking about the incompatibility, in which multiple people verified that the manufacturers of various components were all blaming each other for the problem and no-one was working on a solution, so my dad did what everyone else in the thread did and gave up. He was lucky enough to be able to lean on their wholesaler to take the equipment back with a full refund, but others weren't so lucky, and the thread continued for another year as various people swapped out parts for other models/brands to see what the best fully-functional system was. Or you just buy something that someone else has already checked for you. There's a reason that many serious software packages are only supported on certain OS versions and certain hardware configurations. It's because their clients value reliability and full-featured service and support rather than getting a 12% improvement on performance.
  16. XMAS DOGGIES!! My only question is - what alcohol did you put in those little barrels? Brandy perhaps, for the holiday season? Whisky for a more straight-up drink? ...Tequila perhaps? 🙂 🙂 🙂
  17. While the C70 has many advantages and attractive features, I think the above is probably the most important factor of any camera because it impacts the level of creativity and inspiration of the video, not just what colour or clarity the pixels are in the output files. I consistently find that creativity evaporates when I feel like i'm fighting the equipment rather than it having my back and supporting the work. In the creative pursuits, this is a night-and-day difference, but of course, is also different for all people, so it's about the synergy between the camera and the user rather than one camera suiting everyone. Great stuff!
  18. I thought quite a few folks here liked the images from the Z6 but maybe the timing wasn't right for people to actually get one. Certainly, the colour from Fuji on their latest cameras is very nice, and the eterna colour profile is very nice indeed.
  19. Fair enough. Unfortunately, your budget isn't sized appropriately for the resolutions you're talking about. I think you have three paths forward: Give up on the laptop and add a zero to your budget, making it $20000 instead of $2000, then go find where people are talking about things like multi-GPU-watercooling setups and where to put the server racks and how to run the cables to the control room Do nothing and wait for Apple to release the 16" MBP with their new chipset in it (this could be a few years wait though and no guarantees about 8K) Work with proxies Proxies are the free option, at least in dollar terms, and you probably don't need to spend any money to get decent enough performance. I'd suggest rendering 1080p proxies in either Prores HQ or DNxHD HQ. This format should be low enough resolution for a modest machine to work with acceptable performance, but high enough resolution and colour depth so that you can do almost all your editing and image processing on the proxy files, and they will be a decent enough approximation of how the footage looks. Things like NR and texture effects would need to be adjusted while looking at the source footage directly, but apart from that you should be able to work with the Proxy files and then just swap to the source files and render the project in whatever resolution you want to deliver and master in.
  20. There are two ways to buy a computer for video editing. The first is to look at what performance you need and buy something that can deliver that for you, regardless of price. The second is to set a budget and get the most you can for that, accepting whatever level of performance that gives you and working around the limitations. $2000 is isn't even in the same universe as the first option, so your only hope is to buy the best performance you can, and then work out the best proxy workflow for your NLE and situation. To get good editing and colour grading performance, your system needs to be capable of maybe 2-4 times (or more) the performance required to play the media you're editing. Even a simple cut requires your computer to load the next clip, skip to the in point of the next clip, if it's IPB then it needs to retrace in the file back to the previous keyframe, then render each frame from there forwards until it knows what the first frame on your timeline looks like, and it needs to do all that while playing the previous clip. This doesn't include putting a grade on the clips once they're decoded, or even having to process multiple frames for things like temporal NR, etc. Playing a file is one thing, editing is something else entirely. By the way, Hollywood films are regularly shot in 2.8K or 3.2K and processed and delivered in 2K, so trying to convince someone that you need an 8K workflow is basically saying you need 16 times the image quality of a multi-million dollar Hollywood film, so good luck with that. Most systems work just fine with 2K by the way....
  21. For some time I've been thinking about the texture of film. I've also been thinking about the texture of RAW images, both 4K and also 1080p. And I've been thinking of the texture of low-bitrate cheap digital camera images, and how much I don't like it. Last night I watched Knives Out, which was very entertaining, but of note was that it was shot by Steve Yedlin ASC, and that it was shot in 2.8K RAW and mastered in 2K. For those that aren't aware, Steve Yedlin is basically a genius, and his website takes on all the good topics like sensor size, colour science, resolution, and others, and does so with A/B testing, logic and actual math. If someone disagrees with Steve, their work is cut out in convincing me that they know something Steve doesn't! This inspired me to do some tests on processing images with the goal being to create a nice timeless texture. Film has a nice analog but very imperfect feel with grain (both the random noise grain but also grain size of the film itself which controls resolution). Highly-compressed images from cheap cameras have a cheap and nasty texture, often called digititis, and is to be avoided where possible. RAW images don't feel analog, but they don't feel digital in digititis way either. They're somewhere in-between, but in a super clean direction rather than having distortions, with film having film grain which isn't always viewed as a negative distortion, and highly-compressed digital having compression artefacts which are always viewed as a negative distortion. Here's the first test, which is based on taking a few random still images from the net and adding various blur and grain to see what we can do to change the texture of them. The images are 4-7K and offer varying levels of sharpness. The processing was a simple Gaussian Blur in Resolve, at 0.1 / 0.15 / 0.2 settings, and adding film grain to kind of match. On the export file the 0.1 blur does basically nothing, the 0.15 blur is a little heavy handed, and the 0.2 looks like 8mm film, so very stylised! The video starts with each image zoomed in significantly, both so that you can see the original resolution in the file, but also so that you can get a sense of how having extra resolution (by including more of the source file in the frame) changes the aesthetic. Interestingly, most of the images look quite analog when zoomed in a lot, which may be as much to do with the lens resolution and artefacts being exposed as it has to do with the resolution of the file itself. My impression of the zooming test is that the images start looking very retro (at 5X all their flaws are exposed) but transition to a very clean and digital aesthetic. The 0.15 blur seems to take that impression away, and with the film grain added it almost looks like an optical pull-out on film was shot of a printed photograph. In a sense they start looking very analog and at some point the blur I'm applying becomes the limiting factor and so the image doesn't progress beyond a certain level of 'digitalness'. In the sections where I faded between the processed and unprocessed image I found it interesting that the digitalness doesn't kick in until quite late in the fade, which shows the impact of blurring the image and putting it on top of the unprocessed image, which is an alternate approach to blurring the source image directly. I think both are interesting strategies that can be used. Now obviously I still need to do tests on footage I have shot, considering that I have footage across a range of cameras, including XC10 4K, GH5 4K, GH5 1080p, GoPro 1080p, iPhone 4K, and others. That'll be a future test, but I've played in this space before, trying to blur away sharpening/compression artefacts. There are limits to what you can do to 'clean up' a compressed file, but depending on how much you are willing to degrade the IQ, much is possible. For example, here are the graded and ungraded versions of the film I shot for the EOSHD cheap camera challenge 18 months ago. These were shot on the mighty Fujifilm J20 in glorious 640x480, or as I prefer to call it 0.6K.... IIRC someone even commented on the nice highlight rolloff that the video had. All credit goes to the Fuji colour science 😂😂😂 Obviously I pulled out all the stops on that one, but it shows what is possible, and adding blur and grain was a huge part of what improved the image from what is certain to be several orders of magnitude worse than what anyone is working with these days, unless you're making a film using 90s security camera footage or something.
  22. I also don't get it, although I suspect this is more because my projects are too disorganised at the start (multiple cameras without timecode) and my process is too disorganised during the edit. One thing that might be useful for you, and which I use all the time, is the Source Tape viewer. It puts all the clips in the selected bin into the viewer in the order that they appear in the Media viewer (ie, you can sort however you like) and you can just scrub through the whole thing selecting in and out points and building a timeline. The alternative to that in the Edit page is having to select a clip in the media viewer, choose in and out points, add it to the timeline, then manually select the next clip. Having to manually select the next clip is a PITA, and I don't think you can do it via key combinations, so it's a mouse - keyboard - mouse - keyboard situation, rather than just hammering away on the keyboard making selects. The impression that I got from their marketing materials and the discussion around launch was that it was for quick turnaround of simpler videos like 30s news spots, vlogs, or other videos that are simpler and require a very fast turn-around. They even mentioned that the UI has been designed to work well on laptop screens, which further suggests that editing in the field because of fast turn-around times is a thing. Watching how YouTubers clamour over themselves to 'report' from conferences like CES, Apple events, etc, trying to be first comes to mind. If you were a YT or 'reporter' who shoots in 709, does a single-camera interview / piece-to-camera and then puts music and B-Roll over the top, puts on music, end cards (pre-rendered) and then hits export, it would be a huge step up in workflow. I suspect that it's just not made for you. I don't use multi cams, Fusion, or even Fairlight, but the Cut page is still too simple for me.
  23. Welcome to the DR club. Come on in, the waters fine! I think you have three main decisions in front of you - the first is if you want to focus on the Cut or Edit page first, the second is if you want to invest lots of time up front or learn as you go, the third is if you're going to use a hardware controller of some kind. All have pros and cons. I'm not sure how familiar you are with DR, so apologies if you already know this.. DR has two main edit pages, the Cut page and Edit page. The Cut page is the all-in-one screen designed for fast turnaround and you can do everything inside it including import - edit - mix - colour - export, but its got limited functionality. The Edit page is fully-featured but is a bit cumbersome in terms of needing lots of key presses to get to functions etc. The Edit page also only focuses on editing, and is designed to be used in conjunction with the Fairlight and Colour and Delivery pages. I think BM have decided to leave the Edit page mostly alone and are really focusing on the Cut page. For example their new small editing keyboard is targeted at the Cut page but apparently doesn't work that well with the Edit page with buttons not working there, at least at the current time. I started using DR long before the Cut page, so haven't gotten used to it yet but if your projects are simpler then it might be worthwhile to get good at the Cut page to rough-out an edit and just jump into the Edit page for a few final things. If you're shooting larger more complicated projects then it might be good to focus on the Edit page first. Eventually you'll need to learn both as they have advantages. The other major decision is if you take time to watch decently long tutorials, taking notes along the way, or if you want to jump in and just do a few quick tutorials to get up and running. The first approach is probably better in the long run but more painful at the start. I suspect you're going to have three kinds of issues learning it: Finding the basic controls which every NLE has Finding the more advanced things that DR will have but won't be named the same so are hard to search for Accomplishing things when DR goes about it a different way than what you're used to (eg, differences in workflow) which will be impossible to search for Learning deeply / thoroughly at the start will give you all three, whereas learning as you go will leave the latter two subject to chance, and potentially leave you less productive for months or years. Plus it's pretty painful to go through the deep / thorough materials once you've already got the basics as much will be repeated. If you're getting a hardware controller, at least for editing, then that can steer your other choices. Like I said before, the new Speed Editor editing keyboard is designed to work with the Cut page, so that will steer you in that direction. The other reasons I mention it is that it will give you clues about what things are called and the rationale of wider workflow issues, especially if you watch a few tutorials on how to use it as they will cover the specifics as well as at least nod in the direction of the concepts. If you're going to get a hardware controller then now is probably the time, you can always sell it again if it doesn't fit your workflow or you change devices to a different one. The Speed Editor is pretty cheap (in the context of edit controllers) so that might be something worth considering. Some general advice: Make good use of the Keyboard Customisation - it has presets for other NLEs but is highly customisable so use that to your advantage RTFM. Even just skim it. It is truly excellent, and is written by a professional writer who has won awards. I open it frequently and keyword searching is very useful, assuming you know what stuff is called. Skim reading it may answer a lot of more abstract / workflow type questions too. It's long though - current version is 2937 pages, and no I'm not kidding! Google searches often work - I didn't learn deeply and thoroughly when I started (as I didn't really know how to edit so much of it would have been out of context) so I do random searches from time to time, and I often find other people are asking the same questions as me, so this can help find things, or help you with what stuff is called at least. Workflow searching unfortunately doesn't yield much help, at least in my assistance. As questions here, EOSHD has a growing number of DR users, and even if we don't know, an experienced editor asking a question is useful to me as it gives away how pros do it, which helps me, so I often research questions for myself as much as for others. It seems so. 12.5 used to crash about 2-4 times per hour for me, but 16 basically hasn't crashed in weeks/months. I think it's probably related to your software / hardware environment.
  24. Shooting music videos in a warzone... 2020 was a full-on year!!
×
×
  • Create New...