Jump to content

Axel

Members
  • Posts

    1,900
  • Joined

  • Last visited

Everything posted by Axel

  1. ​Of course ETTR is right for raw, nobody disagreed. This is about whether it's also right for compressed codecs, which contradicts common beliefs.
  2. ​Would you elaborate on that? Why 709 and not sRGB?
  3. First of all, we can't judge if what we saw in the clip is what you expected. I agree with animan that you should rely rather on the scopes than on your naked eye. Let me explain what I did to set my mind at rest regarding a cheap monitor. I used to calibrate my iMac's display and later my 2 Samsung Synchmasters (cheap) with Spyder 3 Pro. Looked okay to me, and I never felt a reason why I should worry. With Resolve and the more demanding task to get the most out of raw and flat ProRes-10-bit images, I found the method didn't work anymore. So I followed the advice of a close friend who had just bought a €1500 Eizo. He sent me a link with a review of a really shitty little display, the LG 22MP65HQ, dirt cheap, around €130. Factory calibrated to 95% sRGB, which is very good. It's 8-bit of course, and it only has one HDMI input. So I bought this and the Blackmagic Mini Monitor. It looked completely off compared to my ancient Samsungs, so I distrusted the review. But my friend also had a new X-rite i1Pro and he suggested to make a new calibration. But instead of the X-rite software 'i1Profiler' (which is downloadable for free, which means you can share the device with friends, you need that as driver), he used the free dispcalGUI. Looks complicated, but actually is pretty straightforward. At first, it analyzes the display. It said '96% sRGB', and it showed onscreen which RGB levels had to be changed. I didn't know the LG had such levels deep inside the horrible menu, but it had (the Synchmasters didn't, they have instead Magic Colors and the like). I calibrated to sRGB (after a discussion if I shouldn't better use 'Video', which is rec709, but I don't expect my stuff to be broadcasted, and everywhere else you have sRGB computer displays). Then, instead of saving the profile for my graphic card's output, I exported a 3D-LUT (which dispcalGUI can do). I put the cube-file into my Resolve LUT-Folder and assigned it to the monitor LUT in project settings. There was only a VERY subtle change if we disabled the LUT, which means that the hardware calibration dispalGUI assisted us in was already pretty close. This means I also see a very neutral preview in FCP X, where I can't assign a LUT to AV out. Whatever you do, try dispcalGUI, it's really useful and free!
  4. The guys from ColorGradingCentral have a very good reputation among FCP X users. Let's hope this isn't just hot air ... PS: Wrong forum.
  5. All compressed video has some curve baked it. With ETTR, you 'fill the well' of your sensor and 'avoid the noise floor', but depending on the said curve (picture style), you may also have thinned out your values. The common S-curve will also try to avoid the noisy depths and favor skin tones, midtones. By not exposing to the center, you risk getting very doughy skintones when changing that curve in post. At least with 8-bit, but also with 10-bit, it's possible that the one shot you need has too little definition in the areas that count (usually the skin), and you are forced to de-grade the good shots to match the worst. Especially when you shot in very bright places. For 8-bit, it's a good idea to light flat. ETTR is a no-brainer with raw. Then even zebra suffices. ​
  6. ​I saw your video. Very impressed. But, as I said, it doesn't prove that AAE is a good choice for CC.
  7. This was very good. Good music also. Thumbs up! Doesn't play in Germany, due to unsolved copyright something. Had to install Firefox and an addon to see it. Was worth the trouble.
  8. First, we don't disagree in general. But God is in the detail. As we both know, you can't animate dinosaurs in AAE, but if you are used to exhaust it's possibilities ('Andrew Kramer'), then it's mightier than Motion, to put it mildly. I don't like the GUI, I quote myself from another thread: I rotoscoped stuff in Shake, in Motion and in AAE. You mentioned Mocha Pro to get tracking data into Motion. But Mocha Pro is §1500 the last time I checked. And as far as TrackX with Mocha (~$100) is concerned, it's good for the aforementioned matchmoved lower thirds, but you can't track a four-corner matchmove with it, for instance. So it is cheaper and much easier, but it's also limited. We agree, that most people don't need it, and though I am able to design them of my own, I'm happy to use templates created by others, preferably if they're free. I can't tell what pietz is needing the LUTs for. My clips often are in BM ProRes Log, which is rather flat. Not nice to look at while editing. Of course I could simply use the built-in LUT. But unfortunately the IR-cut-filter I have to use causes a slightly greenish cast, and I use a modified LUT with LUT utility. If I exported the adjustment​ layer with the LUT to Resolve, it would show as TITLE, because it's an empty title template originally. I rather apply the LUT again in Resolve, no big deal (more often, I don't use a LUT in Resolve other than that for the monitor). Sorry for the complicated explanation. As I said, God is in the detail, and everybody has his own way, neither wrong nor right. BTW.: Color correction in FCP X alone is possible, quite comfortable and often underestimated. For the majority, it will suffice. Compared to grading in Premiere (w/o Speedgrade), let alone AAE, it's close to perfect. ​
  9. ​Hope you don't mind if I put in my next two cents. Nobody needs to prove, let alone justify anything. Who is a lifelong user of AAE and feels on par with Andrew Kramer hardly feels any urge to jump ship. Would the inadequacy of tracks (and composition layers without edible nodes) be that self-evident as our comments suggest, they hadn't survived the last 3 1/2 years. And the wide range of tools, integrated or third party, collected as the years passed by, is awe-inspiring. tomekk wrote: FCP X more concentrates on what most people actually need. Sure, you can do a lot with Motion (probably more than you'd expect), but it isn't AAE. Anyway, I'd guess only few use Motion to really create animations. They use effects, graphic generators, titles and the like from Motion, but within FCP X. They let others create the 'Motion templates', which can be published for FCP X, many for free. These effects you can easily adjust to your needs. Doesn't sound particularly creative? Okay, but is that tinkering with the exchangeable graphic elements mentioned above anything to be proud of anyway? Today we see match-moved lower thirds every minute, everywhere we look. Even if you found a new, baffling effect to impress clients, laboriously keyframed it together in AAE, it'd be a matter of weeks until someone cloned it, also for FCP X, and every GoPro surfer (not even primarily video-oriented) uses it effortlessly in his youtube video. You can comp in Motion. Very good keyer and an absolutely usable tracker. Layers, a timeline, not too exotic. No roto brush, no built-in Mocha. So if 3D composition is what you're into: stay with AAE or learn Fusion. The majority never touches these areas. They are proud to work with the 'industry standard', if that's what AAE really is, but they never use the stuff. They buy expensive plugins. Or cheap plugins that look like it. Many of them could better do with FCP X, if they made up their minds what they really need. Don't know, lut utility causes no problem with my set up. FCP X has a few of the most common LUTs built in (clip properties in the inspector, Blackmagic, CanonLog, ArriLog,SonyLog). I recommend you connect an adjustment layer over your whole sequence and apply the appropriate lLUT to that. That way you can trash it as a whole before you export to Resolve. ​
  10. If CUDA or OpenCL is faster is comparing apples with oranges, if not both softwares use them or not in the same way. Also, if I read 16 x 4k PiP, or this or that configuration renders a certain codec 30% faster - I don't care. Once, when FCP classic had to log & transcode modern codecs prior to editing and Premiere edited the native codecs and already had the Mercury Engine on it's side, there were workflow speed contests (including performance and render times) that could only have one winner: Premiere. Someone has a monstrously pimped workstation, graphic power in great profusion, running Premiere, his competitor just an iMac with Thunderbold raid and FCP X. They both are confronted with 500 clips with externally recorded sound, and they are to edit the same one-minute-long sequence that they get shown before, from memory. This is again comparing apples with oranges, of course. This situation reminds me of the Grimm's tale The Hare And The Hedgehog. Both take part in a racing duel, but as soon as the hedgehog fells behind, he ducks. Then, at the goal, the hedgehodge's wife, looking like him, raises her head and shouts: Well, here I am! Unfair, isn't it? Though I'm not enchanted by your charm, I appreciate your wit and FCP X knowledge. Maybe you're a nice fellow in real life and your avatar is your Mr. Hyde.
  11. lin2log: You may be right that the position tool was there from the start. I vaguely remember someone writing, yippieh, we've got control back in FCP! or something of that kind. And yes, it's not completely useless, only that I seldom use it. Regarding the swap behaviour of the magnetic timeline: ​I just very rarely change the position of two adjacent clips. For I am an 'old hand' with some conservative habits regarding editing. That's why I always spend quite a while to know my footage before I throw anything into the timeline. People argue about the timeline, but in my view the event browser with skimmer, tagging (no more office-like folders to be double-clicked) and intelligent collections would justify the 300 bucks (and more) alone. The primary storyline I use as a rough draft. I usually only trim the length of some clips to get the right timing, and I am glad of the ripple behaviour. It's like a skeleton. Afterwards I connect everything I really want in the film to this. These clips I can move and swap without ever destroying the structure I laid out before. Everybody uses an NLE in a different way. Imo there is no right or wrong. ​At first, I was irritated by your comment. Because, frankly, I didn't know that a second viewer existed (f.k.a. 'viewer', whereas the timeline had it's own viewer, f.k.a. 'canvas'). Learned it only just now. And it is useless. Thankfully it's hidden well. ​For the record: on a german keyboard '~' needs option+n (new event), the corresponding command then is cmd+^.
  12. Wild Ranger: I would have liked to see your video, but got a browser incompability 'warning'. I was forced to download Firefox, Chrome or Opera - even for download! Refuse to be blackmailed like that. Safari (latest version) supports all modern protocols (Adobe Flash installed manually). Don't fall for that. You probably made a perfectly graded video. Nonetheless grading in AAE is as comfortable as swimming in glue. Learning Resolve Lite took me 9 hours of watching the Ripple training and a few days exercises. To explain it to a friend just took a little over an hour. A fraction of the time you waste when you grade in old AAE.
  13. joema: good explanation. I'd guess the average FCP X user doesn't care much about render benchmarks. My own routine is to deactivate background rendering until I foresee that my images are good and just need to finetune the sound. That way my project exports in no time afterwards. The performance can be sufficient even on slow systems, because per default FCP X scales the preview quality dynamically (viewer preferences). With 'high quality', you will need more power. A bigger graphic card makes the app more responsive, more RAM makes the skimmer faster (it seems to show more frames?) and allows for bigger projects. ​Start only after you watched tutorials for the basics. Better yet: ask someone who nows the particular app to show them to you live. ​In the case of an NLE actually how you think. Did you ever think about how you think, how you comprehend things? You can teach yourself to watch yourself build an idea (without necessarily becoming schizophrenic). If you tend to slowly gather things that fit and let the thought become clearer gradually, the puzzle of the track based NLEs will be right for you. If you are struck by a spontaneous thought, already somewhat complex, and want to explore where it leads to, then choose the mindmap-like FCP X.
  14. ​It's useful to think of FCP X as a subclip editor. Assume we had a dialog clip: 'Poor girl, do you want to marry me?' - 'Yes, 'cause I know you have terminal cancer!' ; then you can make a selection 'do you want to marry me?' - 'Yes ...', make it a favourite with f and rename it as 'proposal' and then make a second selection 'Yes, 'cause I know you have terminal cancer!' and rename it to 'cancer'. The result are three clips (the original, if you imported from a camera archive or from original card, it can already be a subclip!), the favourites technically subclips. Synchronized clips, multicam clips, compound clips: all different flavors of subclips. FCP X has become very reliable in avoiding logic conflicts that made legacy FCP slow down and finally crash. You make, for example, an archive (save as ...) of your sequence in old FCP. Then you keep changing things, and two weeks later you remember that old sequence and how well a certain thing worked for you. So you open that sequence (= the grandparents of your current state of editing), copy a selection (technically it becomes a compounded subclip) and paste it to your new timeline. You commit incest or travel back in time and kill your grandparents. In FCP X there seem to be almost no limits for such operations, perhaps because of the way the libraries work and that clips (and subclips) need to be copied before you can use them in another library ...
  15. ​Perfect, like Oliver described. You still have the possibility to trim (Larry Jordans reason to create a secondary storyline) and it's easier. There is no right or wrong way to edit. There is no need to create a multicam-clip in the event browser, you can as well synch clips directly in the timeline (saw it in a tutorial somewhere, but didn't test it myself, and it's been a while since my last music video). With FCP X, you find new tricks every day. I bet, I don't use 50% of what the app is capable of. To avoid misunderstandings: There is no gap in FCP Xs primary storyline, there can only be slugs (black clips, placeholders). So you connect the music to a black clip at the beginning.
  16. This was a very good article. Wish I had read something like that before I started with FCP X the hard way, trial and error. It looked to me like a bizarre nightmare first, and I put it aside. Before the trial month ended, I bought a training DVD, just in case I had missed the point. I chose the wrong one (of two), the one whose author also provided me with a book on FCP 2.0 a decade earlier. All tutorials lured you to look for workarounds, to restore the good old track routine, as much as possible. Today I'd say the guy not only missed the wood for the trees (german phrase), he completely ignored that he stood on a comfortable path. A year passed, and I saw a guy edit something on a set, with a MBP. The timeline looked insane, dozens of colorful tracks stacked. I made the remark, and he laughed and said, no, it's one track, and it's easy to read. Have a look. So I started over with 10.0.6. And never rued it. ​No. Both are matters of personal preference. One has to accept that. The sharpness of some comments needs to be criticized, because people may feel hurt if you belittle their personal preferences. As i can confirm. I am sick of hearing, oh, you're editing with iMovie Pro, in a patronizing voice.
  17. ​Been there, done this. I fully understand what you describe as a horrible experience with FCP X, puzzling together a timeline that way. The direct answer is a secondary storyline (like a second track, or actually a track at all, since the term 'track' implies at least two of them). But apart from that the example is useful for two reasons: 1. In Premiere (or whatever), you 'cut to the music'. Let's assume the music isn't just there to set the mood, you edit a music video for a band where various parts need to be lip-synched. The music is in a track that needs to be and stay integer. You probably lock it, so you don't accidentally overwrite or move it. For clips you tinker above it, angles from a multicam clip (the band's playback in the old warehouse and in the wood) and two or three illustrating storylines, their absolute position on the SMPTE ruler is of no importance. 2. For the sake of the music you try to establish a hierarchy. You (FCP X term:) connect all clips to the music, their positioning is then relative. They are not meant to move accidentally out of synch. To avoid this, you need tracks in Premiere, because rather sooner than later the video clips would collide, overwritten or moved. But you only need the tracks because you are working with a track based NLE. BTW: I know I sound evangelizing, I apologize. If you like the way your NLE thinks, fine. ​Two sides of the same mountain. You could as well say, all the clips I once considered B-roll are now spread over, say, five tracks. Some of those tracks are almost empty, on others I allowed 'secondary storylines' with their own inserts on a higher track, that also - oh, beware! - contains some titles and other stuff. Again, I know what you mean, but that's exactly why I now prefer FCP X where I can at any time change the whole structure completely, because there IS a clear structure of clip contents relating to each other in the first place. ​True. ​ 'p and move' was introduced in 10.0.3, I think, reacting to complaints by 'experienced editors'. It's a tool to counteract the magnetism and almost completely useless. Makes people continue to workaround. ​Don't swap clips in your primary or secondary storyline. That's one of the, but also the most intelligent way to 'edit to music': If you want to shove clips around the way you are used to do it in an empty track, rather make them connected clips. That's the whole secret. People are frustrated by the swapping behavior, which is actually only a side-effect of the magnetic timeline, not it's main advantage. You can at any time make primary clips into connected clips (opt+cmd+up) or vice versa (opt+cmd+down). You can at any time create secondary storylines out of selections of connected clips. You can do everything you need. If you stop thinking in tracks!
  18. ​Here is why: Tell me one reason for using independent tracks to arrange clips in the timeline. I asked this a couple of times in other forums. No one had an answer. Finally comes the killer phrase 'it has always been done that way'. There could as well have been a splice tool in all historic NLEs to connect two adjacent clips. Well, at least a splice reminds the old folks of what needed to be done with physical film. Tracks then could origin in classic A/B-editing, where, in order to make crossfades possible, the clips had to overlap on two parallel reels. Who the heck came up with this concept for non-linear editing? An NLEs GUI doesn't and needn't represent what's actually happening under the hood. It's there for our convenience. Tracks are nothing but inconvenient. They have no equivalent in the real world. If you start arguing with music scores: Are the 'tracks' independent of each other? Do you shove notes haphazardly back and forth? Imagine the timeline of NLEs had always had one track, as with film. That you could simply add isolated clips to the integer sequence, vertically as many as you like, either to temporarily change the flow of the narration ('B-roll'), make a composition of two or more images or simply try an alternative. Obvious idea, no? Imagine then, some two decades later, someone developed a new kind of NLE: The track based timeline! Would editors say, why, this is indeed an improvement. Or would they scratch their heads and put that software trial to the trash immediately?
  19. ​Compressor is meant to add export presets and destinations to FCP X, very similar to AME. Proceed as follows (translated from german, terms may differ): Go to >FCP X >Presets >Destination >add destination >Compressor preset. From there you can add any preset, custom or default, you previously defined in Compressor. EDIT: I never batch exported individual clips from FCP X, so I gave a wrong advice first. You are right: Though you can choose multiple clips in the event browser and export them, they will be one clip in the end. So either you need to do the batch export in Compressor or try Larry Jordans workaround ... But what were you planning to do with those transcoded MJPEG files? Reimport them? Why? You could better have a ProRes4444 or HQ preset for the project (=sequence), then Resolve will do the high quality rendering for you as a final step. Disable the default background rendering in FCP X, if you want to render specific sections of the timeline - clips with Neat applied won't play in realtime for example - you can any time select them in the timeline and hit ctrl+r. Take your time to get accustomed to the magnetic timeline. It's a pita first if you come from a track based NLE, but once you stop trying useless workarounds, you will never look back. Don't fight the magnetic timeline.
  20. Allegedly fixes a dropped frames bug with ProResLT and Proxy, but may leave you with empty raw folders at the end of a long shooting day, read here: http://forum.blackmagicdesign.com/viewtopic.php?f=2&t=31814
  21. ​Do you have an ordinary timeline representation of Premiere sequences in AAE CC meanwhile? Or is it *still* the ugly aztec pyramid with the clips stacked in layers? ​Andrew wrote s.th. about 5D2RGB, didn't he? The purpose of this app is to choose the correct range, if your NLE doesn't.
  22. ​... and it never was different. Only fashions change. The Zeitgeist brings forth different appearances, professions become extinct. The need to find images for our fears and hopes is hereditary. Like the need to produce narrations. Our lifes are narrations, stories, scripts. We tell them to each other, and only by sharing them and mutually acknowledging our biographies we uphold the fiction we call our mind.
  23. You really fulfill the definition of 'teaser'. Thumbs down: Too short!
  24. This lamenting about business ruined by dilettants isn't new. Things change and it's harder for established photographers to make their profit. As it had always been. Amateurs don't make good photos even though they can take 1300 for a wedding. It's a fundamental difference if you cover every angle ten times in order to later choose the, er, 'best' - or if you get into the mood, compose the image in your mind, wait for the magic fraction of a second and push the button (the only excuse for SLR cameras today and against the slightest delay of LCDs), I don't consider myself a good photographer. Or videographer. But I do understand this difference. You don't aim your camera at your motif, pull the trigger, shoot a heap of pictures in machine gun style and capture the essence of it. It's much like a fisherman who drains the whole lake and afterwards digs out the fish.
  25. ​ I saw that just now. The fact that every idiot can make a technically perfect photo doesn't mean anything. Our present is contaminated with those. A good photo reveals something that didn't exist in the conscience of the viewer, it is as well a document (as opposed to stylish pose) as it is an individual expression (as opposed to endless repetition). It has to do with finding out something about the world and about yourself. This is not going to die. And speaking of perfect glossy photos with the aid of automatic cameras or Photoshop, how about a time journey back to 1908? We may be better 'equipped' 106 years later. Did we improve photography? Are we exhausted? By too much high quality? Or by too much stupidity?
×
×
  • Create New...