Axel
Members-
Posts
1,900 -
Joined
-
Last visited
Content Type
Profiles
Forums
Articles
Everything posted by Axel
-
Did you try the Library Manager? Reading this thread, I realize I made a mistake with my current system. I bought the 256 flash version because that's enough for the system (and crazy fast too) and because I thought, well, people are doing this with the cylinder, it must work. But now I learn that the iMac "smokes" the nMP. I should have ordered a bigger drive and use the Pegasus to store finished projects. That's the disadvantage of a compact system: I can't change the drive now, I have to live with it for at least two years. However, it's no tragedy, I have not been editing multicam for quite a while, and I don't feel any limitations so far. As for AAE, it told me something like poor graphic card acceleration with your system on first startup. I used to have Nvidia before. Doesn't bother me much. AAE never was a real time app for me
-
BTW: I made a mistake, it's serial node, not sequential node. You seem to have FCP X (other thread). This is way more easy to comprehend and much more intuitive in many respects. Even correctly exposed and white-balanced clips sometimes don't have very appealing skin tones. Canon and Nikon are often (but not always) better at that than Sony. With optimal A7rii (PP6, sharpness and saturation dialed down, my own preference over S-log) - footage, the skin looked less attractive than possible. I had just loaded the Tiffen DFX plugins as a trial and browsed through them. One filter, called Nude/FX, warms up skin tones. A porn filter if you like. I created a better one in Motion (copying the colorize - filter and reversing the skin-keyer to 'colorize' instead of 'protect'). You have to move the "AXS" folder from downloads to >private >movies >motion templates >effects, then you find it in FCP X's effect browser (it's called "Hautfarben', german for skin color), with preview also. In FCP X, try to use an adjustment layer, connected above the whole length of the timeline (move titles and graphics above it, so that they aren't influenced). You could ... a) ... use FCP X's own qualifier ("color mask" in >effects >color >color), select the skin tones *over* ONE clip and correct them. You could also save this corrections as "skin" or so. Don't be too "in-tolerant". No problem, if some wood, "autumn leaves" (name of a Sony creative style, an inside joke, pleasing colors, but not very grading-friendly and suboptimal DR) or cereals are changed as well. b) ... find a Lut that changes the skin tones (or export your own one using Resolve, but beware: expert level !!!), using LutUtility or Color Finale. c) ... just apply my crude colorize filter to the adjustment layer. ... many ways to skin a cat. The concept of the adjustment layer allows easier control over your operations. Needless to say than you can stack them and rename them, blade them and toggle their visibilty with "v" for before & after as well as change blend modes and transparency. You could key your weird colored sky (proof that you made some major mistakes) with FCP X's own excellent keyer and apply a correction on a separate adjustment layer. They are not as good as Resolves nodes, nor is any of those tools nearly as precise as those within Resolve (okay, one, the keyer), but I found out for myself that I never will become a master of colors and that the unbelievable simplicity of FCP X leads to pleasing results very quickly.
-
Nobody answers, because you didn't RTFM. You have to eventually, but I help you out once: To perform the same tool with another purpose, you need a new > NODE (manual) Efficient order of corrections: 1. Node - primary corrections, optimize contrast, get rid of color cast. The second node (learn about the differences of sequential and parallel nodes) sees the first node as another instance of the image. You perform everything on top of the first. 2, 3 asf.. Node - secondary corrections - you isolate colors with the qualifier or areas with a mask. Finally you create a look on top of the primary and secondary correction with further nodes. You can >save the grade It becomes a still to be compared to ungraded shots with the splitscreen tool, but it can also be dragged&dropped to ungraded clips. If you have done all corrections in meaningful order and understand your own "node tree", you can easily adjust minor differences in exposure asf. in the rest of the shots. No faster and easier method.
-
I still have a 2009 MP (now the 27" iMac). My advice is to buy a TB external drive and try the (very simple) proxy workflow. Just in case you encounter any drawbacks with native 4k.
-
Right. But unless you plan to PiP multiple 4k ProRes4444XQ clips (or 4k raw, but FCP X can't handle that anyway), you won't notice the difference. You would, however, realize that the 1 TB drive in this example was way too small and you had to go external from then on. Another question. I used to have a three-monitor setup with my ancient MP. With 5k, everything fits on one screen. Some day I know I will want an external monitor, but all 4k monitors on the market in a justifiable price range don't seem to be on par with the excellent colors of the iMac display. What to do? Would an external 4k display slow down GPU performance significantly?
-
I wouldn't. With 4k, you need more storage. My Pegasus is no bottleneck, though it only reaches 500 MB/s read speed. I've seen a test with 8, 16, 24 and 32 GB RAM on the maxed-out iMac. Didn't make much of a difference with FCP X. Slightly faster with 16GB, the sweet spot. FCP X is not as ram-hungry as Adobe. CPU and GPU are crucial. You can set free a lot of CPU performance with optimized media. Since those eat more space, you need it external. Big and fast enough TB drives are the way to go. With redundant arrays you need no extra backups.
-
I am currently reading Moby Dick in english, which isn't my native language, that may have colored my words. I was referring to the last really big thing in 2011, the inauguration of FCP X, which by this time was "utterly unusable" (Mike Matzdorf). Meanwhile, if you ask us, it is not only usable, it now deserves the initially used attribute "awesome". All updates since then made incremental improvements, many for speed and stability, but none of them made waves (aye!). If they really adopt that subscription system, it means more expenses for me. If they made radical changes to the UI (some rumors) or abandon Quicktime (other rumors), both things can work from the start or cause many BUGS. Not a reason to be suspicious? Just look at APP and Resolve who finally announced "native QT". But people are forced to downgrade because various codecs aren't recognized anymore. But "I am not afraid", because on the long run I spent less on hardware and software than my Window friends. And, more important for me, I spent much less *time* configurating, maintaining and repairing my system. But that's my own business, and I won't try to convince others. Now better?
-
Yeah, MacOS Sierra waits in the wings. Possibly with new hardware. Allegedly the majority of the apps in the Appstore will start subscription deals, FCP X might as well. If you bought it, you can keep it, but you won't be able to update it. And according to some people fcp.co has talked to (among them the retired Larry Jordan), the next update will be a "big thing". If you are a long-time Apple user, you know all the dodges, and you are suspicious about big things. Speaking for myself, I am not afraid. But I'd never lure others on this path.
-
I bought 256 GB flash and a used Promise Pegasus2 R4 (for 600€ @ Ebay). Effectively 6TB Raid5, read speed around 500Mb/s according to BM Disk Speed Test, enough for 5 4k streams PiP in ProRes422 and don't-know-how-many in HD, fluidly (playback at "better quality" and "stop playback when frame skipping occurs" - or how the correct translations from german were). If I didn't use FCP X, I hadn't bought a Mac. Makes no sense for Premiere or Resolve.
-
Test shots combined with location scouting, via iPhone or Nikon D3300, stills and short clips. Arranged in an offline-edit. Works also as digital moodboard, I experiment with colors and start to collect audio. Difficult shots I arrange as AAE compositions, rough drafts. Sometimes this gives ideas how to get the shots done as trick shots. The sad part is that by doing so I realize how many efforts and expenses lie ahead, and I keep playing with the placeholders ...
-
Better the hyperfocal distance. With most wide lenses and small apertures, the focus setting wouldn't be infinity but, amazingly, between 3 and 6 feet. According to an online DoF calculator, with a 12mm @f7 on a GH2 it would be 4,4 feet.
-
You are right, of course. It's just that an original, organic story needs to be structured. It needs to be edited, rewritten, polished, trimmed. But the wild and uncensored idea must be first. The content. Even meaning and message come later. Or everything becomes a boring mess.
-
Yes, and we must distinguish between the formal conventions a screenplay has to have to meet the expectations of the professional buyers and things like "the arc of suspense", motivations, narrational economy asf. The first part is now facilitated by Word and some specialized screenwriting apps (Syd Fields book must me so old, he probably only knew white paper and a typewriter). As for the other stuff, it seems to be a basic truth if you analyze films or novels. Field kind of reverse-engineers the films, counting minutes and pages and how they correspond to the stages of the developing conflicts. Now what his scholars can do is make check marks at the end of page ten, twenty asf to see if they established the conflict, reached the plot point at the right pace. If a conventional form is what you are after, you can of course do that, but imo you can NEVER invent something with a rigid structure in the back of the mind. This is simple psychology. If you are obsessed with formal criteria, you will never create something original. This collective subconscious is what bad films, formalistic genre films, "movies" transport. You have (according to Jung) little worth as a human being if you move with the crowd, avoid painful individuation (a Jungian term) and always choose the blue pill. There are other valuable ideas, like the masks we wear (directors should exploit that with their actors) and our shadow. Stanley Kubrick cited the shadow in his Full Metal Jacket. A film, btw, Syd Field would have had a nice challenge in making the structure fit to his rules ...
-
Imo, you can read this as primer and closer. Not, because it is so insightful or inspiring, but on the contrary, because it analyzes something that should be quite obvious for someone who wants to learn how to tell a story. In recent years, it has become hip to refer to The Hero's Journey, deducting the roots for all narrations in mythology. Quite interesting read, but if Syd Fields "findings" actually repeat what Aristotle had found some 2300 years before about the nature of drama (completely with 3 to 5 acts and what have you), these books (i.e. The Hero With A Thousand Faces) don't help to understand story, let alone let you find your own narrational structure. Reality proof: good stories don't fit those "receipts". Moby Dick, just for instance, would never have been approved by any lector infected by those dogmata. There is a hereditary need for human beings to tell stories. In the office between colleagues, to your children, around the campfire. They may be personal, educational, reassuring, unsettling asf. But they have to be interesting. The content as well as the form. Someone offers you a formula? Fine, but someone wise once said the form should follow the content. Find something you find worth telling to the world, and the best way to tell it will be revealed to you by your muse.
-
May be it's a well known fact. Most people, however, mix up sharpness and resolution. Both, it seems* (*see below), are related to detail. You can do like CSI detectives do in bad TV shows: take a muddy surveillance camera image (like FS5's ) and emboss the outlines until faces become recognizable or plates in the distance become readable. Tim Shoebridges NX-1 images show both characteristics, greater resolution (so-called 'real detail') AND an artificial sharpness. The people look as if just one more step up in in-camera sharpening would finally reveal that they were animated mummies made of sand grain. We never experience such a thing in real life. The objects before our eyes have infinite resolution, they also have infinite color depth. We find an image beautiful if it is soft (even blurred, see sDoF) with a surplus of resolution. This is not about detail. Our eyes are very bad cameras. The lenses are slow, the sensor only gets close to "HD" in it's center (fovea), with a permanent vignette around the FoV, blurred, dark and distorted. We focus on a given point of interest, we explore it, and we find no limits. Detail should better not be everywhere in the image. And it shouldn't look chiseled. That's why both cameras don't produce nice images. (and there is another reason: colors. Looks like the FS5 wins there if it outputs ProRes or raw). The better way was not to blur a sharpened image, but to switch off in-camera sharpening if that can be done.
-
My Adobe-on-Windows friend came to see my new 5k iMac, maxed out and with Pegasus raid via TB. He couldn't understand why I spent a fortune on a consumer's compact unit, since he had volunteered to build a developable Windows system for me for a fraction. He brought a USB3 drive with almost 5 hours 4k (A7rii) of his recent journey to India and was curious how FCP X would deal with that much footage. A few weeks ago, I had seen Simon Ubsdells tutorial on Creative Cow about just ONE smart collection to rule them all: ... and, creating tags in advance I already knew would be appropriate from my friend's stories, I organized the media, which I had never seen a second of before, in no time. Just think of all the redundant clicks needed in Premiere, even for the simple task to play back a clip, not to mention sorting them in any practical way! I deliberately spent a little more time in the media browser to allow FCP X to finish transcoding proxy copies in the background. The real time in the demo project's timeline was seemingly without limit. When he learned that it was all just proxy (he hadn't noticed), he made the expected remarks from an editor who is sworn to native workflows. I switched the view to original media. Surprise for him: the stuff continued to play back in real time, the skipped-frames-warning set to on, with no issues (there would have been occasional lags, stuttering or beachballs with that much UHD XAVC footage during a normal editing process probably, I have no intention to prove the opposite). Afterwards he had to admit that the system was "useable".
-
Something worth thinking about. There really is no 'industry' anymore, at least not if this refers to enterprises which set up blueprints for production pipelines and man them with workers. This industry, audio-visual media, became hell for people who have to work for a living and perhaps plan for a family. On german video sites, the FCP X haters label the "pros" that use this NLE "trunk producers". I almost became one of them, it's called "Ich AG" (literally: 'Me PLC'; in Germany: person(s) in self-employment as partof a government-funded scheme to help jobless people to start-uptheir own business). As your own camera operator, sound engineer, driver, editor, colorist and what have you - what do you need Adobe for? You can be creative (Adobes motto) at designing your own funky lower thirds within After Effects? Well, you hardly get paid fairly for that extra effort. There are thousands of free Motion templates that can easily and in no time be adjusted for your needs. A handful of specialized graphic designers, who probably do nothing else, let you use better ones for a reasonable fee. You'd be crazy to sit awake after a long shoot and editing session and try to top those on your own. So maybe FCP X is for hobbyists and trunk producers, but who cares? I've opened an own thread on the topic. My suspicion is that Adobe would like to become less industry standard and more trunk production suite. But they fear to lose their nimbus.
-
Do you like to be filmed? I accept that people feel uneasy when someone holds a camera (or smartphone) to their faces, because I know how impertinent and ignorant that can be. I feign respect for their privacy and kindly ask for permission. I smile and reassure them nonverbally that they can feel safe. Works in 90% of all cases. The remaining 10%? Well, it's their right to forbid it. Simple as that. People like to be respected, but they also, in my experience, like to be asked and then directed. It's more about psychology than about personal rights. You've got to have (and show) empathy and then exploit that. Polite words aren't enough, it's your body language and your whole manner. That's why I still shoot weddings. The lens is their 'magic mirror'.
-
Let's hope that Apple is faithful this time. They are not famous for (if that's the right term) perfective maintenance. Some day they may announce, look, here is iCut, it's even easier than FCP - which from now on is considered obsolet and no longer supported ...
-
Be patient ... I am not cutting with Premiere (but know it until CS5.5), but if I still did, I didn't use Prelude either. Compared to the direct access and fluidity the Kyno demo suggests, it's ancient software. Compared to the usability of the MAM features within FCP X, it was sheer masochism to prepare footage that way. You still have to double-click clips to load them, you still have only thumbnails for the awkwardly designed 'hover scrubber', you still have those almost useless colored markers instead of intelligent filters. Yes, and there are **folders** - quote from the video in the link: The classic method of using containers to organize your material (...) is kind of the old school way of doing it, but {using metadata} is the way of the future, it's ridiculously powerful. That said, when do I really need mighty tools for media management? Only if I was confronted with a lot of footage. Anything below, say, a hundred clips can easily be found in Premiere with different techniques to audit and sort them. But at a certain point, you'll need something more efficient.
-
Yes. The design of Preludes GUI is also not very lightweight (as Kyno describes itself: "The main point is, that is is much more light-weight than typical MAMs because it does not require an import/ingest step before you can do something useful with your material. That means there is not really a concept of "inside" or "outside" of Kyno, which also means there is no global search of all content Kyno has ever seen.") This could also be the reason why many Premiere users don't know Prelude. They are not familiar with the concept of logging clips. Actually knowing your footage in advance, having it consolidated before import (in short: having it logged and organized) is a big time-saver. It needs a good player for that, batch-renaming and a search filter. EDIT: I am not sure if it supports all kinds of codecs, but the afaik free Sony Catalyst Browse could be something worth trying. Used it once when FS7s XAVC-L was not yet working in FCP X and found it to be fun. Excellent player, unrivaled meta data view.
-
It's brand new, beta version for OSX, and Windows version on the way. No, no need for it. Saw it before. It's basically what you are asking for to judge from the (well hidden) description in their reference guide: "A cross-platform media cataloging and video logging tool." However, their site is completely incomprehensible, a 30-days-trial is mentioned, but no prices. To get he app running, you need to *import* media first, and that's what's ridiculous to start with. I want to have instant access to everything on my system, I want to browse it, evaluate it and subclip and tag it so that it is somehow prepared for the import in the NLE. BTW: did you check Prelude?
-
"Kyno" Right now for OSX only, but a Windows version is coming. Few things that FCP X can't do by it's own devices, but definitely not a bad approach.
-
A myth. None of the Arri digital cameras has a global shutter. The Alexa Studio (about 100k presumably) has a physically rotating shutter blade that makes it look like a global shutter, check on the Arri site.
-
Congratulations, mercer. Looking forward to see a good Micro short from you. The german site slashCAM tested the Micro and said it might very well have a bigger DR than 13 stops, because 13 stops were initially announced for global shutter mode. They came to the conclusion that due to a better read-noise-ratio the DR could even be better than that of the Ursa Mini 4,6k (which they had tested as well and which allegedly has 15 stops). The main disadvantages of the camera were the sub-HD resolution and the awkward position of the buttons. I couldn't decide between the Micro and the UM and probably will soldier on with A7rii and FS7