Jump to content

KnightsFan

Members
  • Posts

    1,292
  • Joined

  • Last visited

Everything posted by KnightsFan

  1. You could get an XT3 and the original z cam e2 with a speedbooster for that price. Or a panasonic s1 and a Pocket 4k. We will see how many features z cam packs in to make it worth that cost. They have been steadily adding to the e2 so maybe they have more up their sleeve.
  2. I'll let you know whether I'm excited after they announce the price! But even if it is out of my budget, I hope they make an absolute killer camera. I like their approach to hardware and software design, and how well engaged they are with their community.
  3. In my tests, H.265 and H.264 retain detail very well compared to ProRes. I don't see any issues in this S1 footage.
  4. I'm not sure. I tend not to be scientific about lenses. If I consistently like the images I capture, I like the lens. And I've not consistently liked Rokinons the way I like my vintage Nikons, or the Zeiss ZF.2's and handful of Sigma Art lenses I've used. I've often considered grabbing Rokinons when I see them used for a good price, but their extraordinarily convenient features just aren't worth the image to me yet.
  5. Windows. I am finding that various metadata sections can be very useful for exactly that reason. I usually just include the project name in the file name, but there are all sorts of possibilities if you can automate metadata into the comment field, even if it's just an identifier to associate a video file with an audio file. Since I'm scared of accidentally messing up and losing valuable information, I am storing the original file name in a metadata field so that I can test my organization system on a real project, but relatively risk-free. Yeah, I think in that case just keeping projects separated is the main thing, and maybe keeping a notepad in the project folder with whatever useful info you can think of. I usually keep a list of shooting dates and a rough description of what we did. Like, "2019-03-23: Scene 47 (ambient sound in ZOOM_003)" and stuff like that. Just a comedy series with some friends. I've done some motion graphics for it using Blender, Fusion, and After Effects, but nothing major.
  6. They don't have feelings... so... Just kidding. Ugly maybe is too strong. But I have shot quite a bit on a full set of Rokinons, and quite a bit on a set of Nikon K lenses, and I prefer the Nikons' look, and that of most other lenses I've used I find the Rokinons to consistently make unpleasing flares and to have some annoying warm color casts. After seeing in Duclos' teardown that Rokinon applied a yellow tinge to one of their elements, I was like "oh yeah, that explains SO MUCH about the color." I prefer cooler colors I guess. I do actually like the 14 and 10mm Rokinons, simply because they were very affordable and fast for wide angle primes. When I use them, it's because I need an ultra wide and there is no other option, so in that sense they are very useful.
  7. I don't know about whether they are relevant to the market as a whole, but to me they are since they remain unique for their features and price. There are no other brands with anywhere near the number of focal lengths, all with similar size, clickless manual aperture, decently long focus throw, and standard gear positions. And they are all very fast, and in EF mount. The only problem is they make fairly ugly images, in my subjective opinion.
  8. No, not really. I probably have fewer projects than you, though. Right now, my projects are generally grouped by "era" on different hard drives--school projects, work during college, 2 large personal projects since then, and paid work since college. Pretty much, yeah. I keep personal and paid separate, and have an individual project folder and then individual elements, which vary based on the project. Like if it's a small VFX job I don't need a pre-production folder, etc. I don't have many plugins, but I do keep the install files in an installers folder. I do have a large library of assets: 3D models, sounds, and textures, which I keep in an "Assets" folder. They are usually very small, so I make copies if they are used in a project. I have projects all stored on the same drives if they are "in use." When I archive stuff I group things on a drive which I fill as full as possible, and then never look at again haha. I just use the date. Version up just means another date. I've never needed two concurrent versions from the say day before. Most of my VFX shots are small enough that all the project files go in one folder, titled by the shot. Like I may have a Blender file and a Fusion composition and a few intermediate files all in the same folder. Then there's often a folder for textures, and a folder for renders. If I rendered to an image sequence, that is for sure in its own folder. Client work all goes in the same folder, so I'm fine referencing textures or assets within a client's folder without making a copy. My volume is pretty low, though, so I imagine you might need a different structure if you have a decent amount of work. It's pretty rare for me to use stock anything. For VFX I really just name things descriptively. Often the project files are just "project.ae" because it's in a descriptively named folder. Yeah, I keep track of hours and the amount paid. I keep it locally only. Yeah, the names correlate. I'm not a high level professional: most of my work is personal. Paid work is usually for people I have a good relationship with and work with fairly informally, so I don't have formal "invoices." But I do keep track of what I have been paid for each project, when, and how. Never done that before. I use google drive to send files usually, and always have a local copy. My internet kinda sucks so I could never render straight to the internet--it would be too unreliable. I change the names on big projects. That's the function of my organization tools, automagically rename all the files from a card, copy them into the correct directory, and generate proxies. It makes it easier when editing long term. But it does depend on the project. I just got back from a music video shoot where I won't be renaming files--there's only ~100 clips, and no audio. It will be easy to keep straight. Another project I'm in post on has been shooting for almost two years and has thousands of files, so organization is much more important. I don't know how I'd manage if the files weren't all renamed to differentiate between different scenes and angles, some of which could have been shot months apart, or even years later in the case of some of the Foley files. I delete tests after a while, usually about five minutes after looking at them.
  9. Organization is my specialty! I am actually currently building a couple of tools to help organize projects, so maybe we can bounce ideas around and I may incorporate them. There are several levels of organization I use. At the top tier, each project is in its own folder. Generally, the folders are all in the same place, on different drives as they fill up. I keep personal projects and work projects separate at the top level. I generally organize paid work by client, and then within that, a folder for each project that is sorted by date. I keep a spreadsheet with details about each job, which I can refer to if I forget where anything is. When organizing by something, I'll have sortable numbers at the beginning and then a descriptive title. For example, "004 Smoke simulation", or use yymmdd date format. Depending on the size of the project, I might jump straight into a media folder within that project. The last couple things I've shot have been episode based, so I have a folder per episode, and then within that, a folder for videos, one for photos, one for audio, one for VFX. One my latest project, each episode is longer (50 min), so I have it broken down by scene after episode. So Ep1\Scene006\VFX\[some vfx shot]\[composite files, textures]... etc I created a mirrored folder structure for proxies. That way, I can literally do a find/replace on an XML to replace proxies, or use the Reconform from Bins option in Resolve to switch between proxies and online media, assuming my Resolve bins mirror my folder structure (which it does, since I simply import the project folder all at once). I generally keep storyboards, scripts, and other pre-production notes in a separate folder. For actual files, I have filenames with: - scene, angle, and take information - The date it was shot (that way, we don't have to remember which angle we ended on if a scene is spread over multiple days) - project name (not strictly necessary, but I do it anyway!) File scene, angle, and take are named automatically based on timecode using the tools I mentioned earlier. Audio and video files that go together are named the same thing, so they are easy to find. When finished, I ensure that all renders have the date in their filename, and go in a Renders folder. In general, I try to keep no more than 10 folders in any given folder, and no more than 100 files in a given folder--except for image sequences, which go in their own folder. It's much easier to find things when you keep the number of items per folder to a small, manageable number. I keep a lot of .txt files describing what is in each folder as well. I'm kind of rambling and being confusing here, maybe I'll write up a more cohesive post about it after my current project is done.
  10. I would love to know if those work. I have read about a lot of compatibility problems with the other active EF to FX adapters.
  11. Have you compared color in Premiere or Resolve? VLC is notorious for having no color management. If the color different is apparent in Resolve, then it might be that problem with the way XT3 files are tagged. I haven't looked into that issue yet, but there are a few threads here talking about it. My guess is either VLC is showing the wrong colors on the file, or it's this tagging issue. It is also possible that Shutter Encoder is actually changing the color in your files in a way that doesn't have anything to do with color space tagging. It doesn't seem likely, but it's possible. Hmm, I looked in ffprobe, but I don't see that particular info. I suspect that the XT3 doesn't record ISO and aperture information for video files.
  12. It looks like your 18-25 might be a Sigma SA mount lens, perhaps mislabeled as Canon EF by the seller. (Based on looking up pictures of lens mounts, I don't actually have the lens in SA mount) I found this page which has a detailed pic of an SA mount http://conurus.com/forum/sigma-sa-mount-thread-t266-15.html?sid=90c0074805e17063cfbc549a5315e1b0
  13. Agreed. The f4 is the only piece of my kit that I use daily so it was worth every penny. But I have seen ridiculously cheap used h6's and if you only rarely use it, or really want something handheld, it might be a good option.
  14. I often record dialog for video games and sometimes voiceovers for film projects. I have a corner of my room lined on a couple sides with thick, wool sleeping bags (really dense and heavy). I record with an AKG CK93 running into a Zoom F4 used as a USB audio interface, usually with Reaper as the software. I monitor with MDR-7506's. It sounds great. It's extremely budget-efficient as all of the components are things that I use on set. Well within 3 figures if you look for used equipment. - You could switch out the mic for a cheaper cardioid (or omni, if you have to), but stay away from shotguns indoors. - The Zoom F4 can be swapped for a cheaper H6. - You can use Audacity to record for free--though I highly recommend Reaper if you do any audio post work at all. It's phenomenal! - You may want a pop filter. I haven't gotten one yet. I would definitely get ambient sound to fill the silence if there is no other audio playing. You can just get ambient sound from the room where you are doing the VO. It will probably just be faint hiss, but will remove that "jarring" factor of silence. If you go with laboratory sounds, you could record them in stereo. That way the mono VO will stand out against the ambient sounds better.
  15. Yeah, I've been watching those developments closely. Price is a mystery still, but hopefully we'll hear within the next few weeks. I'm very excited, because the two main problems I have with the E2 are the smaller sensor, and the low MP count. I'd like have decently high resolution still images, and close to the native FOV with my vintage FF lenses. A S35 version with a M43 mount seems to be something they are looking into, though they seem more interested in EF at the moment. So maybe we'll finally have that spiritual successor to the LS300 as far as lenses are concerned.
  16. Why don't you just go from FD to MFT? In general those FD lens -> EF camera adapters with glass will reduce quality so they aren't recommended unless absolutely necessary.
  17. I do this all the time. With nikon F to canon EF adapters, you can use a flathead screwdriver to tighten the leaf springs before mounting the lens, so that the adapter will have zero play. It is just as physically solid as if the nikon lens was a native canon lens.
  18. There is not much visual difference between 24 and 25 if you stick to one or the other, but you should not mix and match them. Use 25 for PAL television or 24 if you want to use the film convention. 30 is slightly different on scenes with a lot of motion. I am almost 100% certain YouTube keeps the original framerate. I think they can even do variable frame rate? I could be mistaken on that one though. I would be very surprised if Vimeo converted, but I don't use Vimeo so I am not positive. Yes, exactly. You can speed it up. If you speed it up 2x it will look a little jerky simply because it is essentially a 15 fps file now. If you speed it up to some weird interval, like 1.26x, then there will be frame blending and that will probably look pretty bad, depending on the content of your shot (a static shot with nothing moving won't have any issues with frame blending, whereas a handheld shot walking down a road will look really bad). Technically, yes, you can do that. If you want your final product to be a 48fps file, and you are sure such a file is compatible with your release platform(s), then it should work. I think that it is a phenomenal idea to try it out as an experiment--but definitely try it out thoroughly before doing this in a serious/paid project--there is a very good chance it will not look the way you expect if you are still figuring stuff out. Also, for any project, only do this if you want the large majority of your 30fps clips sped up. If you want MOST to be normal speed and a COUPLE to be sped up, then use a 30fps timeline and speed up those couple clips. If I were you, I'd go out one weekend and shoot a bunch of stuff in different frame rates, then try out different ways of editing. Just have fun, and try all the things that we tell you not to do and see what you think of our advice!
  19. Totally agree, thats one reason its easier to conform rather than manually slow down. You conform TO the desired framerate so you never have to wonder whether you need tk slow by 50 or 40 percent.
  20. Not a silly question at all. Basically it just means to tell your editing software to play the frames in the file at a different rate. Example: if you shoot for 2 seconds at 60 fps, you have 120 frames total. If you conform to 30 fps, it is like telling the software that you actually shot at 30 fps for 4 seconds to get those 120 frames. Now as far as the software is concerned, its a 30 fps file (which happens to look slowed down). So for slow motion, I find it easiest to select all the high frame rate files in your bin before putting them on a timeline, and conforming them to the timeline frame rate. Then when you drag it on the timeline, it will be slowed down automatically.
  21. Youtube will play anything. If you shot 23.976, then stick with that. I don't know for certain, but i bet vimeo will also play anything. The only time you really have to be careful when deciding which format to use is with television broadcast or specific festivals, since modern computers can play anything. For slow motion, you can shoot anything higher than your timeline frame rate and conform it. If your NLE has the option, you should conform footage instead of manually slowing it down. That way you will avoid any frame artifacts in case your math wasnt correct. But to directly answer the question, slowing 59.94 to 23.976 is a great way to get slow motion.
  22. You should shoot with your distribution frame rate in mind. If you are shooting for PAL televisions, then you should shoot in 25 fps for normal motion. If you want 2x slow motion, shoot in 50 and conform to 25. If you want 2.3976x slow motion, shoot in 59.94 and conform to 25, etc. (I know you aren't talking about slow motion, I just mention it to be clearer) Essentially, at the beginning of an edit you will pick a timeline framerate to edit in, based on artistic choice or the distribution requirements. Any file that is NOT at the timeline framerate will need to be interpolated to some extent to play back at normal speed. Mixing any two frame rates that are not exact multiples of each other will result in artifacts, though there are ways to mitigate those problems with advanced interpolation algorithms. So you shouldn't mix 23.976 and 59.94. If you have a 23.976 timeline, the 59.94 footage will need to be modified to show 2.5 video frames per timeline frame. You can't show .5 frames, so yoy have to do some sort of frame blending or interpolation, which introduces artifacts. Depending on the image content, the artifacts might not be a problem at all. The same would apply for putting 23.976 footage on a 29.97 timeline, or any other combination of formats. The only way to avoid all artifacts completely is to shoot at exactly the frame rate you will use on the timeline and in the final deliverable, or conform the footage for slow/fast motion.
  23. It is true that MFT specifies a thicker sensor stack than most other formats, which means that adapting, say, an EF lens straight to MFT will not result in optimal performance, especially wide angle lenses wide open. It is similar to how the prism in bolex 16mm reflex made non-r lenses look soft. Its also the reason Metabones made a special speed booster for blackmagic cameras, because BM used a stack that was not the standard MFT thickness. Not sure about microlenses though, if they are part of that stack thickness or what.
  24. That's a good question. I assume BRaw is always lossy because none of the official Blackmagic info I've seen says it's mathematically lossless, but with q0 the data rate is as high as some lossless formats. Of course higher ratios like 12:1 and such must be lossy. I think calling Braw "RAW" is misleading, but I fully support the format. The image quality is great at 12:1, and the use of metadata sidecar files could really improve round tripping even outside of Blackmagic software. Back when I used a 5D3 and the choice was either 1080p in 8 bit H264 or uncompressed 14 bit RAW, the latter really stood out. Nowadays with 4k 10 bit in HEVC or ProRes, I see no benefit to the extra data that lossless raw brings. BRaw looks like a really good compromise.
×
×
  • Create New...