KnightsFan
Members-
Posts
1,292 -
Joined
-
Last visited
Content Type
Profiles
Forums
Articles
Everything posted by KnightsFan
-
Sounds like a nice weekend project then. Here's the program so far: http://gobuildstuff.com/CropFactorApp/. It takes a really long time to load right now, I'll try to improve that as well.
-
I made a little 3D simulation once that showed a comparison between different focal lengths on different size sensors, maybe I should look into adding anamorphic and/or more specific camera sensor sizes? Would anyone be interested?
-
This is definitely as far from real tv as we could make it. On the technical side, it's 24p with 1.89:1 ratio. Story wise it kinda has a bit of everything.
-
It's all done, just need to upload. I think its 10 for season 2. Episode lengths range from about 5 min to 20 min.
-
I've done many projects that were edited in Premiere and finished in Resolve. What I do is use an XML to move the edit back and forth. If you've already started editing, then you need to export an XML from Resolve, and hand that over with all the footage to your friend, who will import that XML into Premiere. He'll probably need to do some relinking, but usually once you find one missing file it'll automatically find the rest if you keep the folder structure intact. I find that keeping the same folder structure is the most important thing for a smooth collaboration. If you haven't started the edit, then your friend can just start the edit in Premiere. Once your friend finishes the edit, he will export an XML, which you will import into Resolve. You'll likely have to fiddle with export and import settings on both ends, so I recommend trying the workflow before committing to it. Keep in mind that most effects will not transfer over. Simple fades usually work, but things like warp stabilizer or color correction won't. I have read that Blackmagic has made a free plugin so that Braw can be read in Premiere. If this is true, then you won't need to do any transcoding. That would make things easier. I would color grade after the edit is done. Let him edit the SOOC footage, and then color it once it gets back to you. If you want to start coloring straight away, one trick is to use Remote Grades in Resolve. If you haven't used them before, basically it makes a color grade go for a file instead of a timeline clip. You can start coloring clips right away, and then those grades will link up with the files automatically once the XML is ready and imported. Since you mention a band, keep in mind that most music is 44.1 khz, while Resolve operates strictly at 48 khz and tends to produce unpleasant artifacts when resampling. When I edit music videos in Resolve, I do the sound separately and then mux audio and video using ffmpeg so that I can keep the native sample rate for the music track.
-
Season 1 is 4 episodes and will be out on Amazon on Dec 20. Season 2 is considerably longer (and much better quality, we actually knew what we were doing by then!) and will be out soon after new years.
-
Problem With Samsung T5 Drive After Using It With iPad
KnightsFan replied to BTM_Pix's topic in Cameras
Does it show up in Disk Utility at all? If so, does it give any information? I ran into issues like this a LONG time ago when I unplugged a drive from a Windows computer without ejecting it, and then Mac wouldn't show it automatically. I think in that case I could still see it in Disk Utility and it gave some information about the drive. Also, have you tried a different cable? I have a Micro HDMI cable that works everywhere, except it suddenly stopped working with my NX1 about a year ago. Do you have other USB devices plugged into your Mac that would be forcing it to only allocate USB 2.0 resources to the T5? I would expect it to still work, but it's another easy thing to try. -
Problem With Samsung T5 Drive After Using It With iPad
KnightsFan replied to BTM_Pix's topic in Cameras
What exactly happens when you plug the T5 into a Mac or PC? Does it show up in the device manager (or whatever the Mac equivalent of that is)? Any noises or alerts? What is the drive formatted in--exFat, HFS, etc? -
I've been saying for some time (including earlier in this topic) that the roadblock to mainstream VR is bulky equipment. Facebook just announced controller-free hand tracking for the Quest, coming this week. They're talking about the resolution of the recorded 360 video. Mapping an image around for a 360x180 panoramic stream really benefits from 8k since you're only seeing a small portion of it at any time. What platform do you target in VR? My day job involves developing for the Quest, so we have pretty strict hardware limitations. I imagine developing for a traditional headset hooked up to a gaming PC gives you more room for better assets and rendering.
-
I haven't seen any measurements of the DR of the 5D3 raw, but that seems about right based on my comparisons to the BM 2.5k. I tried that iso mode a bit, but i didn't like it and never used it on a project, i cant remember exactly why.
-
Magic Lantern shoots native 14 bit DNG image sequences and puts each sequence into a .MLV container. If you mount the container as a drive, you literally copy/paste DNG frames out of the container--you don't actually have to convert anything. For 12 and 10 bit, Magic Lantern truncates the words which is why there is a DR penalty. (Or at least that was the case last time I used it, which was a few years ago). I can dig up some of my old footage if you can't find any samples online.
-
Here is the third and final promo video. The first season should come out on Dec. 20.
-
There are a lot of free SFX packs, including swishes and impacts, on https://opengameart.org/ I also have a hobby of recording sounds and putting them up on my site, I'd be happy to make some specific sounds for you if you'd like.
-
The second video is out!
-
The first short promo video is out. It's just under 1 minute long. Enjoy!
-
I'm not sure what you are disagreeing with, my post was about how much physical space a TV set takes up compared to a headset. A large TV is always going to take up a huge amount of wall space, whereas a VR headset can be tiny but give the same impression of a large screen set a reasonable distance away. With a headset, I can experience a virtual 85" TV set 10' away from me, while sitting in the cramped back seat of a car. Our headset technology just isn't there yet.
-
I don't know about supplant, but I think VR has huge potential once the physical limitations of current headsets are overcome. Have you ever done anything in VR, and if so, how was that experience for you? The big problem with VR now is that headsets are cumbersome. When I first tried it, you had like 10 wires going from headset to PC, headset to handsets, you had to place sensors around the room. Now with the Quest, everything is wireless, but it's still big and heavy to wear on your head for a long period of time, the refresh rate and the resolution isn't quite convincing, not to mention the low specs of the device means that at least for game content, graphics aren't great. We aren't going to have 4k fully ray traced environments on the Quest anytime soon. In the near future, these problems will be solved. Headsets will be wireless, lightweight, and cheap. One problem with flat screens is that they are huge. A 60" 4k TV is huge, and while technology is making them thinner and lighter, there is no way to get around the fact that a 60" TV takes up 60" of wall space. Even a projector, which takes up no physical space, requires space to show the image. On the other hand, if we can get a headset the size and weight of sunglasses, you can watch a 2D movie in a "virtual theater" anywhere, any time. Imagine wearing ordinary glasses that convincingly make it look like there is a 60" TV at the proper viewing distance, showing traditional film content. You can fold it up and put it in your pocket. You can watch theater quality images on a plane. You can watch 3D movies with no extra effort or equipment. You can watch them lying down. I think that's when we'll see the paradigm shift away from screens towards headsets. Once people already have headsets, I think they will show more interest in actual VR content.
-
I'm grateful for the abundance of cheap equipment and software. But I'm also grateful to the thousands of people who spend their time freely sharing knowledge on sites like this, from the retired pros with decades of industry experience, to experimental newcomers with more ideas than experience, and everyone in between.
-
First, 8k is better than 4k for every application if you don't sacrifice frame rate and quality. It's just a question of whether it's worth the extra cost, and at this point, the vast majority would agree that that 8k's expenses outweigh its benefits compared to 4k, both in cameras and on screens. Exactly. And not just refresh rate, but latency as well. You can feel the lag between moving your head quickly and seeing the change with current Oculus headsets. If I recall it's something along the lines of 1k resolution per eye on the Quest, and while you can definitely see the pixelation, it is not that distracting when playing games. Yeah, I see higher resolution as more important on the creation side than the viewing side. I'd rather physically move my head to look at different parts of a video timeline than click and scroll back and forth with a mouse.
-
#millcore is a comedy web series I worked on with some friends several years ago, which we are at long last putting on the actual web. There's a lot of content coming, so I'll keep this updated as the episodes roll out. There is mild language. Hope you enjoy, and feel free to share your thoughts! This was definitely a fun and educational process. It was done with a $0 budget and mostly equipment we owned. There wasn't really a crew-- it was basically written and acted by the people on screen, and I was the crew. If I recall, there were only 2 out of 12 days where there was another person on set who wasn't acting. I shot the entirety of it on my NX1 using a couple prime lenses. We were fortunate enough to borrow lav mics and LED lights.
-
Definitely out of my budget, but I am glad to see another foray into global shutter. Sounds like it will be global/rolling switchable based on the language Land used. I don't think any other cameras have that? Blacknagic tried with the Ursa 4.6k and cancelled it (though their later G2 update has such low rs it's global in all but name). I wonder if it is a global shutter sensor, or whether they use some other technology like their old motion mount.
-
Yes, that makes perfect sense. With most codecs you can losslessly trim a portion out. It can be done with ffmpeg, I've done it for archiving GoPro footage from long events. I don't know whether any mainstream editing software can do this--there is an option to "only re-encode when necessary" in Resolve but I haven't tried it. On All-I codecs such as ProRes, this should be possible with cuts on any frame, but with intra-frame codecs, you would begin your trim on a I frame. Unlike an uncompressed format like your TIFFs, cutting out a portion of a compressed codec won't be lossless across edits. For example, you can't color correct it and maintain your complete fidelity with this method. On the other side, most codecs will not show any degradation over a single re-encoding. Avoid it when possible, but it's usually not the end of the world if you're not doing it a few times.
-
There are lossless formats, including TIFF image sequences. The downside is that file sizes will be astronomical. What type of intermediates are you using, and can you use a proxy workflow instead? Using proxies instead of uncompressed intermediates will have the same fidelity and be a lot less taxing on hard drives.
-
Discovery I Made About Distribution A While Ago
KnightsFan replied to Zach Goodwin2's topic in Cameras
Right, what I said was if the screen "doesn't support" a given aspect ratio as a native file, then you add black bars manually and the problem is solved. The idea that 2.39:1 movie can be shown on a 1.85:1 screen but not vice versa is incorrect. -
Discovery I Made About Distribution A While Ago
KnightsFan replied to Zach Goodwin2's topic in Cameras
Any screen can show any aspect ratio. There will be either pillarboxing or letterboxing if the movie doesn't match the screen. If for some reason a screen does not support your file's aspect ratio (e.g. if you picked something non-standard), then you can manually add the black bars yourself. There is no technical reason why a movie should be incompatible with a theater because of aspect ratio.