-
Posts
160 -
Joined
-
Last visited
Content Type
Profiles
Forums
Articles
Everything posted by joema
-
No, but I will try that next. I had the LCD panel pulled out so heat from the panel itself should not have contributed much, but it's possible the drive electronics for the LCD could be producing more heat if bright. I also have an Atomos recorder so will see if that makes any difference, esp if triggering recording from it vs the camera.
-
I have done more testing today, and the A7RII overheats in 4k mode more readily than I first thought. It does not take extreme temperatures or direct sunlight. Today it overheated multiple times at 75F ambient, with the LCD pulled out, in the shade with a light breeze blowing on the camera. From a "cold" start you can do at least 29 min (like the manual says). How much beyond that is unpredictable, based on various things. Unfortunately the camera provides no thermal trend indicators, so you cannot monitor how close to the limit it's getting. You only know when it's getting so hot that shutdown is imminent. If you turn it off for a few minutes, you have no way of knowing how much that buys you in additional 4k operating time because the camera doesn't tell you. It's not generally a problem for hand-held "run and gun", or most documentary shooting, or most interview shooting. Your takes usually aren't that long, and the typical ratio of shooting time to non-shooting time means the camera has time to cool down. In the case of interviews you probably would not need 4k anyway. But it's definitely an issue on longer-form material. 4k can be attractive on medium and wider shots because it allows a lot of cropping and recomposing in post for 1080p productions. This can lead to using 4k on longer static shots, which leads to the thermal limit. Even if it can't be fixed in firmware, there is no question a firmware update could provide thermal trend indicators to show how close to the limit you're getting and how quickly. The camera obviously knows this because it warns at one temperature threshold and shuts down on another threshold.
-
This is clearly documented in the manual, page 95 under "Notes on continuous movie recording". There is a table of ambient temp vs recording mode, showing possible recording time limited by internal sensor temperature. I just shot a 1 hr (two contiguous 29:59 segments) outside video at 4k using Super 35 mode on my A7RII, and it had no problems. Ambient temp was about 65 F. I did have the LCD panel pulled away from the body, which I needed to frame the shot. That took about 90% of the single internal battery. For longer shooting I'd use the battery grip or an external power source. Last weekend I shot 4k documentary material all day and never had any problems. In general most real world professional material is shot in fairly short clips. E.g, every feature film from the dawn of cinema until 2002 typically used 1000 ft maximum film magazines, which limits a continuous take to 11 minutes at 24 frames/sec. After that they'd have to stop and change magazines. With digital video we have become accustomed to the ability to shoot longer takes, but it is usually not needed except for recording an entire stage play, etc. For long form recording of a lecture, etc. you normally wouldn't use 4k, as it burdens post production with a huge amount of material for little practical benefit.
-
What's The Best Camera For Shooting A Low Budget Movie?
joema replied to fuzzynormal's topic in Cameras
Excellent advice. Also some well-known movies have used basic cameras. Much of Blair Witch Project ($248 million gross) was shot on a $500 consumer-grade RCA Hi-8 Camcorder. Several movies were shot on the Panasonic AG-DVX100 standard-def camcorder: http://en.wikipedia.org/wiki/Panasonic_AG-DVX100 -
Now you can transcode to 4K ProRes over 3x faster with FCPX
joema replied to Andrew Reid's topic in Cameras
My statement was "Traditionally GPU acceleration has limited benefit for transcoding", which is correct. This is talking specifically about transcoding, not GPU acceleration in general or for rendered effects. There is no doubt the GPU can greatly accelerate certain affects. E.g, applying color correction to every pixel of a frame is parallelizable, so the many parallel elements in a GPU can help. However there is little evidence that a GPU accelerates transcoding greatly. In FCP X monitoring the GPU with iStat Menus shows little activity during export. In a recent interview with editor Scott Simmons and Andrew Page (nVidia Product Manager for Professional Video Technologies), they explained: "there are a lot of tasks that can't be broken down to be parallel: encoding is one, decoding some of the camera compressed formats is another one...what happens in frame 2 depends one frame 1, we do something to frame 1 then feed the results into frame 2....that's pretty much an encoding problem...everything is interrelated, so we can't break it up into lots of different simultaneous things." (That Studio Show podcast, 5/20/14: https://itunes.apple.com/us/podcast/that-studio-show/id293692362?mt=2) Maybe eventually some clever developer will figure out how to more effectively harness the GPU for encoding but as of today it hasn't happened to a great degree. -
Now you can transcode to 4K ProRes over 3x faster with FCPX
joema replied to Andrew Reid's topic in Cameras
OK, so Quick Sync and I/O are not factors here. This leaves CPU and GPU. Traditionally GPU acceleration has limited benefit for transcoding. In fact Extremetech.com had a detailed article titled "The Wretched State of GPU Transcoding": http://www.extremetech.com/computing/128681-the-wretched-state-of-gpu-transcoding Maybe someone with algorithmic knowledge of this (like jcs) could comment why GPUs don't help more for this task. However for any given task it must (a) map well to the GPU programming model, and (b) the programmer must take advantage of this. Not all tasks are amenable to GPU acceleration. If any part contains inherently sequential algorithms, the GPU (which is parallel) may not help. Assuming mostly CPU-bound multithreaded transcoding, that leaves two issues: (1) Why was the quad-core nMP and retina iMac not considerably faster than your MBP? Your MBP is pretty fast -- an i7-4850HQ quad-core CPU with max turbo speed of 3.5Ghz. The GeekBench 2 multi-core numbers shown in MacTracker (https://itunes.apple.com/us/app/mactracker/id311421597?mt=8) don't indicate a big difference between your MBP, 2014 nMP and retina iMac (esp. if iMac was 3.5Ghz i5). i7-based Macs have hyperthreading which I've tested on FCP X export and it makes an approx. 30% difference between turning it on and off via the CPUSetter utility: http://www.macupdate.com/app/mac/48580/cpusetter So the iMac (if i5) could have been disadvantaged by that. (2) Why was FCP X much faster than EditReady and all other methods? Careful coding can produce great improvement, as seen by Handbrake's x.264 implementation. It is very fast, even though it's software-only. If a developer is willing to profile the app, produce an execution histogram and re-write any hot spots (maybe even in assembler), significant improvements are possible. Maybe the FCP X developers did that. -
Now you can transcode to 4K ProRes over 3x faster with FCPX
joema replied to Andrew Reid's topic in Cameras
Transcoding is mostly CPU-bound and only certain aspects benefit from the GPU. Intel's Quick Sync is integrated with their on-chip GPU, but it's not a GPU algorithm -- it's essentially an on-chip ASIC for H.264 transcoding. Xeon CPUs on the Mac Pro don't have Quick Sync so it's common for H.264 transcoding to not benefit or even run slower on those, despite the greater GPU resources. EditReady uses Quick Sync, so I'm surprised it's not faster on the 5k iMac, since the CPU is faster than the 2.3Ghz i7 MBP. However Quick Sync only works for single-pass H.264 (at least on the encoding side). Is it possible you are using multi-pass encoding? That would flatten the performance differences. In FCP X this is selected by choosing "H.264 faster encode". In Compressor the video properties for the preset can be adjusted between single pass and multi pass, only single pass uses Quick Sync. However even with Compressor adjusted to the same video properties as FCP X, it's still not as fast at exporting. I don't know how or if a similar adjustment can be made with EditReady. Another possibility is the transcode is bottlenecked on I/O so further CPU improvements are flattened. Normally that doesn't happen since H.264 is quite compressed which limits the I/O, but with 4k that could be happening. If you are testing each of these systems from files on a little bus-powered USB portable hard drive, that could be capping transcode rate due to I/O. Some of those HDDs are quite slow, even at USB 3. The fastest ones (HGST Touro S) do about 130 MB/sec, but most are a lot slower. -
The 1DC produces 300 gigabytes per hour of 4k video using H.264. That is compressed. Uncompressed 10-bit 4k video is roughly 3x that, about 1 terabyte per hour. A typical shooting ratio might be 50:1, so for a 1-hr production you'd need 50 hr of raw material. That equates to 15 terabytes of 4k H.264 or 50 terabytes of 4k raw at 10 bits. Assuming you had the resources to manage and edit that, the final 1 hr product would be about 300 gigabytes of H.264. In theory H.265 might reduce that to 150 gigabytes. To distribute that you'd have to give each customer three double-layer Blu Ray discs, or a portable hard drive. For on-line distribution, 150 gigabytes is most of the current data cap that broadband providers allow. So they could download it but do little else the rest of the month. It makes sense for large studios to shoot 4k to "future proof" their material, even if they'll be delivering mostly 1080p for several years. This is similar to TV networks shooting color film long before color TVs were common. However I don't see the cost/benefit tradeoff of small, independent or amateur productions shooting 4k -- even if the camera supports it. The problem isn't the camera -- it's the asset management and distribution of 4k content.
-
For some time there has been widespread problems with Youtube buffering performance. This affects anyone wanting to distribute or view video content via Youtube. I did some testing, and here are my results. Symptom: A consistent 24x7 lack of complete buffering on certain videos, independent of network speed or browser. I've tested Chrome 27.0.1453.110, FF 21.0, and IE 10.0.9200.16576 on 64-bit Windows 7 Home Premium SP1, Flash 11.7.700.202. I have Comcast Extreme 105, which on speedtest.net consistently produces 115 megabit/sec down, 21 megabit/sec up. My ping is 19 mS with 1mS jitter. The behavior is incomplete or halting buffering of some videos in the Youtube client. It happens with some videos at some resolutions, but not others. In general 480p and below FLV videos buffer slowly and incompletely. Some 720p and above MP4 videos buffer quickly and completely; others do not. The difference in network data rate is about 150-200k bits/sec for the slow case vs. 70 megabits/sec for the fast case. It is obviously a Youtube client code issue, not a pure network or server-side issue. If you use the FireFox extension DownloadHelper (http://www.downloadhelper.net/) it bypasses the buffering problem on 720p and 1080p MP4 videos, despite using the same PC, OS version, network, browser, flash, and time of day. Interestingly it does not help on 480 and below FLV videos -- they download or buffer slowly. You can monitor network speed with Windows Perfmon -- Control Panel>Administrative Tools>Performance Monitor. Select "network interface", bytes received/sec and bytes sent/sec. Perfmon clearly shows that the Youtube client is throttling the download under some conditions. For each performance "counter" at the bottom, you may need to right-click and pick properties>data>scale, and select a scale factor so the graphs fits vertically on the screen. Note if you right-click on the graph background and select properties>Appearance>color>graph background, select light grey, you can then highlight the specific graph line by pressing CTRL-H. However different videos exhibit different behavior. This video currently (6-7-13, 10:30 AM CDT) shows the throttling behavior in all browsers at all resolutions, 24x7: https://www.youtube.com/watch?feature=player_embedded&v=dQ51rE_ZUgo However THIS video shows throttling only at 480 and below -- at 720 and above there's no throttling and it buffers completely and quickly -- IF played on Youtube. If played embedded on this forum, buffering remains slow: https://www.youtube.com/watch?v=Alm6D_6sd8k&list=PLC10321FC136BF7C1 480 and below are typically FLV files, 720 and above are usually MP4 files. Whether using the Youtube interface or DownloadHelper, I cannot buffer or download FLV videos quickly. 720 MP4 and above can be downloaded quickly with DownloadHelper, and (sometimes) the Youtube client will buffer them quickly but usually not. While there may be issues with time of day, ISP traffic shaping and local net capacity, this behavior seems independent of that. Chrome HTML5 also makes no difference. I speculate Youtube has implemented client-side code which limits buffering in order to reduce overall load on their servers. In Perfmon you can see bursts of activity if you scrub forward, but buffering quickly stops. Unfortunately it's not working consistently. The difference between the slow case and fast case is gigantic. When the throttling code mis-predicts the needed buffer prefetch, it interferes with viewing videos, especially at 720 and above. I posted the above symptoms on a Google help forum, but I doubt they'll respond.
- 3 replies
-
- youtube buffering
- download
-
(and 1 more)
Tagged with: