Syme
Members-
Posts
42 -
Joined
-
Last visited
Content Type
Profiles
Forums
Articles
Everything posted by Syme
-
Sounds like they have something to do with configuring video scaling filters. I highly doubt that there is actually a 4k mode, but modifying the configuration of the scaling filters could still be interesting. Where in the filesystem did you find these? Are you using an unpacked firmware upgrade .exe or examining the system through a remote terminal?
-
Hopefully it's somewhat obvious, but DON'T RUN THE BITRATE MODS ON THE NEW FIRMWARE UNTIL ON OF THE DEVELOPERS CHECKS IT OUT. It would probably do no worse than require a restart, but there is a very small chance it could do something nasty since the behavior is pretty much undefined at this point. Edit: I'm pretty sure the popular mods will refuse to install on the new version for now, but don't bypass that check.
-
There's nothing stopping someone from sending bayer RAW data over HDMI. You can send whatever data you want over HDMI, actually. It would appear as some crazy scrambled pattern to most HDMI devices, but the data would be there. Edit: I highly doubt they did that for this camera. Just saying it is possible.
-
Latest version includes telnet, which is great for exploring how the camera works. I really hope Sony doesn't crack down on this system in future firmware versions, but unfortunately they probably will.
-
Did it export really fast? Splitting without re-encoding should be about as fast as making a copy of the result. I don't know about MPEG Streamclip in particular, but that's a good way to tell in general.
-
I don't know. It's probably somewhat subjective what would be best for overall image quality. I'm not sure what a real log profile would do to macroblocking. The best settings for real usage will have to be determined by people actually using the camera.
-
There is no real hard limit to the bitrate of HEVC. Yes there is a maximum rate specified for each "level" of the standard, but that has nothing to do with the actual codec. What it really means is that all compatible HEVC decoders need to be able to decode at least 160mbit/s to meet the level 5.1 standard. That way any video that meets those specifications will play on any compliant decoder. It's a completely arbitrary number. No sane HEVC encoder cares about that part of the standard. It will happily spit out bits as fast as its algorithm and clock rate can keep up. The actual limit for a given piece of hardware or software depends on many things, but it almost certainly isn't exactly 160mbit/s. I hope that makes sense.
-
Here are my predictions for rolling shutter with the latest firmware: NX1 UHD: 32ms NX1 DCI: 30ms NX500 UHD & DCI: 19ms NX500 2.5k: 16ms NX1 & NX500 FHD 24-30p: 16ms or 11ms NX1 & NX500 FHD 50-60p: 16ms or 11ms NX1 FHD 100-120p: 8ms I'd love to see how close I am to the real numbers.
-
Yes, it could be impossible to turn off. The entire processing chain from sensor to H.265 bitstream could be done without any input from the software (except for selecting resolution and bitrate). I would bet money that it is possible to turn off noise reduction, since Samsung would be more than a little bit crazy to do it like that. However without proof it is still possible however improbable it might be.
-
Not impossible. Unlikely, but not impossible. It could be built into the HEVC encoder, for instance.
-
The "ASV Group" number is referring to one of a number of possible binnings of the chip, not the number of different voltages being specified. Two chips can have different ASVs and perform exactly the same, just with slightly different voltages. Take a look at http://www.anandtech.com/show/9146/the-samsung-galaxy-s6-and-s6-edge-review/2
-
Have you quantified any differences in the SoC itself? Apart from different version numbers the only differences I see are in memory (separate chip), audio codec (separate chip), and other board-level differences. Unless it's due to a lack of memory, as far as I can tell we still have no idea if there is a hardware reason (apart from power consumption) for the 4k crop on the NX500.
-
Their definition of "megapixels" seems a little bit fishy to me. I assume that means something like the number of samples used to measure the light field, but I'd rather know the size of the actual image it's going to create after turning the light-field into a 2D or 3D image. Seems like there must be a lot of "pixels" per actual pixel.
-
While in theory having a higher data rate means compressing less and compressing less requires less work that's definitely not the case in practice. I think it might be the case with software implementations of some codecs, but it is usually not true with dedicated hardware. I think it probably has to do with the fact that the codec in a camera has to guarantee a certain resolution and framerate regardless of how "difficult" the input is to compress. In other words, the codec isn't allowed to slow down if it has to compress more so the opposite is also true: the codec isn't allowed to speed up if it gets to compress less. The designers had to pick an algorithm to implement HEVC that was stable (w.r.t. changing input) at the cost of flexibility (and also an algorithm that is easy to implement in hardware). This may or may not be the main reason why the NX1 is behaving like it is, but it's important to understand. p.s. I hope that makes sense. I'm not a hardware codec designer, so correct/clarify if I'm wrong.
-
Nice! That's some of the first hard evidence we have about what's going on under the hood. Sounds like my pessimistic guess about the SRP was probably right. It makes the most sense for it to be used for what Samsung doesn't already have a more efficient IP block for (like SAS). I kind of wish that I, too, had a Samsung camera to play with. I spent most of spring break investigating my NEX 5t, but Sony goes to great lengths to keep people out of their products (at least compared to Samsung), which makes them no fun to work with.
-
I don't think he has claimed to have proved that the CPU is different. It would be foolish for anyone to claim to know if the SoCs are the same or different without proof. The possible differences between the SoC in the NX1 and the one in the NX500 will not necessarily appear in the source code that Samsung has released. The same source code can work with different hardware depending on how it is configured. What's more, there are probably (well, almost certainly) binary-only firmware blobs that are uploaded to run certain parts of the SoC that aren't directly controlled by the kernel. The kernel doesn't even have to include all the drivers needed to run the hardware. For all I know they might be using a user-space driver or proprietary kernel modules for some components, which would not appear in the GPL-required source. Or they might not. But you would have to carefully analyze the entire firmware image to find out. There are many possibilities at this point. The silicon could be identical and fully enabled in both chips. It could be identical and configured differently by low-level binary firmware. It could be identical but with parts permanently (laser) disabled. It could be different. As far as I know it isn't possible to tell apart those four (or more I haven't thought of) cases for sure just based on kernel code. Or it might be possible to prove. All we know right now is that the CPU cores themselves are functionally equivalent, which doesn't really matter anyway. If anyone knows that I'm wrong, please correct me. I'm somewhat familiar with embedded Linux, but I'm not an expert by any stretch of the imagination.
-
I kind of hope you're trolling, but I feel compelled to respond anyway. tl;dr: Nobody has been making any implied promises, and your suggestion had already been tried and failed before you posted it. First of all, nobody is claiming to be working on super-duper secret projects. Everyone has been very clear that they are doing this for fun in their spare time. From what I have read, nobody is releasing much since there isn't much interesting to release. For what it's worth, there have already been results, in particular from Otto and Vasile. Both of them made no promises yet delivered exactly what they said they expected to do. Vasile claimed to have found how to increase the bitrate, and provided a video as proof. Yea he could theoretically have faked it, but I see know reason to think that at this point. Otto stated his intentions very clearly and his results have already proven useful to people who want to take videos of events longer than 30 minutes (as well as anyone who wants to investigate their camera further). You got radio silence on your suggestions because people were already working on that and ran into problems that you did not address at all. In case it's too much trouble for you to actually read what people have already written, the issue was that if the camera app is killed it freezes the camera. It's hard to replace an app if you cannot even remove it. If you think you know how to prevent that, I would love to hear it. If you want "traction" in this discussion, try actually being helpful. Regarding the DRIMeV SoC itself, I mostly agree with your educated guess. It is almost certainly the same die, and the ARM cores are 100% identical. Unfortunately that's not saying much at all since most modern ARMv7 cores are identical as far as the software is concerned. It's the details of other blocks that would determine whether or not the important components of the camera application run or not. I can't tell just from skimming the kernel source code. It takes under 5 minutes to find and download the firmware update from Samsung's site. If you really don't have 5 free minutes in your life, I don't understand why you are wasting time criticizing people who have actually accomplished things and posted them here. If you are 1/2 as smart as you make yourself sound, it should take you less than an hour to unpack all the components of the firmware. I had never unpacked a firmware before, but it only took me two evenings to get all the major pieces out. The offsets, sizes, compression, and filesystems are almost all documented right here in this forum, so there is no excuse not to take a look if you really want to test an idea. I totally agree that the ratio of people waiting with high expectations to the people working on it is rather high. The Magic Lantern forums are just like that on a larger scale, but that doesn't change what they have or have not done. You need to keep in mind that this forum is a community of filmmakers, not hackers (or even "photographers"). The point of this thread in the first place was to show interest in modifying a camera, at which I'd say it has admirably succeeded! Finally, nobody cares that you're a busy person. So you like your HoloLens and want to play with it. Cool story. We're all busy people too, so suck it up and either contribute or go back to lurking. Sorry for the tone, but that's because I can never stand people who drop into discussions and claim that they could do it all better if they didn't have so many more important things to do, yet don't contribute anything useful. /rant
-
That's what I mean (though it's in 12-bit mode for video IIRC).
-
For internal recording it's highly unlikely. It would require extensive modification to the digital design of the video encoder. I won't say it's strictly impossible, since the NX1 supposedly uses a re-configurable processor for some tasks. Unfortunately no one knows if HEVC encoding is one of those tasks, so no one can say for sure if it is possible or not. My opinion is that Samsung probably did the power-hungry video encoding with a dedicated hardware block for the sake of efficiency, but I don't actually know. Even if it is possible, the reverse engineering required to do it would be unlike anything that has been done before in a public project. I sure hope it is possible, but it's a moon-shot at best. On the other hand, external recording might be a more plausible option. If it turns out to be possible to get a hold of a buffer of 10 or 12 bit video data while recording, it may be possible to send that out over the HDMI port.
-
I've never seen a file manager report a file size in bits. It's always bytes. Computer hardware typically cannot even store or operate on a unit smaller than a byte independently. Megabits are almost exclusively used for data rates, not storage sizes. Just to be sure I downloaded the files and checked them myself (on linux). Sure enough they are all at least 1 million bytes. Therefore the correct calculation for the 2048x1152x15fps stream is 1.5*10^6 bytes/frame * 8 bits/byte * 15 frames/s = 180*10^6 bit/s, which is approximately 180Mbit/s. Which is exactly what one would expect from typical JPEG compression ratios.
-
He's not a developer, he's a "Senior Marketing Manager." If Samsung is like many other semiconductor and consumer electronics companies, I very well might.
-
Unfortunately that's not how it works. The framerate is inversely proportional to the number of lines, not the number of pixels. Getting much more than 120fps in 1080p would require an entirely new image sensor. The reason is that all modern cameras (apart from some exotic high-end stuff) use "column parallel" sensors. The best affordable high-speed camera at the moment is probably the crowd-funded fps1000, if it's actually shipping yet. Please quit saying that without any evidence. The processor does not limit the readout speed of each individual frame. The RAM in the NX1 is fast enough to store the data to be processed in way under 1/30 of a second (you can see the clock speeds for the RAM in the kernel source). It can do that in its sleep. Literally. According to the source code, the RAM is running at 400Mhz in sleep mode. There is no way it is de-bayering, scaling, and encoding the video line by line without putting it in a framebuffer first. Yes I know a Samsung representative said the sensor was really fast in an interview, but the official Samsung website says otherwise. Furthermore every third-party test indicates otherwise. The proportionality between rolling shutter and lines of resolution in all real world tests agrees with my estimates. The information in the firmware release notes is consistent with those limitations. Every other camera ever made works like that. The only way the processor could determine the rolling shutter is if they forced the LVDS receivers on the main SoC to run at a fraction of the rate they could be run at. That would be spectacularly stupid, since there are much easier ways to cripple a camera. Even if they knew about the details of the camera's operation, I doubt Samsung executives would choose to limit the camera in such a dumb, arbitrary way. Unless you actually have evidence, stop spreading misinformation. It's a waste of time to keep explaining this over and over.
-
Reducing rolling shutter significantly without a crop is almost certainly impossible. That's a limitation of the sensor, not the firmware.
-
I'm pretty sure the encoder is integrated into the main SoC. I can't imagine Samsung using a separate physical chip when the main one is already a custom SoC they made for their cameras.
-
Has anyone actually found where and how the SRP is used in the NX1 firmware? It would be a shame to spend a bunch of time learning how it is programmed only to find out that it's just something like an audio processor as in the Exynos SoCs the DRIMe-5 is based on. Seems to me like there's a pretty good chance Samsung just used the fast dedicated 4k HEVC encoder they were already developing for their cell phones. Their video encoders are famous for their efficiency, so if they were done with the design of their HEVC codec in time the smart choice would be to use less flexible hardware for the sake of battery life and heat. I really hope it is being used for video encoding, since that would open the door to much more significant modifications (with a massive reverse-engineering and digital design effort). However I think the prudent first step is to figure out exactly what all it is doing in the camera. That's what I will be attempting to do once I have some time.