Masking, although more than anything else just makes the depth map cleaner in my limited tests, one place I do see it will influence is in production, where background lights and objects moving in the background as the actor moves around produce slight figments and noise in the solved animation. Interestingly enough, washing out the footage seems to produce a smoother depth map. With the tests I did yesterday, at a normal distance, the results were much, much better, and I also did tests with masking the footage (using Davinci Resolve "magic mask") and playing around with the contrast of the footage. The quality of the depth map is also rather poor. In my first test a few days ago, I had the cameras really close to the face, around 4 inches, and as you can see, the nose almost completely disappears, even when testing different settings when setting up the capture source. However, as with the iphone, it appears that closer is not necessarily better as there seems to be a limit on how close you can really get to the face without losing depth data. One of the things that were great about Dxyz HMC's, is that because of the lenses, you can come in really close to the face, which makes performing with them a little easier. looking forward to being there at Siggraph and answering any questions.Ĭontinuing my HMC tests with MHA, captured some test footage yesterday and a couple of days ago with the new mounts. The only things that it missed, which are minor, are the contours of the face (most likely because of my beard and because the footage was masked) and one instance where the tongue interferes with the lip contours.īut everything was else was spot on for the most part, and honestly, I probably could not do any better. It is slow, but very accurate (on my older i7 laptop, almost 40 minutes for a 40 second clip, on my I9 workhorse, 23 minutes) The one that floored me though, is AI05, which is designed for any non-FG HMC. On my dxyz HMC footage it still missed a bit, but good nonetheless. Now, up to now, I've never been a fan of autotrackers, either from FG or dynamixyz or anyone else, as they never seemed to be really accurate, so I always did this manually.ĪI04 was their most advanced auto tracker that they had before, geared towards their HMC's, but now with the addition of being able to track Technoprops HMC's pretty accurately and quickly. Yesterday, Facegood dropped a new version of their software, Avatary 1.15, with a lot of changes.Īmongst these, the two that excite me the most so far that I've tested are: Kks_stereo_hmc_conversion_w_audio_mha.xlsx Hope this might be helpful to some of you out there. In the second sheet, you will have a cell with the text needed to create a batch file for conversion of the footage There's a text panel with general instructions for use, but basically, you enter some information in the yellow tinted cells (the path to the root footage folders and the path/names of the footage you want to convert) and it does the rest. Tested with dxyz, TP and other stereo footage I've used it over several projects so it should be relatively bug-free. I'm sure there are easier ways to do this, but my excel skills far outweigh my coding/scripting skills It assumes you already have ffmpeg installed and set in your environment path and that you have python installed as per Epic documentation. I've decided to share with anyone who is interested the spreadsheet I created and use to automate the process of converting stereo HMC footage for use in MHA
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |