Jump to content
Sign in to follow this  
SpecOp9

Will ARMA 3 support video files / image sequences?

Recommended Posts

I had a rather unconventional idea recently when it came to modding, specifically cut scene creations. This may seem like a bit of a stretch but in theory, I think it might actually work. It also takes a bit of explaining and understanding of motion graphics / composting so I'll try my best.

1. Make a 3d green screen object in Oxygen. The screen has tracking markers on it. Much like what's used in feature film.

2. Place the object behind your subject (a man walking for example) and have him walk forward with a camera animation.

3. record the scene ingame using Fraps

4. Import the video into After Effects CS6, use the 3d tracking camera to track the motion of the camera

5. Import the camera motion into Cinema 4d, and create an animation inside of 50,000 A-10s flying overhead (as example)

6. render that video out

7. back to the original footage, create a mask of the walking man in After effects so that the A-10s vanish when behind the walking man

Now this is the part where I think there might be trouble, but would be the final step. Importing that image sequence or video file BACK into ARMA 3, and having it play WHILE the actual in-game cutscene is playing.

The final result would APPEAR to be thousands of A10-s flying behind the walking man, but in actuality is just an image sequence.

While it may be a boring example, I think the possibilities of this technique shouldn't go ignored... Animations and effects that normally would not be possible in ARMA 3, would then BE possible - without a performance hit.

Share this post


Link to post
Share on other sites

I believe it's already possible A2:OA. They added video playback features in a patch that came alongside the PMC DLC. You can see it

at around 30 second mark at the bottom right corner of the screen.

Share this post


Link to post
Share on other sites

Indeed, videos are supported in ArmA2:OA + PMC.

Share this post


Link to post
Share on other sites

While your actual example seems rather more complex than needed for the effect, it seems the video playback option is in :)

Share this post


Link to post
Share on other sites

So does that mean the video can have an alpha and still work?

Share this post


Link to post
Share on other sites

why would you need a video with alpha layer in the first place...?

Share this post


Link to post
Share on other sites

With 3d camera tracking you could do some very interesting things. Say you wanted a realistic plane crash with thousands of individual particles and physics and explosions. You could render the animation in 3ds max / Cinema 4d. Instead of using film-video as your main plate to composite it together, you use a recording of the cutscene. Once it all looks good, import the video file in-game so when the cutscene plays, you have a plane crash simulation. But you would need that Alpha, or else it would just play the animation with a big black background.

Share this post


Link to post
Share on other sites

It's probably saner to pre-render it all anyway due to potential desync, timing and screen-size alignment issues between the alpha-blended and ingame stuff. Filesize is also a potential issue, but I don't deal in video editing so I'm not sure about that one.

Share this post


Link to post
Share on other sites

Since A2:PMC, it's possible to run videos in game via the OGV format. There were some early teething problems with playback on certain hardware, which were addressed in a subsequent patch or two.

They're a nice addition/ option to have, but - particularly in full-screen - their quality can suffer when trying to balance resolution and filesize. Dark scenes particularly can suffer. I find that they're best deployed as smaller resources, such as the small "video feeds", etc; although, fairly long full-screen footage can work nicely, if your players are willing to download the extra data. For example, the 'title screens' for each showcase were OGV videos running in fullscreen. I think they're only ~5-10MB each, and are 10s long.

I believe that the OGV playback doesn't handle alpha (at least in our current implementation), so you will always end up with the black scene, even if you export with an alpha channel, capturing the footage as you describe above.

You can, actually, play image sequences already though. It's a bit of a hack, but if you were to export your composition as a png-sequence with alpha, a simple script can play this back, calling the numbered images at whatever framerate you've exported it in. We actually used this 'hack' in the E3 presentation in our prototype of the context-hint system, found in the 'Deterrence' mission (such as fire support), which is actually a sequence of 90 images, called by script.

The problem here again is the data size. In my example, it was only a very small image for 3 seconds, but we're still talking about loading and scrapping a lot of data, and I'd be concerned about performance when trying to do anything more complex/ longer/ fullscreen. It's not engine supported, nor is support for it planned at the moment. As long as Ondra (or anyone else that may explode over a 'hacked' 'designer's' 'implementation'â„¢) doesn't see it, you'll be fine. :D

If you carry out any experiment in full-screen, for example, perhaps overlaying some png-sequence of particle effects on a real-time rendered scene, that'd be fascinating to see in action!

Hope that helps!

Best,

RiE

Edited by RoyaltyinExile

Share this post


Link to post
Share on other sites

From what I understand the script/image sequence idea could potentially already work in ARMA 2 as well, I'm feeling inclined to try it

Share this post


Link to post
Share on other sites

More or less. At its core, it's a for do loop with 90 steps that rapidly calls different silent hints:

for "_i" from 0 to 89 do {
private ["_pic", "_string"];

// Filenames need to be handled differently if single or double digit
if (_i < 10) then {_pic = format ["pic_0000%1.paa", _i]} else {_pic = format ["pic_000%1.paa", _i]};

// Don't display the instructions for the first 30 frames
if (_i < 30) then {
	_string = format ["<t size = '2'>EGLM</t><br/><img image = '%1' size = '10' shadow = '0'/><br/><t size = '1.25'> </t>", _pic];
} else {
	_string = format ["<t size = '2'>EGLM</t><br/><img image = '%1' size = '10' shadow = '0'/><br/><t size = '1.25'>PRESS [H] FOR MORE INFO</t>", _pic];
};

// Use a silent hint to display it
hintSilent parseText _string;

// Display at 30 FPS
sleep 0.03;
};

Share this post


Link to post
Share on other sites

Now if we were able to pull in a data feed from an external source .... ahhh to dream, to dream!

Share this post


Link to post
Share on other sites
With 3d camera tracking you could do some very interesting things. Say you wanted a realistic plane crash with thousands of individual particles and physics and explosions. You could render the animation in 3ds max / Cinema 4d. Instead of using film-video as your main plate to composite it together, you use a recording of the cutscene. Once it all looks good, import the video file in-game so when the cutscene plays, you have a plane crash simulation. But you would need that Alpha, or else it would just play the animation with a big black background.

I am well aware of what alpha is used for, i had my fair share of compositing, both static and animation files.

But after reading your OP, what i find weird is, that you would like to compose, in the game, a scene between an pre-rendered sequence, and a cutscene (that might end up differently each time). Of course, you could (most likely very hard), capture the camera movement (match moving it via Boujou, Nuke, AE, ADSK Matchmoving or whatever have you) and then redo the whole scene. But then again would be much easier to pre-render the entire thing (especially since, even with alpha, you wouln't be able to do the rotoscoping/masking in A3 editor)...

I think being able to play a sequence of image files, as well as animation files should be enough for 99% of the community needs, even for the lads doing machinima (who would, in my humble opinion, be happier with some more advanced camera controls)...

Share this post


Link to post
Share on other sites

Pre-rendering the entire movie to me shows some problems, the file size, and full screen video can start showing artifacts. If, however, it was a small 5 second CG clip composited in and along with the actual in-game cutscene, you'd be able to not only render the video at a higher resolution, but the file size would also be much smaller. I'm a bigger fan of in-game cutscenes, pre-rendered takes away from the magic to me :(

The in-game scenes would need to be very simple and specific. The AI does move in all different directions nearly every mission preview - but one thing is definite and the same every time, the camera movement. This is a benefit, because now you can conduct multiple passes of different objects, solo the individual elements out of each pass, and then composite it together - now it's the same every time. I think this is what you meant.

Share this post


Link to post
Share on other sites

Well I was able to come up with a short example of what exactly I mean.

What I did was created a simple camera animation in ARMA 2 and recorded it.

I then did a 3d motion track

imported the camera data into Cinema 4d

created the small animation

overlayed the animation with the alpha

I hope people are starting to see the potential with this, it worked fairly well, that's not to say I did not notice any potential problems. The frame-rate is extremely important. I set the animation to 23.967 FPS, but ARMA's frame rate can fluctuate depending on the scene and things can become rather wonky.

As for file size, the animation image sequences are 31.0 MB with 1680 x 1050 dimensions (pretty good considering my render straight from AE is 1 GB)

It's a quick and silly example video really... but with some more advanced compositing in Cinema 4d, like Ambient occlusion, shadows, reflections, HDR lighting, motion blur and more, you could create some pretty surreal things - and achieve animations in cut scenes that normally would never be possible inside the engine.

Also notice how the balls interact with the shot, by bouncing off the sidewalk area of the dam.

Share this post


Link to post
Share on other sites

Hmm, going by the info in the Biki, if there were a millisecond version of time alongside the per-frame scripting, you could probably be able to sync it somewhat and dodge the framerate issues. Maybe it's something we could ask for ArmA3. :P

Share this post


Link to post
Share on other sites

As long as both animations start and finish at the same time, things won't look that bad

Share this post


Link to post
Share on other sites

good example there spec.

One thing, due to FPS fluctuation, and guessed camera fov, there is a bit of skidding there.

One way to make sure the FPS is constant, is record everything with setacctime, and vsynk on (should be able to push 60FPS that way).

You then need to speed that app in your compositing software (AE i guess) and convert to some normal 24/30FPS (or you will have double the render time). You can do the matchmoving afterwards ;)

Share this post


Link to post
Share on other sites
and achieve animations in cut scenes that normally would never be possible inside the engine.

Don't get me wrong, I love progress, but part of the "charm" of the ArmA series is WYSIWYG. There is no pre-rendered or completely composited stuff ingame, its all made ingame, warts and all.

Share this post


Link to post
Share on other sites
good example there spec.

One thing, due to FPS fluctuation, and guessed camera fov, there is a bit of skidding there.

One way to make sure the FPS is constant, is record everything with setacctime, and vsynk on (should be able to push 60FPS that way).

You then need to speed that app in your compositing software (AE i guess) and convert to some normal 24/30FPS (or you will have double the render time). You can do the matchmoving afterwards ;)

I think you might be misunderstanding the point of the exercise. :D

He only recorded a video of it for the proof of concept. But the goal is to have the game play your standard cutscene and overlay it with prerendered content (in this case the balls), so when you, for example, play his mission, you'd be able to see the balls bouncing around on the dam.

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×