Military personnel involved in weapon training are subjected to repeated low-level blasts. The prevailing method of estimating blast loads involves wearable blast gauges. However, using wearable sensor data, blast loads to the head or other organs cannot be accurately estimated without knowledge of the service member's body posture. An image/video-augmented complementary experimental-computational platform for conducting safer weapon training was developed. This study describes the protocol for the automated generation of weapon training scenes from video data for blast exposure simulations. The blast scene extracted from the video data at the instant of weapon firing involves the service member body avatars, weapons, ground, and other structures. The computational protocol is used to reconstruct service members' positions and postures using this data. Image or video data extracted from service member body silhouettes are used to generate an anatomical skeleton and the key anthropometric data. These data are used to generate the 3D body surface avatars segmented into individual body parts and geometrically transformed to match extracted service member postures. The final virtual weapon training scene is used for 3D computational simulations of weapon blast wave loading on service members. The weapon training scene generator has been used to construct 3D anatomical avatars of individual service member bodies from images or videos in various orientations and postures. Results of the generation of a training scene from shoulder-mounted assault weapon system and mortar weapon system image data are presented. The Blast Overpressure (BOP) tool uses the virtual weapon training scene for 3D simulations of blast wave loading on the service member avatar bodies. This paper presents 3D computational simulations of blast wave propagation from weapon firing and corresponding blast loads on service members in training.