A new generation of gated x-ray detectors at the National Ignition Facility has brought faster, enhanced imaging capabilities. Their performance is currently limited by the amount of signal they can be operated with before space charge effects in their electron tube start to compromise their temporal and spatial response. We present a technique to characterize this phenomenon and apply it to a prototype of such a system, the Single Line Of Sight camera. The results of this characterization are used to benchmark particle-in-cell simulations of the electrons drifting inside the detector, which are found to well reproduce the experimental data. These simulations are then employed to predict the optimum photon flux to the camera, with the goal to increase the quality of the images obtained on an experimental campaign while preventing the appearance of deleterious effects. They also offer some insights into some of the improvements that can be brought to the new pulse-dilation systems being built at Lawrence Livermore National Laboratory.