# A note on Pruning
# Pruning Methods
We discuss any NLS equation in the usual notation where the evolution/longitudinal variable is $x$ and the transverse variable is $t$
Assume we run a simulation with $M$ periods and $N$ fourier modes. As we know, the $M, 2M, \ldots N\times M$ modes grow in lock step as one would expect. However, after some point, the remaining modes grow rapidly from zero and ruin the talbot carpet. To adress this, the pruning method was developed. In this method, after each evolution step, we enforce the following:
$
\tilde{\psi}(\omega_j) = f(\psi(\omega_j)), \qquad j \neq M, 2M, \ldots N\times M
$
where $\tilde{\psi}(\omega_j)$ is the fourier transform of $\psi(t)$ and $f(\psi(\omega_j))$ is a function of the $j^\text{th}$ mode of $\psi$ that depends on the pruning algorithm. The options we have are:
$
f(\psi(\omega_j))= \left\{
\begin{array}{ll}
0 & & \text{Fixed Pruning}\\
\tilde{\psi}(\omega_j) \times \exp\left( -\alpha \left|\tilde{\psi}(\omega_j)\right| \right) & \alpha \geq 1 & \text{Exponential Pruning}\\
\tilde{\psi}(\omega_j) \times \exp\left( -\alpha \left|\tilde{\psi}(\omega_j)\right|^2 \right) & \alpha \geq 1 & \text{Gaussian Pruning}\\
\end{array}
\right.
$
In our work so far, we have only used fixed and Gaussian pruning (with $\alpha = 1$). Fixed pruning corresponds to either of the other two types with $\alpha \rightarrow \infty$. In the following section, I will showcase a few calculations showing that exponential pruning is superior to Gaussian pruning even with $\alpha = 1$, and that fixed pruning is extreme and unnecessary.
In earlier works, my personal implementation was to prune after every step of the integrator $T_2$, i.e. 3 times for every time step in the fourth order symplectic algorithm, 9 times per step for the sixth order, etc. However, this is very inefficient and after sufficient testing, I have concluded that it is sufficient to prune just once after each time step. The results are not 100% identical, but pruning is not an exact procedure anyway and consistency in application is what matters.
However, one should keep in mind that fixed pruning is the truest pruning method, in the sense that a computation with e.g. 3 periods and fixed pruning would be the closest to 3 independent copies of an identical simulation with 1 period. Thus, we will use closeness to the fixed pruning as our bench mark here. These results should be quantified at some point by integrating the error so we don't have to do this comparison visually.
# Comparison
To compare these schemes we run a series of simulations with $a = 0.48$, `n_periods = 3` and $N_t = 512$ and `dx = 1e-4`. We use a fourth order symplectic algorithm.
## Gaussian Pruning, $\alpha = 1$


## Exponential Pruning, $\alpha = 1$


## Exponential Pruning, $\alpha = 5$


## Exponential Pruning, $\alpha = 10$


## Exponential Pruning, $\alpha \rightarrow \infty$


# Conclusion
Exponential pruning naturally outperforms gaussian pruning with $\alpha = 1$ since $|\tilde{\psi}_j(x)|^2 \leq |\tilde{\psi}_j(x)|$. Thus, we recommend the usage of exponential pruning with a small value of $\alpha \geq 1$. This will give the closest result to fixed pruning (i.e. $\alpha = \infty$), which essentially emulates a simulation with only 1 period copied 3 times.
Exponential pruning might make more sense experimentally since one can naturally think of a light wave decaying exponentially, while gaussian decay is far less common.