With all due respect (and I mean that), nonsense.
MadhiH, I don't know what you mean when you say that, "the p-value is too high!" The p-value for a test of normality applied to a sample drawn from the null hypothesis (a normal distribution) has a U(0,1) distribution. That's literally the definition of a p-value. With repeated sampling you will see p-values from less than 0.05 (5% of the time you get a Type I error if alpha is 0.05, right?) up to larger than 0.95 (also 5% of the time). Run this code
to demonstrate that, using this empirical CDF plot of the p-values:
That's a U(0,1) empirical dist'n in anyone's book. I used kstest, because I know the null hypothesis, I don't have to estimate its parameters. Also because lillietest truncates the p-value to 0.5 if larger than that, because that's how high its tabulated values go (which are not, by the way, Lilliefohrs' tabulated values, computing power has increased rather a lot since the 60's, so we used larger Monte-Carlo simulations than he did). There's not a whole lot of point in caring about the exact p-value if you already know that it's larger than 0.5, you are going to fail to reject the null under any reasonable significance level. But if you want p-values all the way up to 1, you can tell lillietest to compute a Monte-Carlo p-value; it's slower.
dpb, you've shown a probability plot for one sample. Drawing conclusions from that is a little like saying that this
is wrong because the normal dist'n should have mean zero. I know you know that: "...for this particular instantiation." The entire sample is (pseudo)random; the extreme tails are especially noisy because they are by definition a small sample. You can expect the tails in any given prob plot to skew high or low. Run this code
to get something like this plot:
Some of the time the extreme left tail is too light with respect to the theoretical dist'n, sometimes too heavy. Same for the right. Nothing wrong there. The K-S test with alpha=0.05 rejects normality 5% of the time, right on target. So would Lilliefohrs', so would A-D (which IIRC is more sensitive to departure in the tails).
The generator choices in MATLAB are well documented, included references to the literature (bottom of that page). There is a vast amount of literature discussing the various uniform generators and normal transformations that MATLAB offers, and these algorithms have been really heavily tested in that literature (but do note the caveat about not using the old ones: "The generators mcg16807, shr3cong, and swb2712 provide for backwards compatibility with earlier versions of MATLAB"). The default generator, the "vanilla" Mersenne Twister, is not known to have flaws other than linear complexity (and it's not suitable for cryptographic applications), but there are other choices of algorithms if that one doesn't suit your fancy. You can also switch from the Ziggurat algorithm used by default for the normal transformation; inversion is slower but often recommended for better behavior in the extreme tails.