You see the Planck length as evidence of the discrete nature of spacetime - but that is not where the Planck length is coming from.
The Planck length is the distance below which the equations of quantum mechanics return nonsensical results (as in probabilities not adding up to 1 etc.) and was introduced by Heisenberg.
Nonsensical results mean just that: there is nothing that can be inferred from them. The Planck length is a limit imposed on the equations of quantum mechanics and must give the appearance of spacetime chunkiness precisely because the equations still work at and above the Planck length.
Thus, the Planck length came into being because of the mathematical necessity of treating spacetime as quantized when using quantum mechanics equations - which is not exactly surprising, duh. Therefore, the Planck length indicates a limit imposed by nature of some sort but does not necessarily entail discreteness of spacetime at distances smaller than the Planck length.
(Another example of equations ceasing to work is Newtonian mechanics when relativistic effects begin to matter - Newtonian mechanics beyond the limita of their applicability do not tell us anything about relativity but are a good approximation when relativistic effects are minimal.)
In summary, the Planck length does not constitute evidence for or against chunkiness of spacetime below its value.
Yeah, the lack of valid response that you get when trying to go below Planck length has a special meaning to me as a software engineer. 'Undefined results' is what happens in software when you try to access something that doesn't exist. This only lends credence to simulation theory. When you get that small you're going below the intended use case by trying to access data that the simulation doesn't have. It makes perfect sense that you get back garbage in that instance.
Epistemologically, I don't find your argument to be stronger than mine on this matter. You're just operating from different initial assumptions about the universe.
And this leads to the second issue which involves inference to the most likely explanation:
Attributing the apparent smoothness of spacetime to insufficient resolution despite using all our means is not a parsimonious explanation as it involves postulation of certain conditions that make a thing appear as something that it is not.
The parsimonious explanation is that what we are seeing is indeed what is - and this is the explanation against which the defense must be directed, i.e. continuous spacetime is the parsimonious explanation and chunky spacetime would need to be demonstrated in order to reject continuous spacetime.
There is no evidence yet in support of discrete spacetime but it is a possibility.
You are simply extrapolating based upon a completely different system (looking at larger stuff where resolution is sufficient to allow for a continuous spacetime approximation). This forces the observations we've made about the universe into these unsupported assumptions.
There is no evidence yet in support of continuous spacetime. But it is certainly a possibility.
I think you are misinterpreting what I'm saying.
I've stated early in the conversation that the Planck length limit is more a necessary constraint on current quantum mechanics math that requires quantization of spacetime in order to work.
A reconciliation of the apparent contradiction between continuous spacetime at the macroscopic level, as per special relativity, and the quantized spacetime of quantum mechanics requires a new theory of gravity.
I'm not talking about extrapolation of special relativity into small physics but the quantum theory of gravity that could possibly resolve the paradox.
That quantum theory of gravity does not yet exist and the math required is not yet known.
So the missing link between special relativity and quantum mechanics is a quantum theory of gravity and there is no convincing evidence for such a theory requiring a discrete minimum length: if it requires it, spacetime is more likely to be discrete; if it doesn't, continuous spacetime of special relativity is extended into the subatomic realm (but not extrapolation of special relativity itself).
Continuous spacetime requires fewer moving parts than discrete spacetime and therefore remains the favored hypothesis, but it is not a strong case at all.
All this could be resolved once we got a quantum theory of gravity but remains unsettled for now.
From the article:
Einstein's theory cannot answer it. We assume that there's a quantum theory of gravity out there, but we don't know whether that theory will also require a distance-scale cutoff or not. Heisenberg's original argument came about from trying (and failing) to renormalize Enrico Fermi's original theory of beta decay; the development of electroweak theory and the Standard Model removed the need for a discrete minimum length. Perhaps, with a quantum theory of gravity, we won't need a minimum length scale to renormalize any and all of our theories.