Bayesian statistics are sort of a weird bunny, IMO. It is useful to know the "priors" if you are otherwise stabbing in the dark and need to know which 'ballpark' to look in. But it by definition adds an element of bias to your analysis. I had a stat prof in Grad School who though Bayes was basically The Devil....
Speaking of the devil - here are some thoughts:
Bayesian reasoning is a process, not a series of discrete operations that can be evaluated while disregarding the history which resulted in the state that constitutes the "prior" that goes into the estimation of the "posterior", which of course is again the new "prior" - and so on.
As such, the "prior" will differ between evaluators as they have different histories and they may also differ in the assessment of the strength of evidence.
In addition, tacit knowledge also goes into the evaluation and that is by definition not easily analyzed.
So it looks like that Bayesian reasoning lacks rigor and precision.
But then we have to ask what the lack of rigor and precision is compared to - and here things become a bit weird.
If we look at a single neuron, we see that it consists of a cell body and a kind of biological wire capable of carrying a frequency modulated output to other cells, including other neurons. This "wire" is called an axon and the individual membrane potentials are called action potentials.
The axon ends in a multitude of synapses that release small vesicles of neurotransmitters that influence the membrane potential of the target cell, and if that target cell is another neuron, it will modulate the probability of the next action potential that will then travel again via the axon and so on.
The probability of an action potential to be either delayed or accelerated is a function of the excitatory and inhibitory inputs a neuron receives and of its own state when the inputs were integrated - the "prior", as the state is determined by previous inputs that themselves are derived from the states of other groups of neurons.
Even in this simplified view, it is clear that the "prior" is the result of the state of a network and not just a local phenomenon.
The rapid depolarization and propagation that characterize an action potential is an all or nothing phenomenon and can be seen as being "digital" although the rate modulation around the base rate of firing is analog again, whereas the continuous modulation of membrane potential is analog and infinitely variable within its constraints.
Now consider that the connections of a single neuron number in the tens of thousands and neurons number in the billions and it is obvious that the complexity is insane.
So the human brain appears to be made up by structures that are best described as electrochemical hybrid analog/digital information processors that integrate "prior" states with new inputs to generate a rate modulated output.
That is how one would go about if one wanted to construct a Bayesian calculator.
The key here is to understand that this Bayesian calculator is an analog computer. Analog computers are called analog(ue) because they operate as analogues of the real world. Issues with analog computers are the lack of precision and the difficulty in programming such a thing.
The animal nervous system has answered the programming issue with ongoing remodeling throughout life with rewiring, either functional or physical, being a basic feature called neuroplasticity.
Precision is another matter and we´ll see that the ability of dealing effortlessly with infinities (membrane potentials and firing rates are continuous variables that are expressed in real numbers) comes with another sort of precision and with great energetic efficiency.
Although continuous variables are expressed in real numbers, the brain does not deal in numbers but in analogues of real world processes, particularly electrical and chemical analogues - that makes dealing with continuous variables a no energetic cost deal when compared to simulations of real world processes in a simulated real number space and this is where computers come in.
Digital computers deal in numbers and cannot properly deal with real numbers and are restricted to a subset of real numbers that can be represented by floating point numbers and that also only up to a given precision and size.
Ultimately, digital computers are restricted to simulations of reality with the realism of the simulation determined by the brute force computational effort.
These are hard constraints and increases in number size and in precision come at a high energetic cost - brains made up of neurons operating as reality analogues do not have those constraints and can operate at incredible efficiency.
This is behind the warnings about the energy hungry "AI" world that have to be taken seriously.
Of course, the big question people have been asking is if computers will eventually become conscious.
The idea is that with increasing complexity, new emerging properties appear and consciousness could be one of them.
The trouble here is that throwing more and more computing power at "AI" iterations does not eliminate the constraints on the actual operation of digital computers. It might well be that the simulations become better and indistinguishable from aspects of reality and actually give the impression of increasing complexity - but the problem is that we are still looking at a simulation of increased complexity.
Also, an algorithmic approach proceeding with discrete steps does not exert evolutionary pressure in direction of emerging consciousness whereas, due to the fuzziness of complex self-organizing Bayesian calculators, consciousness may be selected for to deal with conflicting results that are not suppressed and whose salience might actually be constitutive of consciousness.
(It appears that most of the work the brain does is in suppressing neural activity, thus operating in the abductive logical mode, or "eliminating everything but the most likely explanation")
In conclusion, Bayesian analog calculators of the complexity of our nervous system are historical processes that operate continuously (not in stepwise fashion, and even "prior" and "posterior" are kind of smeared out in time giving consciousness the space to operate in) and can handle continuous variables effortlessly.
Digital computers are not capable of that and this is a fundamental difference - no matter how good the simulations are.
Here is something about analog chips and that illustrates how hyped up all that "AI" stuff in respect to machine consciousness really is:
The Unbelievable Zombie Comeback of Analog ComputingComputers have been digital for half a century. Why would anyone want to resurrect the clunkers of yesteryear?
Let's Get Physical
https://www.wired.com/story/unbelievable-zombie-comeback-analog-computing/And something about computational constraints of digital computers:
Why Computers are Bad at AlgebraPBS Infinite Series
https://www.youtube.com/watch?v=pQs_wx8eoQ8