

I reckon we are so incredibly complex, are integrating so much information that from inside it’s hard to see if you’re deciding or selecting by rule your preferred path given what you know
You can call the complexity free will, we’re all so different having had different parents, different childhood experiences, different education, different opportunities so each has their own solution that rises to the top in any situation
But also brain scans have demonstrated that for minor stuff (like raising your hand) action precedes “deciding” to take the action.
If you’re a complex machine whose action could be perfectly predicted (with full knowledge of everything you ever experienced) it’s still reasonable to punish you for breaking rules - the risk of punishment goes into your programming as part of the (deterministic) calculation of what action to take