>>55>Fooling them in what sense?
in the sense of causing algorithms to draw incorrect or incomplete conclusions
>but even with that there's a chance that they could predict you from the data others feed it or simply by generalization.
exactly, if you don't act like everyone else(which you're already doing by not submitting to the surveillance state) you're hard to predict.
>To actually cripple one you would need to generate significantly more fake data than everyone else combined
You don't need to make fake data. Segregation means you still use social media in some limited sense, but if you don't let them bleed together at all(which is the hard part) their models can't predict your actions as a full individual, only a partial one.
Don't attribute significant powers of discernment to modern AI. The technology is still in its infancy, and will be for some time.>>57>so we could also predict police actions?
predictive policing doesn't really work though. It's just a way to justify the crap that police do by saying the algorithm made them do it.