about those engines

A few years ago, one of the new media brats made fun of this blog for "not liking recommendation engines."
The blog probably didn't call them that but instead something like "that thing where amazon or netflix thinks it knows the inside of your brain and makes shitty suggestions based on your past consumption."
"Engines" is too fancy a term for this. Engines generally work.
The brat's assumption here is that "engines" deserve consideration -- that they might be just as good as critics, even.
The societal problem isn't algorithms, per se, it's a loss of belief in criticism.
It's a Catch-22: one would have to have critical faculties to perceive that critics do a better job of choosing artworks.
Schools don't really teach that anymore (right?), hence faith in "engines."
Another way of asking this is "who designs the Turing Test for the critic AI?" If it's a software engineer what do they know about art criticism? Are the tech schools turning out polymaths lately?