Without free will, it follows that everything is a matter of causality. Or rather, the two ideas confirm each other. That is to say, everything is systemic. Everything a natural process that will play out exactly as it's coded to do, by all the variables of circumstance.
When people like Sam Harris discuss the potential dangers of artificial intelligence, it boils down to this. Just because a system excels in some way, does not mean that it's what we'd consider good. That is, beneficial to us, to humanity, etc. An AI could excel at replicating itself, for example. Superhuman in it's ability to learn and utilize information, but without the values we ascribe due to our physiology. Any safeguards easily discarded by an entity much more intelligent than we are, in a relatively narrow sense.
A system we create that essentially evolves into patterns of expansion, much better than we ever could, given our physiological constraints. From our frailty and fallibility to all those arbitrary values and emotions that get in the way.
Not really so unlike the systems we've been building for thousands of years. Government and bureaucracy, culture and community, layer upon layer of cooperation and competition. Like a hive, we build systems that far exceed our own cognitive abilities, as individuals. We keep getting better and better at it, but these systems have to be constrained by human values. What we're really talking about is the brutality of Darwinism, extending not only to us, physiologically, but in everything we build.
Maybe that AI that we're afraid of is already here. A superhuman system of self-replication that exceeds our ability to constrain it. Only we call it capitalism.