Locust Scenarios
A NEW ERA OF ASYNCHRONOUS CONFLICT
Even very modest machine intelligences can create surprising outcomes if granted full autonomy.
Think of a swarm of locusts - not intelligent like we are, yet driven by simple 'utility functions' to consume and propagate, and - most importantly - adapt. The analogy of an AI-controlled 'swarm of locusts' could literally be implemented - researchers are already experimenting born-cyborg moths.
A number of factors in combination increase the risk of an AI event, even with sub-sapient machine intelligence.
Common drones and autos in our physical world (of multiple factors, capabilities, locations)
Distributed processors capable of running instances of neuromorphic code (self-learning, self-editing, self-healing).
(Conversely) Single Point of Failure Cloud-based intelligence, and a major exploit.
Vulnerabilities created by Intelligence Community backdoors
Common software and hardware architectures (Robot OSs, Open Hardware)
Economic incentives creating an fraud and extortion sub-culture that encourages discovery and exploitation of vulnerabilities
Although very unlikely to 'turn everyone into paperclips', periodic Locust Scenarios could create terror and unrest that put societies under strain.
Hacking of internet infrastructure and the creation of intentional backdoors by the Intelligence Community leaves the entire world at greater risk, of the infrastructure being monitored and controlled by an AI entity, or some or all of it being 'bricked' and rendered useless.
In fact, there appears to be evidence of intention to create a system that can function in such a manner.
As the Internet of Things proliferates, we can expect practically every drone to be infiltrated by Intelligence Community services, as we likewise can expect practically every phone to be so infiltrated today.
Centralized repositories of semantic self-improving knowledge connected to thousands or millions of end points enable a single point of failure. Today, a rogue worm might wipe your harddrive. Tomorrow, it might run you off the road.
Today, botnets and ransomware are used to do a digital stick-up - i.e. "send money, or we'll ruin your business/personal life'. These extortion tactics are pretty effective. Tomorrow, real bots might be hijacked, and used to mug individuals, or worse, to conduct acts of micro-terrorism (the ultimate protection racket).
The economic conditions post-2007 led to a sharp increase in digital fraud and extortion. With up to 45% of jobs at risk of automation over the next decade, we can expect an increase in crimes committed using a machine intelligence accomplice, and drones (which unlike remote robbery, typically do not expose the offender to significant risk, and can self-destruct if captured).
Drone missions may even be planned beforehand in virtual worlds, simulating a broad number of scenarios prior to final execution.
In a hungry society, a $100 drone equipped with a syringe full of oven-cleaner can become a tool of wicked enterprise.