“AI has an ‘explainability’ problem. Your algorithm did XYZ, and everyone wants to know why, but because of the way that machine learning works, even its programmers often can’t know why an algorithm reached the outcome that it did. It’s a black box. Now, when you enter the realm of autonomous weapons, and ask, ‘Why did you kill that person,’ the complete lack of an answer simply will not do — morally, legally, or practically.”
It will soon be as easy to produce convincing fake video as it is to lie. We need to be prepared.
Here are three ways to take control of the algorithms in your life.
In an extension of the article’s logic, and on a slightly more sinister side, the following statement would have been considered outlandish and a movie fiction conspiracy theory to an uninitiate just a little over a year ago: how hard is it for a any interest — political, religious, idealogical, a mix of all, etc. group or sub-group — desiring accumulation of power or influence through any means including legislation, surveillance, or social division or unrest it can benefit from, to persuade, collude, coordinate with or ask a foreign power, entity or interest with access to or influence in FTOs or TCOs, to effectuate (catastrophic) events that can facilitate or lead to those outcomes? A lot easier than the 2016 election interference which went on mostly by independent opposing self interests. For free, personal desire, vengeance, a favor or return of one?
Hastily drafted laws passed under pressure tend to create new problems while doing little to counter threats from terrorists and violent extremists.
The Pentagon’s new autonomous system could embed bias, jeopardize civil liberties, and reshape the role of algorithms in the workplace.
Skeletally speaking …
Barr’s letter contains good news for the president, but it also raises ominous questions that only Mueller’s report can answer.
The hidden text too.