Browsed by
Tag: arms race

Review of Frank Pasquale’s “A Rule of Persons, Not Machines: The Limits of Legal Automation” – Article by Adam Alonzi

Review of Frank Pasquale’s “A Rule of Persons, Not Machines: The Limits of Legal Automation” – Article by Adam Alonzi

Adam Alonzi


From the beginning Frank Pasquale, author of The Black Box Society: The Secret Algorithms That Control Money and Information, contends in his new paper “A Rule of Persons, Not Machines: The Limits of Legal Automation” that software, given its brittleness, is not designed to deal with the complexities of taking a case through court and establishing a verdict. As he understands it, an AI cannot deviate far from the rules laid down by its creator. This assumption, which is not even quite right at the present time, only slightly tinges an otherwise erudite, sincere, and balanced coverage of the topic. He does not show much faith in the use of past cases to create datasets for the next generation of paralegals, automated legal services, and, in the more distant future, lawyers and jurists.

Lawrence Zelanik has noted that when taxes were filed entirely on paper, provisions were limited to avoid unreasonably imposing irksome nuances on the average person. Tax-return software has eliminated this “complexity constraint.” He goes on to state that without this the laws, and the software that interprets it, are akin to a “black box” for those who must abide by them. William Gale has said taxes could be easily computed for “non-itemizers.” In other words, the government could use information it already has to present a “bill” to this class of taxpayers, saving time and money for all parties involved. However, simplification does not always align with everyone’s interests. TurboTax’s business, which is built entirely on helping ordinary people navigate the labyrinth is the American federal income tax, noticed a threat to its business model. This prompted it to put together a grassroots campaign to fight such measures. More than just another example of a business protecting its interests, it is an ominous foreshadowing of an escalation scenario that will transpire in many areas if and when legal AI becomes sufficiently advanced.

Pasquale writes: “Technologists cannot assume that computational solutions to one problem will not affect the scope and nature of that problem. Instead, as technology enters fields, problems change, as various parties seek to either entrench or disrupt aspects of the present situation for their own advantage.”

What he is referring to here, in everything but name, is an arms race. The vastly superior computational powers of robot lawyers may make the already perverse incentive to make ever more Byzantine rules ever more attractive to bureaucracies and lawyers. The concern is that the clauses and dependencies hidden within contracts will quickly explode, making them far too detailed even for professionals to make sense of in a reasonable amount of time. Given that this sort of software may become a necessary accoutrement in most or all legal matters means that the demand for it, or for professionals with access to it, will expand greatly at the expense of those who are unwilling or unable to adopt it. This, though Pasquale only hints at it, may lead to greater imbalances in socioeconomic power. On the other hand, he does not consider the possibility of bottom-up open-source (or state-led) efforts to create synthetic public defenders. While this may seem idealistic, it is fairly clear that the open-source model can compete with and, in some areas, outperform proprietary competitors.

It is not unlikely that within subdomains of law that an array of arms races can and will arise between synthetic intelligences. If a lawyer knows its client is guilty, should it squeal? This will change the way jurisprudence works in many countries, but it would seem unwise to program any robot to knowingly lie about whether a crime, particularly a serious one, has been committed – including by omission. If it is fighting against a punishment it deems overly harsh for a given crime, for trespassing to get a closer look at a rabid raccoon or unintentional jaywalking, should it maintain its client’s innocence as a means to an end? A moral consequentialist, seeing no harm was done (or in some instances, could possibly have been done), may persist in pleading innocent. A synthetic lawyer may be more pragmatic than deontological, but it is not entirely correct, and certainly shortsighted, to (mis)characterize AI as only capable of blindly following a set of instructions, like a Fortran program made to compute the nth member of the Fibonacci series.

Human courts are rife with biases: judges give more lenient sentences after taking a lunch break (65% more likely to grant parole – nothing to spit at), attractive defendants are viewed favorably by unwashed juries and trained jurists alike, and the prejudices of all kinds exist against various “out” groups, which can tip the scales in favor of a guilty verdict or to harsher sentences. Why then would someone have an aversion to the introduction of AI into a system that is clearly ruled, in part, by the quirks of human psychology?

DoNotPay is an an app that helps drivers fight parking tickets. It allows drivers with legitimate medical emergencies to gain exemptions. So, as Pasquale says, not only will traffic management be automated, but so will appeals. However, as he cautions, a flesh-and-blood lawyer takes responsibility for bad advice. The DoNotPay not only fails to take responsibility, but “holds its client responsible for when its proprietor is harmed by the interaction.” There is little reason to think machines would do a worse job of adhering to privacy guidelines than human beings unless, as mentioned in the example of a machine ratting on its client, there is some overriding principle that would compel them to divulge the information to protect several people from harm if their diagnosis in some way makes them as a danger in their personal or professional life. Is the client responsible for the mistakes of the robot it has hired? Should the blame not fall upon the firm who has provided the service?

Making a blockchain that could handle the demands of processing purchases and sales, one that takes into account all the relevant variables to make expert judgements on a matter, is no small task. As the infamous disagreement over the meaning of the word “chicken” in Frigaliment v. B.N.S International Sales Group illustrates, the definitions of what anything is can be a bit puzzling. The need to maintain a decent reputation to maintain sales is a strong incentive against knowingly cheating customers, but although cheating tends to be the exception for this reason, it is still necessary to protect against it. As one official on the  Commodity Futures Trading Commission put it, “where a smart contract’s conditions depend upon real-world data (e.g., the price of a commodity future at a given time), agreed-upon outside systems, called oracles, can be developed to monitor and verify prices, performance, or other real-world events.”

Pasquale cites the SEC’s decision to force providers of asset-backed securities to file “downloadable source code in Python.” AmeriCredit responded by saying it  “should not be forced to predict and therefore program every possible slight iteration of all waterfall payments” because its business is “automobile loans, not software development.” AmeriTrade does not seem to be familiar with machine learning. There is a case for making all financial transactions and agreements explicit on an immutable platform like blockchain. There is also a case for making all such code open source, ready to be scrutinized by those with the talents to do so or, in the near future, by those with access to software that can quickly turn it into plain English, Spanish, Mandarin, Bantu, Etruscan, etc.

During the fallout of the 2008 crisis, some homeowners noticed the entities on their foreclosure paperwork did not match the paperwork they received when their mortgages were sold to a trust. According to Dayen (2010) many banks did not fill out the paperwork at all. This seems to be a rather forceful argument in favor of the incorporation of synthetic agents into law practices. Like many futurists Pasquale foresees an increase in “complementary automation.” The cooperation of chess engines with humans can still trounce the best AI out there. This is a commonly cited example of how two (very different) heads are better than one.  Yet going to a lawyer is not like visiting a tailor. People, including fairly delusional ones, know if their clothes fit. Yet they do not know whether they’ve received expert counsel or not – although, the outcome of the case might give them a hint.

Pasquale concludes his paper by asserting that “the rule of law entails a system of social relationships and legitimate governance, not simply the transfer and evaluation of information about behavior.” This is closely related to the doubts expressed at the beginning of the piece about the usefulness of data sets in training legal AI. He then states that those in the legal profession must handle “intractable conflicts of values that repeatedly require thoughtful discretion and negotiation.” This appears to be the legal equivalent of epistemological mysterianism. It stands on still shakier ground than its analogue because it is clear that laws are, or should be, rooted in some set of criteria agreed upon by the members of a given jurisdiction. Shouldn’t the rulings of law makers and the values that inform them be at least partially quantifiable? There are efforts, like EthicsNet, which are trying to prepare datasets and criteria to feed machines in the future (because they will certainly have to be fed by someone!).  There is no doubt that the human touch in law will not be supplanted soon, but the question is whether our intuition should be exalted as guarantee of fairness or a hindrance to moving beyond a legal system bogged down by the baggage of human foibles.

Adam Alonzi is a writer, biotechnologist, documentary maker, futurist, inventor, programmer, and author of the novels A Plank in Reason and Praying for Death: A Zombie Apocalypse. He is an analyst for the Millennium Project, the Head Media Director for BioViva Sciences, and Editor-in-Chief of Radical Science News. Listen to his podcasts here. Read his blog here.

What Everyone is Missing About Trump Literally Going Nuclear on Twitter – Article by Carey Wedler

What Everyone is Missing About Trump Literally Going Nuclear on Twitter – Article by Carey Wedler

The New Renaissance HatCarey Wedler
******************************

Trump critics have long cautioned that the president-elect’s impending administration will be a disaster, often referencing the potential for nuclear war. Trump can’t be trusted with his hand on the nuclear button, many warn.

On Friday, President-elect Donald Trump confirmed these fears when he tweeted in favor of expanding America’s nuclear arsenal.

His statement drew widespread criticism, with Twitter users sounding the end of the world as we know it. Few people outside Trump’s loyal fan base would deny the severe risks Trump poses by vowing to expand America’s nuclear arsenal. But as the media launches a barrage of condescending condemnations of Trump’s nuclear fantasies, many outlets are ignoring vital context.

Trump appears to represent chaos and danger and would undoubtedly hamper U.S.  and global interests by bloating the country’s nuclear weapons systems. His recklessness contrasts starkly with Obama’s seemingly reasoned approach.

Earlier this year, Obama asserted that “Of all the threats to global security and peace, the most dangerous is the proliferation and potential use of nuclear weapons.”

But when Obama made those comments, he had already directly contradicted his own rhetoric against the destructive weaponry. During his presidency, he ensured the country’s nuclear triad would be modernized, a massive project that includes sweeping nuclear modernizations that include improved weapons, bombers, missiles and submarines.” This endeavor is slated to cost over one trillion dollars over the next three decades — an indicator the U.S. government, under the guidance of Obama, seeks to establish a long-term commitment to nuclear arms.

This is hardly evidence of a president committed to reducing the influence and dangers of nuclear weapons, even as he preaches to other nations about the need to dispose of them. Though he deserves a modicum of credit for committing $5 billion to efforts to better secure nuclear weapons, this accounts for less than one percent of the United States’ total military budget. He ultimately scaled back his original goals on this endeavor.

Further, according to figures from the Pentagon, “the current administration has reduced the nuclear stockpile less than any other post-Cold War presidency,” the New York Times reported in May. President Obama “reduced the size of the nation’s nuclear stockpile at a far slower rate than did any of his three immediate predecessors, including George Bush and George W. Bush.

According to Hans M. Kristensen of the anti-armament Nuclear Information Project, though Obama’s progress has been disappointing on some fronts, he deserves credit for dismantling some nuclear warheads over the years. Kristensen also cited Russia as a justification for Obama’s less-than-impressive record on disarmament.

His vision of significant reductions and putting an end to Cold War thinking has been undercut by opposition ranging from Congress to the Kremlin,” Mr. Kristensen wrote in a blog. “An entrenched and almost ideologically-opposed Congress has fought his arms reduction vision every step of the way.

Though Kristensen lays some of the blame on tensions with Russia, the president’s own policies have exacerbated this very rift with a country that has more nuclear weapons than the United States, which comes in second place. Amid Obama’s resistance to finding common ground with Russian President Vladimir Putin in the fight against the Islamic State and other radical factions in Syria, he has escalated hostilities between the two countries, offering few ill-fated efforts to cooperate.

In one such example, Secretary of State John Kerry insisted Russia must be held accountable for bombing hospitals in Syria. This a noble goal — and the Russian military has inexcusably killed civilians — but the secretary failed to acknowledge that the United States military bombed a Doctors Without Borders hospital in Afghanistan in 2015 and that the U.S.-backed, Saudi-led assault on Yemen has destroyed multiple hospitals in the last year and killed thousands of civilians. He also failed to note that the United States has killed one million Iraqis.

Obama has also publicly adopted the Democratic narrative that Russia hacked the U.S. election and vowed retaliation in spite of the fact no evidence has been presented. These accusations led Putin to intimate that the U.S. needs to present proof of or, essentially, shut up (interestingly, a Clinton presidency, which, according to establishment figures was thwarted by Russia, would have increased the risk of a nuclear confrontation due to her interventionist approach to the Syrian conflict).

Before these developments, the consistent deterioration of U.S.-Russia ties inspired scientists who operate a “Doomsday Clock” to keep the time at three minutes to midnight. Midnight represents doomsday. They wrote:

“That decision is not good news, but an expression of dismay that world leaders continue to fail to focus their efforts and the world’s attention on reducing the extreme danger posed by nuclear weapons and climate change.”

(Indeed, scientists have warned that nuclear weapons are a major threat to the environment, a reality apparently overlooked by President Obama.)

The ongoing hypocrisy on the part of the Obama administration is exactly why it’s difficult to take outrage about Donald Trump’s nuclear designs seriously.

There is no doubt Trump is advocating dangerous policies on nuclear weapons, but like many other issues currently terrifying Americans fearful of The Donald, Obama set the stage for Trump to implement his aggressive goals. Obama’s expansion of presidential powers, such as setting the precedent that presidents may kill American citizens without trial, will make it that much easier for President Trump to impose ill-advised, risky policies.

The same is to be expected once Trump takes control of a nuclear arsenal Obama dutifully expanded.


Carey Wedler joined Anti-Media as an independent journalist in September of 2014. Her topics of interest include the police and warfare states, the Drug War, the relevance of history to current problems and solutions, and positive developments that drive humanity forward. She currently resides in Los Angeles, California, where she was born and raised.

This article (What Everyone is Missing About Trump Literally Going Nuclear on Twitter) is free and open-source. You have permission to republish this article under a Creative Commons license with attribution to Carey Wedler and theAntiMedia.org. Anti-Media Radio airs weeknights at 11 pm Eastern/8 pm Pacific.