Browsed by
Tag: privacy

Gennady Stolyarov II Interviews Ray Kurzweil at RAAD Fest 2018

Gennady Stolyarov II Interviews Ray Kurzweil at RAAD Fest 2018

Gennady Stolyarov II
Ray Kurzweil


The Stolyarov-Kurzweil Interview has been released at last! Watch it on YouTube here.

U.S. Transhumanist Party Chairman Gennady Stolyarov II posed a wide array of questions for inventor, futurist, and Singularitarian Dr. Ray Kurzweil on September 21, 2018, at RAAD Fest 2018 in San Diego, California. Topics discussed include advances in robotics and the potential for household robots, artificial intelligence and overcoming the pitfalls of AI bias, the importance of philosophy, culture, and politics in ensuring that humankind realizes the best possible future, how emerging technologies can protect privacy and verify the truthfulness of information being analyzed by algorithms, as well as insights that can assist in the attainment of longevity and the preservation of good health – including a brief foray into how Ray Kurzweil overcame his Type 2 Diabetes.

Learn more about RAAD Fest here. RAAD Fest 2019 will occur in Las Vegas during October 3-6, 2019.

Become a member of the U.S. Transhumanist Party for free, no matter where you reside. Fill out our Membership Application Form.

Watch the presentation by Gennady Stolyarov II at RAAD Fest 2018, entitled, “The U.S. Transhumanist Party: Four Years of Advocating for the Future”.

Fifth Enlightenment Salon – Discussions on Longevity, Gene Therapy, Overcoming Disabilities, Animal Lifespans, Education, and Privacy

Fifth Enlightenment Salon – Discussions on Longevity, Gene Therapy, Overcoming Disabilities, Animal Lifespans, Education, and Privacy

Gennady Stolyarov II
Bill Andrews
James Kohagen
Bobby Ridge
John Murrieta


On October 13, 2018, in the spirit of the Age of Enlightenment and its furtherance today, Gennady Stolyarov II, Bill Andrews, James Kohagen, Bobby Ridge, and John Murrieta met for the fifth interdisciplinary discussion – hosted by Mr. Stolyarov – on science, culture, education, advocacy, and policy. Subjects discussed included the following:

– The recent RAAD Fest 2018 in San Diego
– Developments in the field of gene therapy
– Advances in epidural stimulation for treating and overcoming spinal-cord injuries
– Long-lived organisms and their similarities and dissimilarities to humans
– How animal experiments can become more humane
– How contemporary science still has far to go to accumulate even fairly basic information about certain organisms
– How the study of lifespans can be included in educational curricula starting at early childhood
– Whether privacy will remain in a more technologically interconnected future.

Join the U.S. Transhumanist Party for free, no matter where you reside by filling out an application form that takes less than a minute.

Find out about Death is Wrong – the illustrated children’s book on indefinite life extension.

Review of Frank Pasquale’s “A Rule of Persons, Not Machines: The Limits of Legal Automation” – Article by Adam Alonzi

Review of Frank Pasquale’s “A Rule of Persons, Not Machines: The Limits of Legal Automation” – Article by Adam Alonzi

Adam Alonzi


From the beginning Frank Pasquale, author of The Black Box Society: The Secret Algorithms That Control Money and Information, contends in his new paper “A Rule of Persons, Not Machines: The Limits of Legal Automation” that software, given its brittleness, is not designed to deal with the complexities of taking a case through court and establishing a verdict. As he understands it, an AI cannot deviate far from the rules laid down by its creator. This assumption, which is not even quite right at the present time, only slightly tinges an otherwise erudite, sincere, and balanced coverage of the topic. He does not show much faith in the use of past cases to create datasets for the next generation of paralegals, automated legal services, and, in the more distant future, lawyers and jurists.

Lawrence Zelanik has noted that when taxes were filed entirely on paper, provisions were limited to avoid unreasonably imposing irksome nuances on the average person. Tax-return software has eliminated this “complexity constraint.” He goes on to state that without this the laws, and the software that interprets it, are akin to a “black box” for those who must abide by them. William Gale has said taxes could be easily computed for “non-itemizers.” In other words, the government could use information it already has to present a “bill” to this class of taxpayers, saving time and money for all parties involved. However, simplification does not always align with everyone’s interests. TurboTax’s business, which is built entirely on helping ordinary people navigate the labyrinth is the American federal income tax, noticed a threat to its business model. This prompted it to put together a grassroots campaign to fight such measures. More than just another example of a business protecting its interests, it is an ominous foreshadowing of an escalation scenario that will transpire in many areas if and when legal AI becomes sufficiently advanced.

Pasquale writes: “Technologists cannot assume that computational solutions to one problem will not affect the scope and nature of that problem. Instead, as technology enters fields, problems change, as various parties seek to either entrench or disrupt aspects of the present situation for their own advantage.”

What he is referring to here, in everything but name, is an arms race. The vastly superior computational powers of robot lawyers may make the already perverse incentive to make ever more Byzantine rules ever more attractive to bureaucracies and lawyers. The concern is that the clauses and dependencies hidden within contracts will quickly explode, making them far too detailed even for professionals to make sense of in a reasonable amount of time. Given that this sort of software may become a necessary accoutrement in most or all legal matters means that the demand for it, or for professionals with access to it, will expand greatly at the expense of those who are unwilling or unable to adopt it. This, though Pasquale only hints at it, may lead to greater imbalances in socioeconomic power. On the other hand, he does not consider the possibility of bottom-up open-source (or state-led) efforts to create synthetic public defenders. While this may seem idealistic, it is fairly clear that the open-source model can compete with and, in some areas, outperform proprietary competitors.

It is not unlikely that within subdomains of law that an array of arms races can and will arise between synthetic intelligences. If a lawyer knows its client is guilty, should it squeal? This will change the way jurisprudence works in many countries, but it would seem unwise to program any robot to knowingly lie about whether a crime, particularly a serious one, has been committed – including by omission. If it is fighting against a punishment it deems overly harsh for a given crime, for trespassing to get a closer look at a rabid raccoon or unintentional jaywalking, should it maintain its client’s innocence as a means to an end? A moral consequentialist, seeing no harm was done (or in some instances, could possibly have been done), may persist in pleading innocent. A synthetic lawyer may be more pragmatic than deontological, but it is not entirely correct, and certainly shortsighted, to (mis)characterize AI as only capable of blindly following a set of instructions, like a Fortran program made to compute the nth member of the Fibonacci series.

Human courts are rife with biases: judges give more lenient sentences after taking a lunch break (65% more likely to grant parole – nothing to spit at), attractive defendants are viewed favorably by unwashed juries and trained jurists alike, and the prejudices of all kinds exist against various “out” groups, which can tip the scales in favor of a guilty verdict or to harsher sentences. Why then would someone have an aversion to the introduction of AI into a system that is clearly ruled, in part, by the quirks of human psychology?

DoNotPay is an an app that helps drivers fight parking tickets. It allows drivers with legitimate medical emergencies to gain exemptions. So, as Pasquale says, not only will traffic management be automated, but so will appeals. However, as he cautions, a flesh-and-blood lawyer takes responsibility for bad advice. The DoNotPay not only fails to take responsibility, but “holds its client responsible for when its proprietor is harmed by the interaction.” There is little reason to think machines would do a worse job of adhering to privacy guidelines than human beings unless, as mentioned in the example of a machine ratting on its client, there is some overriding principle that would compel them to divulge the information to protect several people from harm if their diagnosis in some way makes them as a danger in their personal or professional life. Is the client responsible for the mistakes of the robot it has hired? Should the blame not fall upon the firm who has provided the service?

Making a blockchain that could handle the demands of processing purchases and sales, one that takes into account all the relevant variables to make expert judgements on a matter, is no small task. As the infamous disagreement over the meaning of the word “chicken” in Frigaliment v. B.N.S International Sales Group illustrates, the definitions of what anything is can be a bit puzzling. The need to maintain a decent reputation to maintain sales is a strong incentive against knowingly cheating customers, but although cheating tends to be the exception for this reason, it is still necessary to protect against it. As one official on the  Commodity Futures Trading Commission put it, “where a smart contract’s conditions depend upon real-world data (e.g., the price of a commodity future at a given time), agreed-upon outside systems, called oracles, can be developed to monitor and verify prices, performance, or other real-world events.”

Pasquale cites the SEC’s decision to force providers of asset-backed securities to file “downloadable source code in Python.” AmeriCredit responded by saying it  “should not be forced to predict and therefore program every possible slight iteration of all waterfall payments” because its business is “automobile loans, not software development.” AmeriTrade does not seem to be familiar with machine learning. There is a case for making all financial transactions and agreements explicit on an immutable platform like blockchain. There is also a case for making all such code open source, ready to be scrutinized by those with the talents to do so or, in the near future, by those with access to software that can quickly turn it into plain English, Spanish, Mandarin, Bantu, Etruscan, etc.

During the fallout of the 2008 crisis, some homeowners noticed the entities on their foreclosure paperwork did not match the paperwork they received when their mortgages were sold to a trust. According to Dayen (2010) many banks did not fill out the paperwork at all. This seems to be a rather forceful argument in favor of the incorporation of synthetic agents into law practices. Like many futurists Pasquale foresees an increase in “complementary automation.” The cooperation of chess engines with humans can still trounce the best AI out there. This is a commonly cited example of how two (very different) heads are better than one.  Yet going to a lawyer is not like visiting a tailor. People, including fairly delusional ones, know if their clothes fit. Yet they do not know whether they’ve received expert counsel or not – although, the outcome of the case might give them a hint.

Pasquale concludes his paper by asserting that “the rule of law entails a system of social relationships and legitimate governance, not simply the transfer and evaluation of information about behavior.” This is closely related to the doubts expressed at the beginning of the piece about the usefulness of data sets in training legal AI. He then states that those in the legal profession must handle “intractable conflicts of values that repeatedly require thoughtful discretion and negotiation.” This appears to be the legal equivalent of epistemological mysterianism. It stands on still shakier ground than its analogue because it is clear that laws are, or should be, rooted in some set of criteria agreed upon by the members of a given jurisdiction. Shouldn’t the rulings of law makers and the values that inform them be at least partially quantifiable? There are efforts, like EthicsNet, which are trying to prepare datasets and criteria to feed machines in the future (because they will certainly have to be fed by someone!).  There is no doubt that the human touch in law will not be supplanted soon, but the question is whether our intuition should be exalted as guarantee of fairness or a hindrance to moving beyond a legal system bogged down by the baggage of human foibles.

Adam Alonzi is a writer, biotechnologist, documentary maker, futurist, inventor, programmer, and author of the novels A Plank in Reason and Praying for Death: A Zombie Apocalypse. He is an analyst for the Millennium Project, the Head Media Director for BioViva Sciences, and Editor-in-Chief of Radical Science News. Listen to his podcasts here. Read his blog here.

Transhumanism: Contemporary Issues – Presentation by Gennady Stolyarov II at VSIM:17 Conference in Ravda, Bulgaria

Transhumanism: Contemporary Issues – Presentation by Gennady Stolyarov II at VSIM:17 Conference in Ravda, Bulgaria

The New Renaissance Hat

G. Stolyarov II


Gennady Stolyarov II, Chairman of the U.S. Transhumanist Party, outlines common differences in perspectives in three key areas of contemporary transhumanist discourse: artificial intelligence, religion, and privacy. Mr. Stolyarov follows his presentation of each issue with the U.S. Transhumanist Party’s official stances, which endeavor to resolve commonplace debates and find new common ground in these areas. Watch the video of Mr. Stolyarov’s presentation here.

This presentation was delivered by Mr. Stolyarov on September 14, 2017, virtually to the Vanguard Scientific Instruments in Management 2017 (VSIM:17) Conference in Ravda, Bulgaria. Mr. Stolyarov was introduced by Professor Angel Marchev, Sr. –  the organizer of the conference and the U.S. Transhumanist Party’s Ambassador to Bulgaria.

After his presentation, Mr. Stolyarov answered questions from the audience on the subjects of the political orientation of transhumanism, what the institutional norms of a transhuman society would look like, and how best to advance transhumanist ideas.

Download and view the slides of Mr. Stolyarov’s presentation (with hyperlinks) here.

Listen to the Transhumanist March (March #12, Op. 78), composed by Mr. Stolyarov in 2014, here.

Visit the website of the U.S. Transhumanist Party here.

Become a member of the U.S. Transhumanist Party for free, no matter where you reside. Fill out our Membership Application Form here.

Become a Foreign Ambassador for the U.S. Transhumanist Party. Apply here.

U.S. Transhumanist Party Support for H.R. 1868, the Restoring American Privacy Act of 2017 – Post by G. Stolyarov II

U.S. Transhumanist Party Support for H.R. 1868, the Restoring American Privacy Act of 2017 – Post by G. Stolyarov II

The New Renaissance HatG. Stolyarov II
******************************

The United States Transhumanist Party and Nevada Transhumanist Party support H.R. 1868, the Restoring American Privacy Act of 2017, proposed by Rep. Jacky Rosen of Henderson, Nevada.

This bill, if enacted into law, would undo the power recently granted by S.J. Res. 34 for regional-monopoly Internet Service Providers (ISPs) to sell individuals’ private data – including browsing histories – without those individuals’ consent. For more details, read Caleb Chen’s article on Privacy News Online, “Congresswoman Rosen introduces Restoring American Privacy Act of 2017 to reverse S.J. Res. 34”.

Section I of the U.S. Transhumanist Party Platform states, “The United States Transhumanist Party strongly supports individual privacy and liberty over how to apply technology to one’s personal life. The United States Transhumanist Party holds that each individual should remain completely sovereign in the choice to disclose or not disclose personal activities, preferences, and beliefs within the public sphere. As such, the United States Transhumanist Party opposes all forms of mass surveillance and any intrusion by governmental or private institutions upon non-coercive activities that an individual has chosen to retain within his, her, or its private sphere. However, the United States Transhumanist Party also recognizes that no individuals should be protected from peaceful criticism of any matters that those individuals have chosen to disclose within the sphere of public knowledge and discourse.”

Neither governmental nor private institutions – especially private institutions with coercive monopoly powers granted to them by laws barring or limiting competition – should be permitted to deprive individuals of the choice over whether or not to disclose their personal information.

Individuals’ ownership over their own data and sovereignty over whether or not to disclose any browsing history or other history of online visitation to external entities are essential components of privacy, and we applaud Representative Rosen for her efforts to restore these concepts within United States federal law.

Become a member of the U.S. Transhumanist Party for free  by filling out the membership application form here

The IRS Believes All Bitcoin Users are Tax Cheats – Article by Jim Harper

The IRS Believes All Bitcoin Users are Tax Cheats – Article by Jim Harper

The New Renaissance HatJim Harper
******************************

The Internal Revenue Service has filed a “John Doe” summons seeking to require U.S. Bitcoin exchange Coinbase to turn over records about every transaction of every user from 2013 to 2015. That demand is shocking in sweep, and it includes: “complete user profile, history of changes to user profile from account inception, complete user preferences, complete user security settings and history (including confirmed devices and account activity), complete user payment methods, and any other information related to the funding sources for the account/wallet/vault, regardless of date.” And every single transaction:

All records of account/wallet/vault activity including transaction logs or other records identifying the date, amount, and type of transaction (purchase/sale/exchange), the post transaction balance, the names or other identifiers of counterparties to the transaction; requests or instructions to send or receive bitcoin; and, where counterparties transact through their own Coinbase accounts/wallets/vaults, all available information identifying the users of such accounts and their contact information.

The demand is not limited to owners of large amounts of Bitcoin or to those who have transacted in large amounts. Everything about everyone.

Equally shocking is the weak foundation for making this demand. In a declaration submitted to the court, an IRS agent recounts having learned of tax evasion on the part of one Bitcoin user and two companies. On this basis, he and the IRS claim “a reasonable basis for believing” that all U.S. Coinbase users “may fail or may have failed to comply” with the internal revenue laws.

If that evidence is enough to create a reasonable basis to believe that all Bitcoin users evade taxes, the IRS is entitled to access the records of everyone who uses paper money.

Anecdotes and online bragodaccio about tax avoidance are not a reasonable basis to believe that all Coinbase users are tax cheats whose financial lives should be opened to IRS investigators and the hackers looking over their shoulders. There must be some specific information about particular users, or else the IRS is seeking a general warrant, which the Fourth Amendment denies it the power to do.

Speaking of the Fourth Amendment, that rock-bottom “reasonable basis” standard is probably insufficient. Americans should and probably do have Fourth Amendment rights in information they entrust to financial services providers required by contract to keep it confidential. Observers of Fourth Amendment law know full-well that the “third-party doctrine,” which cancels Fourth Amendment interests in shared information, is in retreat.

The IRS’s effort to strip away the privacy of all Coinbase users is more broad than the government’s effort in recent cases dealing with cell site location information. In the CSLI cases, the government has sought data about particular suspects, using a standard below the probable cause standard required by the Fourth Amendment (“specific and articulable facts showing that there are reasonable grounds to believe”).

In United States v. Benbow, we argued to the D.C. Circuit that people retain a property right in information they share with service providers under contractual privacy obligations. This information is a “paper or effect” for purposes of the Fourth Amendment. Accordingly, a probable cause standard should apply to accessing that data.

Again, the government in the CSLI cases sought information about the cell phone use of particular suspects, and that is controversial enough given the low standard of the Stored Communications Act. Here, the IRS is seeking data about every user of Bitcoin, using a standard that’s even lower.

Coinbase’s privacy policy only permits it to share user information with law enforcement when it is “compelled to do so.” That implies putting up a reasonable fight for the interests of its users. Given the low standard and the vastly overbroad demand, Coinbase seems obligated to put up that fight.

Jim Harper is a senior fellow at the Cato Institute, working to adapt law and policy to the information age in areas such as privacy, cybersecurity, telecommunications, intellectual property, counterterrorism, government transparency, and digital currency. A former counsel to committees in both the U.S. House and the U.S. Senate, he went on to represent companies such as PayPal, ICO-Teledesic, DigitalGlobe, and Verisign, and in 2014 he served as Global Policy Counsel for the Bitcoin Foundation.

Harper holds a JD from the University of California–Hastings College of Law.

This work by Cato Institute is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License. Read the original article.

A Transhumanist Opinion on Privacy – Article by Ryan Starr

A Transhumanist Opinion on Privacy – Article by Ryan Starr

The New Renaissance HatRyan Starr

******************************

Privacy is a favorite topic of mine. Maintaining individual privacy is a crucial element in free society. Yet there are many who want to invade it for personal or political gain. As our digital fingerprint becomes a part of our notion of self, how do we maintain our personal privacy on an inherently impersonal network of data? Where do we draw that line on what is private, and how do we enforce it? These are questions that are difficult to answer when looking at a short-term perspective. However, if we look further into the probable future, we can create a plan that helps protect the privacy of citizens today and for generations to come. By taking into account the almost certain physical merger of human biology and technology, the answer becomes clear. Our electronic data should be treated as part of our bodily autonomy.

The explosive success of social media has shown that we already view ourselves as partly digital entities. Where we go, what we eat, and who we are with is proudly displayed in cyberspace for eternity. But beyond that we store unique data about ourselves “securely” on the internet. Bank accounts, tax returns, even medical information are filed away on a server somewhere and specifically identified as us. It’s no longer solely what we chose to let people see. We are physical and digital beings, and it is time we view these two sides as one before we take the next step into enhanced humanity.

Subdermal storage of electronic data is here, and its storage capabilities will expand rapidly. Soon we will be able to store a lot more than just access codes for our doors. It is hard to speculate exactly what people will chose to keep stored this way, and there may even come a time when what we see and hear is automatically stored this way. But before we go too far into what will be stored, we must understand how this information is accessed in present time. These implants are currently based in NFC technology. Near-Field Communication is a method of storing and transmitting data wirelessly within a very short distance. Yes, “wireless” is the key word. It means that if I can connect my NFC tag to my smart phone by just waiving my hand close to it (usually within an inch or so), then technically someone else can, too. While current antenna limitations and the discreetness of where a person’s tag is implanted create a highly secure method of storage, advances in technology will eventually make it easier to access the individual. This is why it is urgent we develop a streamlined policy for privacy.

The current Transhumanist position is that personally collected intellectual property, whether stored digitally or organically, is the property of the individual. As such, it should be protected from unauthorized search and download. The current platform also states that each individual has the freedom to enhance their own body as they like so long as it doesn’t negatively impact others. However, it does not specify what qualifies as a negative impact or how to prevent it. Morphological freedom is a double-edged sword. A person can a person enhance their ability to access information on themselves, but they can also use it to access others. It is entirely feasible enhancements will be created that allow a person to hack another. And collecting personal data isn’t the only risk with that. What if the hacking victim has an artificial heart or an implanted insulin pump? The hacker could potentially access the code the medical device is operating with and change or delete it, ultimately leading to death. Another scenario might be hacking into someone’s enhanced sensory abilities. Much like in the novel Ender’s Game, a person can access another to see what they see. This ability can be abused countless ways ranging from government surveillance to sexual voyeurism. While this is still firmly within the realm of science fiction, a transhuman society will need to create laws to protect against these person-to-person invasions of privacy.

Now let’s consider mass data collection. Proximity beacons could easily and cheaply be scattered across stores and cities to function as passive collection points much like overhead cameras are today. Retail stands to gain significantly from this technology, especially if they are allowed access to intimate knowledge about customers. Government intelligence gathering also stands to benefit from this capability. Levels of adrenaline, dopamine, and oxytocin stored for personal health analysis could be taken and paired with location data to put together an invasive picture of how people are feeling in a certain situation. Far more can be learned and exploited when discreetly collected biodata is merged with publicly observable activity.

In my mind, these are concerns that should be addressed sooner than later. If we take the appropriate steps to preserve personal privacy in all domains, we can make a positive impact that will last into the 22nd century.
***
Ryan Starr is the leader of the Transhumanist Party of Colorado. This article was originally published on his blog, and has been republished here with his permission.
First They Came For the iPhones… – Article by Ron Paul

First They Came For the iPhones… – Article by Ron Paul

The New Renaissance HatRon Paul
******************************
The FBI tells us that its demand for a back door into the iPhone is all about fighting terrorism, and that it is essential to break in just this one time to find out more about the San Bernardino attack last December. But the truth is they had long sought a way to break Apple’s iPhone encryption and, like 9/11 and the PATRIOT Act, a mass murder provided just the pretext needed. After all, they say, if we are going to be protected from terrorism we have to give up a little of our privacy and liberty. Never mind that government spying on us has not prevented one terrorist attack.

Apple has so far stood up to a federal government’s demand that it force its employees to write a computer program to break into its own product. No doubt Apple CEO Tim Cook understands the damage it would do to his company for the world to know that the US government has a key to supposedly secure iPhones. But the principles at stake are even higher. We have a fundamental right to privacy. We have a fundamental right to go about our daily life without the threat of government surveillance of our activities. We are not East Germany.

Let’s not forget that this new, more secure iPhone was developed partly in response to Ed Snowden’s revelations that the federal government was illegally spying on us. The federal government was caught breaking the law but instead of ending its illegal spying is demanding that private companies make it easier for it to continue.

Last week we also learned that Congress is planning to join the fight against Apple – and us. Members are rushing to set up yet another federal commission to study how our privacy can be violated for false promises of security. Of course they won’t put it that way, but we can be sure that will be the result. Some in Congress are seeking to pass legislation regulating how companies can or cannot encrypt their products. This will suppress the development of new technology and will have a chilling effect on our right to be protected from an intrusive federal government. Any legislation Congress writes limiting encryption will likely be unconstitutional, but unfortunately Congress seldom heeds the Constitution anyway.

When FBI Director James Comey demanded a back door into the San Bernardino shooter’s iPhone, he promised that it was only for this one, extraordinary situation. “The San Bernardino litigation isn’t about trying to set a precedent or send any kind of message,” he said in a statement last week. Testifying before Congress just days later, however, he quickly changed course, telling the Members of the House Intelligence Committee that the court order and Apple’s appeals, “will be instructive for other courts.” Does anyone really believe this will not be considered a precedent-setting case? Does anyone really believe the federal government will not use this technology again and again, with lower and lower thresholds?

According to press reports, Manhattan district attorney Cyrus Vance, Jr., has 175 iPhones with passcodes that the City of New York wants to access. We can be sure that is only the beginning.

We should support Apple’s refusal to bow to the FBI’s dangerous demands, and we should join forces to defend of our precious liberties without compromise. If the people lead, the leaders will follow.

Ron Paul, MD, is a former three-time Republican candidate for U. S. President and Congressman from Texas.

This article is reprinted with permission from the Ron Paul Institute for Peace and Prosperity.

Thanks to “Wiretapping” Laws, Your Cell Phone Is a Felony Machine – Article by Gary McGath

Thanks to “Wiretapping” Laws, Your Cell Phone Is a Felony Machine – Article by Gary McGath

The New Renaissance HatGary McGath
******************************

The prosecutions are clearly meant to chill free speech

In 2006, police in Nashua, New Hampshire, filed charges against Michael Gannon for using a security system in his home. When he brought a security recording to the police to back up a complaint about how he was treated, they arrested him and charged him with “felony wiretapping” — recording what happened in his own house. They were later forced to drop the charges under intense publicity.

The relevant New Hampshire law is titled “Wiretapping and Eavesdropping,” but it isn’t restricted to electronic communications.

It’s a felony if someone “willfully intercepts, endeavors to intercept, or procures any other person to intercept or endeavor to intercept, any telecommunication or oral communication.”

Intercepting means “the aural or other acquisition of, or the recording of, the contents of any telecommunication or oral communication through the use of any electronic, mechanical, or other device.” Oral communication means “any verbal communication uttered by a person who has a reasonable expectation that the communication is not subject to interception, under circumstances justifying such expectation,” but the law doesn’t define “reasonable expectation.”

Recording what someone else says can be a felony unless it falls under the reasonable-expectation exception. Burglars don’t expect to be recorded. I live in the same city as Gannon; if thieves broke into my home and I recorded their activity, would I dare bring the evidence to the police?

The New Hampshire law is a “two-party consent” law; you can’t even record your own conversation with someone else without letting him or her know. Nine to twelve states, depending on interpretation, have two-party consent requirements.

In recent years activists have successfully pushed back against using those laws to prevent or punish recording police activity. Courts have held that when they’re on duty, cops don’t have a reasonable expectation of privacy. Governments can still use the law against people who record other public speech, though.

In 2015, in Portsmouth, New Hampshire, Christopher David was charged with felony wiretapping for recording a conversation on a public street. He recorded a private citizen telling him he could be prosecuted for running an Uber vehicle, which the city has banned. It’s easy to suspect the city is going after him for competing with the city’s taxis, but officially, his “crime” is recording words directed at him in public.

Illinois had a similarly draconian law often used to punish recording the police, which the state’s Supreme Court struck down. The court held:

The recording provision of the eavesdropping statute … burdens substantially more speech than is necessary to serve a legitimate state interest in protecting conversational privacy. Thus, it does not survive intermediate scrutiny. We hold that the recording provision is unconstitutional on its face because a substantial number of its applications violate the First Amendment.

Any legal prohibition ought to satisfy the question, “What harm to someone does it deter?” Recording a person who comes up to you in public and tells you something doesn’t injure him in any way. If he’s giving away information he doesn’t want known, that’s on his own head.

Eugene Volokh notes that without a clear definition of privacy, prohibitions ostensibly designed to protect it can seriously infringe on free speech. “Once restrictions on people’s speech are accepted in the name of ‘privacy,’ people will likely use them to argue for other restrictions on ‘privacy’ grounds, even when the matter involves a very different sort of ‘privacy.’” This is a serious matter, because “the right to information privacy — my right to control your communication of personally identifiable information about me — is a right to have the government stop you from speaking about me.”

Modern technology allows anyone to make video recordings in public, and if anyone’s voice is picked up without consent, the recording could be a crime punishable by years in jail. David Rittgers, an attorney and legal policy analyst at the Cato Institute, argues, “I think in this modern age where everyone has a ‘felony machine’ in their pocket — a cell phone — the [all-party] consent law is outdated.”

When the government surreptitiously captures records of our private communications, it tells us we shouldn’t worry if we have nothing to hide. When we record people speaking openly in public, quite a different standard applies.

Most of the debate about abusive wiretapping and eavesdropping laws has focused on their use to protect police officers caught misbehaving. The problem doesn’t stop there, though. When “reasonable expectation of privacy” isn’t clearly delimited, any recording of what people say in public can become an excuse to throw people in jail.

Gary McGath is a freelance software engineer living in Nashua, New Hampshire.

This article was published by The Foundation for Economic Education and may be freely distributed, subject to a Creative Commons Attribution 4.0 International License, which requires that credit be given to the author.

The Internet Memory Hole – Article by Wendy McElroy

The Internet Memory Hole – Article by Wendy McElroy

The New Renaissance Hat
Wendy McElroy
November 24, 2014
******************************

Imagine you are considering a candidate as a caregiver for your child. Or maybe you are vetting an applicant for a sensitive position in your company. Perhaps you’re researching a public figure for class or endorsing him in some manner. Whatever the situation, you open your browser and assess the linked information that pops up from a search. Nothing criminal or otherwise objectionable is present, so you proceed with confidence. But what if the information required for you to make a reasoned assessment had been removed by the individual himself?

Under “the right to be forgotten,” a new “human right” established in the European Union in 2012, people can legally require a search engine to delete links to their names, even if information at the linked source is true and involves a public matter such as an arrest. The Google form for requesting removal asks the legally relevant question of why the link is “irrelevant, outdated, or otherwise objectionable.” Then it is up to the search engine to determine whether to delete the link.

The law’s purpose is to prevent people from being stigmatized for life. The effect, however, is to limit freedom of the press, freedom of speech, and access to information. Each person becomes a potential censor who can rewrite history for personal advantage.

It couldn’t happen here

The process of creating such a law in the United States is already underway. American law is increasingly driven by public opinion and polls. The IT security company Software Advice recently conducted a survey that found that “sixty-one percent of Americans believe some version of the right to be forgotten is necessary,” and “thirty-nine percent want a European-style blanket right to be forgotten, without restrictions.” And politicians love to give voters what they want.

In January 2015, California will enforce the Privacy Rights for California Minors in the Digital World law. This is the first state version of a “right to be forgotten” law. It requires “the operator of an Internet Web site, online service, online application, or mobile application to permit a minor, who is a registered user … to remove, or to request and obtain removal of, content or information posted … by the minor.” (There are some exceptions.)

Meanwhile, the consumer-rights group Consumer Watchdog has floated the idea that Google should voluntarily provide Americans with the right to be forgotten. On September 30, 2014, Forbes stated, “The fight for the right to be forgotten is certainly coming to the U.S., and sooner than you may think.” For one thing, there is a continuing hue and cry about embarrassing photos of minors and celebrities being circulated.

Who and what deserves to be forgotten?

What form would the laws likely take? In the Stanford Law Review (February 13, 2012), legal commentator Jeffrey Rosen presented three categories of information that would be vulnerable if the EU rules became a model. First, material posted could be “unlinked” at the poster’s request. Second, material copied by another site could “almost certainly” be unlinked at the original poster’s request unless its retention was deemed “necessary” to “the right of freedom of expression.” Rosen explained, “Essentially, this puts the burden on” the publisher to prove that the link “is a legitimate journalistic (or literary or artistic) exercise.” Third, the commentary of one individual about another, whether truthful or not, could be vulnerable. Rosen observed that the EU includes “takedown requests for truthful information posted by others.… I can demand takedown and the burden, once again, is on the third party to prove that it falls within the exception for journalistic, artistic, or literary exception.”

Search engines have an incentive to honor requests rather than to absorb the legal cost of fighting them. Rosen said, “The right to be forgotten could make Facebook and Google, for example, liable for up to two percent of their global income if they fail to remove photos that people post about themselves and later regret, even if the photos have been widely distributed already.” An October 12, 2014, article in the UK Daily Mail indicated the impact of compliance on the free flow of public information. The headline: “Google deletes 18,000 UK links under ‘right to be forgotten’ laws in just a month: 60% of Europe-wide requests come from fraudsters, criminals and sex offenders.”

American backlash

America protects the freedoms of speech and the press more vigorously than Europe does. Even California’s limited version of a “right to be forgotten” bill has elicited sharp criticism from civil libertarians and tech-freedom advocates. The IT site TechCrunch expressed the main practical objection: “The web is chaotic, viral, and interconnected. Either the law is completely toothless, or it sets in motion a very scary anti-information snowball.” TechCrunch also expressed the main political objection: The bill “appears to create a head-on collision between privacy law and the First Amendment.”

Conflict between untrue information and free speech need not occur. Peter Fleischer, Google’s global privacy counsel, explained, “Traditional law has mechanisms, like defamation and libel law, to allow a person to seek redress against someone who publishes untrue information about him.… The legal standards are long-standing and fairly clear.” Defamation and libel are controversial issues within the libertarian community, but the point here is that defense against untrue information already exists.

What of true information? Truth is a defense against being charged with defamation or libel. America tends to value freedom of expression above privacy rights. It is no coincidence that the First Amendment is first among the rights protected by the Constitution. And any “right” to delete the truth from the public sphere runs counter to the American tradition of an open public square where information is debated and weighed.

Moreover, even true information can have powerful privacy protection. For example, the Fourth Amendment prohibits the use of data that is collected via unwarranted search and seizure. The Fourteenth Amendment is deemed by the Supreme Court to offer a general protection to family information. And then there are the “protections” of patents, trade secrets, copyrighted literature, and a wide range of products that originate in the mind. Intellectual property is controversial, too. But again, the point here is that defenses already exist.

Reputation capital

Reputation capital consists of the good or bad opinions that a community holds of an individual over time. It is not always accurate, but it is what people think. The opinion is often based on past behaviors, which are sometimes viewed as an indicator of future behavior. In business endeavors, reputation capital is so valuable that aspiring employees will work for free as interns in order to accrue experience and recommendations. Businesses will take a loss to replace an item or to otherwise credit a customer in order to establish a name for fairness. Reputation is thus a path to being hired and to attracting more business. It is a nonfinancial reward for establishing the reliability and good character upon which financial remuneration often rests.

Conversely, if an employee’s bad acts are publicized, then a red flag goes up for future employers who might consider his application. If a company defrauds customers, community gossip could drive it out of business. In the case of negative reputation capital, the person or business who considers dealing with the “reputation deficient” individual is the one who benefits by realizing a risk is involved. Services, such as eBay, often build this benefit into their structure by having buyers or sellers rate individuals. By one estimate, a 1 percent negative rating can reduce the price of an eBay good by 4 percent. This system establishes a strong incentive to build positive reputation capital.

Reputation capital is particularly important because it is one of the key answers to the question, “Without government interference, how do you ensure the quality of goods and services?” In a highly competitive marketplace, reputation becomes a path to success or to failure.

Right-to-be-forgotten laws offer a second chance to an individual who has made a mistake. This is a humane option that many people may choose to extend, especially if the individual will work for less money or offer some other advantage in order to win back his reputation capital. But the association should be a choice. The humane nature of a second chance should not overwhelm the need of others for public information to assess the risks involved in dealing with someone. Indeed, this risk assessment provides the very basis of the burgeoning sharing economy.

History and culture are memory

In “The Right to Be Forgotten: An Insult to Latin American History,” Eduardo Bertoni offers a potent argument. He writes that the law’s “name itself“ is “an affront to Latin America; rather than promoting this type of erasure, we have spent the past few decades in search of the truth regarding what occurred during the dark years of the military dictatorships.” History is little more than preserved memory. Arguably, culture itself lives or dies depending on what is remembered and shared.

And yet, because the right to be forgotten has the politically seductive ring of fairness, it is becoming a popular view. Fleischer called privacy “the new black in censorship fashion.” And it may be increasingly on display in America.

Wendy McElroy (wendy@wendymcelroy.com) is an author, editor of ifeminists.com, and Research Fellow at The Independent Institute (independent.org).

This article was originally published by The Foundation for Economic Education.