Monthly Archives: May 2013

by

Mitochondrially Targeted Antioxidant SS-31 Reverses Some Measures of Aging in Muscle – Article by Reason

No comments yet

Categories: Science, Technology, Transhumanism, Tags: , , , , , , , , , , , , , , , ,

The New Renaissance Hat
Reason
May 26, 2013
Recommend this page.
******************************

Originally published on the Fight Aging! website.

Antioxidants of the sort you can buy at the store and consume are pretty much useless: the evidence shows us that they do nothing for health, and may even work to block some beneficial mechanisms. Targeting antioxidant compounds to the mitochondria in our cells is a whole different story, however. Mitochondria are swarming bacteria-like entities that produce the chemical energy stores used to power cellular processes. This involves chemical reactions that necessarily generate reactive oxygen species (ROS) as a byproduct, and these tend to react with and damage protein machinery in the cell. The machinery that gets damaged the most is that inside the mitochondria, of course, right at ground zero for ROS production. There are some natural antioxidants present in mitochondria, but adding more appears to make a substantial difference to the proportion of ROS that are soaked up versus let loose to cause harm.

If mitochondria were only trivially relevant to health and longevity, this wouldn’t be a terribly interesting topic, and I wouldn’t be talking about it. The evidence strongly favors mitochondrial damage as an important contribution to degenerative aging, however. Most damage in cells is repaired pretty quickly, and mitochondria are regularly destroyed and replaced by a process of division – again, like bacteria. Some rare forms of mitochondrial damage persist, however, eluding quality-control mechanisms and spreading through the mitochondrial population in a cell. This causes cells to fall into a malfunctioning state in which they export massive quantities of ROS out into surrounding tissue and the body at large. As you age, ever more of your cells suffer this fate.

In recent years a number of research groups have been working on ways to deliver antioxidants to the mitochondria, some of which are more relevant to future therapies than others. For example gene therapies to boost levels of natural mitochondrial antioxidants like catalase are unlikely to arrive in the clinic any time soon, but they serve to demonstrate significance by extending healthy life in mice. A Russian research group has been working with plastinquinone compounds that can be ingested and then localize to the mitochondria, and have shown numerous benefits to result in animal studies of the SkQ series of drug candidates.

US-based researchers have been working on a different set of mitochondrially targeted antioxidant compounds, with a focus on burn treatment. However, they recently published a paper claiming reversal of some age-related changes in muscle tissue in mice using their drug candidate SS-31. Note that this is injected, unlike SkQ compounds:

Mitochondrial targeted peptide rapidly improves mitochondrial energetics and skeletal muscle performance in aged mice

Quote:

Mitochondrial dysfunction plays a key pathogenic role in aging skeletal muscle resulting in significant healthcare costs in the developed world. However, there is no pharmacologic treatment to rapidly reverse mitochondrial deficits in the elderly. Here we demonstrate that a single treatment with the mitochondrial targeted peptide SS-31 restores in vivo mitochondrial energetics to young levels in aged mice after only one hour.

Young (5 month old) and old (27 month old) mice were injected intraperitoneally with either saline or 3 mg/kg of SS-31. Skeletal muscle mitochondrial energetics were measured in vivo one hour after injection using a unique combination of optical and 31 P magnetic resonance spectroscopy. Age-related declines in resting and maximal mitochondrial ATP production, coupling of oxidative phosphorylation (P/O), and cell energy state (PCr/ATP) were rapidly reversed after SS-31 treatment, while SS-31 had no observable effect on young muscle.

These effects of SS-31 on mitochondrial energetics in aged muscle were also associated with a more reduced glutathione redox status and lower mitochondrial [ROS] emission. Skeletal muscle of aged mice was more fatigue resistant in situ one hour after SS-31 treatment and eight days of SS-31 treatment led to increased whole animal endurance capacity. These data demonstrate that SS-31 represents a new strategy for reversing age-related deficits in skeletal muscle with potential for translation into human use.

So what is SS-31? If look at the publication history for these authors you’ll find a burn-treatment-focused open-access paper that goes into a little more detail and a 2008 review paper that covers the pharmacology of the SS compounds:

Quote:

The SS peptides, so called because they were designed by Hazel H. Sezto and Peter W. Schiler, are small cell-permeable peptides of less than ten amino acid residues that specifically target to inner mitochondrial membrane and possess mitoprotective properties. There have been a series of SS peptides synthesized and characterized, but for our study, we decided to use SS-31 peptide (H-D-Arg-Dimethyl Tyr-Lys-Phe-NH2) for its well-documented efficacy.

Studies with isolated mitochondrial preparations and cell cultures show that these SS peptides can scavenge ROS, reduce mitochondrial ROS production, and inhibit mitochondrial permeability transition. They are very potent in preventing apoptosis and necrosis induced by oxidative stress or inhibition of the mitochondrial electron transport chain. These peptides have demonstrated excellent efficacy in animal models of ischemia-reperfusion, neurodegeneration, and renal fibrosis, and they are remarkably free of toxicity.

Given the existence of a range of different types of mitochondrial antioxidant and research groups working on them, it seems that we should expect to see therapies emerge into the clinic over the next decade. As ever, the regulatory regime will ensure that they are only approved for use in treatment of specific named diseases and injuries such as burns, however. It’s still impossible to obtain approval for a therapy to treat aging in otherwise healthy individuals in the US, as the FDA doesn’t recognize degenerative aging as a disease. The greatest use of these compounds will therefore occur via medical tourism and in a growing black market for easily synthesized compounds of this sort.

In fact, any dedicated and sufficiently knowledgeable individual could already set up a home chemistry lab, download the relevant papers, and synthesize SkQ or SS compounds. That we don’t see this happening is, I think, more of a measure of the present immaturity of the global medical tourism market than anything else. It lacks an ecosystem of marketplaces and review organizations that would allow chemists to safely participate in and profit from regulatory arbitrage of the sort that is ubiquitous in recreational chemistry.

Reason is the founder of The Longevity Meme (now Fight Aging!). He saw the need for The Longevity Meme in late 2000, after spending a number of years searching for the most useful contribution he could make to the future of healthy life extension. When not advancing the Longevity Meme or Fight Aging!, Reason works as a technologist in a variety of industries.  

This work is reproduced here in accord with a Creative Commons Attribution license.  It was originally published on FightAging.org.

by

Squishy Machines: Bio-Cybernetic Neuron Hybrids – Article by Franco Cortese

No comments yet

Categories: Science, Transhumanism, Tags: , , , , , , , , , , , , , , ,

The New Renaissance Hat
Franco Cortese
May 25, 2013
Recommend this page.
******************************
This essay is the eighth chapter in Franco Cortese’s forthcoming e-book, I Shall Not Go Quietly Into That Good Night!: My Quest to Cure Death, published by the Center for Transhumanity. The first seven chapters were previously published on The Rational Argumentator under the following titles:
– Chapter 1: The Moral Imperative and Technical Feasibility of Defeating Death
– Chapter 2: Immortality: Material or Ethereal? Nanotech Does Both!
– Chapter 3: Concepts for Functional Replication of Biological Neurons
– Chapter 4: Gradual Neuron Replacement for the Preservation of Subjective-Continuity
– Chapter 5: Wireless Synapses, Artificial Plasticity, and Neuromodulation
– Chapter 6: “Mind as Interference with Itself: A New Approach to Immediate Subjective-Continuity
– Chapter 7: “Neuronal ‘Scanning’ and NRU Integration
***

By 2009 I felt the major classes of physicalist-functionalist replication approaches to be largely developed, producing now only potential minor variations in approach and procedure. These developments consisted of contingency plans in the case that some aspect of neuronal operation couldn’t be replicated with alternate, non-biological physical systems and processes, based around the goal of maintaining those biological (or otherwise organic) systems and processes artificially and of integrating them with the processes that could be reproduced artificially.

2009 also saw further developments in the computational approach, where I conceptualized a new sub-division in the larger class of the informational-functionalist (i.e., computational, which encompasses both simulation and emulation) replication approach, which is detailed in the next chapter.

Developments in the Physicalist Approach

During this time I explored mainly varieties of the cybernetic-physical functionalist approach. This involved the use of replicatory units that preserve certain biological aspects of the neuron while replacing certain others with functionalist replacements, and other NRUs that preserved alternate biological aspects of the neuron while replacing different aspects with functional replacements. The reasoning behind this approach was twofold. The first was that there was a chance, no matter how small, that we might fail to sufficiently replicate some relevant aspect(s) of the neuron either computationally or physically by failing to understand the underlying principles of that particular sub-process/aspect. The second was to have an approach that would work in the event that there was some material aspect that couldn’t be sufficiently replicated via non-biological physically embodied systems (i.e., the normative physical-functionalist approach).

However, these varieties were conceived of in case we couldn’t replicate certain components successfully (i.e., without functional divergence). The chances of preserving subjective-continuity in such circumstances are increased by the number of varieties we have for this class of model (i.e., different arrangements of mechanical replacement components and biological components), because we don’t know which we would fail to functionally replicate.

This class of physical-functionalist model can be usefully considered as electromechanical-biological hybrids, wherein the receptors (i.e., transporter proteins) on the post-synaptic membrane are integrated with the artificial membrane and in coexistence with artificial ion-channels, or wherein the biological membrane is retained while the receptor and ion-channels are replaced with functional equivalents instead. The biological components would be extracted from the existing biological neurons and reintegrated with the artificial membrane. Otherwise they would have to be synthesized via electromechanical systems, such as, but not limited to, the use of chemical stores of amino-acids released in specific sequences to facilitate in vivo protein folding and synthesis, which would then be transported to and integrated with the artificial membrane. This is better than providing stores of pre-synthesized proteins, due to more complexities in storing synthesized proteins without decay or functional degradation over storage-time, and in restoring them from their “stored”, inactive state to a functionally-active state when they were ready for use.

During this time I also explored the possibility of using the neuron’s existing protein-synthesis systems to facilitate the construction and gradual integration of the artificial sections with the existing lipid bilayer membrane. Work in synthetic biology allows us to use viral gene vectors to replace a given cell’s constituent genome—and consequently allowing us to make it manufacture various non-organic substances in replacement of the substances created via its normative protein-synthesis. We could use such techniques to replace the existing protein-synthesis instructions with ones that manufacture and integrate the molecular materials constituting the artificial membrane sections and artificial ion-channels and ion-pumps. Indeed, it may even be a functional necessity to gradually replace a given neuron’s protein-synthesis machinery with protein-synthesis-based machinery for the replacement, integration and maintenance of the non-biological sections’ material, because otherwise those parts of the neuron would still be trying to rebuild each section of lipid bilayer membrane we iteratively remove and replace. This could be problematic, and so for successful gradual replacement of single neurons, a means of gradually switching off and/or replacing portions of the cell’s protein-synthesis systems may be required.

Franco Cortese is an editor for Transhumanity.net, as well as one of its most frequent contributors.  He has also published articles and essays on Immortal Life and The Rational Argumentator. He contributed 4 essays and 7 debate responses to the digital anthology Human Destiny is to Eliminate Death: Essays, Rants and Arguments About Immortality.

Franco is an Advisor for Lifeboat Foundation (on its Futurists Board and its Life Extension Board) and contributes regularly to its blog.

by

The Extraordinary Business of Life – Article by Sanford Ikeda

No comments yet

Categories: Business, Economics, History, Tags: , , , , , , , , , , , , , , , , , , ,

The New Renaissance Hat
Sanford Ikeda
May 25, 2013
Recommend this page.
******************************

I heard it again from this year’s commencement speaker: the common mistake of thinking economics is just about business and making money. I know I’m not the only economics teacher who every year has to disabuse his students (and many of his own colleagues from other disciplines) of that same error.

Economics is not business administration or accounting. Economics is a science that studies how people interact when the means at their disposal are scarce in relation to their ends. That includes business, of course, but a whole lot more as well.

Where Does That Notion Come From?

Well, for starters, perhaps from one of the greatest economists in history, Alfred Marshall. He opens his highly influential textbook, first published in 1890, with this statement:

“Political Economy or Economics is a study of mankind in the ordinary business of life; it examines that part of individual and social action which is most closely connected with the attainment and with the use of the material requisites of wellbeing.” (Emphasis added)

This definition more or less prevailed until 1932, when another British economist, Lionel Robbins, defined economic science as being concerned with an aspect of all human action insofar as it involves making choices, not with a part of individual action. Economics, in other words, is the science of choice. Its starting point is not the “material requisites of wellbeing” but a person’s subjective valuation of her circumstances. Ludwig von Mises got it, which is why he called his magnum opus, simply, Human Action.

Similarly, Libertarianism Isn’t Pro-Business

An equally common mistake is to think that supporters of the free market are “pro-business” and favor so-called crony capitalism. But a consistent free-market supporter is neither pro-business nor anti-business, pro-labor nor anti-labor. A free market to us is what happens when you safeguard private property, free association, and consistent governance and then just leave people alone.

Part of the misunderstanding here might stem from the term “free market” itself. Since people tend to associate markets with buying and selling, jobs, and making (and losing) money, it’s perhaps understandable that they would think that advocates of the free market must be concerned mainly about business-related stuff: profits and losses, efficiency, and creating and marketing new products.

Indeed, I’ve met quite a few who claim to favor “free-market capitalism” merely because they believe in making as much money as possible in their lifetimes. It’s not surprising that many of these folks do tend to be pro-business and supporters of crony capitalism. I want to ask them not to be on my side.

Connotations aside, the free market encompasses far more than the stuff of business or a money-making scheme. Yes, it does include the essentials of private property, free association, and stable governance. But a dynamic market process that generates widespread material prosperity and promotes the pursuit of happiness would not be possible if it were based solely on the relentless pursuit of one’s narrow self-interest. Markets would not have gotten as far as they have today (with per-capita GDP up more than fiftyfold since 1700) if people didn’t also follow norms of honesty and fair play, trust and reciprocity. Such norms are without question partly the result of self-interest; few would trade with us if we weren’t honest and fair. But, as Adam Smith taught us, these norms also arise in large measure from a sense of sympathy, of fellow-feeling and fairness, that comes from our ability to see others as we see ourselves, and vice versa. This is why in most contexts I usually prefer the term “free society” to “free market.”

Bourgeois Virtue

But I think one good reason the association between business on the one hand and economics and classical liberalism on the other has been so persistent is that business and the free society arose together. That is, the liberal idea—that certain fundamental individual rights exist prior to and apart from the State—sparked one of the most momentous social changes in history: the commercial revolution and the emergence of the modern urban middle class. 

The triumph of liberty, of personal freedom, unleashed the creative potential of people, who found expression in art, religion, literature, but most of all—or at least most visibly—in the Marshallian “ordinary business of life.” The changes that have taken place in the past 500 years—scientific revolutions, religious reformations, political upheavals, artistic rebirths—were driven by the same human propensities as the commercial revolution and fueled by the wealth it produced. Indeed, the social and political changes of the past century—for women, workers, and minorities—would not have been possible without the entrepreneurial pressures of competition and innovation that forced radical changes in conventional thinking and socially conservative attitudes.

Tradition’s Worst Enemy

In short, business is the most dynamic social institution known to mankind. The critical and competitive attitudes that enable business to flourish erode custom and break old ties even as they foster new ones. The products of business tend to offend people whose sensibilities were refined by generations of tradition. The free market is tradition’s worst enemy.

Business has become part of the default mode of modern society. We take it for granted. We don’t realize what a radical, subversive force it is, to the point where it sounds strange to say so. But try to imagine a world without businesses and commerce. A world like the Dark Ages of, say ninth century Western Europe: static, grindingly poor, strictly hierarchical, socially intolerant, and, apart from the occasional battle or beheading, boring like you wouldn’t believe.

So, while it’s still a mistake to think economics and classical liberalism are somehow about studying and promoting business, maybe at a deeper level it’s not such a bad one to make after all. Business is subversive.

Sanford Ikeda is an associate professor of economics at Purchase College, SUNY, and the author of The Dynamics of the Mixed Economy: Toward a Theory of Interventionism.
***
This article was originally published by The Foundation for Economic Education.

by

Oklahoma: The Economic Storm – Article by David J. Hebert

No comments yet

Categories: Economics, Politics, Tags: , , , , , , , , , , , , ,

The New Renaissance Hat
David J. Hebert
May 25, 2013
Recommend this page.
******************************

A tornado ravaged Oklahoma last week, destroying hundreds of homes, killing dozens, and injuring hundreds more. Unfortunately, it looks like the citizens of Oklahoma are about to be ravaged by another storm brought on by the Oklahoma Attorney General, Scott Pruitt.

According to ABC News, Mr. Pruitt and his staff began “aggressively combing the region for fraud just hours after the tornado … and immediately [found] businesses violating the law. ”

What laws were the businesses accused of violating?  Anti-price-gouging laws. Using powers granted by the Emergency Price Stabilization Act, Mr. Pruitt is hoping to help the people of Oklahoma by preventing businesses from profiting off of the suffering of the townspeople, many of whom just lost their homes. He goes so far as to say, “[the townspeople] never anticipate or expect that someone would take advantage of them right now, but this situation is what criminals prey upon. ”

While Mr. Pruitt no doubt intends to help the local citizens, his misunderstanding of the workings of the price mechanism will lead only to folly and the prolonged suffering of the very people that he is trying to help. What he is effectively arguing for is a price ceiling on basic commodities, such as water (which is reportedly being sold for $40 per case today as opposed to only $3-$4 just a few weeks ago).

This has very predictable results: a shortage.

When prices are held below their market value, the effect is that there will be a large number of people who are willing to purchase water at that price but very few sellers willing to sell the water at that price. This means that people will compete on non-price margins to acquire water, that is, they will queue, sometimes for hours on end. The time that they spend waiting in line, however, is a deadweight cost as it is value that is forgone but is not captured by anyone. So now, instead of contributing towards the reconstruction of the town, the people are stuck waiting in line for water.

The beauty of the price mechanism is what it accomplishes in situations like this, assuming, of course, officials allow it to function properly. In this situation, demand in Oklahoma rises and producers, seeing an opportunity to profit, reroute trucks/planes to Oklahoma, thus increasing the quantity of water supplied in the area that needs it most.

Absent the rise in price, we would have to rely on the benevolence of these companies to help the people in need (and assume that they knew what the people of Oklahoma wanted to begin with).

This isn’t in and of itself terrible. Obviously companies DO send extra water to places that experience disasters, and the Red Cross DOES send volunteers and such. But notice that nothing in the preceding analysis precludes this benevolence. Why rely merely on benevolence when we can also rely on self-interest? If the goal is to help people get clean drinking water, it stands to reason that we ought to incentivize producers as many ways as possible, be they other-regarding, self-regarding or both.

David Hebert is a Ph.D. Fellow at the Mercatus Center at George Mason University.

This article was originally published by The Foundation for Economic Education.

by

Neuronal “Scanning” and NRU Integration – Article by Franco Cortese

2 comments

Categories: Science, Transhumanism, Tags: , , , , , , , , , , , ,

The New Renaissance Hat
Franco Cortese
May 23, 2013
Recommend this page.
******************************
This essay is the seventh chapter in Franco Cortese’s forthcoming e-book, I Shall Not Go Quietly Into That Good Night!: My Quest to Cure Death, published by the Center for Transhumanity. The first six chapters were previously published on The Rational Argumentator under the following titles:
***

I was planning on using the NEMS already conceptually developed by Robert Freitas for nanosurgery applications (to be supplemented by the use of MEMS if the technological infrastructure was unavailable at the time) to take in vivo recordings of the salient neural metrics and properties needing to be replicated. One novel approach was to design the units with elongated, worm-like bodies, disposing the computational and electromechanical apparatus within the elongated body of the unit. This sacrifices width for length so as to allow the units to fit inside the extra-cellular space between neurons and glial cells as a postulated solution to a lack of sufficient miniaturization. Moreover, if a unit is too large to be used in this way, extending its length by the same proportion would allow it to then operate in the extracellular space, provided that its means of data-measurement itself weren’t so large as to fail to fit inside the extracellular space (the span of ECF between two adjacent neurons for much of the brain is around 200 Angstroms).

I was planning on using the chemical and electrical sensing methodologies already in development for nanosurgery as the technological and methodological infrastructure for the neuronal data-measurement methodology. However, I also explored my own conceptual approaches to data-measurement. This consisted of detecting variation of morphological features in particular, as the schemes for electrical and chemical sensing already extant seemed either sufficiently developed or to be receiving sufficient developmental support and/or funding. One was the use of laser-scanning or more generally radiography (i.e., sonar) to measure and record morphological data. Another was a device that uses a 2D array of depressible members (e.g., solid members attached to a spring or ratchet assembly, which is operatively connected to a means of detecting how much each individual member is depressed—such as but not limited to piezoelectric crystals that produce electricity in response and proportion to applied mechanical strain). The device would be run along the neuronal membrane and the topology of the membrane would be subsequently recorded by the pattern of depression recordings, which are then integrated to provide a topographic map of the neuron (e.g., relative location of integral membrane components to determine morphology—and magnitude of depression to determine emergent topology). This approach could also potentially be used to identify the integral membrane proteins, rather than using electrical or chemical sensing techniques, if the topologies of the respective proteins are sufficiently different as to be detectable by the unit (determined by its degree of precision, which typically is a function of its degree of miniaturization).

The constructional and data-measurement units would also rely on the technological and methodological infrastructure for organization and locomotion that would be used in normative nanosurgery. I conceptually explored such techniques as the use of a propeller, the use of pressure-based methods (i.e., a stream of water acting as jet exhaust would in a rocket), the use of artificial cilia, and the use of tracks that the unit attaches to so as to be moved electromechanically, which decreases computational intensiveness – a measure of required computation per unit time – rather than having a unit compute its relative location so as to perform obstacle-avoidance and not, say, damage in-place biological neurons. Obstacle-avoidance and related concerns are instead negated through the use of tracks that limit the unit’s degrees of freedom—thus preventing it from having to incorporate computational techniques of obstacle-avoidance (and their entailed sensing apparatus). This also decreases the necessary precision (and thus, presumably, the required degree of miniaturization) of the means of locomotion, which would need to be much greater if the unit were to perform real-time obstacle avoidance. Such tracks would be constructed in iterative fashion. The constructional system would analyze the space in front of it to determine if the space was occupied by a neuron terminal or soma, and extrude the tracks iteratively (e.g., add a segment in spaces where it detects the absence of biological material). It would then move along the newly extruded track, progressively extending it through the spaces between neurons as it moves forward.

Non-Distortional in vivo Brain “Scanning”

A novel avenue of enquiry that occurred during this period involves counteracting or taking into account the distortions caused by the data-measurement units on the elements or properties they are measuring and subsequently applying such corrections to the recording data. A unit changes the local environment that it is supposed to be measuring and recording, which becomes problematic. My solution was to test which operations performed by the units have the potential to distort relevant attributes of the neuron or its environment and to build units that compensate for it either physically or computationally.

If we reduce how a recording unit’s operation distorts neuronal behavior into a list of mathematical rules, we can take the recordings and apply mathematical techniques to eliminate or “cancel out” those distortions post-measurement, thus arriving at what would have been the correct data. This approach would work only if the distortions are affecting the recorded data (i.e., changing it in predictable ways), and not if they are affecting the unit’s ability to actually access, measure, or resolve such data.

The second approach applies the method underlying the first approach to the physical environment of the neuron. A unit senses and records the constituents of the area of space immediately adjacent to its edges and mathematically models that “layer”; i.e., if it is meant to detect ionic solutions (in the case of ECF or ICF), then it would measure their concentration and subsequently model ionic diffusion for that layer. It then moves forward, encountering another adjacent “layer” and integrating it with its extant model. By being able to sense iteratively what is immediately adjacent to it, it can model the space it occupies as it travels through that space. It then uses electric or chemical stores to manipulate the electrical and chemical properties of the environment immediately adjacent to its surface, so as to produce the emergent effects of that model (i.e., the properties of the edges of that model and how such properties causally affect/impact adjacent sections of the environment), thus producing the emergent effects that would have been present if the NRU-construction/integration system or data-measuring system hadn’t occupied that space.

The third postulated solution was the use of a grid comprised of a series of hollow recesses placed in front of the sensing/measuring apparatus. The grid is impressed upon the surface of the membrane. Each compartment isolates a given section of the neuronal membrane from the rest. The constituents of each compartment are measured and recorded, most probably via uptake of its constituents and transport to a suitable measuring apparatus. A simple indexing system can keep track of which constituents came from which grid (and thus which region of the membrane they came from). The unit has a chemical store operatively connected to the means of locomotion used to transport the isolated membrane-constituents to the measuring/sensing apparatus. After a given compartment’s constituents are measured and recorded, the system then marks its constituents (determined by measurement and already stored as recordings by this point of the process), takes an equivalent molecule or compound from a chemical inventory, and replaces the substance it removed for measurement with the equivalent substance from its chemical inventory. Once this is accomplished for a given section of membrane, the grid then moves forward, farther into the membrane, leaving the replacement molecules/compounds from the biochemical inventory in the same respective spots as their original counterparts. It does this iteratively, making its way through a neuron and out the other side. This approach is the most speculative, and thus the least likely to be used. It would likely require the use of NEMS, rather than MEMS, as a necessary technological infrastructure, if the approach were to avoid becoming economically prohibitive, because in order for the compartment-constituents to be replaceable after measurement via chemical store, they need to be simple molecules and compounds rather than sections of emergent protein or tissue, which are comparatively harder to artificially synthesize and store in working order.

***

In the next chapter I describe the work done throughout late 2009 on biological/non-biological NRU hybrids, and in early 2010 on one of two new approaches to retaining subjective-continuity through a gradual replacement procedure, both of which are unrelated to concerns of graduality or sufficient functional equivalence between the biological original and the artificial replication-unit.

Franco Cortese is an editor for Transhumanity.net, as well as one of its most frequent contributors.  He has also published articles and essays on Immortal Life and The Rational Argumentator. He contributed 4 essays and 7 debate responses to the digital anthology Human Destiny is to Eliminate Death: Essays, Rants and Arguments About Immortality.

Franco is an Advisor for Lifeboat Foundation (on its Futurists Board and its Life Extension Board) and contributes regularly to its blog.

by

How Can I Live Forever?: What Does and Does Not Preserve the Self – Video by G. Stolyarov II

No comments yet

Categories: Philosophy, Transhumanism, Tags: , , , , , , , , , , , , , , , , ,

When we seek indefinite life, what is it that we are fundamentally seeking to preserve? Mr. Stolyarov discusses what is necessary for the preservation of “I-ness” – an individual’s direct vantage point: the thoughts and sensations of a person as that person experiences them directly.

Once you are finished with this video, you can take a quiz and earn the “I-ness” Awareness Open Badge.

Reference

– “How Can I Live Forever?: What Does and Does Not Preserve the Self” – Essay by G. Stolyarov II

by

Mind as Interference with Itself: A New Approach to Immediate Subjective-Continuity – Article by Franco Cortese

No comments yet

Categories: Philosophy, Science, Transhumanism, Tags: , , , , , , , , , , , ,

The New Renaissance Hat
Franco Cortese
May 21, 2013
Recommend this page.
******************************
This essay is the sixth chapter in Franco Cortese’s forthcoming e-book, I Shall Not Go Quietly Into That Good Night!: My Quest to Cure Death, published by the Center for Transhumanity. The first five chapters were previously published on The Rational Argumentator as “The Moral Imperative and Technical Feasibility of Defeating Death”, “Immortality: Material or Ethereal? Nanotech Does Both!, “Concepts for Functional Replication of Biological Neurons“, “Gradual Neuron Replacement for the Preservation of Subjective-Continuity“, and “Wireless Synapses, Artificial Plasticity, and Neuromodulation“.
***
Electromagnetic Theory of Mind
***

One line of thought I explored during this period of my conceptual work on life extension was concerned with whether it was not the material constituents of the brain manifesting consciousness, but rather the emergent electric or electromagnetic fields generated by the concerted operation of those material constituents, that instantiates mind. This work sprang from reading literature on Karl Pribram’s holonomic-brain theory, in which he developed a “holographic” theory of brain function. A hologram can be cut in half, and, if illuminated, each piece will still retain the whole image, albeit at a loss of resolution. This is due to informational redundancy in the recording procedure (i.e., because it records phase and amplitude, as opposed to just amplitude in normal photography). Pribram’s theory sought to explain the results of experiments in which a patient who had up to half his brain removed and nonetheless retained levels of memory and intelligence comparable to what he possessed prior to the procedure, and to explain the similar results of experiments in which the brain is sectioned and the relative organization of these sections is rearranged without the drastic loss in memory or functionality one would anticipate. These experiments appear to show a holonomic principle at work in the brain. I immediately saw the relation to gradual uploading, particularly the brain’s ability to take over the function of parts recently damaged or destroyed beyond repair. I also saw the emergent electric fields produced by the brain as much better candidates for exhibiting the material properties needed for such holonomic attributes. For one, electromagnetic fields (if considered as waves rather than particles) are continuous, rather than modular and discrete as in the case of atoms.

The electric-field theory of mind also seemed to provide a hypothetical explanatory model for the existence of subjective-continuity through gradual replacement. (Remember that the existence and successful implementation of subjective-continuity is validated by our subjective sense of continuity through normative metabolic replacement of the molecular constituents of our biological neurons— a.k.a. molecular turnover). If the emergent electric or electromagnetic fields of the brain are indeed holonomic (i.e., possess the attribute of holographic redundancy), then a potential explanatory model to account for why the loss of a constituent module (i.e., neuron, neuron cluster, neural network, etc.) fails to cause subjective-discontinuity is provided. Namely, subjective-continuity is retained because the loss of a constituent part doesn’t negate the emergent information (the big picture), but only eliminates a fraction of its original resolution. This looked like empirical support for the claim that it is the electric fields, rather than the material constituents of the brain, that facilitate subjective-continuity.

Another, more speculative aspect of this theory (i.e., not supported by empirical research or literature) involved the hypothesis that the increased interaction among electric fields in the brain (i.e., interference via wave superposition, the result of which is determined by both phase and amplitude) might provide a physical basis for the holographic/holonomic property of “informational redundancy” as well, if it was found that electric fields do not already possess or retain the holographic-redundancy attributes mentioned (i.e., interference via wave superposition, which involves a combination of both phase and amplitude).

A local electromagnetic field is produced by the electrochemical activity of the neuron. This field then undergoes interference with other local fields; and at each point up the scale, we have more fields interfering and combining. The level of disorder makes the claim that salient computation is occurring here dubious, due to the lack of precision and high level of variability which provides an ample basis for dysfunction (including increased noise, lack of a stable — i.e., static or material — means of information storage, and poor signal transduction or at least a high decay rate for signal propagation). However, the fact that they are interfering at every scale means that the local electric fields contain not only information encoding the operational states and functional behavior of the neuron it originated from, but also information encoding the operational states of other neurons by interacting, interfering, and combining with the electric fields produced by those other neurons (by electromagnetic fields interfering and combining in both amplitude and phase, as in holography, and containing information about other neurons by having interfered with their corresponding EM fields; thus if one neuron dies, some of its properties could have been encoded in other EM-waves) appeared to provide a possible physical basis for the brain’s hypothesized holonomic properties.

If electric fields are the physically continuous process that allows for continuity of consciousness (i.e., theories of emergence), then this suggests that computational substrates instantiating consciousness need to exhibit similar properties. This is not a form of vitalism, because I am not postulating that some extra-physical (i.e., metaphysical) process instantiates consciousness, but rather that a material aspect does, and that such an aspect may have to be incorporated in any attempts at gradual substrate replacement meant to retain subjective-continuity through the procedure. It is not a matter of simulating the emergent electric fields using normative computational hardware, because it is not that the electric fields provide the functionality needed, or implement some salient aspect of computation that would otherwise be left out, but rather that the emergent EM fields form a physical basis for continuity and emergence unrelated to functionality but imperative to experiential-continuity or subjectivity—which I distinguish from the type of subjective-continuity thus far discussed, that is, of a feeling of being the same person through the process of gradual substrate replacement—via the term “immediate subjective-continuity”, as opposed to “temporal subjective-continuity”. Immediate subjective-continuity is the capacity to feel, period. Temporal subjective-continuity is the state of feeling like the same person you were. Thus while temporal subjective-continuity inherently necessitates immediate subjective-continuity, immediate subjective-continuity does not require temporal subjective-continuity as a fundamental prerequisite.

Thus I explored variations of NRU-operational-modality that incorporate this (i.e., prosthetics on the cellular scale) particularly the informational-functionalist (i.e., computational) NRUs, as the physical-functionalist NRUs were presumed to instantiate these same emergent fields via their normative operation. The approach consisted of either (a) translating the informational output of the models into the generation of physical fields (either at the end of the process, or throughout by providing the internal area or volume of the unit with a grid composed of electrically conductive nodes, such that the voltage patterns can be physically instantiated in temporal synchrony with the computational model, or (b) constructing the computational substrate instantiating the computational model so as to generate emergent electric fields in a manner as consistent with biological operation as possible (e.g., in the brain a given neuron is never in an electrically neutral state, never completely off, but rather always in a range of values between on and off [see Chapter 2], which means that there is never a break — i.e., spatiotemporal region of discontinuity — in its emergent electric fields; these operational properties would have to be replicated by any computational substrate used to replicate biological neurons via the informationalist-functionalist approach, if the premises that it facilitates immediate subjective-continuity are correct).

Franco Cortese is an editor for Transhumanity.net, as well as one of its most frequent contributors.  He has also published articles and essays on Immortal Life and The Rational Argumentator. He contributed 4 essays and 7 debate responses to the digital anthology Human Destiny is to Eliminate Death: Essays, Rants and Arguments About Immortality.

Franco is an Advisor for Lifeboat Foundation (on its Futurists Board and its Life Extension Board) and contributes regularly to its blog.

by

Wireless Synapses, Artificial Plasticity, and Neuromodulation – Article by Franco Cortese

No comments yet

Categories: Technology, Transhumanism, Tags: , , , , , , , , , , , , ,

The New Renaissance Hat
Franco Cortese
May 21, 2013
Recommend this page.
******************************
This essay is the fifth chapter in Franco Cortese’s forthcoming e-book, I Shall Not Go Quietly Into That Good Night!: My Quest to Cure Death, published by the Center for Transhumanity. The first four chapters were previously published on The Rational Argumentator as “The Moral Imperative and Technical Feasibility of Defeating Death”, “Immortality: Material or Ethereal? Nanotech Does Both!, “Concepts for Functional Replication of Biological Neurons“, and “Gradual Neuron Replacement for the Preservation of Subjective-Continuity“.
***

Morphological Changes for Neural Plasticity

The finished physical-functionalist units would need the ability to change their emergent morphology not only for active modification of single-neuron functionality but even for basic functional replication of normative neuron behavior, by virtue of needing to take into account neural plasticity and the way that morphological changes facilitate learning and memory. My original approach involved the use of retractable, telescopic dendrites and axons (with corresponding internal retractable and telescopic dendritic spines and axonal spines, respectively) activated electromechanically by the unit-CPU. For morphological changes, by providing the edges of each membrane section with an electromechanical hinged connection (i.e., a means of changing the angle of inclination between immediately adjacent sections), the emergent morphology can be controllably varied. This eventually developed to consist of an internal compartment designed so as to detach a given membrane section, move it down into the internal compartment of the neuronal soma or terminal, transport it along a track that stores alternative membrane sections stacked face-to-face (to compensate for limited space), and subsequently replaces it with a membrane section containing an alternate functional component (e.g., ion pump, ion channel, [voltage-gated or ligand-gated], etc.) embedded therein. Note that this approach was also conceived of as an alternative to retractable axons/dendrites and axonal/dendritic spines, by attaching additional membrane sections with a very steep angle of inclination (or a lesser inclination with a greater quantity of segments) and thereby creating an emergent section of artificial membrane that extends out from the biological membrane in the same way as axons and dendrites.

However, this approach was eventually supplemented by one that necessitates less technological infrastructure (i.e., that was simpler and thus more economical and realizable). If the size of the integral-membrane components is small enough (preferably smaller than their biological analogues), then differential activation of components or membrane sections would achieve the same effect as changing the organization or type of integral-membrane components, effectively eliminating the need at actually interchange membrane sections at all.

Active Neuronal Modulation and Modification

The technological and methodological infrastructure used to facilitate neural plasticity can also be used for active modification and modulation of neural behavior (and the emergent functionality determined by local neuronal behavior) towards the aim of mental augmentation and modification. Potential uses already discussed include mental amplification (increasing or augmenting existing functional modalities—i.e., intelligence, emotion, morality), or mental augmentation (the creation of categorically new functional and experiential modalities). While the distinction between modification and modulation isn’t definitive, a useful way of differentiating them is to consider modification as morphological changes creating new functional modalities, and to consider modulation as actively varying the operation of existing structures/processes through not morphological change but rather changes to the operation of integral-membrane components or the properties of the local environment (e.g., increasing local ionic concentrations).

Modulation: A Less Discontinuous Alternative to Morphological Modification

The use of modulation to achieve the effective results of morphological changes seemed like a hypothetically less discontinuous alternative to morphological changes (and thus as having a hypothetically greater probability of achieving subjective-continuity). I’m more dubious in regards to the validity of this approach now, because the emergent functionality (normatively determined by morphological features) is still changed in an effectively equivalent manner.

The Eventual Replacement of Neural Ionic Solutions with Direct Electric Fields

Upon full gradual replacement of the CNS with physical-functionalist equivalents, the preferred embodiment consisted of replacing the ionic solutions with electric fields that preserve the electric potential instantiated by the difference in ionic concentrations on the respective sides of the membrane. Such electric fields can be generated directly, without recourse to electrochemicals for manifesting them. In such a case the integral-membrane components would be replaced by a means of generating and maintaining a static and/or dynamic electric field on either side of the membrane, or even merely of generating an electrical potential (i.e., voltage—a broader category encompassing electric fields) with solid-state electronics.

This procedure would allow a fraction of the speedups (that is, increased rate of subjective perception of time, which extends to speed of thought) resulting from emulatory (i.e., strictly computational) replication-methods by no longer being limited to the rate of passive ionic diffusion—now instead being limited to the propagation velocity of electric or electromagnetic fields.

Wireless Synapses

If we replace the physical synaptic connections the NRU uses to communicate (with both existing biological neurons and with other NRUs) with a wireless means of synaptic-transmission, we can preserve the same functionality (insofar as it is determined by synaptic connectivity) while allowing any NRU to communicate with any other NRU or biological neuron in the brain at potentially equal speed. First we need a way of converting the output of an NRU or biological neuron into information that can be transmitted wirelessly. For cyber-physicalist-functionalist NRUs, regardless of their sub-class, this requires no new technological infrastructure because they already deal with 2nd-order (i.e., not structurally or directly embodied) information; informational-functional NRU deals solely in terms of this type of information, and the cyber-physical-systems sub-class of the physicalist-functionalist NRUs deal with this kind of information in the intermediary stage between sensors and actuators—and consequently, converting what would have been a sequence of electromechanical actuations into information isn’t a problem. Only the passive-physicalist-functionalist NRU class requires additional technological infrastructure to accomplish this, because they don’t already use computational operational-modalities for their normative operation, whereas the other NRU classes do.

We dispose receivers within the range of every neuron (or alternatively NRU) in the brain, connected to actuators – the precise composition of which depends on the operational modality of the receiving biological neuron or NRU. The receiver translates incoming information into physical actuations (e.g., the release of chemical stores), thereby instantiating that informational output in physical terms. For biological neurons, the receiver’s actuators would consist of a means of electrically stimulating the neuron and releasable chemical stores of neurotransmitters (or ionic concentrations as an alternate means of electrical stimulation via the manipulation of local ionic concentrations). For informational-functionalist NRUs, the information is already in a form it can accept; it can simply integrate that information into its extant model. For cyber-physicalist-NRUs, the unit’s CPU merely needs to be able to translate that information into the sequence in which it must electromechanically actuate its artificial ion-channels. For the passive-physicalist (i.e., having no computational hardware devoted to operating individual components at all, operating according to physical feedback between components alone) NRUs, our only option appears to be translating received information into the manipulation of the local environment to vicariously affect the operation of the NRU (e.g., increasing electric potential through manipulations of local ionic concentrations, or increasing the rate of diffusion via applied electric fields to attract ions and thus achieve the same effect as a steeper electrochemical gradient or potential-difference).

The technological and methodological infrastructure for this is very similar to that used for the “integrational NRUs”, which allows a given NRU-class to communicate with either existing biological neurons or NRUs of an alternate class.

Integrating New Neural Nets Without Functional Distortion of Existing Regions

The use of artificial neural networks (which here will designate NRU-networks that do not replicate any existing biological neurons, rather than the normative Artificial Neuron Networks mentioned in the first and second parts of this essay), rather than normative neural prosthetics and BCI, was the preferred method of cognitive augmentation (creation of categorically new functional/experiential modalities) and cognitive amplification (the extension of existing functional/experiential modalities). Due to functioning according to the same operational modality as existing neurons (whether biological or artificial-replacements), they can become a continuous part of our “selves”, whereas normative neural prosthetics and BCI are comparatively less likely to be capable of becoming an integral part of our experiential continuum (or subjective sense of self) due to their significant operational dissimilarity in relation to biological neural networks.

A given artificial neural network can be integrated with existing biological networks in a few ways. One is interior integration, wherein the new neural network is integrated so as to be “inter-threaded”, in which a given artificial-neuron is placed among one or multiple existing networks. The networks are integrated and connected on a very local level. In “anterior” integration, the new network would be integrated in a way comparable to the connection between separate cortical columns, with the majority of integration happening at the peripherals of each respective network or cluster.

If the interior integration approach is used then the functionality of the region may be distorted or negated by virtue of the fact that neurons that once took a certain amount of time to communicate now take comparatively longer due to the distance between them having been increased to compensate for the extra space necessitated by the integration of the new artificial neurons. Thus in order to negate these problematizing aspects, a means of increasing the speed of communication (determined by both [a] the rate of diffusion across the synaptic junction and [b] the rate of diffusion across the neuronal membrane, which in most cases is synonymous with the propagation velocity in the membrane – the exception being myelinated axons, wherein a given action potential “jumps” from node of Ranvier to node of Ranvier; in these cases propagation velocity is determined by the thickness and length of the myelinated sections) must be employed.

My original solution was the use of an artificial membrane morphologically modeled on a myelinated axon that possesses very high capacitance (and thus high propagation velocity), combined with increasing the capacitance of the existing axon or dendrite of the biological neuron. The cumulative capacitance of both is increased in proportion to how far apart they are moved. In this way, the propagation velocity of the existing neuron and the connector-terminal are increased to allow the existing biological neurons to communicate as fast as they would have prior to the addition of the artificial neural network. This solution was eventually supplemented by the wireless means of synaptic transmission described above, which allows any neuron to communicate with any other neuron at equal speed.

Gradually Assigning Operational Control of a Physical NRU to a Virtual NRU

This approach allows us to apply the single-neuron gradual replacement facilitated by the physical-functionalist NRU to the informational-functionalist (physically embodied) NRU. A given section of artificial membrane and its integral membrane components are modeled. When this model is functioning in parallel (i.e., synchronization of operative states) with its corresponding membrane section, the normative operational routines of that artificial membrane section (usually controlled by the unit’s CPU and its programming) are subsequently taken over by the computational model—i.e., the physical operation of the artificial membrane section is implemented according to and in correspondence with the operative states of the model. This is done iteratively, with the informationalist-functionalist NRU progressively controlling more and more sections of the membrane until the physical operation of the whole physical-functionalist NRU is controlled by the informational operative states of the informationalist-functionalist NRU. While this concept sprang originally from the approach of using multiple gradual-replacement phases (with a class of model assigned to each phase, wherein each is more dissimilar to the original than the preceding phase, thereby increasing the cumulative degree of graduality), I now see it as a way of facilitating sub-neuron gradual replacement in computational NRUs. Also note that this approach can be used to go from existing biological membrane-sections to a computational NRU, without a physical-functionalist intermediary stage. This, however, is comparatively more complex because the physical-functionalist NRU already has a means of modulating its operative states, whereas the biological neuron does not. In such a case the section of lipid bilayer membrane would presumably have to be operationally isolated from adjacent sections of membrane, using a system of chemical inventories (of either highly concentrated ionic solution or neurotransmitters, depending on the area of membrane) to produce electrochemical output and chemical sensors to accept the electrochemical input from adjacent sections (i.e., a means of detecting depolarization and hyperpolarization). Thus to facilitate an action potential, for example, the chemical sensors would detect depolarization, the computational NRU would then model the influx of ions through the section of membrane it is replacing and subsequently translate the effective results impinging upon the opposite side to that opposite edge via either the release of neurotransmitters or the manipulation of local ionic concentrations so as to generate the required depolarization at the adjacent section of biological membrane.

Integrational NRU

This consisted of a unit facilitating connection between emulatory (i.e., informational-functionalist) units and existing biological neurons. The output of the emulatory units is converted into chemical and electrical output at the locations where the emulatory NRU makes synaptic connection with other biological neurons, facilitated through electric stimulation or the release of chemical inventories for the increase of ionic concentrations and the release of neurotransmitters, respectively. The input of existing biological neurons making synaptic connections with the emulatory NRU is read, likewise, by chemical and electrical sensors and is converted into informational input that corresponds to the operational modality of the informationalist-functionalist NRU classes.

Solutions to Scale

If we needed NEMS or something below the scale of the present state of MEMS for the technological infrastructure of either (a) the electromechanical systems replicating a given section of neuronal membrane, or (b) the systems used to construct and/or integrate the sections, or those used to remove or otherwise operationally isolate the existing section of lipid bilayer membrane being replaced from adjacent sections, a postulated solution consisted of taking the difference in length between the artificial membrane section and the existing bilipid section (which difference is determined by how small we can construct functionally operative artificial ion-channels) and incorporating this as added curvature in the artificial membrane-section such that its edges converge upon or superpose with the edges of the space left by the removal the lipid bilayer membrane-section. We would also need to increase the propagation velocity (typically determined by the rate of ionic influx, which in turn is typically determined by the concentration gradient or difference in the ionic concentrations on the respective sides of the membrane) such that the action potential reaches the opposite end of the replacement section at the same time that it would normally have via the lipid bilayer membrane. This could be accomplished directly by the application of electric fields with a charge opposite that of the ions (which would attract them, thus increasing the rate of diffusion), by increasing the number of open channels or the diameter of existing channels, or simply by increasing the concentration gradient through local manipulation of extracellular and/or intracellular ionic concentration—e.g., through concentrated electrolyte stores of the relevant ion that can be released to increase the local ionic concentration.

If the degree of miniaturization is so low as to make this approach untenable (e.g., increasing curvature still doesn’t allow successful integration) then a hypothesized alternative approach was to increase the overall space between adjacent neurons, integrate the NRU, and replace normative connection with chemical inventories (of either ionic compound or neurotransmitter) released at the site of existing connection, and having the NRU (or NRU sub-section—i.e., artificial membrane section) wirelessly control the release of such chemical inventories according to its operative states.

The next chapter describes (a) possible physical bases for subjective-continuity through a gradual-uploading procedure and (b) possible design requirements for in vivo brain-scanning and for systems to construct and integrate the prosthetic neurons with the existing biological brain.

Franco Cortese is an editor for Transhumanity.net, as well as one of its most frequent contributors.  He has also published articles and essays on Immortal Life and The Rational Argumentator. He contributed 4 essays and 7 debate responses to the digital anthology Human Destiny is to Eliminate Death: Essays, Rants and Arguments About Immortality.

Franco is an Advisor for Lifeboat Foundation (on its Futurists Board and its Life Extension Board) and contributes regularly to its blog.

Bibliography

Project Avatar (2011). Retrieved February 28, 2013 from http://2045.com/tech2/

by

Thoughts on Zoltan Istvan’s “The Transhumanist Wager” – A Review – Video by G. Stolyarov II

No comments yet

Categories: Fiction, Transhumanism, Tags: , , , , , , , , , , , , , , , , , , , , , ,

Zoltan Istvan’s new novel The Transhumanist Wager has been compared to Ayn Rand’s Atlas Shrugged. But to what extent are the books alike, and in what respects? In this review, Mr. Stolyarov compares and contrasts the two novels and explores the question of how best to achieve radical life extension and general technological progress for the improvement of the human condition.

References

– The Transhumanist Wager Official Page
– “Thoughts on Zoltan Istvan’s ‘The Transhumanist Wager’: A Review” – Article by G. Stolyarov II
Guilio Prisco’s Review of The Transhumanist Wager
– “Larry Page wants to ‘set aside a part of the world’ for unregulated experimentation” – Nathan Ingraham – The Verge – May 15, 2013
Zoltan Istvan’s Reddit AMA

by

The IRS’s Job Is To Violate Our Liberties – Article by Ron Paul

No comments yet

Categories: History, Politics, Tags: , , , , , , , , , , , , , , ,

The New Renaissance Hat
Ron Paul
May 21, 2013
Recommend this page.
******************************

“What do you expect when you target the President?” This is what an Internal Revenue Service (IRS) agent allegedly said to the head of a conservative organization that was being audited after calling for the impeachment of then-President Clinton. Recent revelations that IRS agents gave “special scrutiny” to organizations opposed to the current administration’s policies suggest that many in the IRS still believe harassing the President’s opponents is part of their job.

As troubling as these recent reports are, it would be a grave mistake to think that IRS harassment of opponents of the incumbent President is a modern, or a partisan, phenomenon. As scholar Burton Folsom pointed out in his book New Deal or Raw Deal, IRS agents in the 1930s where essentially “hit squads” against opponents of the New Deal. It is well-known that the administrations of John F. Kennedy and Lyndon Johnson used the IRS to silence their critics. One of the articles of impeachment drawn up against Richard Nixon dealt with his use of the IRS to harass his political enemies. Allegations of IRS abuses were common during the Clinton administration, and just this week some of the current administration’s defenders recalled that antiwar and progressive groups alleged harassment by the IRS during the Bush presidency.

The bipartisan tradition of using the IRS as a tool to harass political opponents suggests that the problem is deeper than just a few “rogue” IRS agents—or even corruption within one, two, three, or many administrations. Instead, the problem lies in the extraordinary power the tax system grants the IRS.

The IRS routinely obtains information about how we earn a living, what investments we make, what we spend on ourselves and our families, and even what charitable and religious organizations we support. Starting next year, the IRS will be collecting personally identifiable health insurance information in order to ensure we are complying with Obamacare’s mandates.

The current tax laws even give the IRS power to marginalize any educational, political, or even religious organizations whose goals, beliefs, and values are not favored by the current regime by denying those organizations “tax-free” status. This is the root of the latest scandal involving the IRS.

Considering the type of power the IRS excises over the American people, and the propensity of those who hold power to violate liberty, it is surprising we do not hear about more cases of politically motivated IRS harassment. As the third US Supreme Court Chief Justice John Marshall said, “The power to tax is the power to destroy” — and whom better to destroy than one’s political enemies?

The US flourished for over 120 years without an income tax, and our liberty and prosperity will only benefit from getting rid of the current tax system. The federal government will get along just fine without its immoral claim on the fruits of our labor, particularly if the elimination of federal income taxes is accompanied by serious reduction in all areas of spending, starting with the military spending beloved by so many who claim to be opponents of high taxes and big government.

While it is important for Congress to investigate the most recent scandal and ensure all involved are held accountable, we cannot pretend that the problem is a few bad actors. The very purpose of the IRS is to transfer wealth from one group to another while violating our liberties in the process. Thus, the only way Congress can protect our freedoms is to repeal the income tax and shutter the doors of the IRS once and for all.

Ron Paul, MD, is a former three-time Republican candidate for U. S. President and Congressman from Texas.

This article is reprinted with permission.

1 2 3 4