Browsed by
Tag: measurement

A Rational Cosmology – Treatise by G. Stolyarov II – Third Edition

A Rational Cosmology – Treatise by G. Stolyarov II – Third Edition

A Rational Cosmology - Third Edition - by G. Stolyarov II

A Rational Cosmology – Third Edition – by G. Stolyarov II

Contemporary science does not make as much progress as it could, due the fallacy of empiricism-positivism – the idea that no knowledge is certain beyond refutation and that every claim is contingent on highly narrow, particular, and expensive experiments. A Rational Cosmology, however, provides a thorough refutation of prevalent empiricist-positivist fallacies, both in content and in method. It shatters some of the erroneous philosophical interpretations of theories such as Relativity and Quantum Mechanics. Moreover, it refutes the ex nihilo origin of the universe – including its manifestation in popular views of the Big Bang and Big Crunch – the particle/wave view of light, and a host of other fallacious ideas, using the proper, axiomatic-deductive methodology of identifying those theories’ conceptual flaws and internal contradictions.

As constructive alternatives to these fallacies, A Rational Cosmology presents objective, absolute, rationally grounded views of terms such as universe, matter, volume, space, time, motion, sound, light, forces, fields, and even the higher-order concepts of life, consciousness, and volition. The result is a system verified by ubiquitous observation and common sense, the underpinnings of objective science which demonstrate a knowable, fathomable reality and set the stage for unfettered progress, confidence in reason, and full-scale logical investigation of just about everything existence has to offer.

The Third Edition of A Rational Cosmology has been enhanced and edited, with augmentations and revisions to several of the previous essays. There is a new, beautiful cover design by Wendy D. Stolyarov. Furthermore, there are two additional numbered essays and more recent writings within the Related Essays section.

For the first time, A Rational Cosmology is available for free download in the form of unified files. There are four options to choose from.

Download the PDF version.

Download the MOBI version.

Download the EPUB version.

Download the AZW3 version.

The Rational Argumentator welcomes your reviews of A Rational Cosmology. You can submit them to TRA by sending them to gennadystolyarovii@yahoo.com. You are also encouraged to spread the word by reprinting the information on this page or your own comments concerning the book on other media outlets.

Productivity Enhancement – Video Series by G. Stolyarov II

Productivity Enhancement – Video Series by G. Stolyarov II

The New Renaissance Hat
G. Stolyarov II
June 2, 2013
******************************

In this series on productivity enhancement, taken from Mr. Stolyarov’s e-book The Best Self-Help is Free, Mr. Stolyarov discusses the fundamental nature of productivity and approaches by which any person can become more productive.

This series is based on Chapters 7-14 of The Best Self-Help is Free.

Part 1 – What is Productivity?

The most reliable way to achieve incremental progress in your life is by addressing and continually improving your own productivity. Productivity constitutes the difference between a world in which life is nasty, brutish, and short and one in which it is pleasant, civilized, and ever-increasing in length.

Part 2 – Reason and the Decisional Component of Productivity

In order to properly decide what ought to be produced, man can ultimately consult only one guide: his rational faculty.

Part 3 – Perfectionism — The Number One Enemy of Productivity

Perfectionism engenders a pervasive sense of futility in its practitioner and mentally inhibits him from pursuing further productive work.

Part 4 – Quantification and Productivity Targets

Quantification enables an individual to set productivity targets for himself and to escape underachievement on one hand and perfectionism on the other.

Part 5 – Habit and the Elimination of the Quality-Quantity Tradeoff

A common fallacy presumes that there is a necessary tradeoff between the quantity of work produced and the quality of that work. By this notion, one can either produce a lot of mediocre units of output or a scant few exceptional ones. While this might be true in some cases, it overlooks several important factors.

Part 6 – The Importance of Frameworks for Productivity

Time-saving, productivity-enhancing frameworks can be applied on a personal level to enable one to overcome the human mind’s limited ability to hold and process multiple pieces of information simultaneously.

Part 7 – The Benefits of Repetition to Productivity

One of the most reliable ways to reduce the amount of mental effort per unit of productive output is to create many extremely similar units of output in succession. Mr. Stolyarov discusses the advantages of structuring one’s work so as to perform many similar tasks in close succession.

Part 8 – Making Accomplishments Work for You

Producing alone is not enough. If you just let your output lie around accumulating dust or taking up computer memory, it will not boost your overall well-being. Your accomplishments can help procure health, reputation, knowledge, safety, and happiness for you — if you think about how to put them to use.

Neuronal “Scanning” and NRU Integration – Article by Franco Cortese

Neuronal “Scanning” and NRU Integration – Article by Franco Cortese

The New Renaissance Hat
Franco Cortese
May 23, 2013
******************************
This essay is the seventh chapter in Franco Cortese’s forthcoming e-book, I Shall Not Go Quietly Into That Good Night!: My Quest to Cure Death, published by the Center for Transhumanity. The first six chapters were previously published on The Rational Argumentator under the following titles:
***

I was planning on using the NEMS already conceptually developed by Robert Freitas for nanosurgery applications (to be supplemented by the use of MEMS if the technological infrastructure was unavailable at the time) to take in vivo recordings of the salient neural metrics and properties needing to be replicated. One novel approach was to design the units with elongated, worm-like bodies, disposing the computational and electromechanical apparatus within the elongated body of the unit. This sacrifices width for length so as to allow the units to fit inside the extra-cellular space between neurons and glial cells as a postulated solution to a lack of sufficient miniaturization. Moreover, if a unit is too large to be used in this way, extending its length by the same proportion would allow it to then operate in the extracellular space, provided that its means of data-measurement itself weren’t so large as to fail to fit inside the extracellular space (the span of ECF between two adjacent neurons for much of the brain is around 200 Angstroms).

I was planning on using the chemical and electrical sensing methodologies already in development for nanosurgery as the technological and methodological infrastructure for the neuronal data-measurement methodology. However, I also explored my own conceptual approaches to data-measurement. This consisted of detecting variation of morphological features in particular, as the schemes for electrical and chemical sensing already extant seemed either sufficiently developed or to be receiving sufficient developmental support and/or funding. One was the use of laser-scanning or more generally radiography (i.e., sonar) to measure and record morphological data. Another was a device that uses a 2D array of depressible members (e.g., solid members attached to a spring or ratchet assembly, which is operatively connected to a means of detecting how much each individual member is depressed—such as but not limited to piezoelectric crystals that produce electricity in response and proportion to applied mechanical strain). The device would be run along the neuronal membrane and the topology of the membrane would be subsequently recorded by the pattern of depression recordings, which are then integrated to provide a topographic map of the neuron (e.g., relative location of integral membrane components to determine morphology—and magnitude of depression to determine emergent topology). This approach could also potentially be used to identify the integral membrane proteins, rather than using electrical or chemical sensing techniques, if the topologies of the respective proteins are sufficiently different as to be detectable by the unit (determined by its degree of precision, which typically is a function of its degree of miniaturization).

The constructional and data-measurement units would also rely on the technological and methodological infrastructure for organization and locomotion that would be used in normative nanosurgery. I conceptually explored such techniques as the use of a propeller, the use of pressure-based methods (i.e., a stream of water acting as jet exhaust would in a rocket), the use of artificial cilia, and the use of tracks that the unit attaches to so as to be moved electromechanically, which decreases computational intensiveness – a measure of required computation per unit time – rather than having a unit compute its relative location so as to perform obstacle-avoidance and not, say, damage in-place biological neurons. Obstacle-avoidance and related concerns are instead negated through the use of tracks that limit the unit’s degrees of freedom—thus preventing it from having to incorporate computational techniques of obstacle-avoidance (and their entailed sensing apparatus). This also decreases the necessary precision (and thus, presumably, the required degree of miniaturization) of the means of locomotion, which would need to be much greater if the unit were to perform real-time obstacle avoidance. Such tracks would be constructed in iterative fashion. The constructional system would analyze the space in front of it to determine if the space was occupied by a neuron terminal or soma, and extrude the tracks iteratively (e.g., add a segment in spaces where it detects the absence of biological material). It would then move along the newly extruded track, progressively extending it through the spaces between neurons as it moves forward.

Non-Distortional in vivo Brain “Scanning”

A novel avenue of enquiry that occurred during this period involves counteracting or taking into account the distortions caused by the data-measurement units on the elements or properties they are measuring and subsequently applying such corrections to the recording data. A unit changes the local environment that it is supposed to be measuring and recording, which becomes problematic. My solution was to test which operations performed by the units have the potential to distort relevant attributes of the neuron or its environment and to build units that compensate for it either physically or computationally.

If we reduce how a recording unit’s operation distorts neuronal behavior into a list of mathematical rules, we can take the recordings and apply mathematical techniques to eliminate or “cancel out” those distortions post-measurement, thus arriving at what would have been the correct data. This approach would work only if the distortions are affecting the recorded data (i.e., changing it in predictable ways), and not if they are affecting the unit’s ability to actually access, measure, or resolve such data.

The second approach applies the method underlying the first approach to the physical environment of the neuron. A unit senses and records the constituents of the area of space immediately adjacent to its edges and mathematically models that “layer”; i.e., if it is meant to detect ionic solutions (in the case of ECF or ICF), then it would measure their concentration and subsequently model ionic diffusion for that layer. It then moves forward, encountering another adjacent “layer” and integrating it with its extant model. By being able to sense iteratively what is immediately adjacent to it, it can model the space it occupies as it travels through that space. It then uses electric or chemical stores to manipulate the electrical and chemical properties of the environment immediately adjacent to its surface, so as to produce the emergent effects of that model (i.e., the properties of the edges of that model and how such properties causally affect/impact adjacent sections of the environment), thus producing the emergent effects that would have been present if the NRU-construction/integration system or data-measuring system hadn’t occupied that space.

The third postulated solution was the use of a grid comprised of a series of hollow recesses placed in front of the sensing/measuring apparatus. The grid is impressed upon the surface of the membrane. Each compartment isolates a given section of the neuronal membrane from the rest. The constituents of each compartment are measured and recorded, most probably via uptake of its constituents and transport to a suitable measuring apparatus. A simple indexing system can keep track of which constituents came from which grid (and thus which region of the membrane they came from). The unit has a chemical store operatively connected to the means of locomotion used to transport the isolated membrane-constituents to the measuring/sensing apparatus. After a given compartment’s constituents are measured and recorded, the system then marks its constituents (determined by measurement and already stored as recordings by this point of the process), takes an equivalent molecule or compound from a chemical inventory, and replaces the substance it removed for measurement with the equivalent substance from its chemical inventory. Once this is accomplished for a given section of membrane, the grid then moves forward, farther into the membrane, leaving the replacement molecules/compounds from the biochemical inventory in the same respective spots as their original counterparts. It does this iteratively, making its way through a neuron and out the other side. This approach is the most speculative, and thus the least likely to be used. It would likely require the use of NEMS, rather than MEMS, as a necessary technological infrastructure, if the approach were to avoid becoming economically prohibitive, because in order for the compartment-constituents to be replaceable after measurement via chemical store, they need to be simple molecules and compounds rather than sections of emergent protein or tissue, which are comparatively harder to artificially synthesize and store in working order.

***

In the next chapter I describe the work done throughout late 2009 on biological/non-biological NRU hybrids, and in early 2010 on one of two new approaches to retaining subjective-continuity through a gradual replacement procedure, both of which are unrelated to concerns of graduality or sufficient functional equivalence between the biological original and the artificial replication-unit.

Franco Cortese is an editor for Transhumanity.net, as well as one of its most frequent contributors.  He has also published articles and essays on Immortal Life and The Rational Argumentator. He contributed 4 essays and 7 debate responses to the digital anthology Human Destiny is to Eliminate Death: Essays, Rants and Arguments About Immortality.

Franco is an Advisor for Lifeboat Foundation (on its Futurists Board and its Life Extension Board) and contributes regularly to its blog.