Browsed by
Tag: academia

Andrés Grases Interviews U.S. Transhumanist Party Chairman Gennady Stolyarov II on Transhumanism and the Transition to the Next Technological Era

Andrés Grases Interviews U.S. Transhumanist Party Chairman Gennady Stolyarov II on Transhumanism and the Transition to the Next Technological Era

Gennady Stolyarov II
Andrés Grases

Andrés Grases, the publisher of the Transhuman Plus website ( interviews U.S. Transhumanist Party Chairman Gennady Stolyarov II at RAAD Fest 2018 in San Diego, CA, on September 23, 2018. During the course of this conversation, both the contemporary state of transhumanist politics and future directions are covered – along with the challenges to reforming the educational system, the need to create open access to academic works, the manner in which the transition toward the next era of technologies will occur, the meaning of transhumanism and its applications in the proximate future – including promising advances that we can expect to see during the next several years.

Watch the video here.

Become a member of the U.S. Transhumanist Party for free, no matter where you reside. Apply online here in less than a minute.

When Academia Turns into Fight Club – Article by Steven Horwitz

When Academia Turns into Fight Club – Article by Steven Horwitz

The New Renaissance Hat
Steven Horwitz
July 14, 2017

What do academics do for excitement over the summer, you ask? This summer many of us have been engaged in a furious debate over the new book Democracy in Chains by Duke historian Nancy MacLean.

Libertarian and conservative scholars from a variety of disciplines have raised a number of criticisms about MacLean’s sources and her accuracy about historical facts that call into question the “evidence” she has to show that economist James Buchanan and public choice theory, if not libertarianism more generally, are all tools of racist oligarchs like the Koch Brothers.

Rather than rehash all of the particular criticisms, I want to focus on the controversy that has developed over the criticisms themselves. It’s important to understand that the libertarian critics of MacLean have carefully compared passages in her book with her cited sources and showed how she has misread and quoted selectively from them, often leading her to attribute to people the exact opposite of the argument they actually held. These criticisms have been posted publicly on blogs and websites. These are not just vague accusations. They are detailed examples of poor scholarship.

But the fascinating part has been her response. And her lack thereof.

Everyone Is under Attack

MacLean has offered no substantive response to the detailed criticisms. She had one exchange with Russ Roberts over her treatment of Tyler Cowen, but even there she did not respond to the substance of Russ’s concerns. Other than that, nothing.

What she did do, however, was put up a long Facebook post that reads like a combination conspiracy theory tract and call to action for progressive activists. The short version is that she claimed she was under “attack” from a conspiracy of “Koch operatives” who were paid hacks out to destroy her book and her reputation and silence her. She claimed, then retracted when she found out it couldn’t be done, that the Kochs had bought Google results to put the critics at the top of searches. She encouraged her supporters to game the Amazon reviews by posting positive reviews and down-voting the “fake” Koch reviews.

She has continued this narrative of being “under attack” in various interviews, and most recently in a story in Inside Higher Ed, where fellow progressives echo this language.

This notion of being “attacked” is particularly fascinating to me. Let’s be clear what she means: people who know a lot about Buchanan, public choice theory, and libertarianism have taken issue with her scholarship and have patiently and carefully documented the places where she has made errors of fact or interpretation, or mangled and misused source materials and quotes. That is all that they have done.

None of this was coordinated nor was it part of a conspiracy from the Koch brothers. It was scholars doing what scholars do when they are confronted with bad scholarly work, especially when it touches on issues we know well.

None of these critics, and I am among them, have called for physical violence against her. None have contacted her employer. None have called her publisher or Amazon to have the book taken down. Contrary to her claim, the only silence in this whole episode is her own refusal to respond to legitimate scholarly criticism. We don’t want to silence her – we eagerly await her response.

So where is this language of “attack” coming from? Here is where I think the political right bears some responsibility for the current situation. And to the degree libertarians have cast their lot with “the right,” we are seen as guilty by association. Call it blowback if you will.

In the last year or two, progressive intellectuals and academics have been threatened with violence and had their employers contacted, not to mention threats made from politicians, on the basis of public statements they’ve made. Yes, some of those statements were deplorable, but that is no excuse for threatening people’s physical safety or their jobs. These are real attacks, not intellectual criticisms.

We should also not forget the anti-intellectual “Professor Watch List” put up by TurningPoint USA, which gave left-leaning faculty more reason to imagine coordinated and conspiratorial attacks.

And yes, all of this was not done by conservative or libertarian intellectuals, but they were done by activists associated with “the right,” and that is all that progressives need to find the intellectuals guilty by association.

It probably also matters, though less so, that many conservative and libertarian students have referred to themselves as “under attack” in college classrooms. In my 30 years of teaching experience, what they call “under attack” is far more often than not simply having their views strongly challenged and being expected to defend them. In other words, exactly what MacLean is experiencing.

This is not being “attacked.” It is what college classrooms and scholarly conversation are all about.

Unfortunately, the real attacks on left-wing faculty (and yes, there have been ones on right-wing ones too) have provided MacLean’s defenders with a convenient word to use to blur the difference between legitimate, but forceful, scholarly criticism, and threats of violence or silencing.

Always Take the High Road

Conservative critics of higher education should take this to heart. When you whip people into a frenzy over the crazy things that a small number of faculty say on Twitter, or because of legitimate concerns about the treatment of a small number of conservative speakers, the whipped up folks are going to do things you wish they wouldn’t. And that’s going to lead to blowback.

As a libertarian academic who frequently speaks at public events on other campuses, I do have low-level concerns about my safety. And if I were a progressive academic, I’d have similar fears given the way some of them have been treated, especially by politicians. Calling the intellectual criticisms of her book a coordinated conspiracy heads MacLean into Alex Jones territory, but given the current climate, it shouldn’t surprise us that she and her supporters feel “under attack.”

But notice the result: a book that smears libertarian and conservative ideas on the basis of shoddy scholarship gets attention because the author claims she’s under attack when she is called out in careful detail by other scholars. The real attacks on left-leaning faculty enable her to claim victimhood by association while using guilt by association to blame the conservative and libertarian intellectuals who are criticizing her work.

Once we head down the road, whether caused by the left, right, or libertarians, of turning intellectual disagreements into threats of violence, or threats to employment, or anything of that sort, the social losses are huge. Indeed, once both threats to people’s safety and employment and sharp intellectual disagreement become “attacks,” we will lose our ability to recognize the moral and intellectual difference between the two, and our disgust at the threats will weaken. And to the degree that the left largely dominates the intellectual world, conservatives and libertarians will be the biggest losers when academia turns into Fight Club.

So what to do? First, call off the dogs. Conservatives and libertarians need to consistently take the high road, as many of the intellectuals have tried to do in response to MacLean’s book. The hard part is getting right-wing media, both traditional and social media, to do the same. Those of us who care about intellectual standards have to publicly call out our own when they whip up anti-intellectual and anti-higher education frenzies.

Second, implore our left-wing friends of integrity to do the same. The most important thing that can happen to end this arms race is for scholars of integrity on the left to call out people like MacLean, both for their shoddy scholarship and their hyperbolic use of the language of conspiracy and attack. A strongly critical review of her book by a historian or economist of the center or left would go a long way to addressing the specific concerns it raises and could set a necessary example for others.

In the meantime, those of us critical of MacLean will continue to document her errors and press publicly for a response. And we’ll do so with the most proper of scholarly etiquette. I implore those sympathetic to our cause to be on their best behavior on social media as well. She and her supporters need no more ammunition.

Steven Horwitz is the Schnatter Distinguished Professor of Free Enterprise in the Department of Economics at Ball State University, where he also is a Fellow at the John H. Schnatter Institute for Entrepreneurship and Free Enterprise. He is the author of Hayek’s Modern Family: Classical Liberalism and the Evolution of Social Institutions and is a Distinguished Fellow at the Foundation for Economic Education (FEE) and a member of the FEE Faculty Network.

This article was published by The Foundation for Economic Education and may be freely distributed, subject to a Creative Commons Attribution 4.0 International License, which requires that credit be given to the author. Read the original article.

A Libertarian Defense of Tenure – Article by Aeon Skoble and Steven Horwitz

A Libertarian Defense of Tenure – Article by Aeon Skoble and Steven Horwitz

The New Renaissance HatAeon Skoble and Steven Horwitz

Tenure protects the right to be unpopular

Libertarians are understandably frustrated with the state of higher education today. Libertarian ideas often do not get covered, or are covered unfairly. Faculty are overwhelmingly left-of-center, and government subsidies have driven up costs, leading to higher student debt.

These are legitimate concerns of course. However, the solution to these problem is not to abolish the institution of tenure. Tenure is not anti-liberty, and it provides important protections for those who are libertarians (and conservatives) in academia. In addition, it has some efficiency properties that explain why it has survived and might well do so even in a world where the state had no role in higher ed.

There are many potential objections to tenure. For some, the idea that a tenured professor cannot be fired strikes them as a rejection of the free market. Others believe that tenure is a way of protecting leftist faculty, even if their ideas are wrong-headed, and students don’t wish to hear them. In that way, tenure is a kind of monopoly protection for bad ideas. Finally, people across the political spectrum believe that tenure creates so-called “deadwood” faculty who, once they are tenured, no longer have to care about their teaching or research.

First, let’s dispatch a common misconception: it is not true that tenured professors cannot be fired.  Tenured professors can be fired for a variety of reasons.  What tenure does is limit what counts as a valid reason for dismissal in order to protect academic freedom. A tenured professor can be fired if caught plagiarizing, or found guilty of sexual or other forms of harassment, or convicted of violent crime. But if she can be fired for writing an article that the dean disapproves of, she cannot perform her job. And that is where tenure comes in.

Understanding why tenure is a desirable institution requires us to remember the purpose of a university. Universities are, ideally, institutional arrangements that enable scholars to engage in the activities of seeking the truth and then sharing the fruits of our scholarship with students, other scholars, and perhaps the general public.

Essential to that project is that scholars are free to seek the truth as we see it, without interference by others who have different goals. Of course, scholars must play by some very simple rules of the game: do not lie or cheat; do not distort your data or the arguments of your sources; be transparent about conflicts of interest; do not engage in personal attacks or the use of force, among others.

If this sounds familiar, that’s because the search for truth is a discovery process analogous to the market. Just as entrepreneurs in a market require the freedom to discover value where their best judgment takes them, subject to rules against force and fraud, so do scholars in a university require the freedom to discover truth where our best judgment takes us.

Tenure protects scholars like us from interference with our attempts to discover truth. Scholars cannot engage in truth-seeking if we’re facing retaliation from people who don’t like where our research leads. A university cannot be a university without robust protection of the open exchange of ideas and the freedom of each scholar to research in his or her field without intimidation.

By ruling out the possibility of firing a professor simply for the content of her beliefs, tenure ensures that the university will be what Michael Polanyi called “a republic of science,” in which truth-seeking is the highest standard.

Skeptics might argue that even if tenure were abolished, faculty still wouldn’t leave their current jobs because they would find it difficult to get hired elsewhere. But that’s not the point. The point is that we cannot do our jobs without a credible guarantee of academic freedom, and tenure is one way to secure that.

Tenure protects academic freedom in three distinct ways. First, when we engage in research and publishing, we can’t be worried that some administrator, trustee, politician, or even a student activist will find our work offensive and retaliate against us. This will have a chilling effect on our ability to seek the truth, which is our job as college professors. There are numerous examples of libertarian and conservative faculty facing just these sorts of threats, and tenure is the primary reason those threats are empty.

Second, when we construct and teach our curricula, we can’t worry that any of the usual suspects will take offense, or try to substitute their judgment for ours. Finally, when participating in institutional decision making about academic matters, we can’t be afraid to call shenanigans on various administrator-driven fads (of which there are many) that would undermine our ability to engage in research and teaching.

Although we are open to alternative institutional processes if they could be shown to adequately protect academic freedom, abolishing tenure in their absence is a dicey proposition. Absent tenure, it is libertarians and conservatives who would be the first to be persecuted, censored, or silenced.

Politically correct ideas don’t need the protection of tenure because they are popular; tenure protects ideas that are not. Abolishing it would give still more power to the activists and administrators who seek to create an ideologically uniform academy.

Given those concerns, how big is the downside to tenure? If the complaint is that some faculty’s research productivity declines after tenure, then an easy fix is to have continued productivity tied to merit raises.  Nothing about the institution of tenure precludes post-tenure reviews and merit pay. And even if some faculty slack off as publishers, so what? As long as they’re good teachers, mentors, and colleagues, it’s not necessary that all college faculty be active publishers their whole careers.

Tenure offers a beneficial set of incentives for many universities. Faculty want the protections we have outlined above, and universities want to encourage faculty to develop university-specific human capital to better serve their educational vision and the type of students they attract. Faculty don’t necessarily want to make those specific investments if the opportunity cost may be enhancing their publication record so as to make them more attractive in the job market.

Tenure is a commitment by the institution to maintain a faculty member’s employment in return for abiding by some basic rules and demonstrating during the tenure process that they have acquired that institution-specific human capital. The faculty member gets enhanced, but not total, job security, and the institution gets someone committed to its particular needs. In this way, tenure is like marriage: we bind ourselves to an arrangement with high exit costs in order to incentivize ourselves to commit to the relationship. Just as marriage is compatible with a free society, so is tenure.

There are many problems with contemporary higher education, but tenure isn’t one of them. Ending tenure would exacerbate many of those issues while also creating new ones. And in an institutional setting where the opponents of liberty hold most of the cards, getting rid of one of the most important institutions that protects dissent and the ability to seek the truth will only silence the friends of liberty.

Aeon J. Skoble is Professor of Philosophy at Bridgewater State University.

Steven Horwitz is the Charles A. Dana Professor of Economics at St. Lawrence University and the author of Hayek’s Modern Family: Classical Liberalism and the Evolution of Social Institutions. He is a member of the FEE Faculty Network.

This article was published by The Foundation for Economic Education and may be freely distributed, subject to a Creative Commons Attribution 4.0 International License, which requires that credit be given to the author.

Don’t Assume I’m Smarter Than My Contractor – Article by Kevin Currie-Knight

Don’t Assume I’m Smarter Than My Contractor – Article by Kevin Currie-Knight

The New Renaissance Hat
Kevin Currie-Knight
September 11, 2015

“So, I figured I’d ask you,” said my contractor. “You’re a lot smarter than me and—”

That’s when I stopped him.

Tom knows I am a college professor, and he wanted to ask my advice on his daughter’s education. He’s an ex-Marine who never went to college. It makes sense to ask an educator for advice about education, but why does that make me smarter?

I thought about all the times I’ve asked Tom’s advice about the house we are renovating, and about all the times he answered with a tone that implied, “Well, obviously you should…”

“Tom,” I said, “I wouldn’t say I’m smarter than you. It depends on the topic.”

He smiled politely and moved on to his question.

But even if he dismissed my objection as perfunctory, I can’t let it go. Why does our culture trivialize nonacademic intelligence and knowledge?

I think the existing structure of schooling plays a big part.

Fantasy Football

Let me tell another story, this one from my days as a high school special educator. I was teaching a study-skills class to students with learning disabilities. Partly, this course provided students extra time on assignments for other classes. One day, I sent two students to the library to work on a written project assigned for another course. About 10 minutes later, I received a call from the school librarian.

“You should come up here and get these kids, because they are off task and disturbing others!”

When I got to the library, I didn’t want to confront my students immediately. I wanted to see how, exactly, they were being disruptive.

What were they doing? Adjusting their fantasy football rosters.

As anyone who’s really played fantasy football knows, adjusting your weekly roster involves contemplating a lot of statistics: What are this player’s chances against this team? How does this team do against this type of running back?

That’s what my students were doing in the library: arguing over statistics. Not bad for kids considered learning disabled in subjects like math.

Like a good teacher, I interrupted their passionate dispute and instructed them to come back to the room, where they could get going on the more important work of writing an academic paper.

Whether we mean to or not, we constantly reinforce the message that only the stuff kids are taught in school counts as serious learning. Extracurriculars are fine, but what really counts is in their textbooks and homework.

We send them to school precisely because we believe that’s where they’ll be taught the most important subjects. We grade them on those things, and in many ways we measure their worth (at least while they’re in school) by how well they do on tests and school assignments.

Deschooling America

I’m certainly not the first person to notice this. Education theorist John Holt wrote about it in his frankly titled essay “School Is Bad for Children”:

Oh, we make a lot of nice noises in school about respect for the child and individual differences, and the like. But our acts, as opposed to our talk, says to the child, “Your experience, your concerns, your curiosities, your needs, what you know, what you want, what you wonder about, what you hope for, what you fear, what you like and dislike, what you are good at or not so good at — all this is of not the slightest importance, it counts for nothing.”

Ivan Illich made a similar point in Deschooling Society. Illich suggests that schooling makes us dependent on institutions for learning by convincing us that what we learn in school is important and what we learn outside is not.

Likewise, in Shop Class as Soulcraft, philosopher and auto mechanic Matthew Crawford bemoans the dichotomy we set up in our schools and society between knowing and doing. Schools are increasingly cancelling programs like shop class to make way for more knowing and less doing. Crawford points out that this drastically underestimates the crucial role of thinking in manual labor.

If you are still in doubt, think about this: earlier, I talked about learning disabilities. According to the Diagnostic and Statistical Manual of Mental Disorders (DSM), learning disabilities can only exist in academic subjects like reading and math. If you are bad at playing music or drawing, you are not learning disabled — just bad at music or art.

There may be good reason we leave teaching biology to the schools and teaching how to care of a car to the home (or to “extracurricular” apprenticeships). There may be good reason we teach algebra in the schools but not the statistical analysis needed to adjust a fantasy football roster. But the standard segregation of subjects sends the message that what is learned in school must be more important. We send you to a special building to learn it, we grade you on your ability to learn it, and we socially judge much of your worth by your success at it.

Almost by reflex, we ask kids, “What did you learn in school today?,” not, “What did you learn today?” The existence of school has conditioned us to regard what happens there as important, while we relegate what happens outside of school to the dust heap of “extracurriculars.”

So, no, Tom, I am not smarter than you; we’re both pretty smart. It’s just that our school-influenced culture wrongly tells us that what I do is more cerebral and therefore requires more intelligence than what you do. And that’s a bad assumption.

Kevin Currie-Knight teaches in East Carolina University’s Department of Special Education, Foundations, and Research.

This article was originally published by The Foundation for Economic Education and may be freely distributed, subject to a Creative Commons Attribution 4.0 International License, which requires that credit be given to the author.

Maybe the Hardest Nut for a New Scientist to Crack: Finding a Job – Article by Bryan Gaensler

Maybe the Hardest Nut for a New Scientist to Crack: Finding a Job – Article by Bryan Gaensler

The New Renaissance Hat
Bryan Gaensler
March 29, 2015

The typical biography of a scientist might look something like this.

At a young age, a boy or girl discovers a love for science. Their dream is to become perhaps a geologist, a chemist, or a marine biologist.

At school they work hard at math and science, and they supplement this with everything else they can get their hands on: books, documentaries, public talks and visits to museums. They take all the right courses at college and then embark on a PhD in their chosen field.

After many years of hard effort (including chunks of time racked with doubt and frustration), they complete a solid body of work that contains some genuinely new discoveries. They’ve had the chance to meet some of the big names they read about as a kid, and now actually know some of them on a first-name basis.

The day a young graduate receives his or her science diploma is the most thrilling and satisfying day of their life. They are finally, officially, a scientist.

But there’s one thing that all those years of study and research has not prepared them for: the job market.

There must be a job out there somewhere…. Michael Salerno, CC BY-NC-SA

Pounding the pavement as a scientist

No matter what your profession, job hunting is not fun. But for scientists and other researchers, it’s a weird world of intense competition, painfully long time scales, and uncertain outcomes.

The strangest thing about a scientific career is that the application deadlines are often ridiculously early. Hoping to find a university position starting in September? If you wait until February or March to begin your job search, you’ve likely left it way too late. The application deadlines for some of the juiciest positions were way back in November and December.

Because of this advanced schedule, only the things that someone accomplishes a year or more before actually needing a new job will matter for their career prospects. Any amazing discoveries made after the application deadline are largely irrelevant.

The problem is that this is not always how science works.

For many important research topics, all the headline results emerge only at the very end. Students whose research is part of a massive longitudinal study or who are members of a big project team suddenly find themselves at a huge disadvantage, because they often can’t provide instant evidence of the quality of their work a whole year before needing a job.

The other daunting thing is the intensity of the competition. For most specialized scientific topics, there are far more PhD degrees than job postings: across all of science, doctoral degrees outnumber faculty positions by a ratio of 12 to one. An advertisement for a fellowship or junior faculty position will routinely draw hundreds of applications, and only 1%-2% of graduates will eventually land a coveted professorship.

How to proceed, when the odds are so stacked against you? Inevitably, the only way to counter the competition is to apply for lots of positions. A budding scientist is expected to apply for a dozen or more jobs, spread all over the world.

This situation immediately creates some challenges and problems.

By increasing the quantity of applications, the quality suffers. In an ideal world, an applicant will provide a carefully wrought narrative, weaving a story as to how their skills and background perfectly dovetail with the interest of the department they hope will hire them. But there’s no time for that. Instead one typically sends out a generic CV and research plan, and then essentially just hopes for the best.

The process is also incredibly inefficient. Professors all over write endless careful letters of recommendation, most of which have little bearing on the outcome. Selection panels spend hundreds of hours reading huge piles of applications, but can only afford a scant 10-15 minutes considering the merits of each candidate.

What’s more, not everyone can freely pursue jobs anywhere the market will take them. Young children, aging parents and other personal circumstances result in a large pool of outstanding scientists with strong geographic constraints, and hence limited options.

Overall, the harsh reality is that many applicants will simply not get any offers. A lifelong dream of being a scientist, combined with an advanced postgraduate degree, is tragically not a guarantee of a scientific career.

Good scientists should be able to find jobs

The frustration, disappointment and disillusionment grow every year. Things need to change.

First, employers need to make much more of an effort to tell applicants what sort of scientist they are looking for. Instead of reducing the job searching process to the scientific equivalent of speed dating, advertisements need to set out a clear and detailed set of selection criteria, with lots of context and background on the role and working environment. By properly telling the community what they’re looking for, labs and research institutes can focus their time on candidates with useful qualifications, and applicants can focus their energy on only those jobs for which they have a realistic chance.

Second, we need to create flexible career paths. Part-time positions, “two body” hires for couples with both members in academia, and accommodation of career interruptions need to become de rigueur, rather than whispered legends we’ve all only ever heard about second- or third-hand.

And finally, a specialist science degree needs to move beyond the expectation that it offers training only in one particular type of science.

A good scientist graduates with passion, vision and brilliance, and also with persistence, organization, rigor, eloquence and clarity. A scientist can incisively separate out truth from falsehoods, and can solve complicated problems with precious little starting information. These are highly desired attributes. The scientific community needs not just accept but celebrate that the skills and values we cherish are the paths to a wide range of stimulating and satisfying careers – both in and out of academia.

Bryan Gaensler is an award-winning astronomer and passionate science communicator, who is internationally recognised for his groundbreaking work on dying stars, interstellar magnets and cosmic explosions. A former Young Australian of the Year, NASA Hubble Fellow, Harvard professor and Australian Laureate Fellow, Gaensler is currently the Director of the Dunlap Institute for Astronomy and Astrophysics at the University of Toronto. He gave the 2001 Australia Day Address to the nation, was awarded the 2011 Pawsey Medal for outstanding research by a physicist aged under 40, and in 2013 was elected as a Fellow of the Australian Academy of Science. His best-selling popular science book Extreme Cosmos was published worldwide in 2012, and has subsequently been translated into four other languages.

This article is republished pursuant to a Creative Commons Attribution No-Derivatives license. It was originally published by The Conversation.

The Humility of Futurism – Article by Adam Alonzi

The Humility of Futurism – Article by Adam Alonzi

The New Renaissance Hat
Adam Alonzi
April 20, 2014

Civilization operates as if its troubles and their solutions will be as relevant tomorrow as they are today. Likely they were obsolete yesterday. How preposterous do the worries and aspirations of yesteryear seem now? What has not been refined since its conception? Our means of subsistence, entertainment, expression and enlightenment continue to change, although, at least unconsciously, they are accepted as stable. Change, once gradual, now quickens exponentially. Countless professions have been created and destroyed by advances; old orders have been destroyed, new ones have arisen; our world outlooks have been revolutionized by new discoveries over and over, although a sizable portion of the world is unwilling or unable to understand a man like Aubrey de Grey and an equally sizable portion of the population is still struggling with Copernicus. A Futurist accepts himself and his ideas as incomplete, therefore he actively works to improve upon them. Futurism is the first ideology that explicitly accepts the necessity and desirability of change.

It is a mistake to think we have reached the final stage of our journey. Plateaus are mirages conjured by the shortsighted; human evolution is a mountain without a peak. If a man has eyes, let him see all we have done and all we have yet to do. Let him gain the humility religion and liberalism have failed to inculcate into him and so many others. Each generation repeats this mistake. There is no evidence to suggest we are complete or are doomed now only to regress. Naysayers seem motivated to dismiss the triumphs of others out of fear they themselves will appear even less significant. Historically the distant future has received little attention compared to such pressing questions as the number of angels on the head of a pin or the labor theory of value. This may be thanks to a fondness for the apocalyptic, a fascination which certainly has not faded with time, but it is also attributable to the egotistical need to stand out. All epochs are transitions. The advances of this decade have failed to restore popular faith in progress, yet the very word is misleading. Faith does rest not upon an empirical foundation. There are scores of popular beliefs founded upon little or no evidence. Yet the proof of progress is all around us. Death wishes and earth-annihilating misanthropy aside, we can trace the modern disdain for the march forward to the fashionable nonsense of academia.

Speculations and prophecies, even conservative estimates based on careful analysis, are treated with derision by the public. To say one has faith in technology is misleading. To compare the singularity to the rapture is like comparing planetary motion to Santa Claus. One is rooted in scripture, the other in observation. The doomsayers, secular and religious alike, enjoy forecasting our demise. The essential corruption critics charge Western civilization with is common to all; it is called human nature. It is meant to be transcended, not through critiques of immaterial “cultural entities,” but by improving our bodies and our minds through bioengineering. No belief is needed here. We do not rely upon a outworn holy book or the absurd dialectic of the Marxists. We change and adapt because we must. This is a point of pride, not one of shame. We do not worship the past; we have shrugged it off. Compared to the ridiculous claims circulating in the cesspool collectively referred to as “the humanities” this is a sane position, yet it is treated with nothing by scorn by those who, wishing so ardently to distance themselves from Western civilization, bite the hands that feed them, clothes them, and shelters them. While they navigate by GPS, post their inane tangents on social media sites, and try with all their might to discredit the culture to which they owe their lives and livelihoods, others push forward. Self-proclaimed critics of Western civilization should consider trading their general practitioner for an Angolan witch doctor. It is hard to respect those who do not practice what they preach.

Postmodernism and cultural relativism, though they have pretensions of completeness and delusions of permanence, are but passing fads. Something devoid of usefulness or, for that matter, a coherent hypothesis, cannot last long when technology is generating so much benefit to so many people. A meme will continue to propagate itself long after it has served its purpose, to the detriment of competitors and to society at large. It is the duty of Futurists and Transhumanists to demolish the acceptability of rubbish in academia and in the media. The Luddites are more dangerous than the Creationists. Hubris is barely acceptable in the hard sciences, but in an absolutely unempirical discipline like philosophy, it is deplorable. Our first priority should not be political or religious; it should be scientific. To whom do we owe our prosperity, and to whom do we owe our future? To whom do we owe our lives and the lives of our children? How many of us would not be here today were it not for the men and women of modern medicine? This is not the end. Forget the weary and the overwhelmed; they are weak. Forget the ones who have no desire to climb higher; they are unfit. Cast aside the ones who pray fervently for the undoing of their own species; they are the most vile of all. This is not the end. This is our beginning.
Adam Alonzi is the author of Praying for Death and A Plank in Reason. He is also a futurist, inventor, DIY enthusiast, biotechnologist, programmer, molecular gastronomist, consummate dilletante and columnist at The Indian Economist. Read his blog Cool Flickers.
Help the next generation embrace a progress-filled vision of the future by supporting the illustrated children’s book Death is Wrong (free in Kindle format until April 22, 2014), and the campaign to distribute 1000 paperback copies to children, free of cost to them. The Indiegogo fundraising period ends on April 23, so please consider making a contribution today.

Iterative Learning versus the Student-Debt Trap – Video by G. Stolyarov II

Iterative Learning versus the Student-Debt Trap – Video by G. Stolyarov II

Mr. Stolyarov explains why the structure of formal schooling does not teach the ways in which real achievements are attained. The worst obstacle to true, iterative learning is student debt that locks people into a particular path for most of their lives.

– “Iterative Learning versus the Student-Debt Trap” – Essay by G. Stolyarov II – The Rational Argumentator. This essay was originally published on the as a guest post on the “Education Bubble and Scam Report” website.
– “Reasons Not to Pursue a PhD” – Video by G. Stolyarov II
– “Advice for Most Recent High-School and College Graduates” – Video by G. Stolyarov II
– “Commonly Misunderstood Concepts: Education” – Video by G. Stolyarov II

Iterative Learning versus the Student-Debt Trap – Article by G. Stolyarov II

Iterative Learning versus the Student-Debt Trap – Article by G. Stolyarov II

The New Renaissance Hat
G. Stolyarov II
December 18, 2012
This article was originally published as a guest post on the “Education Bubble and Scam Report” website.
Contemporary formal schooling inculcates a counterproductive and often stressful fallacy into millions of young people – particularly the best and brightest. The fallacy, which undermines the lives of many, is that, when it comes to learning, productivity, and achievement, you have to get it absolutely right the first time. Consider how grades are assigned in school. You complete an assignment or sit for a test – and if your work product is deficient in the teacher’s eyes, or you answer some questions incorrectly, your grade suffers. It does not matter if you learn from your mistakes afterward; the grade cannot be undone. The best you can do is hope that, on future assignments and tests, you do well enough that your average grade will remain sufficiently high. If it does not – if it takes you longer than usual to learn the material – then a poor grade will be a permanent blot on your academic record, if you care about such records. If you are below the age of majority and prohibited from owning substantial property or working for a living, grades may be a major measure of achievement in your eyes. Too many hits to your grades might discourage you or lead you to think that your future prospects are not as bright as you would wish.
But this is not how the real world works. This is not how learning works. This is not how great achievements are attained. It took me years to figure this out. I was one of those students who insisted on always attaining the highest grades in everything. I graduated first in my class in high school (while taking honors and Advanced Placement courses whenever they were offered) and second in college – with three majors. In high school especially, I sometimes found the grading criteria to be rather arbitrary and subjective, but I spent considerable time preparing my work and myself to meet them. While I did engage in prolific learning during my high-school years, the majority of that learning occurred outside the scope of my classes and was the result of self-study using books and the Internet. Unfortunately, my autonomous learning endeavors needed to be crammed into the precious little free time I had, because most of my time was occupied by attempting to conform my schoolwork to the demanding and often unforgiving expectations that needed to be met in order to earn the highest grades. I succeeded at that – but only through living by a regimen that would have been unsustainable in the long term: little sleep, little leisure, constant tension, and apprehension about the possibility of a single academic misstep. Yet now I realize that, whether I had succeeded or failed at the game of perfect grades, my post-academic achievements would have probably been unaffected.
How does real learning occur? It is not an all-or-nothing game. It is not about trying some task once and advancing if you succeed, or being shamed and despondent if you do not. Real learning is an iterative process. By a multitude of repetitions and attempts – each aiming to master the subject or make progress on a goal – one gradually learns what works and what does not, what is true and what is false. In many areas of life, the first principles are not immediately apparent or even known by anybody. The solution to a problem in those areas, instead of emerging by a straightforward (if sometimes time-consuming) deductive process from those first principles, can only be arrived at by induction, trial and error, and periodic adjustment to changing circumstances. Failure is an expected part of learning how to approach these areas, and no learning would occur in them if every failure were punished with either material deprivation or social condemnation.
Of course, not all failures are of the same sort. A failure to solve a math problem, while heavily penalized in school, is not at all detrimental in the real world. If you need to solve the problem, you just try, try again – as long as you recognize the difference between success and failure and have the free time and material comfort to make the attempts. On the other hand, a failure to yield to oncoming traffic when making a left turn could be irreversible and devastating. The key in approaching failure is to distinguish between safe failure and dangerous failure. A safe failure is one that allows numerous other iterations to get to the correct answer, behavior, or goal. A dangerous failure is one that closes doors, removes opportunities, and – worst of all – damages life. Learning occurs best when you can fail hundreds, even thousands, of times in rapid succession – at no harm or minimal harm to yourself and others. In such situations, failure is to be welcomed as a step along the way to success. On the other hand, if a failure can take away years of your life – either by shortening your life or wasting colossal amounts of time – then the very approach that might result in the failure should be avoided, unless there is no other way to achieve comparable goals. As a general principle, it is not the possibility of success or failure one should evaluate when choosing one’s pursuits, but rather the consequences of failure if it occurs.
Many contemporary societal institutions, unfortunately, are structured in a manner hostile to iterative learning. They rather encourage “all-in” investment into one or a few lines of endeavor – with uncertain success and devastating material and emotional consequences of failure. These institutions do not give second chances, except at considerable cost, and sometimes do not even give first chances because of protectionist barriers to entry. Higher education especially is pervaded by this problem.
At a cost of tens of thousands of dollars per year, college is an enormous bet. Many think that, by choosing the right major and the right courses of study within it, they could greatly increase their future earning potential. For some, this works out – though they are a diminishing fraction of college students. If a major turns out not to be remunerative, there may be some satisfaction from having learned the material, and this may be fine – as long as it is understood that this is a costly satisfaction indeed. Some will switch majors during their time in college, but this is often in itself an extremely expensive decision, as it prolongs the time over which one must pay tuition. For those who can afford either non-remunerative or serial college majors out of pocket, there is the opportunity cost of their time – but that is not the worst that can happen.
The worst fate certainly befalls those who finance their college education through student debt. This was a fate I happily avoided. I graduated college without having undertaken a penny of debt – ever – largely as a result of merit scholarships (and my choice of an institution that gave merit scholarships – a rarity these days). Millions of my contemporaries, however, are not so fortunate. For years hereafter, they will bear a recurring financial burden that will restrict their opportunities and push them along certain often stressful and unsustainable paths in life.
Student debt is the great disruptor of iterative learning. Such debt is assumed on the basis of the tremendously failure-prone expectation of a certain future monetary return capable of paying off the debt. Especially in post-2008 Western economies, this expectation is unfounded – no matter who one is or how knowledgeable, accomplished, or productive one might be. Well-paying jobs are hard to come by; well-paying jobs in one’s own field of study are even scarcer. The field narrows further when one considers that employment should not only be remunerative, but also accompanied by decent working conditions and compatible with a comfortable standard of living that reflects one’s values and goals.
Money is ultimately a means to life, not an end for its own sake. To pursue work that requires constant privation in other areas of life is not optimal, to say the least – but debt leaves one with no choice. There is no escape from student debt. Bankruptcy cannot annul it. One must keep paying it, to avoid being overwhelmed by the accumulated interest. Paying it off takes years for most, decades for some. By the time it is paid off (if it is), a lot of youth, energy, and vitality are lost. It follows some to the grave. If one pays it off as fast as possible, then one might still enjoy a sliver of that precious time window between formal education and senescence – but the intense rush and effort needed to achieve this goal limits one’s options for experimenting with how to solve problems, engage in creative achievement, and explore diverse avenues for material gain.
If you are in heavy debt, you take what income you can get, and you do not complain; you put all of your energy into one career path, one field, one narrow facet of existence – in the hope that the immediate returns are enough to get by and the long-term returns will be greater. If you wish to practice law or medicine, or obtain a PhD, your reliance on this mode of living and its hoped-for ultimate consequences is even greater. You may defer the payoff of the debt for a bit, but the ultimate burden will be even greater. Many lawyers do not start to have positive financial net worth until their thirties; many doctors do not reach this condition until their forties – and this is the reality for those who graduated before the financial crisis and its widespread unemployment fallout. The prospects of today’s young people are even dimmer, and perhaps the very expectation of long-term financial reward arising from educational debt (or any years-long expensive formal education) is no longer realistic. This mode of life is not only stressful and uncertain; it comes at the expense of family relationships, material comfort, leisure time, and experimentation with diverse income streams. Moreover, any serious illness, accident, or other life crisis can derail the expectation of a steady income and therefore render the debt a true destroyer of life. Failure is costly indeed on this conventional track of post-undergraduate formal schooling.
It may be difficult for many to understand that the conventionally perceived pathway to success is in fact one that exposes a person to the most dangerous sorts of failure. The best way forward is one of sustainable iterative work – a way that offers incremental benefits in the present without relying on huge payoffs in the future, all the while allowing enough time and comfort to experiment with life-improving possibilities at one’s discretion. Diversification is the natural companion of iteration. The more you try, the more you experiment, the more you learn and the more you can apply in a variety of contexts.
Having avoided the student-debt trap, I can personally attest to how liberating the experience of post-academic learning can be. Instead of pursuing graduate or professional school, I decided to take actuarial and other insurance-related examinations, where the cost of each exam is modest compared to a semester of college – and one can always try again if one fails. In the 3.5 years after graduating from college, I was able to obtain seven professional insurance designations, at a net profit to myself. I have ample time to try for more designations still. My employment offers me the opportunity to engage in creative work in a variety of capacities, and I focus on maximizing my rate of productivity on the job so as to achieve the benefits of iterative learning and avoid the stress of an accumulated workload. I could choose where I wanted to live, and had the resources to purchase a house with a sizable down payment. Other than a mortgage, which I am paying ahead of schedule, I have no debt of any sort. Even the mortgage makes me somewhat uncomfortable – hence my desire to pay it off as rapidly as possible – but every payment gets me closer to fully owning a large, tangible asset that I use every day. In the meantime, I already have a decent amount of time for leisure, exercise, independent study, intellectual activism, and family interactions.
My life, no doubt, has its own challenges and stresses; anyone’s situation could be better, and I can certainly conceive of improvements for my own – but I have the discretionary time needed to plan for and pursue such improvements. Moreover, the way of iterative learning is not fully realizable in all aspects of today’s world. Comparatively, I have fewer vulnerabilities than debt-ridden post-undergraduate students of my age, but I am not immune to the ubiquitous stressors of contemporary life. We continue to be surrounded by dangers and tasks where it is truly necessary not to fail the first time. As technology advances and we come to life in a safer, healthier world, the sources for life-threatening failure will diminish, and the realm of beneficial trial-and-error failure will broaden. The key in the meantime is to keep the failure points in one’s own life to a minimum. Yes, automobile accidents, crime, and serious illnesses always have a non-zero probability of damaging one’s life – but even that probability can be diminished through vigilance, care, and technology. To avoid introducing vulnerability into one’s life, one should always live within one’s present means – not expectations of future income – and leave oneself with a margin of time and flexibility for the achievement of any goal, financial or not. Productivity, efficiency, and skill are all welcome assets, if they are used to prevent, rather than invite, stress, anxiety, and physical discomfort.
Learning absolutely anything of interest and value is desirable, as long as the cost in time and money – including the opportunity cost – is known and can be absorbed using present resources. This principle applies to any kind of formal schooling – or to the purchase of cars, major articles of furniture, and electronic equipment. If you enjoy it, can afford it out of pocket, and can think of no better way to use your time and money – then by all means pursue it with a clear conscience. If you cannot afford it, or you need the money for something more important, then wait until you have the means, and find other ways to use and enjoy your time in the interim. With the Internet, it is possible to learn many skills and concepts at no monetary cost at all. It is also possible to pursue relatively low-cost professional designation programs in fields where sitting in a classroom is not a requirement for entry.
Remember that success is attained through many iterations of a variety of endeavors. Try to make each iteration as inexpensive as possible in terms of time and money. Except in times of acute crisis where there are no other options, avoid all forms of debt – with the possible exception of a mortgage, since it is preferable to the alternative of renting and giving all of the rent away to another party. Do not put all of your time and energy into a single field, a single path, a single expectation. You are a multifaceted human being, and your job in life is to develop a functional approach to the totality of existence – not just one sub-specialty therein. Remember, above all, never to lose your individuality, favored way of living, and constructive relationships with others in the pursuit of any educational or career path. You should be the master of your work and learning – not the other way around.