Was the process of running this campaign what you expected? Were there any surprises for you?

My campaign has been pretty much to always spread the message around transhumanism. It was never to win, and I said that from the very beginning. What I didn't expect is that I would very quickly enter into the top 10 candidates for almost the entire two years. Just three years ago I signed with one of the major websites that puts you together with candidates. To think one of the largest websites would include me in the top eight was good news.

It's happened because Hillary and Trump, and the candidate before as well, were so unpopular. It's made a lot of the third-party activities much more visible than you ever would have had [expected]. And I think we had a couple big pushes. When Bernie Sanders quit, a huge amount of people said, "Wow, we're not going to the establishment," and then they started looking elsewhere. The dismay over the main candidates has certainly made this election season the perfect moment for a small political party like mine to move in and really generate coverage.

Was this a good election season to introduce transhumanism in general?

What's happening is that in the last two years, everyone has been hit in the face with how fast technology has been changing. And I think the best example right now is Uber. Everyone was so excited that Uber was creating millions of new jobs for people — then they announce driverless cars. And everyone realizes, oh, wait a sec, this technology comes and goes, and then it takes jobs. I think what happened is a lot of people began to realize is, wow, genetic editing, exoskeleton technology for the disabled, blind people seeing through robotic eyes, telepathy via brainwave headsets -- everything is starting to shift very quickly.

You've admitted you're mainly doing this to spread the word; it's not going to be a winning campaign. What do you see the role of the campaign after the election?

We'd like to build out the party. I'd like to run again; there's no question I will. Whether I would do so for the transhumanist party is another matter. You know, without the funding, without the backbone, it's incredibly difficult. I think it's a great idea, but it really does take 20 years to establish a party. ... [The Libertarians] honestly have zero chance at winning, and they have a few people in little local offices. And that's 40 years later. So we're two years later. You can imagine, there are hundreds of political parties in America, and very few actually make a dent unless a billionaire comes in.

Did you try to raise donations?

We did, but we were basically unsuccessful at raising anything substantial. And the reason is that very wealthy people, some who support transhumanism, said, "We'd like to see you in 2020, let's see how you do this time." And they also said, "I would rather spend my dollars and have them go somewhere." And I understand that. I have friends at Google — I live in the Bay Area — I have friends at Apple and Singularity University. ... But nobody is really interested in the big splurge yet. And that's what it takes. It's going to take, at minimum, a few million dollars to change the party and change the nature of ballot access.

That's funny, though. We've seen Peter Thiel invest millions in Trump. I would think he's someone who supports the transhumanist ideals, too.

You know, we asked, and he didn't get back to us. I don't know what the agenda is. I'm surprised he did that, too. I know he's into life extension, and that's the main premise of my campaign, to get the government to put money into that field. However, I just think it's really a matter of establishing legitimacy. You start off small, and it's like a snowball. We're excited that two years later it's gone so far, but it could take 10 years to become competitive to even the Green Party. And it would take big funders to get onboard.

So is the plan, if you were to run again, to join a more-established party?

I interviewed to be Gary Johnson's vice president. ... And I had a great interview with him. He had me over at his house overnight in New Mexico, and I had, like, a 20-hour interview. I would probably run under the Libertarian Party next time. The problem, though, is that I actually lean quite far left. When you actually look at who I am, I'm as much on the Democratic side as I am on the Libertarian. I am what they call a left-leaning Libertarian. So most Libertarians would say I'm not Libertarian enough. And a lot of Democrats would say, "You're not like us; you're not Bernie Sanders enough."

Has your message changed at all since the beginning?

The biggest one I tweaked is that I decided to incorporate a universal basic income into the platform after a lot of discussion. And what's happened, after just two years -- the amount of robots and automation taking jobs has increased dramatically. I took a campaign bus across the country, and one of the things that came out of those four months was spending a lot of time with truck drivers. Lo and behold, Europe had this driverless truck traveling across Europe. There's no question that within five years, in my mind, assuming Congress allows it, that there will be driverless vehicles on the road. And I don't think drivers will survive. So we decided to support a universal basic income, and we're really the first political entity to do that with any visibility.

Have you laid out how you can make that possible? I know you want to lower taxes in general, but you need more taxes to do this.

I think, up front, you'd certainly need more. We have three policies at this point, and, generally, the 1 percent would have to pick up the tab. ... Take the idea of truck drivers. Most of them are older males, most are gun-carrying, hard men; you can't take away their jobs and say there's nothing else you can do. You can't retrain them — you're going to retrain them and in another five years, the robots are going to take those jobs, too. So these are 3 [million] to 4 million men out there who are essentially going to cause civil strife if they don't get something back. And they don't want to be on welfare, they're kind of a proud people.

The rich people don't want that revolt to happen. The best thing for the rich people to have is [a] society [that] functions smoothly -- science and technology grows, the economy grows. That can happen by keeping less civil strife. So there will be higher taxes on the rich. The second policy -- definitely the manufacturers, at some point, must be partially responsible. If Google creates AI, at some point Google has to say, "Well, we've replaced 30 million jobs with our machines -- what have we done for society?" I think at some point that has to be addressed.

But the biggest thing that I would do, and this is quite controversial: The federal government owns a huge amount of resources. Half of the 11 Western states are owned by the federal government. Because I believe, as a transhumanist, we'll be upgrading at some point into machines or into cyborgs, using less natural resources, I have suggested we either loan or sell off large chunks of that federal land. I don't know if in a hundred years America is going to exist. What if we go through the Singularity ...

Whatever happens, we have trillions of dollars of untapped wealth. We could buy 15 or 20 years of the universal basic income off of those kinds of things alone. Now, I know the environmentalists will just hate that, but at the same time, there's so much untapped wealth there. And it would certainly be a good way to feed people and to give them housing and education, rather than let that land be there for a species that I believe is going to be fundamentally changed in a hundred years.

It also seems like the main thing you're promoting is life extension, but that leads to overpopulation. Maybe we will need that space if you want your main goal to come true.

Yes, this is the No. 1 issue I get with life extension. ... The thing is, as countries become wealthier, there's definitely less population. People stabilize it, too. There's a good chance, as the world develops, we'll probably stabilize around 15 billion. The thing is, you have to look at what future technologies are going to bring. There's a very good chance that genetic editing will bring our ability to grow foods five times quicker, maybe 10 times quicker, and regrow rainforests to take care of the environment.

My environmental policy is firm. I believe humans have destroyed the planet. I did a lot of work for National Geographic covering these things. But I don't believe that the best way [forward] is to lessen our carbon footprint. I think the best way is to spend much more money on aggressive green technologies. ...

A third of the arable land on Earth is going toward grazing. Well, if we have meatless meat made in a factory, we won't need all that. ... With overpopulation -- we'll be able to feed many more people. I think the world can handle 15 billion people, especially as people migrate more to cities. So the question is, we probably won't need as much federal land. And a lot of that land are places where people wouldn't want to work, anyway. We're talking mineral rights, just sheer mineral rights, worth trillions of dollars.

I'm making a long bet that the human being won't remain human, and that machines would take less resources, and our planet would be able to be pristine again. We won't be based on what we're doing now, things like agriculture and expanding and destroying the planet. Hopefully one day we'll have a lot of technology to make it a better place. Also, there's the whole Star Trek thing. At some point, maybe it'll be very interesting to get people off the planet.

You've mentioned on your website that you're proudly an atheist, and that's unique for a presidential candidate. But the way I'm reading it, transhumanism right now also seems faith-based. You're taking it on faith that technology will upgrade us and things will change in 100 years. How do you view your faith versus traditional religious faith?

You're right. Everyone always says, "Well, the Singularity seems no different than a religious experience." I can't deny that's true, because the Singularity is this concept that's beyond human understanding. With my idea of transhumanism, I try to stay in the next 10 to 15 years of what might happen. We'll have robotic arms that are better than human arms, and should we electively get it? If I can give you a robotic eye that can see gases and germs or stream media live, those are the things I really advocate for.

When I drove my bus, everyone was saying, "Oh, are you some kind of Christian?" And I said no, we're actually a science- and technology-based moment. But the idea is that I think it's just a matter of getting it out and convincing people. Hearing the word [transhumanism] and associating it with secularism, science and technology, and people say, "Oh, that's just a movement." Sometimes when we say "Greenpeace," we automatically think activists who are over the top. And that's our impression of it.

We're trying to make it so that doesn't happen. So the word and the movement is seen as something that's just a science thing.

But it sounds like, personally, you believe in 100 years things could radically change. We'll be uploading consciousness to machines and things like that. That sounds like an afterlife.

That's where my atheism falls apart. I say I'm an atheist in my campaign, but I'm actually a theodicist. What I believe in, there's a trillion galaxies out there. There's almost certainly aliens and artificial intelligences. I did my senior thesis in college [on] brains in a vat.

I'm a big believer in the idea we live in a holographic universe ... I think it's true. I can't prove it, but I have good arguments. And it would completely throw out this theist [mentality]. When I say I'm an atheist, what I'm really saying is I'm against fundamentalist conservatism that says Jesus died for me. I was raised Catholic, so I have my little battles to fight.

When you're bringing up the idea of gene editing, I think there are valid reasons to consider a moratorium. Scientists might need to take a step back when we reach certain milestones and really think about the ethical and moral implications of something. That could be something that's not tied to religion. Do you think we need to take that step and really think about things? Or should we constantly be moving forward?

You're 100 percent right. The problem, though, is when we don't have something like my campaign or the Transhumanist Party there -- then the moratorium is so one-sided. It's like how it's important for me to run as an atheist, just to make a stand against what I find oppressive, the fact that virtually our entire government is believing in this fundamentalist idea.

Or they say they do. It's a convenient narrative like you're attempting.

I hope so. And if that's the case, we have much less to worry about. But I think it's important to sometimes set up a wall just to make sure the balance of the universe continues. That way you have a much more democratic picture. And I do a lot of aggressive activism that I really don't see as completely philosophically valuable, but I see it as necessary.

If I had a choice, I would say, "Well, you're right, a moratorium is fine, let's consider it." The problem is that some of the people calling for it are so hellbent on keeping their conservatism in the picture that I'm afraid they're going to do what George W. Bush did with stem cells, which is shut it down for seven years. I want to say, "No, wait a sec. If we have someone like Obama who wants to give it some funding, that's a good way to move forward with science." My worst fear is if someone like Ted Cruz got into office, and all of a sudden this entire gene-editing revolution occurred during his term, and he freaks out and says, "We don't want to be gods." That would be tragic for science, and in particular American science, because places like China and India are going to take off with it.

It's important to offer resistance sometimes just for resistance's sake.

Would you say you have more faith in technology than people?

I actually have faith more in technology. That said, I really believe technology is neutral. It's really what people make of it. But for me, technology is the offshoot of the most complex and intellectual side of ourselves. And I think it's very important to try to understand that technology in the future, especially when we create consciousness, might be the better version of ourselves. We should consider that technology and what we create in artificial intelligences could be better than biology.

I'm very much into merging with that. I would like to be a complete digital consciousness. I think the complexity of that would be much more complex than this 3-pound bag of meat that we have right now. But I realize that's [fraught] with danger as well. I don't want an AI that's smarter than me on Earth as well, unless I'm part of it or I'm one with it.

But if it's truly AI, we wouldn't really have a choice. If it's smart enough to know your biological flaws are too inefficient, they probably would rather be their own thing.

Technically, yes, I'm hoping we'll have a method to stay directly an integral part of that. I'm not sure how that technology or science will work yet. I'm assuming if we ever turn on the "On" switch for AI, we should have the 100 best people in the world connected to it so that we have some ability to control it. But that's impossible to know.

That's where all the fears come from. If AI were to actually happen, we wouldn't have any control over it, and it would be omniscient in a way.

That's why I'd never endorse turning it on. I think I'd rather find the perfect transhumanism world through a mixture of cybernetics, machine parts, these kinds of things, and skip the AI. It just might be too dangerous to do, at least until we have better knowledge.

But that's the thing that's definitely going to happen. Maybe even more so than uploading our consciousness.

It's very scary. We still have the 50/50 chance it'll be beneficial.

A lot of what you're talking about sounds great for the more affluent folks, but what will transhumanism do to equalize things in society? We still have issues with things like poverty and discrimination -- those seem like more legitimate concerns.

I worry about that, too. One of the things that changed in my campaign was when I started, I sort of took on a Libertarian perspective, and I started pushing left until I really became hard left, really embracing many of Bernie Sanders' ideas. I said, "You know what? I don't really want to be the guy that's responsible for creating a dystopia." And that's where it can go if we let the 1 percent get all these technologies and nothing else is affordable. The artificial heart that they have in France now is a good example. It's $200,000 -- they're still experimenting with it. But nobody can afford that, so is it only the 1 percent that doesn't have to worry about heart disease?

This is where a universal basic income swallows health care. We delivered the Transhumanist Bill of Rights. It decreed that aging is a disease and that everybody has a universal right to overcome aging, overcoming suffering through technology, if they want. And that's the government's job, to provide that to society. And that's really when I lost a lot of my Libertarian followers.

I can't in good faith remain someone who's trying to bring all this technology knowing a huge amount of people aren't going to get it. And we must insist they get it. There must be universal rights set up just like education with the UN that insist all transhumanist technology is distributed freely.

This interview has been edited and condensed.