speaker
stringclasses
2 values
text
stringlengths
5
1.86k
A
Hey, guys, welcome to the debrief episode after our episode with Byron Tao on the total surveillance. Total surveillance is what we're calling this episode. The means of control was his book. Fantastic book. Yeah. What were your takeaways, David?
B
The big takeaway is just the paper thin wall between the United States that I think is happening in a variety of different sectors with financial freedom, and this is with privacy. Just the paper thin wall of the United States of being a western liberal values democracy country state. And fingers crossed. Fingers crossed. And like an authoritarian, authoritarian country, like calling a United States an authority country is wrong. But when there's a paper thin wall, it's like, still extremely relevant question. You could see it.
A
They just have so much surveillance power now that they've just never had before. And we as citizens insert your western liberal democracy here or any country. I don't think other countries are that much better when it comes to these things, and some are certainly worse, like, what's stopping them from using this against us? Right. And it's just a pretty paper thin, I guess, veneer of protection here. Byron seemed to have some faith in Congress to just put things right from a privacy perspective. I think I personally have less faith that that's the case. I feel very much like the neoliberal kind of political order here has just, I mean, for the past 40 years has just said that, no, in the interest of control, national security, getting the baddies, child pornography, insert whatever you want here, we're going to take away the people's privacy. Oh, and when they find out about it, we'll just create programs that they, we'll shut that one down and we'll create another program that they don't find out about it. Again, I keep going back to David. I feel like it's just an economic thing. What do states want? Seeing a state, that's a book you often cite, which is they want to serialize things. They want to control them. Now, the cost to serialize and control and collect data is basically economically dropped toward zero in a world of bits so cheap, like, why not? You just, it doesn't cost anything to do this in the scope of things. So they're going to continue to encroach and, like, no one's really going to stop them because the capitalist machine that we have is not going to stop them. And the system wants it. The system wants to stop. So the only counterforce we have here is the people. And like, mostly, I think people are just trying to live their lives and get by, and they're not woke to this, this whole privacy mess we're in.
B
Well, this is, this is why, like, this, like, silver bullet encryption. Encryption is actually the answer to this thing. It is the. The answer towards individual privacy. And it benefits from the same natural tailwinds that, like, the cheapness of data. It's like, so it's like a commodity to invade people's privacy.
A
It makes it expensive again.
B
It makes it expensive again. And, you know, the cheapness of, like, ZK is something like it's expensive now. Encryption is expensive now. It's. It's not a commodity now. It's like, cumbersome and bad ux now. But just in the same way that the arc of the cheapness of data bends towards the surveillance state, the arc of encryption bends towards the privacy of the individual. And so this is like, something that kind of keeps me optimistic, is like, well, actually, we have the power, we have the tools, we have the technology. It's less mature, uh, but it does shift the balance back towards the individual. But just watch the United States try and ban it because they're not going to want it. And so that's really going to be the fight. So at least we have this, like, very strong ally in technology, which has the same forces that used to be against us are now for us.
A
It's actually, it's actually our only hope. And I wish, like, broader society realized this. I think that I didn't realize it until I got into, like, crypto and understood, like, what cryptography actually gives us and the value of encryption. And, you know, it's not a lot worse right now because we have encryption. Right? It's just like, you know, if you are the director of the FBI, you don't want Apple to have the ability to encrypt somebody's iPhone device. Because in that hypothetical, and this is actual arguments that previous FBI directors have made, it's just like, what happens if we have a situation where hundreds of thousands of people could die and the only way to prevent that is to unlock this individual's phone, this terrorist phone or something? And you're telling me Apple and you encryption advocates, that your precious, precious cryptography and privacy is worth that? No, we need the key. We need the back door. We need the ability to just unlock that phone. Right? And so this is very much what the kind of like the people that are, quote, unquote defending us, you say, and their vantage point. But if you maximize that to its extreme, you get nothing. You get total surveillance state. You get a place where civil liberties aren't enforced and protected and where, like, bad actors can take control and, yeah, so it's just like a clash of these things. But thank God that we have the encryption that we have today, or we'll be completely screwed. And the answer to privacy problems is more encryption. I don't actually think it's. Congress is gonna do shit for this. Like, or maybe like, little. It'll be like, I don't know, window dressing type of things, like GDPR type things. Like things that don't really just, you know, help.
B
Yeah. The only experience I have with GDPR is, like, the fucking pop ups that slow my navigation through. But the conclusion of this conversation is like, okay, sweet. Say we win the war on encryption, which is not a given. That might actually be really hard just because when you fight on the side of encryption, you actually do fight on the same side as terrorists and drug traffickers and human traffickers.
A
Sure.
B
Those people are on your side. Like, you actually at first, right? At first, yeah. You actually support, like, the long tail of catastrophic events. Right? And so the Daniel Schmachtenberger conversation here is like, well, okay, we can cease the total surveillance, total control, authoritarian state. We can prevent that from happening, but you actually increase the long tail of, like, individuals being able to create significant destruction events inside of society. And that's actually the side that encryption does, like, help enable. So you're like, yeah, like, what side do you want? You want to give up your privacy and prevent, like, the long tail of people from being able to synthesize a new Covid or create, like, an atomic bomb in their garage once that technology becomes available.
A
Yeah.
B
Like, which side do you want to pick? And so this is the other side of this equation, which, like, we have to understand that this is, like, what we are creating here.
A
Sure. That's why I, like, kind of, like, balance of power. Right. And though, like, no side completely wins out, but we just have this balance of power. I think that in order to get back privacy by default, which we really just don't have. So, I mean, he said, you sound like a crank. And this is Byron's words. You sound like a crank if you're advocating for these types of things now. Whereas, like, 40 years ago, it was just a given, right. You didn't have to argue that cash should be private. Cause of. Of course it should be private. We have just, like, given up all of these things over the last, you know, like, 40 years time. And I think it's because previous generations just had real, actual fascist authoritarian regimes to deal with. So I mean, you think about the 1950s and sixties and such, and they had, what impressed upon society was a fascist nazi regime and communist authoritarian Russia. And so it's so distant to us. I guess what I'm saying is, I feel like theory to us, theory in order to get. But in order to get where we probably should be by default, maybe something bad happens first. People don't appreciate privacy because it's like a tail risk type of thing. Until something bad happens, until you have some regime that is taking out political vendettas against certain groups, what's the worst case? If our governments are benevolent, you get convenience and who cares? They know a little bit about your personal life. Who cares? It doesn't matter. It only matters.
B
Same thing with the concept in the crypto space where users don't care about decentralization. So let's all go use the more centralized, more convenient, faster, better UX blockchain. But then when the day comes that, like, the states want to literally control these systems, the more centralized, more convenient blockchain is going to fall victim to that.
A
That's right. So we're preppers, David. We're preppers.
B
That's ethereum, axies. We're crypto preppers.
A
Well, I mean, that's why teaching people how to use these tools, like encryption, again, is super important. Having people have the ability and know how to custody their own assets, super important just in case something goes wrong. And so I guess you'll just be kind of a fringe on society if you care about these things. Until society, like, until something bad happens to. Does that sound too doomer?
B
Now we can grave dance in our bunker.
A
It's encrypted. No, I mean, I wish. Yeah, I think we've talked about this many times before, but what do you think about the idea of bitcoin? Ethereum's just original sin being just not private.
B
Fine. I feel fine about it.
A
Yeah. I mean, it's just privacy is a trade off too, right? Because with the transparency, you sort of get other things. But.
B
The tech wasn't even ready. It's not like we, like, do we want to make bitcoin private or do we not, like, we actually did not. Satoshi did not have that option to make.
A
You know this about chainalysis that came from. I just actually don't know about the founding team of chainalysis at all that they came from. Kind of like in government circles.
B
Yeah, right. They're not crypto natives. They were narcs before they started chainalysis.
A
Yeah, I mean, they came from kind of. What. What's the term you used? Total. Total surveillance, total control, total informational awareness.
B
Yeah, right, right.
A
That whole paradigm.
B
Yeah, yeah.
A
It's just, I get, I guess what I'm saying.
B
No, crypto native is like, you know what startup I'm gonna make that collects all the data about crypto?
A
They're data brokers. They're crypto data brokers. And it's just a pretty powerful business. And I guess there's an argument that if you didn't have groups like that and you did have privacy of crypto from, like, day one, it didn't look even worse. Yeah. That we would have been strangled in the crib, basically. And this is just the, like, compromise we're having to make, like, over time in order to just not have nation states wipe this thing out, or at least attempt to anyway. I don't know. Lots of things here. Yeah. So did you learn anything from, like, I don't think I was aware at the, like, the level of corporation and government.
B
I don't think we landed the point that of, like, there aren't just, like, companies that are data brokers. There are companies who also data broke. I don't know if data broke is a word, but it's just like, it's like there are some companies, like chainalysis, who's like, that's their business model. But then there are some companies who just, like, have a bunch of data. Like Facebook. Facebook is Facebook. It has, like, Instagram, blah, blah. But it's also a data broker in the sense that it is also doing this thing. And so, like, the, like, penetration of this market is so deep in tech that it is all encompassing. Like, everyone is a data broker. Cause everyone has data.
A
Yeah.
B
So it's not just like a sidecar. It's like, kind of the whole thing.
A
I think that the other point that's interesting is basically, even though the US and just like, western countries didn't intend to achieve this outcome of, like, let's say, a totalitarian regime, like, outcome of total control, collect all of the information for control, all of these things. We have built a similar machine, haven't we? We've built a similar surveillance machine. We just kind of, like, did it as a byproduct of.
B
We did it with capitalism.
A
Yeah.
B
We did it with freedom.
A
Yeah. And now we do have a little bit of a wall, but it's just a very small firewall between taking that same machine and turning it against the citizens. There we go. We'll have to end it there thanks, guys.
B
This has been the debrief. Bye.
README.md exists but content is empty.
Downloads last month
35

Collection including MasaFoundation/bankless_DEBRIEF_The_Byron_Tau_Interview