1 00:00:00,000 --> 00:00:04,000 Okay, let's unpack this. We are diving into a topic that, well, it really hits 2 00:00:04,000 --> 00:00:04,960 close to home 3 00:00:04,960 --> 00:00:09,280 for anyone concerned about digital privacy. Secure communication is fundamental. 4 00:00:09,280 --> 00:00:15,600 Exactly. And we've got sources talking about talks. It claims to be a truly 5 00:00:15,600 --> 00:00:17,280 surveillance 6 00:00:17,280 --> 00:00:21,690 resistant way to message people. So if you've ever worried about those big apps 7 00:00:21,690 --> 00:00:22,400 maybe listening in 8 00:00:22,400 --> 00:00:27,200 or logging your chats, this deep dive is definitely for you. It is a critical 9 00:00:27,200 --> 00:00:28,240 subject, yeah. 10 00:00:28,240 --> 00:00:33,350 And our mission today really is to give you a clear way into understanding this 11 00:00:33,350 --> 00:00:34,000 technology. 12 00:00:34,000 --> 00:00:34,960 Make it accessible. 13 00:00:34,960 --> 00:00:39,040 Right. We want to simplify the core ideas of talks, like its decentralized nature, 14 00:00:39,040 --> 00:00:41,520 how it's built differently, and its security model. 15 00:00:41,520 --> 00:00:42,800 So you don't just get what it is. 16 00:00:42,800 --> 00:00:47,600 But why it matters, why it represents maybe a necessary shift away from platforms 17 00:00:47,600 --> 00:00:48,240 that track so 18 00:00:48,240 --> 00:00:53,120 much. Okay, perfect. Now, before we jump into all that cryptographic freedom stuff, 19 00:00:53,120 --> 00:00:53,440 just a quick 20 00:00:53,440 --> 00:00:57,720 word about the supporter of this deep dive. Sure. This deep dive is brought to you 21 00:00:57,720 --> 00:00:58,560 by Safe Server. 22 00:00:58,560 --> 00:01:02,800 Safe Server is dedicated to hosting software and supporting you through your 23 00:01:02,800 --> 00:01:04,000 digital transformation 24 00:01:04,000 --> 00:01:07,850 journey. So they help make sure the infrastructure is there for projects like this. 25 00:01:07,850 --> 00:01:08,800 Exactly. You can 26 00:01:08,800 --> 00:01:13,940 find out more over at www.safeserver.de. Good to know. So here's where it gets 27 00:01:13,940 --> 00:01:14,560 interesting, 28 00:01:14,560 --> 00:01:19,280 the whole philosophy behind talks. Our sources frame it as a direct response, 29 00:01:19,280 --> 00:01:20,240 almost a rebellion 30 00:01:20,800 --> 00:01:26,220 against digital surveillance. That makes sense. People were and are really fed up. 31 00:01:26,220 --> 00:01:26,880 Fed up with 32 00:01:26,880 --> 00:01:33,200 existing options that, well, the sources put it bluntly, they spy on us, track us, 33 00:01:33,200 --> 00:01:34,160 and censor us. 34 00:01:34,160 --> 00:01:37,770 And that underlying frustration, that's really the engine driving talks' 35 00:01:37,770 --> 00:01:39,120 development. Right. Whether 36 00:01:39,120 --> 00:01:43,220 it's corporations wanting your data for ads or governments collecting logs. The 37 00:01:43,220 --> 00:01:43,600 problem is 38 00:01:43,600 --> 00:01:48,100 widespread, yeah. And talks promises to be this immediate kind of easy to use 39 00:01:48,100 --> 00:01:49,360 countermeasure. 40 00:01:49,360 --> 00:01:53,520 And the core pitch is powerful, isn't it? Software that connects you with friends 41 00:01:53,520 --> 00:01:53,680 and 42 00:01:53,680 --> 00:01:57,040 family without anyone else listening in. That's the dream for many. 43 00:01:57,040 --> 00:02:01,480 Now, what makes their approach, their philosophy, different from maybe other apps 44 00:02:01,480 --> 00:02:02,240 that claim to be 45 00:02:02,240 --> 00:02:07,280 secure? It seems like it's rooted in this idea of freedom, not just about cost. 46 00:02:07,280 --> 00:02:11,840 Exactly. And we really need to clarify what free means in the context of talks. It's 47 00:02:11,840 --> 00:02:13,600 free software. Meaning? 48 00:02:13,600 --> 00:02:18,710 Meaning free, as in freedom. You know, the freedom to use the software, look at the 49 00:02:18,710 --> 00:02:19,600 source code, 50 00:02:19,600 --> 00:02:20,960 modify it, share it. 51 00:02:20,960 --> 00:02:22,160 Okay, so transparency. 52 00:02:22,160 --> 00:02:26,000 Total transparency. And yes, it's also free in price, no charge. 53 00:02:26,000 --> 00:02:30,560 Which reinforces that idea you mentioned. It's made by the users for the users. 54 00:02:30,560 --> 00:02:34,960 That's the claim. The sources are quite clear. No corporate interests and no hidden 55 00:02:34,960 --> 00:02:35,840 agendas. 56 00:02:35,840 --> 00:02:38,240 It's built to be simple and secure messaging. 57 00:02:38,240 --> 00:02:42,000 And that open source aspect, that transparency, that's pretty key in the security 58 00:02:42,000 --> 00:02:42,640 world, isn't it? 59 00:02:42,640 --> 00:02:48,480 Oh, absolutely. It means anyone, any expert, any curious user can examine the code. 60 00:02:48,480 --> 00:02:52,320 Making it theoretically harder to hide back doors or sneaky tracking stuff. 61 00:02:52,320 --> 00:02:56,560 Precisely. The community, in theory, acts as a kind of constant auditor. 62 00:02:56,560 --> 00:02:59,680 If the code's visible, flaws are hopefully found faster. 63 00:02:59,680 --> 00:03:03,760 Okay, now let's talk practicality. A privacy app is great in theory, 64 00:03:03,760 --> 00:03:06,880 but it's only useful if people actually, well, use it. 65 00:03:06,880 --> 00:03:08,080 Right, it needs to compete. 66 00:03:08,080 --> 00:03:12,120 And that means it has to offer features people expect from their regular chat 67 00:03:12,120 --> 00:03:12,960 platforms. 68 00:03:12,960 --> 00:03:15,680 It's not just about secure texts, is it? 69 00:03:15,680 --> 00:03:20,240 No, not at all. The sources show it's aiming to be a full communication suite. 70 00:03:20,240 --> 00:03:22,400 Trying to be a viable replacement for the big names. 71 00:03:22,400 --> 00:03:24,880 That seems to be the goal. You get your instant messaging, 72 00:03:24,880 --> 00:03:26,480 obviously secure and encrypted. 73 00:03:26,480 --> 00:03:27,200 And it's stuff? 74 00:03:27,200 --> 00:03:30,080 But also completely free and encrypted voice calls. 75 00:03:30,080 --> 00:03:30,800 Okay. 76 00:03:30,800 --> 00:03:35,560 And importantly, secure video calls, you know, for actually seeing people face to 77 00:03:35,560 --> 00:03:37,040 face, but privately. 78 00:03:37,040 --> 00:03:40,870 Right. I noticed the features list in the sources goes a bit beyond just chat 79 00:03:40,870 --> 00:03:41,440 though. 80 00:03:41,440 --> 00:03:42,160 Screen sharing. 81 00:03:42,160 --> 00:03:43,760 Yeah, screen sharing is in there. 82 00:03:43,760 --> 00:03:48,640 Securely share your desktop, maybe for a collaboration or helping someone out. 83 00:03:48,640 --> 00:03:49,920 And this next one caught my eye. 84 00:03:49,920 --> 00:03:54,160 File sharing. The sources say no artificial limits or caps. 85 00:03:54,160 --> 00:03:58,480 Now, wait a minute. That sounds huge. 86 00:03:58,480 --> 00:04:02,560 How can they promise no limits if there are no central servers managing things? 87 00:04:02,560 --> 00:04:06,160 Don't regular services cap file sizes because of server costs? 88 00:04:06,160 --> 00:04:10,560 They absolutely do. Server storage and bandwidth cost money, so caps are normal. 89 00:04:10,560 --> 00:04:12,400 So how does Tox bypass that? 90 00:04:12,400 --> 00:04:16,160 Well, this is where that peer-to-peer, that distributed architecture, 91 00:04:16,160 --> 00:04:17,760 really starts to show its strength. 92 00:04:17,760 --> 00:04:18,560 Oh, okay. 93 00:04:18,560 --> 00:04:22,640 When you share a file using Tox, you're not uploading it to a central server first. 94 00:04:22,640 --> 00:04:24,800 You're sending it directly to your friend's device. 95 00:04:24,800 --> 00:04:26,480 Peer-to-peer, peer-to-peer. 96 00:04:26,480 --> 00:04:31,200 Exactly. The transfer relies only on the internet connection between the two of you. 97 00:04:31,200 --> 00:04:35,760 So the only real limit becomes your own upload or download speed, 98 00:04:35,760 --> 00:04:39,520 not some arbitrary corporate limit designed to save them money. 99 00:04:39,520 --> 00:04:43,760 Wow. So the lack of a central server actually becomes a feature for big file 100 00:04:43,760 --> 00:04:44,480 transfers. 101 00:04:44,480 --> 00:04:48,720 In this case, yes. It also enables secure group chats, naturally. 102 00:04:48,720 --> 00:04:53,990 Right. For sharing messages, calls, video, even those potentially large files with 103 00:04:53,990 --> 00:04:54,800 a whole group. 104 00:04:54,800 --> 00:04:56,800 Correct. It's ambitious, like you said. 105 00:04:56,800 --> 00:05:00,810 It definitely establishes that the goal isn't just some niche, super secure tool 106 00:05:00,810 --> 00:05:01,840 for experts. 107 00:05:01,840 --> 00:05:06,560 No, it's aiming for a comprehensive, viable, surveillance-free replacement for 108 00:05:06,560 --> 00:05:08,240 everyday communication. 109 00:05:08,240 --> 00:05:11,520 All right. Let's get into the real deep dive part now. 110 00:05:11,520 --> 00:05:15,430 The technical magic, as the outline called it. How does it actually stop people 111 00:05:15,430 --> 00:05:16,480 from listening in? 112 00:05:16,480 --> 00:05:20,480 Okay. So it really boils down to two main pillars, technically speaking. 113 00:05:20,480 --> 00:05:21,120 Which are? 114 00:05:21,120 --> 00:05:23,040 Encryption and distribution. 115 00:05:23,040 --> 00:05:26,320 Okay. Let's take encryption first. How does that work? What makes it secure? 116 00:05:26,320 --> 00:05:32,240 So the foundational security, the scrambling of the messages, it's built using well-known, 117 00:05:32,240 --> 00:05:34,400 trusted, open-source libraries. 118 00:05:34,400 --> 00:05:36,720 Libraries. Like collections of code. 119 00:05:36,720 --> 00:05:40,000 Exactly. Code that handles the complex math of encryption. 120 00:05:40,000 --> 00:05:44,320 Specifically, the core uses something called libsodium. 121 00:05:44,320 --> 00:05:49,280 Libsodium. Why should, say, a beginner listening care about that specific name? 122 00:05:49,280 --> 00:05:55,160 Think of libsodium-like. A really high-quality, tested engine in a secure car. It's 123 00:05:55,160 --> 00:05:55,600 based on 124 00:05:55,600 --> 00:06:01,080 another respected system called NECL, and it's known for being modern, fast, and 125 00:06:01,080 --> 00:06:02,400 very hard to break. 126 00:06:02,400 --> 00:06:04,720 So it's not some homemade encryption they cooked up themselves. 127 00:06:04,720 --> 00:06:08,240 No, no. It uses industry-standard, vetted cryptography. 128 00:06:08,240 --> 00:06:11,920 This is critical because it underpins that central promise we talked about. 129 00:06:11,920 --> 00:06:14,880 The only people who can see your conversations are the people you're talking with. 130 00:06:14,880 --> 00:06:19,510 Precisely. That's what strong end-to-end encryption powered by something like libsodium 131 00:06:19,510 --> 00:06:20,320 provides. 132 00:06:20,320 --> 00:06:24,880 Okay, so encryption protects the message. But what about the system itself? 133 00:06:25,520 --> 00:06:28,560 Protecting the network from being shut down or monitored. 134 00:06:28,560 --> 00:06:31,600 This feels like the big aha moment. 135 00:06:31,600 --> 00:06:34,880 Right. And that comes down to the second pillar, distribution. 136 00:06:34,880 --> 00:06:38,000 Meaning no central point. 137 00:06:38,000 --> 00:06:42,400 Exactly. Tox has no central servers. Think about the difference. 138 00:06:42,400 --> 00:06:47,200 Imagine trying to take down an old-style ham radio network where everyone connects 139 00:06:47,200 --> 00:06:48,080 directly 140 00:06:48,080 --> 00:06:50,880 versus trying to shut down a modern cell phone tower. 141 00:06:50,880 --> 00:06:54,960 The tower is one big target. Easy to find, easy to control. 142 00:06:54,960 --> 00:06:57,760 Right. But a distributed peer-to-peer network. 143 00:06:57,760 --> 00:07:01,200 The network is just the users connected to each other. There's no single hub. 144 00:07:01,200 --> 00:07:03,600 So that eliminates that single point of failure. 145 00:07:03,600 --> 00:07:07,200 Servers can be raided by authorities, right? They can be shut down legally or 146 00:07:07,200 --> 00:07:07,920 technically. 147 00:07:07,920 --> 00:07:12,160 Or a company or government can force them to hand over user data, logs. 148 00:07:12,160 --> 00:07:16,000 All of that becomes much, much harder if there's no central server to target. 149 00:07:16,000 --> 00:07:17,920 Where do you send the subpoena? Who do you raid? 150 00:07:17,920 --> 00:07:20,720 The whole system becomes way more resilient to that kind of pressure. 151 00:07:20,720 --> 00:07:25,120 It does. And there's that bonus practical benefit to remember the server outages on 152 00:07:25,120 --> 00:07:25,760 big platforms. 153 00:07:25,760 --> 00:07:26,800 Oh, yeah. 154 00:07:26,800 --> 00:07:29,360 Well, if the network is simply made up of its users, 155 00:07:29,360 --> 00:07:35,040 it's much less likely to have a single massive outage. It's inherently more robust. 156 00:07:35,040 --> 00:07:39,690 OK, but hang on. If there are no central servers, how do I even find my friends 157 00:07:39,690 --> 00:07:40,240 online? 158 00:07:40,240 --> 00:07:44,560 Don't you need some kind of central directory like a phone book to connect? 159 00:07:44,560 --> 00:07:48,480 Oh, that is the classic challenge for any peer-to-peer network. 160 00:07:48,480 --> 00:07:50,880 It's a real technical hurdle. 161 00:07:50,880 --> 00:07:52,480 So how does Tox solve it? 162 00:07:52,480 --> 00:07:56,720 It uses a few techniques. The main one involves your unique Tox ID. 163 00:07:56,720 --> 00:07:59,200 It's a long string of characters, like a public key. 164 00:07:59,200 --> 00:08:02,080 OK, so I share my ID with my friend. They share theirs with me. 165 00:08:02,080 --> 00:08:05,440 Right. And then your Tox client uses the network itself, 166 00:08:05,440 --> 00:08:09,280 specifically something called a distributed hash table or DHT. 167 00:08:09,280 --> 00:08:11,360 DHT sounds complicated. 168 00:08:11,360 --> 00:08:15,380 Think of it like a decentralized address book spread across many users on the 169 00:08:15,380 --> 00:08:16,080 network. 170 00:08:16,080 --> 00:08:20,210 Your client uses the DHT, plus maybe a little help from some initial bootstrap 171 00:08:20,210 --> 00:08:20,640 nodes, 172 00:08:20,640 --> 00:08:24,000 publicly known starting points to find out your friend's current IP address. 173 00:08:24,000 --> 00:08:27,680 So those bootstrap nodes give it a starting nudge, help find the path. 174 00:08:27,680 --> 00:08:30,640 Exactly. They help initiate the connection. 175 00:08:30,640 --> 00:08:33,600 But once that connection is made, the actual communication, 176 00:08:33,600 --> 00:08:38,300 your messages, calls, files, flows directly peer to peer between you and your 177 00:08:38,300 --> 00:08:38,720 friend. 178 00:08:38,720 --> 00:08:40,640 It doesn't route through some central hub. 179 00:08:40,640 --> 00:08:44,730 Correct. Only the initial finding each other part gets a little help from those 180 00:08:44,730 --> 00:08:45,680 public nodes. 181 00:08:45,680 --> 00:08:47,760 The ongoing conversation is direct. 182 00:08:47,760 --> 00:08:50,560 And that radical distribution is really what sets it apart, 183 00:08:50,560 --> 00:08:54,400 even from other apps that might use encryption but still rely on central servers. 184 00:08:54,400 --> 00:08:55,760 That's the core difference, yes. 185 00:08:55,760 --> 00:09:01,920 Okay. So the goals are ambitious, true freedom, security through decentralization. 186 00:09:01,920 --> 00:09:06,000 The architecture sounds impressive, but we have to shift gears a bit. 187 00:09:06,000 --> 00:09:08,080 Time for the critical context. 188 00:09:08,080 --> 00:09:11,040 Yeah. Our sources include some really important caveats, 189 00:09:11,040 --> 00:09:14,460 some warnings about its current status that anyone thinking of using it needs to 190 00:09:14,460 --> 00:09:14,880 know. 191 00:09:15,440 --> 00:09:18,160 This is absolutely crucial. It cannot be stressed enough. 192 00:09:18,160 --> 00:09:21,600 The sources literally use bold text to emphasize this point. 193 00:09:21,600 --> 00:09:26,080 Tox is currently an experimental cryptographic network library. 194 00:09:26,080 --> 00:09:29,360 Experimental. What does that mean in practical terms for a user? 195 00:09:29,360 --> 00:09:35,120 If the underlying crypto math, like libsodium, is solid, where's the risk? 196 00:09:35,120 --> 00:09:37,360 The risk lies in the implementation, 197 00:09:37,360 --> 00:09:40,640 how all those secure pieces are put together into a working system. 198 00:09:41,600 --> 00:09:45,680 Experimental means the overall security model, the whole design, 199 00:09:45,680 --> 00:09:50,480 has not yet been formally audited by an independent third-party security firm. 200 00:09:50,480 --> 00:09:54,880 You know, specialists in cryptography or finding flaws in complex systems. 201 00:09:54,880 --> 00:09:58,560 So no official stamp of approval from outside experts yet? 202 00:09:58,560 --> 00:10:00,800 Not yet. They're very open about this, which is good, 203 00:10:00,800 --> 00:10:06,560 but it means users are explicitly warned. Use this library at your own risk. 204 00:10:06,560 --> 00:10:11,600 So even with a strong engine, blue sodium, the blueprint for the rest of the car, 205 00:10:11,600 --> 00:10:15,690 how the doors lock, how the steering works, that hasn't been fully crash tested by 206 00:10:15,690 --> 00:10:16,320 outsiders. 207 00:10:16,320 --> 00:10:19,980 That's a decent analogy, yeah. The overall architecture, how connections are 208 00:10:19,980 --> 00:10:20,400 handled, 209 00:10:20,400 --> 00:10:24,400 precisely what threats it protects against that's still being refined and debated 210 00:10:24,400 --> 00:10:24,720 within the 211 00:10:24,720 --> 00:10:28,960 community. The sources even reference specific ongoing discussions about defining 212 00:10:28,960 --> 00:10:29,440 the formal 213 00:10:29,440 --> 00:10:32,970 threat model. Trying to figure out exactly what kinds of attacks it should be able 214 00:10:32,970 --> 00:10:33,600 to resist. 215 00:10:33,600 --> 00:10:38,000 Correct. And they're also open about known weaknesses, which is important. They 216 00:10:38,000 --> 00:10:38,160 point 217 00:10:38,160 --> 00:10:42,320 to discussions about things like, what happens if someone steals your secret key? 218 00:10:42,320 --> 00:10:46,080 Which is vital for users to understand, especially if they're considering it for 219 00:10:46,080 --> 00:10:51,090 highly sensitive stuff. Absolutely. That level of transparency about ongoing work 220 00:10:51,090 --> 00:10:52,320 and potential issues, 221 00:10:52,320 --> 00:10:57,280 while maybe a bit scary, is actually essential for trustworthy security projects. 222 00:10:57,280 --> 00:11:01,520 Right. Now, despite that experimental label, the development process itself sounds 223 00:11:01,520 --> 00:11:02,000 pretty 224 00:11:02,000 --> 00:11:06,000 active and professional, doesn't it? It's all open source on GitHub. Very much so. 225 00:11:06,000 --> 00:11:15,920 They use multiple SaaS tool tools. Static application security testing. 226 00:11:15,920 --> 00:11:20,320 Yes. Tools like Coverity, CPP Check, PVS Studio. 227 00:11:20,320 --> 00:11:23,360 Okay. What are those? Why should a non-developer care? 228 00:11:23,360 --> 00:11:28,000 Think of SaaS tools as automated code checkers, like Spell Check and Grammar Check, 229 00:11:28,000 --> 00:11:31,200 but for potential security bugs and common programming mistakes. 230 00:11:31,200 --> 00:11:33,120 So they scan the code automatically. 231 00:11:33,120 --> 00:11:37,770 Constantly. They scan the millions of lines of C code, looking for patterns that 232 00:11:37,770 --> 00:11:37,920 often 233 00:11:37,920 --> 00:11:42,400 lead to vulnerabilities. It's like having robot auditors doing a first pass. 234 00:11:42,400 --> 00:11:45,200 Finding potential problems before a human even needs to look. 235 00:11:45,200 --> 00:11:50,160 Exactly. Using these tools shows a serious commitment to code quality and finding 236 00:11:50,160 --> 00:11:50,560 bugs 237 00:11:50,560 --> 00:11:54,880 early, even before they get to that big formal third-party audit stage. 238 00:11:54,880 --> 00:11:58,720 That's definitely reassuring. Okay. So this brings us to maybe the most important 239 00:11:58,720 --> 00:11:59,040 question 240 00:11:59,040 --> 00:12:04,000 for you, the listener. We've got talks offering this potentially radical digital 241 00:12:04,000 --> 00:12:04,800 freedom, right? 242 00:12:04,800 --> 00:12:10,400 Decentralized, aiming to resist surveillance. But that freedom comes wrapped in the 243 00:12:10,400 --> 00:12:11,280 known risks of 244 00:12:11,280 --> 00:12:15,520 using cutting-edge experimental software that hasn't finished its external security 245 00:12:15,520 --> 00:12:16,320 validation. 246 00:12:16,320 --> 00:12:19,440 That's the core tension. So if the overall security 247 00:12:19,440 --> 00:12:23,680 model hasn't been fully audited, what are the stakes? Is it just, oh, my message 248 00:12:23,680 --> 00:12:24,080 might not 249 00:12:24,080 --> 00:12:28,840 send reliably, or could someone's privacy genuinely be compromised in ways we don't 250 00:12:28,840 --> 00:12:29,600 know yet? 251 00:12:29,600 --> 00:12:34,400 The stakes are potentially high. Because that full independent verification isn't 252 00:12:34,400 --> 00:12:35,280 complete, 253 00:12:35,280 --> 00:12:39,280 there is a possibility that an unknown flaw in how the pieces are put together 254 00:12:39,280 --> 00:12:40,240 could exist. 255 00:12:40,240 --> 00:12:45,680 A flaw that could allow eavesdropping. Or maybe finding out who is talking to whom. 256 00:12:45,680 --> 00:12:50,400 It's possible. An undiscovered bug in the protocol implementation could potentially 257 00:12:50,400 --> 00:12:54,660 leak information. It's a trade-off you have to consciously make. So you gain that 258 00:12:54,660 --> 00:12:55,120 strong 259 00:12:55,120 --> 00:12:59,110 resistance to corporate or government seizure because there's no central point to 260 00:12:59,110 --> 00:12:59,680 attack. 261 00:12:59,680 --> 00:13:04,800 Right. But you personally take on the risk associated with the software's current, 262 00:13:04,800 --> 00:13:07,600 let's say, maturity level. It's experimental status. 263 00:13:07,600 --> 00:13:10,720 Okay. That leads us nicely to our summary takeaway then. 264 00:13:10,720 --> 00:13:16,880 We've seen TOCS offers this pretty comprehensive vision for surveillance-resistant 265 00:13:16,880 --> 00:13:17,520 communication. 266 00:13:17,520 --> 00:13:19,840 Built on that peer-to-peer architecture. 267 00:13:19,840 --> 00:13:24,240 Aiming for genuine freedom and security by cutting out the middleman, the central 268 00:13:24,240 --> 00:13:24,800 server. 269 00:13:24,800 --> 00:13:25,200 But. 270 00:13:25,200 --> 00:13:30,320 But, and it's a big but, we have to weigh that significant promise against the 271 00:13:30,320 --> 00:13:30,720 critical 272 00:13:30,720 --> 00:13:34,800 fact that it is still an experimental library. It hasn't completed those formal 273 00:13:34,800 --> 00:13:35,200 independent 274 00:13:35,200 --> 00:13:36,320 security audits yet. 275 00:13:36,320 --> 00:13:38,720 Yeah. And connecting that to the bigger picture, 276 00:13:38,720 --> 00:13:42,720 it raises a fundamental question for all of us seeking more digital freedom. 277 00:13:42,720 --> 00:13:48,000 When you want that kind of true freedom, the kind that removes central points of 278 00:13:48,000 --> 00:13:48,960 control, 279 00:13:48,960 --> 00:13:52,950 whether corporate or governmental, how much individual risk are you willing to 280 00:13:52,950 --> 00:13:53,760 accept? 281 00:13:53,760 --> 00:13:57,440 Risk in the form of using software that's still evolving, still being tested. 282 00:13:57,440 --> 00:14:02,320 Exactly. How much risk is acceptable to you in exchange for that ultimate privacy 283 00:14:02,320 --> 00:14:03,440 and control? 284 00:14:03,440 --> 00:14:06,080 It's a core tension we'll likely see more of. 285 00:14:06,080 --> 00:14:09,230 Definitely something for you to mull over as you make your own choices about 286 00:14:09,230 --> 00:14:10,240 digital tools. 287 00:14:10,240 --> 00:14:11,920 An excellent thought to end on. 288 00:14:11,920 --> 00:14:14,880 And remember, this deep dive was supported by Safe Server. 289 00:14:14,880 --> 00:14:17,600 They help with hosting needs and digital transformation. 290 00:14:17,600 --> 00:14:21,360 You can find out more about them at www.safeserver.de. 291 00:14:21,360 --> 00:14:24,480 That's right, www.safeserver.de. 292 00:14:24,480 --> 00:14:27,280 Thank you for joining us for this deep dive into talks. 293 00:14:27,280 --> 00:14:30,000 We hope you feel thoroughly informed and maybe a bit more equipped 294 00:14:30,000 --> 00:14:32,400 to think critically about your communication choices.