1 00:00:00,151 --> 00:00:05,081 [SPEAKER_00] So, imagine your organization gets a subpoena tomorrow. 2 00:00:05,590 --> 00:00:07,931 [SPEAKER_01] Oh, wow, the ultimate nightmare scenario. 3 00:00:08,231 --> 00:00:08,431 [SPEAKER_00] Right. 4 00:00:08,751 --> 00:00:17,734 [SPEAKER_00] Regulators are suddenly demanding access to every piece of internal communication, every financial discussion, your audit trails in the last five years. 5 00:00:17,774 --> 00:00:19,594 [SPEAKER_01] Yeah, that is a massive undertaking. 6 00:00:19,655 --> 00:00:30,198 [SPEAKER_00] And if you're relying on one of those monolithic tech giants, like Microsoft or Google Workspace or any proprietary vendor to manage all that, you have to ask yourself, who actually owns that data? 7 00:00:30,338 --> 00:00:30,978 [SPEAKER_01] Exactly. 8 00:00:31,038 --> 00:00:33,839 [SPEAKER_01] Who actually holds the keys to your compliance vault? 9 00:00:34,019 --> 00:00:34,499 [SPEAKER_00] Exactly. 10 00:00:34,699 --> 00:00:40,821 [SPEAKER_00] And relying on these massive opaque ecosystems, it's not just a huge drain on your IT budget, which it is. 11 00:00:40,861 --> 00:00:42,002 [SPEAKER_00] I mean, they are expensive. 12 00:00:42,102 --> 00:00:42,762 [SPEAKER_01] Oh, absolutely. 13 00:00:42,782 --> 00:00:44,082 [SPEAKER_01] The licensing fees alone. 14 00:00:44,102 --> 00:00:44,302 [SPEAKER_00] Yeah. 15 00:00:44,442 --> 00:00:47,203 [SPEAKER_00] But it's also a fundamental surrender of your data sovereignty. 16 00:00:48,263 --> 00:00:52,005 [SPEAKER_00] And that actually brings us to the sponsor of today's deep dive, Safe Server. 17 00:00:52,065 --> 00:00:52,305 [SPEAKER_01] Right. 18 00:00:52,585 --> 00:00:55,826 [SPEAKER_00] Because Safe Server essentially flips the script on enterprise communication. 19 00:00:56,326 --> 00:01:02,508 [SPEAKER_00] They help organizations find, implement, and run open source solutions that are tailored to your exact compliance needs. 20 00:01:02,897 --> 00:01:05,878 [SPEAKER_01] Which is so critical when you're dealing with strict legal requirements. 21 00:01:05,898 --> 00:01:06,618 [SPEAKER_00] Yeah, exactly. 22 00:01:06,838 --> 00:01:21,441 [SPEAKER_00] Whether you're handling, like, email retention policies or protecting highly sensitive legal records, Safe Server guides you from the very first consulting phase all the way through to secure operation on servers located physically within the EU. 23 00:01:21,581 --> 00:01:25,222 [SPEAKER_01] The physical location in the EU is a huge deal for data protection. 24 00:01:25,502 --> 00:01:25,903 [SPEAKER_00] Huge. 25 00:01:26,323 --> 00:01:36,634 [SPEAKER_00] You get robust protection, you retain complete sovereignty over your encryption keys, and, like we mentioned, you completely eliminate those premium licensing fees from the proprietary giants. 26 00:01:36,814 --> 00:01:38,035 [SPEAKER_01] It's a win-win, really. 27 00:01:38,456 --> 00:01:38,716 [SPEAKER_00] It is. 28 00:01:39,016 --> 00:01:44,702 [SPEAKER_00] So you can start taking control of your own infrastructure right now by visiting www.safeserver.de. 29 00:01:45,641 --> 00:01:51,864 [SPEAKER_01] Because honestly, the question of who physically holds your keys, that's the definitive dividing line in digital security today. 30 00:01:52,005 --> 00:01:52,245 [SPEAKER_00] Yeah. 31 00:01:52,445 --> 00:01:56,627 [SPEAKER_01] I mean, if an external vendor holds the keys, you don't really have a secure system. 32 00:01:56,667 --> 00:01:58,168 [SPEAKER_01] You just have a permission slip. 33 00:01:58,288 --> 00:02:04,151 [SPEAKER_00] That is so true, a permission slip, which actually makes this the perfect entry point for what we are unpacking today. 34 00:02:04,672 --> 00:02:05,892 [SPEAKER_00] So welcome to the deep dive. 35 00:02:05,972 --> 00:02:07,013 [SPEAKER_01] Glad to be here for this one. 36 00:02:07,308 --> 00:02:16,355 [SPEAKER_00] Yeah, our mission for this session is to provide a beginner-friendly, easy entry point into a really fascinating piece of software called Shloider. 37 00:02:16,495 --> 00:02:17,176 [SPEAKER_01] Shloider, yes. 38 00:02:17,376 --> 00:02:22,000 [SPEAKER_00] And we are pulling this directly from the official website and their GitLab repository. 39 00:02:22,100 --> 00:02:23,040 [SPEAKER_01] Great sources. 40 00:02:23,080 --> 00:02:36,591 [SPEAKER_00] So whether you are a complete novice trying to understand how secure group communication actually functions or you're an IT admin looking to upgrade your group's secure communications, we're going to decode the technical jargon for you. 41 00:02:36,671 --> 00:02:38,633 [SPEAKER_01] Because it can get pretty dense, honestly. 42 00:02:38,753 --> 00:02:39,774 [SPEAKER_00] Oh, incredibly dense. 43 00:02:39,894 --> 00:02:40,114 [SPEAKER_00] Yeah. 44 00:02:40,194 --> 00:02:48,141 [SPEAKER_00] Like the official documentation defines Schleuter as a quote, GPG enabled mailing list manager with resending capabilities. 45 00:02:48,181 --> 00:02:49,442 [SPEAKER_01] Yeah, that's a mouthful. 46 00:02:49,482 --> 00:02:49,662 [SPEAKER_00] Right. 47 00:02:50,022 --> 00:02:52,564 [SPEAKER_00] OK, let's unpack this because that sounds super technical. 48 00:02:52,744 --> 00:02:59,450 [SPEAKER_00] I like to think of it as like a highly secure digital bouncer or maybe a bilingual translator for group emails. 49 00:02:59,550 --> 00:03:05,438 [SPEAKER_01] That's actually a really good way to look at it because we need to break down that first half, the GPG enabled mailing list manager part. 50 00:03:05,498 --> 00:03:07,281 [SPEAKER_00] Right, because what does that actually mean? 51 00:03:07,581 --> 00:03:13,189 [SPEAKER_01] Well, it solves a very specific mathematically brutal problem in cryptography. 52 00:03:14,253 --> 00:03:18,815 [SPEAKER_01] GPG relies on OpenPGT, which is an asymmetric encryption standard. 53 00:03:18,935 --> 00:03:20,976 [SPEAKER_00] Okay, asymmetric, meaning two keys. 54 00:03:21,056 --> 00:03:21,997 [SPEAKER_01] Right, exactly. 55 00:03:22,617 --> 00:03:23,898 [SPEAKER_01] Every user has two keys. 56 00:03:24,398 --> 00:03:27,539 [SPEAKER_01] A public key, which is like an open padlock you just hand out to the world. 57 00:03:27,559 --> 00:03:28,520 [SPEAKER_00] I can give that to anyone. 58 00:03:28,720 --> 00:03:29,040 [SPEAKER_01] Anyone. 59 00:03:29,740 --> 00:03:35,583 [SPEAKER_01] And then a private key, which is the unique physical key you keep hidden, and that's the only thing that unlocks the padlock. 60 00:03:35,976 --> 00:03:43,219 [SPEAKER_00] So if I want to send you a secure message, I take your public padlock, lock my message inside a box, and send it over the internet. 61 00:03:43,700 --> 00:03:49,022 [SPEAKER_00] And even if someone intercepts it, it doesn't matter because you are the literally only person with the private key to open it. 62 00:03:49,230 --> 00:03:49,810 [SPEAKER_01] Precisely. 63 00:03:50,031 --> 00:03:51,091 [SPEAKER_01] That's the mechanism. 64 00:03:51,412 --> 00:03:53,573 [SPEAKER_01] But now imagine trying to scale that up for a group. 65 00:03:53,794 --> 00:03:56,816 [SPEAKER_01] OK. Let's say your organization has a management team of 50 people. 66 00:03:56,976 --> 00:03:57,677 [SPEAKER_00] 50 people, right. 67 00:03:57,857 --> 00:04:07,344 [SPEAKER_01] If you want to send a single secure email to that entire group using standard OpenPGP, your email client has to encrypt that message 50 separate times. 68 00:04:07,424 --> 00:04:07,764 [SPEAKER_01] Wow. 69 00:04:07,845 --> 00:04:09,946 [SPEAKER_01] Using 50 different public padlocks. 70 00:04:10,326 --> 00:04:14,430 [SPEAKER_00] OK, that sounds like a total logistical nightmare for the person just trying to press Send. 71 00:04:14,810 --> 00:04:17,052 [SPEAKER_01] Oh, it's a massive computational bottleneck. 72 00:04:18,002 --> 00:04:20,583 [SPEAKER_01] But the administrative burden is honestly even worse. 73 00:04:21,143 --> 00:04:21,683 [SPEAKER_00] How so? 74 00:04:22,104 --> 00:04:22,824 [SPEAKER_01] Well, think about it. 75 00:04:23,104 --> 00:04:26,745 [SPEAKER_01] What happens if one of those 50 people loses their private key? 76 00:04:26,885 --> 00:04:27,265 [SPEAKER_00] Oh, right. 77 00:04:27,346 --> 00:04:28,446 [SPEAKER_00] Or someone leaves the company. 78 00:04:28,566 --> 00:04:29,066 [SPEAKER_01] Exactly. 79 00:04:29,346 --> 00:04:31,007 [SPEAKER_01] Or a new person joins. 80 00:04:31,487 --> 00:04:39,550 [SPEAKER_01] The entire group has to constantly manually update their local address books with the correct public keys for every single member. 81 00:04:39,810 --> 00:04:41,751 [SPEAKER_00] That's impossible to manage perfectly. 82 00:04:42,089 --> 00:04:48,174 [SPEAKER_01] It is, and if one person has an outdated key for a colleague, the whole security chain fractures. 83 00:04:48,534 --> 00:04:52,096 [SPEAKER_01] It's known as the N-squared problem of decentralized key management. 84 00:04:52,417 --> 00:04:58,201 [SPEAKER_00] Okay, so this is where Schleuter steps in to kind of fundamentally alter the architecture of the group, right? 85 00:04:58,221 --> 00:04:58,661 [SPEAKER_00] Exactly. 86 00:04:58,701 --> 00:05:07,628 [SPEAKER_00] Instead of this chaotic web where everyone manages 50 different padlocks, Schleuter acts as that highly secure digital bouncer I mentioned, standing at the door of the club. 87 00:05:07,868 --> 00:05:08,869 [SPEAKER_01] Right, the central point. 88 00:05:09,129 --> 00:05:11,932 [SPEAKER_00] Yeah, so the mailing list itself gets a single master padlock. 89 00:05:12,333 --> 00:05:18,160 [SPEAKER_00] If I want to email the management team, I just encrypt my message once using the list's master public key, and I send it to the bouncer. 90 00:05:18,300 --> 00:05:19,621 [SPEAKER_01] And then the bouncer's loader takes over. 91 00:05:19,681 --> 00:05:22,965 [SPEAKER_01] It's the only entity that actually holds the private key for the list. 92 00:05:23,105 --> 00:05:24,307 [SPEAKER_00] OK, so it opens the box. 93 00:05:24,547 --> 00:05:31,416 [SPEAKER_01] Yes, it receives your locked box, opens it, verifies that you are a legitimate subscriber, and then it does the heavy lifting. 94 00:05:31,536 --> 00:05:32,477 [SPEAKER_00] Like re-encrypting it. 95 00:05:32,717 --> 00:05:33,358 [SPEAKER_01] Exactly. 96 00:05:33,518 --> 00:05:39,425 [SPEAKER_01] Schleuter automatically re-encrypts the message 50 times using the public keys of the current subscribers and then distributes it. 97 00:05:39,566 --> 00:05:40,206 [SPEAKER_00] Wow, okay. 98 00:05:40,246 --> 00:05:43,090 [SPEAKER_00] So the sender only ever needs to know one single key. 99 00:05:43,190 --> 00:05:50,693 [SPEAKER_01] just one, and the administrator can update the subscriber list centrally so that whole n squared problem just completely disappears. 100 00:05:50,853 --> 00:05:51,793 [SPEAKER_00] That is brilliant. 101 00:05:51,873 --> 00:05:52,153 [SPEAKER_01] Yeah. 102 00:05:52,493 --> 00:05:56,995 [SPEAKER_00] But the documentation also emphasizes this gateway concept, right? 103 00:05:57,415 --> 00:05:58,855 [SPEAKER_00] The rescinding capability? 104 00:05:58,875 --> 00:06:00,436 [SPEAKER_01] Yes, the rescinding aspect. 105 00:06:00,516 --> 00:06:03,597 [SPEAKER_00] Which kind of introduces a fascinating vulnerability to me. 106 00:06:03,797 --> 00:06:04,017 [SPEAKER_01] Yeah. 107 00:06:04,257 --> 00:06:09,479 [SPEAKER_00] Because a steel vault is secure, sure, but only because nothing goes in or out. 108 00:06:09,559 --> 00:06:09,779 [SPEAKER_01] Right. 109 00:06:10,059 --> 00:06:11,920 [SPEAKER_00] But human communication doesn't work that way. 110 00:06:12,772 --> 00:06:16,536 [SPEAKER_00] Like a legal team needs to receive emails from outside counsel. 111 00:06:17,097 --> 00:06:19,919 [SPEAKER_00] A journalist needs to get tips from an unencrypted source. 112 00:06:20,220 --> 00:06:20,820 [SPEAKER_01] Absolutely. 113 00:06:21,281 --> 00:06:32,873 [SPEAKER_00] So as a layperson, I have to ask, how does this gateway process an external, completely unencrypted email without shattering the security of the internal subscribers? 114 00:06:33,389 --> 00:06:36,491 [SPEAKER_01] It operates essentially as an automated cryptographic bridge. 115 00:06:36,811 --> 00:06:43,015 [SPEAKER_01] OK. Let's say an external vendor sends a standard plain text email to the Schluter list address. 116 00:06:43,275 --> 00:06:43,996 [SPEAKER_00] Just a normal email? 117 00:06:44,016 --> 00:06:44,856 [SPEAKER_01] Just a normal email. 118 00:06:44,896 --> 00:06:52,241 [SPEAKER_01] OK. Because Schluter sits on the server monitoring the traffic, it catches that plain text email before it ever hits the internal network. 119 00:06:52,481 --> 00:06:52,842 [SPEAKER_00] Oh, I see. 120 00:06:52,922 --> 00:06:59,546 [SPEAKER_01] It securely wraps the message, encrypts it using the internal subscriber's public keys, and delivers it securely to them. 121 00:06:59,907 --> 00:07:04,249 [SPEAKER_00] Meaning the internal team maintains their strict cryptographic discipline. 122 00:07:04,629 --> 00:07:07,030 [SPEAKER_00] Like, they only ever see an encrypted message in their inbox. 123 00:07:07,210 --> 00:07:07,731 [SPEAKER_01] Exactly. 124 00:07:07,771 --> 00:07:08,931 [SPEAKER_00] What happens when they hit reply? 125 00:07:09,031 --> 00:07:10,632 [SPEAKER_00] Because they're replying encrypted, right? 126 00:07:10,712 --> 00:07:10,912 [SPEAKER_01] Right. 127 00:07:10,952 --> 00:07:12,393 [SPEAKER_01] So the process just reverses. 128 00:07:12,733 --> 00:07:17,255 [SPEAKER_01] The internal team drafts a highly secure encrypted reply, sends it to Schluter. 129 00:07:17,575 --> 00:07:24,539 [SPEAKER_01] OK. Schluter decrypts the message, translates it back into a standard plain text email, and sends it out to the external vendor. 130 00:07:24,639 --> 00:07:24,899 [SPEAKER_00] Wow. 131 00:07:25,019 --> 00:07:27,600 [SPEAKER_00] So the vendor just sees a totally normal email conversation. 132 00:07:27,780 --> 00:07:28,241 [SPEAKER_01] Exactly. 133 00:07:28,261 --> 00:07:29,882 [SPEAKER_01] They don't have to install OpenPGP. 134 00:07:29,922 --> 00:07:31,123 [SPEAKER_01] They don't have to manage keys. 135 00:07:31,644 --> 00:07:33,366 [SPEAKER_01] No technical training required at all. 136 00:07:33,906 --> 00:07:39,111 [SPEAKER_01] It completely decouples the internal security protocol from the external user experience. 137 00:07:39,372 --> 00:07:41,113 [SPEAKER_00] Decoupling that experience is so smart. 138 00:07:41,454 --> 00:07:44,517 [SPEAKER_00] But man, it must require a serious engine underneath. 139 00:07:44,764 --> 00:07:45,784 [SPEAKER_01] Oh, it absolutely does. 140 00:07:45,924 --> 00:07:52,667 [SPEAKER_00] Yeah, because looking at the GitLab documentation, the tech stack required to run this digital bouncer is pretty extensive. 141 00:07:52,947 --> 00:07:54,728 [SPEAKER_01] It is a lot to take in at first glance. 142 00:07:54,888 --> 00:08:08,072 [SPEAKER_00] It says it requires Ruby version 2.7 or higher, GNOPG for the cryptography, Scolite 3 for the database, OpenSSL, plus this specific array of Ruby gems like ActiveRecord and Sinatra. 143 00:08:08,112 --> 00:08:08,313 [SPEAKER_01] Yep. 144 00:08:08,533 --> 00:08:09,553 [SPEAKER_01] All of those are essential. 145 00:08:09,733 --> 00:08:15,287 [SPEAKER_00] I mean, for a beginner or an IT volunteer looking at this, it just seems like an overwhelming amount of moving parts to configure. 146 00:08:15,347 --> 00:08:17,432 [SPEAKER_00] I'm intimidated just writing it on behalf of the listener. 147 00:08:17,755 --> 00:08:19,256 [SPEAKER_01] It's totally fair to feel that way. 148 00:08:19,296 --> 00:08:24,239 [SPEAKER_01] If you attempt to compile all those dependencies from scratch, it is extremely complex. 149 00:08:24,439 --> 00:08:24,659 [SPEAKER_00] Right. 150 00:08:24,959 --> 00:08:26,180 [SPEAKER_01] But the developers know this. 151 00:08:26,520 --> 00:08:30,383 [SPEAKER_01] They've provided simplified installation packages for major platforms. 152 00:08:30,523 --> 00:08:31,423 [SPEAKER_00] Oh, OK. Like what? 153 00:08:31,663 --> 00:08:39,128 [SPEAKER_01] They've specifically tested on Debian version 12 codename Bookworm, as well as Centos 7 and Arch Linux. 154 00:08:39,628 --> 00:08:42,270 [SPEAKER_01] They essentially prepackage that complex environment for you. 155 00:08:42,550 --> 00:08:43,851 [SPEAKER_00] OK, that is a huge relief. 156 00:08:44,472 --> 00:08:50,557 [SPEAKER_00] But there is one highly specific, fascinating detail from the readme that we really have to talk about. 157 00:08:50,618 --> 00:08:51,759 [SPEAKER_00] The entropy requirement. 158 00:08:51,779 --> 00:08:52,039 [SPEAKER_00] Yes. 159 00:08:52,740 --> 00:08:58,865 [SPEAKER_00] The developers explicitly recommend running a random number generator daemon called Havaged on the server. 160 00:08:59,886 --> 00:09:06,192 [SPEAKER_00] They say this is to ensure the system won't get blocked by, quote, lacking entropy, especially during key generation. 161 00:09:06,419 --> 00:09:09,141 [SPEAKER_01] Entropy is such a crucial concept in cryptography. 162 00:09:09,241 --> 00:09:09,741 [SPEAKER_00] It really is. 163 00:09:09,821 --> 00:09:18,007 [SPEAKER_00] It's like to explain entropy to a beginner, you basically need enough chaotic random ingredients to bake a truly unpredictable cryptographic cake. 164 00:09:18,067 --> 00:09:19,108 [SPEAKER_01] That's a great analogy. 165 00:09:19,208 --> 00:09:22,270 [SPEAKER_00] Because if you don't have enough entropy, the system can literally just stop. 166 00:09:22,450 --> 00:09:23,210 [SPEAKER_00] It gets blocked. 167 00:09:23,250 --> 00:09:29,655 [SPEAKER_01] What's fascinating here is how this quirky technical requirement connects to the broader picture of true security. 168 00:09:30,820 --> 00:09:35,065 [SPEAKER_01] Well, encryption relies on genuine randomness, not just math. 169 00:09:35,806 --> 00:09:37,728 [SPEAKER_01] Algorithms are entirely deterministic. 170 00:09:38,309 --> 00:09:42,254 [SPEAKER_01] If you put the same starting numbers in, you always get the exact same output. 171 00:09:42,294 --> 00:09:47,480 [SPEAKER_00] Which means if a hacker knows your starting numbers, they can reverse engineer your supposedly secure key. 172 00:09:47,600 --> 00:09:48,060 [SPEAKER_01] Exactly. 173 00:09:48,261 --> 00:09:49,802 [SPEAKER_01] You need true chaos. 174 00:09:50,002 --> 00:09:50,182 [SPEAKER_01] Right. 175 00:09:50,522 --> 00:09:54,205 [SPEAKER_01] But a computer is a machine literally built to eliminate chaos. 176 00:09:54,685 --> 00:09:56,826 [SPEAKER_01] It processes logic sequentially. 177 00:09:56,967 --> 00:09:57,147 [SPEAKER_00] Right. 178 00:09:57,467 --> 00:10:03,151 [SPEAKER_01] So to find unpredictable inputs, a standard server looks for microscopic physical variation. 179 00:10:03,191 --> 00:10:04,252 [SPEAKER_00] Like what kind of variations? 180 00:10:04,332 --> 00:10:10,616 [SPEAKER_01] It might measure the exact milliseconds between keystrokes on a keyboard or tiny temperature fluctuations of the processor. 181 00:10:10,956 --> 00:10:14,819 [SPEAKER_00] But a headless server sitting in a data center somewhere doesn't have a keyboard or a mouse. 182 00:10:15,070 --> 00:10:15,650 [SPEAKER_01] Exactly. 183 00:10:15,730 --> 00:10:18,592 [SPEAKER_01] It's just quietly processing network requests. 184 00:10:18,732 --> 00:10:20,654 [SPEAKER_01] It isn't generating enough physical chaos. 185 00:10:20,694 --> 00:10:24,736 [SPEAKER_00] And when that entropy pool drains, the cryptographic engine literally halts. 186 00:10:24,816 --> 00:10:25,036 [SPEAKER_01] Right. 187 00:10:25,336 --> 00:10:27,398 [SPEAKER_01] It refuses to generate a predictable key. 188 00:10:27,658 --> 00:10:28,839 [SPEAKER_01] So the process blocks. 189 00:10:28,979 --> 00:10:30,840 [SPEAKER_01] And that's where HaveEdged becomes critical. 190 00:10:31,240 --> 00:10:35,743 [SPEAKER_01] It constantly analyzes unpredictable timing variations in the processor itself. 191 00:10:36,352 --> 00:10:38,475 [SPEAKER_01] acting like an artificial chaos blender. 192 00:10:38,735 --> 00:10:39,155 [SPEAKER_00] I love that. 193 00:10:39,716 --> 00:10:48,767 [SPEAKER_00] We spend millions building hyperlogical data centers only to realize our security fundamentally depends on measuring the physical imperfections of the silicon itself. 194 00:10:49,047 --> 00:10:50,269 [SPEAKER_01] It's beautifully ironic. 195 00:10:50,329 --> 00:10:50,789 [SPEAKER_00] It really is. 196 00:10:53,058 --> 00:11:00,102 [SPEAKER_00] Moving from the physical hardware to the actual human beings writing the code, the ethos surrounding Schluter is equally fascinating. 197 00:11:00,222 --> 00:11:02,643 [SPEAKER_01] The mission statement is incredibly powerful. 198 00:11:02,783 --> 00:11:03,123 [SPEAKER_00] It is. 199 00:11:03,223 --> 00:11:05,044 [SPEAKER_00] Like code doesn't just write itself. 200 00:11:05,704 --> 00:11:11,007 [SPEAKER_00] And the developers explicitly state they give their time and knowledge to help people with daily private communication. 201 00:11:11,507 --> 00:11:18,471 [SPEAKER_00] But also, and this is a direct quote from the sources in the struggle for personal emancipation, social and economic justice, and political freedom. 202 00:11:18,511 --> 00:11:22,473 [SPEAKER_01] It's so rare for a technical manual to have such a clear philosophical stance. 203 00:11:22,593 --> 00:11:22,873 [SPEAKER_00] Yeah. 204 00:11:23,254 --> 00:11:31,582 [SPEAKER_00] We're not endorsing any political side here, obviously, but just looking at their intent, it's clear they see open source software as a real tool for social change. 205 00:11:31,982 --> 00:11:32,522 [SPEAKER_01] Absolutely. 206 00:11:32,563 --> 00:11:37,227 [SPEAKER_01] They recognize that privacy is a fundamental prerequisite for political organizing. 207 00:11:37,407 --> 00:11:38,788 [SPEAKER_00] And here's the pushback. 208 00:11:39,269 --> 00:11:42,152 [SPEAKER_00] The reality of open source development is harsh. 209 00:11:42,212 --> 00:11:42,712 [SPEAKER_00] Very harsh. 210 00:11:43,633 --> 00:11:46,876 [SPEAKER_00] If you look at their GitLab page, which was created back in January 2017, 211 00:11:48,638 --> 00:11:49,999 [SPEAKER_00] It has over 1600 commits. 212 00:11:50,759 --> 00:11:53,460 [SPEAKER_00] But right there, prominently, is this bold plea. 213 00:11:53,940 --> 00:11:55,301 [SPEAKER_00] Maintainers want it. 214 00:11:55,701 --> 00:11:57,282 [SPEAKER_01] Yeah, that part is sobering. 215 00:11:57,622 --> 00:12:02,924 [SPEAKER_00] The current team openly admits they have, quote, hardly any time left for the project. 216 00:12:03,344 --> 00:12:04,725 [SPEAKER_00] They don't want the project to die. 217 00:12:05,205 --> 00:12:09,107 [SPEAKER_00] But for a sustainable future, they say it needs new humans to care for it. 218 00:12:09,547 --> 00:12:13,109 [SPEAKER_01] And that exposes a huge vulnerability in our digital infrastructure. 219 00:12:13,569 --> 00:12:16,670 [SPEAKER_01] Software isn't just a static object you build once and leave on a shelf. 220 00:12:16,910 --> 00:12:17,471 [SPEAKER_00] No, it rots. 221 00:12:17,691 --> 00:12:18,151 [SPEAKER_01] Exactly. 222 00:12:18,451 --> 00:12:19,552 [SPEAKER_01] Operating systems update. 223 00:12:19,792 --> 00:12:21,613 [SPEAKER_01] Underlying libraries find vulnerabilities. 224 00:12:21,633 --> 00:12:23,313 [SPEAKER_01] If it isn't actively maintained, it breaks down. 225 00:12:23,654 --> 00:12:27,735 [SPEAKER_00] And digital freedom relies on actual human labor, unpaid human labor often. 226 00:12:28,096 --> 00:12:30,917 [SPEAKER_01] Which is a precarious position for everyone relying on these tools. 227 00:12:31,349 --> 00:12:39,797 [SPEAKER_00] So building on this call for new humans, if a beginner or a new maintainer actually wants to interact with the Schluter ecosystem, how do they get started? 228 00:12:40,317 --> 00:12:43,620 [SPEAKER_01] The developers have structured it with dual options for administration. 229 00:12:43,640 --> 00:12:44,461 [SPEAKER_00] OK, what are they? 230 00:12:45,001 --> 00:12:49,485 [SPEAKER_01] For traditional server admins, there is Schluter Klee, which is a command line interface. 231 00:12:49,866 --> 00:12:51,307 [SPEAKER_01] You manage everything in the terminal. 232 00:12:51,765 --> 00:12:54,868 [SPEAKER_00] But that's not great for an office manager. 233 00:12:55,248 --> 00:12:56,089 [SPEAKER_01] No, not at all. 234 00:12:56,289 --> 00:12:59,692 [SPEAKER_01] Which is why they have an API built with that Sinatra gem we mentioned. 235 00:12:59,872 --> 00:13:00,232 [SPEAKER_00] Oh, right. 236 00:13:00,372 --> 00:13:03,415 [SPEAKER_01] And that allows for an optional web interface called Schluter Web. 237 00:13:03,855 --> 00:13:09,640 [SPEAKER_01] So users who prefer a browser can manage things graphically instead of using terminal request keywords. 238 00:13:09,720 --> 00:13:11,061 [SPEAKER_00] OK, so it's accessible. 239 00:13:11,682 --> 00:13:12,782 [SPEAKER_00] But what about the developers? 240 00:13:12,863 --> 00:13:14,484 [SPEAKER_00] Is it a chaotic environment for them? 241 00:13:14,768 --> 00:13:15,429 [SPEAKER_01] Not at all. 242 00:13:15,749 --> 00:13:16,910 [SPEAKER_01] It's highly structured. 243 00:13:17,311 --> 00:13:22,916 [SPEAKER_01] They use a tool called SurSpec for automated code testing, and they're always trying to extend that test coverage. 244 00:13:22,936 --> 00:13:23,396 [SPEAKER_01] That's smart. 245 00:13:23,637 --> 00:13:29,542 [SPEAKER_01] Plus, they've adopted a formal code of conduct, and it's all licensed under the GNU GPL 3.0. 246 00:13:29,702 --> 00:13:31,424 [SPEAKER_00] Which keeps it permanently open source, right? 247 00:13:31,444 --> 00:13:32,565 [SPEAKER_01] Yeah, exactly. 248 00:13:32,986 --> 00:13:37,290 [SPEAKER_01] If we connect this to the binger picture, this structured, welcoming documentation 249 00:13:37,770 --> 00:13:44,876 [SPEAKER_01] is exactly how open source projects try to lower the barrier to entry for beginners and potential new maintainers. 250 00:13:45,016 --> 00:13:47,578 [SPEAKER_00] They need it to be welcoming if they want people to stay. 251 00:13:47,818 --> 00:13:48,519 [SPEAKER_01] Absolutely. 252 00:13:48,559 --> 00:13:54,544 [SPEAKER_00] Which honestly, seamlessly transitions us back to the practical realities of deploying this kind of software. 253 00:13:54,884 --> 00:13:58,647 [SPEAKER_00] Because implementing these solutions requires strategic planning. 254 00:14:00,168 --> 00:14:02,590 [SPEAKER_00] And that brings us back to our sponsor, Safe Server. 255 00:14:02,818 --> 00:14:04,380 [SPEAKER_01] A vital partner in all of this. 256 00:14:04,640 --> 00:14:05,021 [SPEAKER_00] Truly. 257 00:14:05,361 --> 00:14:13,030 [SPEAKER_00] Whether you are a business, an association, or any other group, there is so much to gain by switching to an open source solution like Schluter. 258 00:14:13,251 --> 00:14:16,333 [SPEAKER_01] The gains in compliance and data privacy are massive. 259 00:14:16,353 --> 00:14:16,793 [SPEAKER_00] Massive. 260 00:14:16,993 --> 00:14:17,153 [SPEAKER_00] Yeah. 261 00:14:17,173 --> 00:14:21,476 [SPEAKER_00] And the significant cost savings over those proprietary giants we talked about earlier. 262 00:14:21,536 --> 00:14:23,717 [SPEAKER_01] You really can't understate the cost factor. 263 00:14:23,938 --> 00:14:24,178 [SPEAKER_00] Right. 264 00:14:24,618 --> 00:14:26,939 [SPEAKER_00] And SafeServer can actually be commissioned for consulting. 265 00:14:27,320 --> 00:14:32,943 [SPEAKER_00] Whether the right fit for your organization is Schluter or maybe a comparable open source alternative, they help you figure it out. 266 00:14:33,103 --> 00:14:34,364 [SPEAKER_01] They handle the heavy lifting. 267 00:14:34,824 --> 00:14:35,265 [SPEAKER_00] Exactly. 268 00:14:35,625 --> 00:14:38,126 [SPEAKER_00] All the way to secure operation on EU servers. 269 00:14:38,567 --> 00:14:42,169 [SPEAKER_00] You can find more information at www.safeserver.de. 270 00:14:43,035 --> 00:14:48,156 [SPEAKER_01] You know, exploring Schlatter leaves me with a lingering question for our listeners to ponder on their own. 271 00:14:48,176 --> 00:14:49,637 [SPEAKER_00] Oh, what's that? 272 00:14:49,857 --> 00:15:04,561 [SPEAKER_01] If tools that are deeply vital for our political freedom, our compliance, and our daily privacy are largely sustained by the dwindling free time of volunteer maintainers, what does that mean for the long-term fragility of our global digital infrastructure? 273 00:15:04,921 --> 00:15:08,226 [SPEAKER_00] Wow, that is a heavy thought to end on, but so important to consider. 274 00:15:08,827 --> 00:15:11,291 [SPEAKER_00] Thank you so much for joining me on this exploration of the sources today. 275 00:15:11,311 --> 00:15:11,912 [SPEAKER_01] Super pleasure. 276 00:15:12,032 --> 00:15:13,194 [SPEAKER_00] And thank you all for listening. 277 00:15:13,274 --> 00:15:14,737 [SPEAKER_00] We will see you on the next Deep Dive.