Skip to main content

Governing polyproto

· 4 min read
bitfl0wer

There's precedence which suggests that governing federated software as a one-man-band is a bad idea. This begs the question: How would polyproto be governed?

{/_ truncate _/}

I am in a somewhat interesting situation

Polyphony and polyproto are, at the moment, "just hobbies" for me. However, I do still really want this all to succeed long-term. These wishes do involve a lot of planning ahead, including thinking about passing on the torch, so that this fire does not die with me. A pretty uncommon thing to say about a hobby, I think.

It is pretty clear to me that I do not want to be the sole ruler of all things polyproto. I simply don't have the time, energy and skills to manage everything myself, should polyproto really become something that is used by a larger audience. polyproto must be an open standard, and in my opinion, to make a standard truly open is to eventually let go and involve other people in decision making processes. This is what I have been thinking about a lot lately, and I wanted to use this article as a space to share my thoughts and wishes for the future.

Democracy

To make a long story short, I believe that establishing a non-profit foundation or association that concerns itself with governing and developing polyproto is the way to go. The tasks of such an association would be:

  • Coordinating and overseeing work on the protocol: As the polyproto sees increased usage, changes and additions to the core protocol will most definitely need to be made to align the protocol with real-world needs and wants voiced by users and developers alike. The polyproto association would have the final say over what changes will be included in the next version of polyproto by reviewing, commenting on and accepting/rejecting change proposals submitted by people/groups.
  • Foster a collaborative environment: I believe it would really suck if there were 12 (exaggeration) different p2-extensions defining what a Discord-like chat application should look like. The polyproto association should make efforts to foster, unify and certify "official" extensions and standards that all developers should use. I am not 100% sure about what this would look like; Imagine, someone develops a p2-extension defining what a Google Photos-like should look like. This person now comes to the polyproto association with a wish to get a "seal of approval" via getting their extension certified. The association members notice, that this extension is pretty awesome. What happens now? Should this extension be "absorbed" into the association somehow? Should only this exact version of the extension receive certification, requiring the developer to contact the association again if an update to the extension is made, to get the updated version certified as well? And if this is the case, say there happens to be a disagreement between the association and the developer on the contents of the extension update – what then? Does the association eventually "hard fork" the extension? This seems very complex, and I think that cases like these should be considered, given that sadly not every human being gets along with each other. In any case, the polyproto association should make the process of getting your extension proposal reviewed and seen by other people easier, so that extensions are not created in a vacuum, but through collaboration.
  • Act as a giant documentation resource: The association should be a great source of information for interested developers. Documentation and explanations of concepts should be available in various levels of complexity and should always be up-to-date.

Closing words

The time for something like this has not yet arrived. What this project needs in its infancy is rapid development and the freedom to change a great number of things quickly.

What I need right now though is to get healthy again. As I am writing this, the flu is still doing a number on me. Bleh.

Stay healthy and stay safe!

NLnet grant application

· 12 min read
bitfl0wer

The NLnet foundation is a non-profit organization that supports open-source projects. They have a grant program that funds projects that align with their goals. On behalf of Polyphony and the polyproto project, I have submitted an application for a grant of 10,000€ from the NLnet foundation in their funding round of October 2024.

{/_ truncate _/}

Should we be successful in our application, the grant will be used to fund the development of the Polyphony and polyproto projects, which would rapidly increase the development velocity of both projects and bring them one big step closer to being ready for a public alpha release.

The application required a bunch of different, interesting questions to be answered. I would like to share the answers with you, as they give a good overview of what we are working on, what we are planning to do, and considerations we have made in the past.

Can you explain the whole project and its expected outcome(s)?

polyproto is a new federation protocol with the goal of offering account portability and an effortless experience for users and developers alike. It is part of the Polyphony Project, which aims to be a truly competitive alternative to centralized, proprietary chat services like Discord. We came up with the idea for polyproto because of a lingering frustration with current federation protocols and their sub-optimal suitability for such a project.

polyproto is not limited to an application in a chat service, and that it is not incompatible with federation protocols such as ActivityPub. It would technically be possible to write a polyproto + ActivityPub server and client to offer new possibilities to the currently existing Fediverse. We want to empower users, not split userbases further.

Our goal is to deliver the Polyphony chat service with polyproto at its core, build great SDKs for other developers to work with, and to also directly work together with other developers to get alternative implementations of polyproto-based chat services for users to choose from. Polyphony should be the ideal, federated and decentralized Discord replacement; a service, that can be used by teenagers, the elderly and anyone in between and which ideally does not require any additional technical knowledge or proficiency to use.

Have you been involved with projects or organisations relevant to this project before? And if so, can you tell us a bit about your contributions?

I (Flori Weber, bitfl0wer) have been following the Spacebar-Chat (formerly "Fosscord") project for some time before deciding to start the Polyphony-Chat GitHub organization in March 2023. The contributions I have made to the Spacebar project were limited to additions and overhauls of the projects documentation, because of a lack of TypeScript knowledge, which is the programming language primarily used in the Spacebar organization. Of course, I have prior experience in software development and software design through my work at Deutsche Telekom MMS GmbH, but this is my first project with such a topic. I am part of a software development community called "Commune", which is a home for individuals and groups who also have interest in federated and/or decentralized social media, as well as "reclaiming" the web from the hands of large corporations. There, I have access to like-minded individuals who are also very much interested in polyproto and in seeing polyproto succeed.

Requested Amount

$10.000

Explain what the requested budget will be used for? Does the project have other funding sources, both past and present?

There are three key things that I (Flori Weber, bitfl0wer) have been wanting to tackle for months now, but have never been able to do, because of a lack of available personal time or expertise. These three things are:

  1. Writing new material and, if necessary, reworking existing material that
    • Describes, in a condensed form, what polyproto is about, targeted towards people who do not have [a lot of] existing knowledge in the topics of federation and decentralized social networking concepts ($250-400)
    • Describes, in a condensed form, how polyproto works, targeted towards developers who might be interested in learning more about the inner workings of polyproto, without needing to read the entire protocol specification document ($250-400)
  2. Starting to build some sort of "brand" by
    • Commissioning an artist to create a recognizable logo for polyproto ($200-400)
    • Commissioning a frontend developer to build a landing page for our project, since non-developers do seem to prefer information hosted outside of GitHub and plain looking documentation pages. This landing page would also host the written material mentioned in 1. ($500-1500)
  3. Paying (freelance) developers to expedite this projects' journey to completion, where there are the following tasks we could use additional brains on:
    • Paying developers to start integrating our polyproto crate into our "chorus" client library and "symfonia" server (~$2000)
    • Stabilizing and extending the symfonia server to host a first, publicly usable instance (~$1500)
    • Getting additional help from UI/UX designers and frontend developers to build a client mockup, then having this mockup translated into a client prototype which can be hosted alongside the symfonia server (~$2000)
    • "Overhaul"/Refactoring tasks which we, as a group of mainly university students working part time jobs in addition, simply did not yet have the time to get to (~$1000-1500)

This would total to $7700 or $9700, depending on whether the lower or higher estimate is used. Additionally, I would like to extend the domains polyphony.chat and polyproto.org for some years using the funding and top up our prepaid E-Mail server hosted at Uberspace. The domain and the E-Mail server make up most of our current operating costs, ranging between 7-12$ a month. This might not sound like a lot in the grand scheme of things, but I am currently paying for these out of my own pocket as an undergraduate with little income, so being able to potentially reduce monthly expenses is a nice prospect.

We currently have and have never had additional/other sources of funding. Receiving funding from NLNet would thus be our first source of funding.

Compare your own project with existing or historical efforts

Spacebar Chat

Already previously mentioned, Spacebar Chat is also working on an open-source Discord replacement in form of offering an API-compatible server. However, they do not seem interested in making Spacebar Chat a federated chat application with good user experience, as their primary focus is to reverse-engineer and re-implement the Discord API spec-for-spec.

From talking to Spacebar Maintainers, their code reportedly seems to have accrued noticeable amounts of technical debt which made it undesirable for most of the maintainers to continue development on the server. Being friends with some of the now mostly inactive maintainers, I have considered forking the server repository to have an already mostly working starting ground to work with. However, due to the reports of technical debt and our organizations' unfamiliarity with JavaScript/TypeScript, we have decided to start from scratch, in Rust.

The accumulated knowledge that Spacebar contributors and maintainers have collected in form of documentation has already been of great use for our project, and we are contributing back by updating documentation and creating issues, when we find disparities between the behaviours of our own server implementation and their server implementation.

XMPP

Much like XMPP, I have decided to make polyproto an extensible protocol.

I am of the opinion that XMPPs biggest downfall is how many extensions there are, and that a server aiming to be compatible with other implementations of XMPP-based chat services should aim to implement all of the XEPs to be a viable choice.

polyproto is actively trying to circumvent this by limiting polyproto extensions (P2 extensions for short) to

  • either be a set of APIs and behaviours, defining a generic(!) version of a service. A "service" is, for example, a chat application, a microblogging application or an image blogging application. Service extensions should be the core functionality that is universally needed to make an application function. In the case of a chat application, that might be:

    • Defining message group size granularity: Direct messages, Group messages, Guild-based messages
    • Defining what a room looks like
    • Defining the APIs and behaviours required to send and receive messages
    • Defining the APIs and behaviours required to perform commonly sought after things, such as reacting to a message with an emoji
    • etc.

    The goal is that all different polyproto-based chat applications should then implement this shared behaviour. Of course, developers may absolutely add their own behaviours and functionality which is perhaps exclusive to their specific implementation. Core functionality remains commonly defined however, which should make all polyproto-based chat applications interoperable in these defined, common behaviours.

  • or describe a major technological addition, which can be used in the "requires" section of another P2 extension. This "requires" section can be thought of like the dependency list of a software package.

    Technological additions might be:

    • Defining APIs and behaviours needed to implement the MLS (Messaging Layer Security) Protocol
    • Defining APIs and behaviours needed to establish and maintain a WebSocket connection, and how to send/receive messages over this WebSocket connection.

By using clay-brick-sized building blocks instead of more LEGO-sized building blocks like XMPP does, we hope to mitigate this problem that we perceive, while still offering an extensible yet well-defined platform to build on.

Matrix/Element

Matrix is perhaps the closest we have yet gotten to federated chat software aimed towards a general audience. However, as a strong believer in user experience - especially how first impressions impact new, non-technical users, I believe that Matrix falls flat in this regard. A lot of peoples first experience with Matrix is the infamous "Could not decrypt: The senders device has not yet sent us the keys for this message". The protocol and its sub-protocols are vast and complicated and use bespoke cryptography protocol implementations such as Olm and Megolm, which, in the past, has already been the cause of high-caliber vulnerabilities (see: Nebuchadnezzar and, more recently, Soatok's blog post).

Matrix is truly impressive from a technical standpoint. Its extremely low centralized architecture fills a niche which especially people already interested in technology seem to enjoy. However, this invariably results in the fact that user experience has to be compromised. It is of my opinion that while Matrix is relatively good at what it is doing, it is not a good fit to be a potential Discord replacement.

As for a comparison: We are taking a radically different approach to Matrix. Matrix aims for eventually-consistent federation of events using cryptographically fully verifiable directed acyclic event graphs, where as polyproto, and by extension Polyphony, prioritize usability above all, intentionally disregarding highly complex or novel data structures in favor of cryptographic verifiability through digital signatures and simple public key infrastructure.

What are significant technical challenges you expect to solve during the project, if any?

Currently, our trust model acknowledges, that a users home server is able to create sessions and session tokens on the users' behalf, and is thus able to listen in on unencrypted communications, or, in the case of a truly malicious admin, would even be able to send messages on behalf of the user. This is not a novel problem, as it also affects all Mastodon ActivityPub servers in existence. Given that this potential abuse risk has not been a large issue in the Fediverse, we expect this to also not be a major problem. However, I would like to find additional mitigations or even a solution for this problem during further development of polyproto.

Another area that will likely need more work is my current design for how to connect to multiple servers at once: Currently, I expect every client to hold a WebSocket connection with each server that they are communicating with, at once. Depending on the amount of traffic, this could lead to constantly high resource consumption for clients. If this turns out to be the case, I am sure that we can find plenty of software- and protocol-side adjustments and improvements to implement - though it is still a potential technical challenge.

My last major area of concern is how well transmission and de-/serializing of the X.509 based Identity Certificates will work. I am optimistic about this however, since the X.500 series of RFCs are extremely well documented and already deeply explored, so that even if challenges arise in this area, I am certain that there is enough literature on the exact problem we might be facing, and enough people to ask/talk to.

Describe the ecosystem of the project, and how you will engage with relevant actors and promote the outcomes?

As the commercialization of Discord.com steadily increases, it is becoming clear that people are looking for a usable alternative. This is an audience that we are hoping to capture. Our Polyphony Chat service is Discord API compatible, so that actors may use the Polyphony client to interact with both Discord.com and polyproto-chat-based instances, and that existing bots and automations could potentially be ported over very easily. This essentially gives people looking for a Discord replacement exactly what they are looking for, as there should be little to no additional concepts, behaviors or patterns that users have to learn or re-learn to use our service.

As previously touched on, we are blessed to already have made a great amount of connections to like-minded developers also working on similar projects, who are looking optimistically towards polyproto as the tool to use to federate. I also have received explicit permission from Spacebar Maintainers to promote our projects on their Discord Guild, which currently counts 3600 members.

polyproto extensions

· 6 min read
bitfl0wer

polyproto is a new federation protocol. Its main focus is enabling seamless participation of one actor on many different servers. The core specification lacks routes for sending any sort of user generated data anywhere, though. What is up with that?

{/_ truncate _/}

To federate is to be familiar

If any application wants to participate in the network of polyproto services, it has to speak the same language as those other services. When wanting to send a message to a server that you are authenticated on, your client needs to know exactly what that HTTP request has to look like. This is nothing new. One take on a solution for this problem stems from the people working on the ATProtocol, who created Lexicon. From the atproto website:

:::quote "Lexicon TL;DR"

Lexicon is a global schema system. It uses reverse-DNS names like "`com.example.ping()`". The
definitions are JSON documents, similar to JSON-Schema. It's currently used for HTTP endpoints,
event streams, and repo records

The core of polyproto is supposed to be infinitely adaptable, to be flexible enough to be used for just about anything, which is why I do not want to force a fixed set of routes onto every single polyproto implementation.

Lexicon sounds interesting and really versatile! However, as mature as the idea itself might be, it is pretty complex and does not yet seem to have good community support in the form of libraries/crates to aid in working with this new schema system. I also do not want to force polyproto integrations to use a (potentially very complex) Lexicon parser and dynamic routing system thingymajig - although having "no rules" means, that if you want to build a polyproto service which uses Lexicon, you absolutely can.

We need a common foundation

I am a big proponent of defining a set of (mutually independent) protocol extensions, which include additionally needed behavior and concrete HTTP routes for building a specific application. This has the following benefits:

  • If you'd like to build a polyproto chat client, and there's a polyproto-chat extension, you simply need to add the additional things required by that extension. No need for complex parsing! Code only what you need and do not care about the rest.
  • Mutual independence means being able to combine extensions however you'd like. You could, for example, create a chat app with integrated microblogging functionality.
  • Developers are free to come up with whatever they want. How about ActivityPub x polyproto? Since polyproto doesn't define a message format, this is absolutely possible!
  • Simplicity! polyproto and its "official" extensions will always just have plain old REST APIs, for which tooling is readily available. Why bother with something fancy and dynamic, when this does the trick?

On the other hand, everyone now has to agree on one extension to use for a specific application. You cannot participate on servers, which have use an extension which is completely different from the one that your client implements, as an example.

...the polyproto foundation. Get it? sigh

To develop, provide and maintain polyproto and some major "official" extensions (such as polyproto-chat), creating a non-profit foundation is likely a good idea for a future where polyproto is actually being used in the real world.

This could sort of be seen like the XMPP Standards Foundation which develops and maintains XMPP extensions. Unlike XMPPs extensions however, official polyproto extensions should always be major additions in functionality. As an example: XEP-0084 is the official XMPP extension for User Avatars. An entire 12 point document, which describes one simple feature!

polyproto extensions should either always be a major technological addition, which can be taken advantage of by other extensions (examples for this would be WebSocket Gateways and Messaging Layer Security), or a document describing a set of routes, which define a particular application use case (A Discord-like, a Reddit-like, a Twitter-like, and so on). Having official extensions adhere to these rules ensures that polyproto will not become a cluttered mess of extensions and that it and its extensions are easy to understand and implement, due to less documentation having to be read and written.

Is this a bottleneck for me as a developer

If you are a developer, you might ask yourself:

Implementing common chat behaviour sounds cool in terms of intercompatibility, but doesn't this
limit what I can do with my application? I have planned for a cool feature X to exist in my chat
service, but that doesn't exist in the protocol extension!

Extensions should be a usable minimum of common behavior that all implementations targeting the same "class" of application must share. Implementations can absolutely offer all the additional special/unique features they'd like, though. polyproto clients implementing the same extensions can be treated as clients with a reduced feature set in this case. What is crucial, however, is that the additional features do not prohibit "reduced feature set clients" from using the behavior described in the extension, if any sort of federation or interoperability is wanted.

Example: What works
In your implementation of a chat service, users can send each other messages with special
effects, such as fireworks, confetti and similar. A different implementation of polyproto-chat
is unlikely to see these special effects on their end. However, they can still see the messages'
text contents, send replies to the message, and do all sorts of other things as described in
this hypothetical polyproto-chat extension.
Example: What doesn't work
In your implementation of a chat service, users can send each other messages with special effects,
such as fireworks, confetti and similar. Your implementation requires every client to send
information about the special effect they'd like to send with a message - otherwise sending the
message fails. If this is the case and you haven't implemented a sort of "adapter" for other
polyproto-chat clients, these clients will not be able to send any messages to servers running
your chat software. This conflicts with the behaviour required by the polyproto-chat extension
and is therefore unacceptable.

Also keep in mind that through clever engineering, it might be possible to write adapters for behavior, which should be required in your implementation and conflicts with the base extension. Picking up the "What doesn't work" example again, the implementer could simply "translate" message sending requests made to the polyproto-chat endpoints and add the required "special effects" information, stating that messages sent through polyproto-chat endpoints have no special effects added to them.

Closing words

I am of the opinion that, while this way of having extensions might not be the most technologically advanced solution, it certainly offers many possibilities while being easy to understand and implement.

These are my current plans, ideas and thoughts for making a v1 of polyproto extensible. If you have any thoughts on this matter, please do let me know! You can contact me via email or by writing a message on our Discord.

Thank you for reading! :>

Happy pride month! 🏳️‍🌈🏳️‍⚧️💛🤍💜🖤

Work on polyproto and a "vacation" ⛱️

· 8 min read
bitfl0wer

In this little update post I write about what I've done in the last couple of weeks alongside talking about taking just a little break (don't worry, y'all are not getting rid of me!)

{/_ truncate _/}

It's been more or less two weeks since the last post - time for the next one!

A good amount of commits have been since the X.509 in polyproto was published. Let's break them down a little, shall we?


Certificate Signing Requests

The polyproto crate can now be used to create very basic - but to the best of my knowledge fully RFC compliant - Certificate Signing Requests! This is cool, because Certificate Signing Requests are how all Actors (Users) in polyproto will request a Certificate from their home server. The generated CSRs can be fully verified using the OpenSSL/LibreSSL CLIs, which is very important, as these two applications are the industry standard when it comes to working with cryptographic standards like X.509.

Specifically, polyproto uses the well-defined PKCS #10 standard to pack up and transport all the needed CSR information to your future home server.

The next steps here are:

  • Creating validators for the information supplied in the CSRs
  • Implementing methods to create an ID-Cert from a CSR
  • Write great documentation for what exactly the data inside of the ID-CSR has to look like to be valid

...and as you might have already guessed, I am already working on all of these things! :) They just take time™️


Cleaning up

As fun as designing APIs and software architecture is for me, I don't yet always get all of it right on the first try. This is fine though, as long as you recognize the mistakes you've made, learn from them and clean the mess you've made.

I noticed that, as well-meant as some of the traits and trait bounds I've added, they made implementing polyprotos' base types and traits a lot harder than needed. I've been chipping away at the unnecessary and redundant bits, removing some of these traits entirely.


Updating the specification document

I really wanted to get started on a reference polyproto implementation before finishing the specification document. This might seem a little counter intuitive, but my thought process was, that implementing the crate in code would force me to think about everything from scratch again, which would make it much easier to spot mistakes I potentially made when writing the specification documentation. These mistakes would primarily be:

  • Information that is there, but unimportant
  • Information that is important, but not there
  • Information that is important, there, but wrong

This turned out to be right. I have added a lot of "TODO"s and "FIXME"s into the specification document since started working on the polyproto crate. All of these TODOs have since been worked on and removed! This doesn't mean that the specification document is now perfect, but it's already better than before, and it'll only get better as I continue to work on the crate!

Another, notable thing that happened is removing the auth-part from the core polyproto protocol! You might be thinking "whaaaat? does that mean that there will be no authentication in polyproto??" but I can assure you, that that's not what this means. Removing the authentication endpoints from the core protocol means that polyproto extensions can now choose authentication technologies and methods for themselves, instead of being forced to implement a bunch of REST-based authentication endpoints they might not even want or use anyways.

I would like to thank @laxla@tech.lgbt for this idea! :> Collaboration and feedback are truly great things, and I am happy to have such a nice group of people on Discord and Matrix who are genuinely interested in the silly thing I/we want to do with Polyphony and polyproto :)


Now for the perhaps biggest and probably most important announcement:

Taking a little break for my silly mental health


It just dawned on me that March 8th marks the one year anniversary of Polyphony!! That's genuinely so cool, and means that this is the project I have worked on the longest for, out of all of my personal projects.

So yeah - it's been almost a year now! And not a lazy one for me, either.

bitfl0wer's (mine) commit graph for the past year. A lot of green squares, signalling a lot of commits/activity that has been made. The text above the graph reads: "2,858 contributions in the last year"

"Content warning"
The following paragraph covers the topics of anxiety and depression. If you would not like to read about this, feel free to
scroll down until you see a big green box with a check mark. The box indicates that it is safe for you to read again!

Big shocker: I am 👻👻👻👻 depreeeeeeessed 👻👻👻👻👻, and have been for the past... 4-6 years of my life. In that time, I have experienced the absolute lowest points of my life. Luckily, I have the absolute privilege to have a great therapist who I have been with for 2 years now, and I am also on medication which already does a good job (most of the time) at taking the edge off the depression.

As it has been explained to me by my therapist, medication should only be a crutch, though. It should not be the tool you should solely rely on for the rest of your life to deal with extreme (social) anxiety and depression. Other, non-medication-related options should be tried, to potentially get you to stop having to take medication to feel non-completely-absolutely-positively-awful every day.

One of these options is therapy, and, as I've mentioned, I've already been doing that for 2+ years now. It has helped me a great, great deal already, and I can absolutely encourage anyone reading who is feeling similarly to how I've described and who is in the lucky position to get (or at least be put on a waiting list for) therapy, to take the first step. It isn't easy; it can actually feel really really scary at first. But do believe me when I say that a good therapist can absolutely help you to get better.

But one hour of therapy a week can sadly only do so much. This is why I, with the encouragement of my friends, loved ones (particularly my lovely, lovely girlfriend) and my therapist, have decided to admit myself into a mental health clinic that specializes in the treatment of depression, anxiety disorders and the like.

Safety checkpoint reached!
It's now over! :)

Starting on March 20th, I will be leaving my everyday life, my girlfriend, my friends, laptop, work, personal projects and everything else behind to go there, and hopefully leave a good bad part of me behind when I come back.

The clinic is far away though, and leaving absolutely everything behind for a month or possibly a little longer is really, really scary to me. However, I think and hope that the metaphorical plunge into icy water will be worth it for me and my mental health.

When I come back, I'll be better than I was before, which will also mean that I can hopefully be more happy and productive in all aspects of my life, including Polyphony.

If you're reading this on or after March 20th, then see you on the other side :) I hope the grass is greener there!

:::quote "BEGPOSTING ON MAIN"

I am lucky and extremely privileged to have been growing up in Germany, a country with a (mostly) functioning social welfare system and universal health care.
If this wasn't the case, I'd likely be absolutely unable to afford to put myself into such good care. Germany doesn't pay for everything though, and the
train rides to and from the clinic will likely be expensive for me, as will the 10€ daily fee for staying at a clinic (capped at 280€).

I can currently afford this without financially ruining myself, so don't worry about that. However, this whole endeavour will take a good chunk out
of my current savings. Thus, if you'd like to [donate to my ko-fi](https://ko-fi.com/bitfl0wer) to help me cover the costs, it would mean a lot to me! \<3

Please only do so if you are in a stable financial standing yourself, though. As I said, with or without tips, I'll manage. :)

X.509 in polyproto

· 7 min read
bitfl0wer

This blog post covers a bit about how and why X.509 is used in polyproto, and how we try to make the process of implementing your own server and incorporating it into an existing network a little easier.

{/_ truncate _/}

:::quote "Authors' note"

Before knowing and reading about the X.500- and PKCS-series of RFCs, I legitimately thought,
that implementing an own certificate standard for polyproto would be a good idea! Looking back,
this is **incredibly** naive. But learning new things and improving myself is one of the
biggest joys I experience when writing software, so this humbling experience was totally worth
it for me, personally.

polyproto is a federation protocol that uses X.509 Public Key Infrastructure (PKI) to prove and federate your identity across a whole network of decentralized services.

X.509

Specifically, polyproto leverages the already well-documented and widely used X.509 standard at its core. X.509 was chosen over OpenPGP because of its comparative simplicity. The Web of Trust from OpenPGP often requires active user input to assign trust levels to users and their keys, which is not inline with our ideas and goals for user experience in a decentralized system. Ideally, decentralization and federation is as seamless as possible for the end-user, and X.509 with its Certificate Authority (CA for short) model is the better fit for such a goal. In fact, X.509 can be so seamless to the end-user, that you have probably forgotten that you are already using it right now!

HTTPS (SSL/TLS) certificates are likely the most popular form of digital certificate out there, and they're implemented in a way, where the only time us humans ever have to think about them, is when our browser tells us that a certificate from a website we're trying to visit, is not valid anymore.

This popularity is great news for polyproto, because it means that mature tooling for all sorts of programming languages exists today, along with tutorials and documentation, teaching potential implementers how everything works.

How polyproto uses X.509, briefly

In polyproto, home servers act as Certificate Authorities, while each client you connect from has its own end-user Certificate, issued by your home server. With certificates, you can prove your identity to any person or server at any time. Certificates are also used to verify the integrity of data sent across the polyproto network.

If servers and clients have well-implemented cryptography, it should be extremely unlikely - if not impossible - for non-quantum-based, non-supercomputer-cluster home servers to alter the contents of a message before passing them on to the recipient.

:::quote "Authors note"

:nerd: Technically, polyproto and X.509 absolutely support Post-Quantum Hybrid Digital
Signatures. If these Hybrid Digital Signatures use well-made Post-Quantum Signature schemes
and are implemented well, polyproto also offers post-quantum-computing resilience. There
seems to be very little, easy to understand reading material on hybrid schemes out there.
The best/most easy to understand definition or explanation of hybrid schemes I could find is
[this one, in the document "A Hybrid Signature Method with Strong Non-Separability"](https://www.ietf.org/archive/id/draft-nir-lamps-altcompsigs-00.html#name-non-separability).

In short, clients generate a PKCS #10 Certificate Signing Request (CSR). This CSR includes some information about the client. In polyprotos case, this information is:

  • session ID
  • federation ID
  • algorithm used to generate the public key attached to the CSR
  • the public key attached to the CSR
  • a signature which is verifiable using the attached public key, validating all of the aforementioned information

This CSR is sent to your home server, which verifies this information and in turn responds with a polyproto X.509 Certificate (ID-Cert).

Home servers get their root certificate by self-signing a CSR. Unlike actor/client certificates, the home server root certificate features X.509 extensions such as the "Basic Constraints" attribute, marking its certificate as a CA certificate, allowing the home server to sign CSRs using this certificate.

But it's not all perfect

Root Certificates in the context of HTTPS and the modern, SSL/TLS protected web are a big source of centralization. This centralization might be necessary to a degree, but it inevitably means less plurality, and way more hoops to jump through, should you also want to be a CA.

To give context for those who might need it, essentially, every certificate for every website out there has to be able to be traced back to one of the root certificates installed on your internet-capable device's operating system or web browser. This creates an incredible amount of centralization, because one Root Certificate Authority is directly responsible for hundreds of thousands, if not millions of websites. This dependency on a few privileged Root CAs has been monetized, which is why getting an SSL/TLS certificate for your website used to cost you money (and depending on who you are, it might still be that way). Nowadays though, Let's Encrypt exists, offering free SSL/TLS certificates, with the caveat that these certificates are only valid for three months at a time.

What can we do about this?

To try and keep open polyproto networks to stay open for everyone, polyproto should make centralization to the degree of modern-day SSL/TLS at infeasible.

An approach we are taking is limiting the length of the certification path.

In X.509, to validate and trust a certificate, you must also trust all the other certificates leading up to the Root Certificate of the Certificate Tree.

Example
To trust `Leaf Certificate 1`, one would have to also trust the certificates held by the
`Middleman CA`, `CA 1` and the `Root CA`.

This path from the certificate you are actually trying to validate to the Root Certificate is referred to as the certification path. By arbitrarily limiting the length of this path, it becomes harder for one certificate authority to issue and manage a great (1.000.000+) number of certificates, due to the increasing amount of processing power required to handle web requests and to verify and sign CSRs.

In polyproto, the maximum length of this certification path is 1, meaning a Root Certificate may only issue leaf certificates. Cutting out middlemen makes it hard to scale to monstrous levels of centralization, as the control one CA can have over the entire network is limited.

All of these factors combined should always make developing or hosting your own home server a viable option.

:::quote "Authors note"

To clarify, this does not mean that polyproto servers will only be able to handle a small amount
of users, or that polyproto is designed for small-userbase scenarios. A well-implemented
and fast home server implementation should, with the given resources, be able to handle a great
number of registered users. This shallow-depth trust model should aid in stopping trust
hierarchies with great amounts of influence over the network from forming.

However, real-life power distribution scenarios can be be unpredictable, which
means that the efficacy of limiting the certificate path length as a measure to prevent
centralization can only be proven when polyproto is being deployed in the real world.

If you have any questions or feedback, feel free to reach out to me via email, where you can reach me under flori@polyphony.chat. OpenPGP is supported, and my public key can be found on keys.openpgp.org

Account migration in polyproto

· 6 min read
bitfl0wer

Account migration is an important and difficult thing to get right in federated systems. In this blog post, I will outline how I imagine account migration to work in polyproto, and what benefits this approach brings.

{/_ truncate _/}

It seems that striking a good balance between user experience, convenience and privacy has been a difficult task for many federated systems, when it comes to account migration. polyprotos' approach to how data is distributed and stored, and how identities are managed, makes it possible to have a very smooth and secure account migration process.

The problem

Using Mastodon as an example; When a user wants to move from one instance to another, they have to create a new account on the new instance, and follow all the people they were following on the old account. All the toots and other data from the old account are left behind, and you do not have a way of porting them over to the new account. This is a problem that has been around for a long time, and it is not just a problem with Mastodon, but with many other federated systems as well.

How polyproto works, briefly

In polyproto, your federation ID, e.g. xenia@example.com, is what identifies you. If you want to use this identity on a client, your client will generate a key pair for a certificate signing request, and send this request to your home server. Given that you didn't provide any invalid data, your home server will sign the certificate, and send it back to you.

Any data you send to anyone - be it a chat message, a social media post, or anything else - is signed using your private key. This signature can be verified by anyone using your public key, which is part of the certificate you received from your home server. To check a certificates' validity, you can ask the home server for its root certificate, and verify the signature on the certificate you received.

This means:

  • All the data you send is cryptographically tied to your identity
  • Anyone can verify that the data was actually sent by you
  • Anyone can verify that the data was not tampered with by anyone else
  • Everybody can verify that you are who you say you are

This is even true when you are sending data to a different server than your home server.

Migrating an account on polyproto

Low data centralization

Fundamentally, the process of migrating an account in polyproto relies mostly on changing data ownership, rather than moving data around. This works best in scenarios where data is highly distributed, and not stored in a central location.

Example
This might be the case in a social chat messaging system
similar to Discord, where messages are stored on the servers of the people hosting the chat rooms.

When you want to move your account from one server to another, you:

  1. First, create a new account on the new server
  2. Then, you configure the new account to back-reference the old account
  3. Next, if you are able to, you tell your old home server about the move
  4. Last but not least, you verify to the servers storing your data that you are the same person as the one who created the old account. The servers then update the data ownership to your new account. This is done by using your old private key(s), in a way that does not reveal your private key(s) to anyone else.

If applicable, your friends and followers will also be notified about the move, keeping existing relationships intact.

note
This entire process does not rely on the old server being online.
This means that the process can be completed even if the old server is down, or if the old server
is not cooperating with the user.

However, including the homeserver in the process adds to the
general user experience. If you, for example, have included your federation ID as part of another,
non-polyproto social media profile, the old server can automatically refer people to the new account.

Moving data

Should data actually need to be moved, for example when the old server is going to be shut down, or if the centralization of data is higher, the migration process is extended by a few steps:

  1. Using the old account, your client requests a data export from your old home server.
  2. The old home server sends you a data export. Your client will check the signatures on the exported data, to make sure that the data was not tampered with.
  3. You then import the data into your new account on the new home server.

Conclusion

polyproto's approach to account migration is very user-friendly, and does not require the user to do anything that is not already part of the normal usage of the system. The process is also very secure, as it relies on the cryptographic properties of X.509 certificates, and also works across a highly distributed data model, which, in my opinion, is how the internet should be.

The biggest drawback to this approach is that there are a whole lot of web requests involved. Depending on the amount of data, this can take some minutes or possibly even hours.

It is also worth noting that all of this does not require any new or young technology. polyproto relies on X.509 certificates, which have been around for a long time, and are widely used in many different applications. This means that the technology is well understood, and that there are already many great tools in all sorts of programming languages available to work with it. From my point of view, there is no need to reinvent the wheel.

I hope that this article has given you a good understanding of how account migration works in polyproto. If you have any questions or feedback, feel free to reach out to me via E-Mail, where I can be reached under flori@polyphony.chat. OpenPGP is supported, and my public key can be found on keys.openpgp.org (click to download pubkey)

Porting chorus to WebAssembly + Client Update

· 2 min read
bitfl0wer

What the current state of GUI libraries in Rust means for Polyphony and chorus, and why we are porting chorus to WebAssembly.

{/_ truncate _/}

Hi all!

To make this part of the post short: The web-based client will be worked on before the native one, if there even ever will be one. The reason is that no currently available native Rust GUI library meets the standards I'd like to see when using it to build an application I am putting my name behind. I'd like to have

  • accessibility
  • great styling
  • cross compilation
  • memory safety

and the current state of Rust GUIs essentially tells me to "pick three", which is unacceptable to me. A WebAssembly based application is the best we'll get for now, and I am fine with that.

Compiling to WebAssembly isn't all that easy though: The wasm32-unknown-unknown target intentionally makes no assumptions about the environment it is deployed in, and therefore does not provide things like a net or filesystem implementation (amongst other things). Luckily, adding support for this compilation target only took me a full 40h work week [:)], and we are now the first Rust Discord-API library (that I know of) to support this target.

You might not have yet heard much about WebAssembly: In the past, web developers could only really use three languages - HTML, CSS, and JavaScript - to write code that browsers could understand directly. With WebAssembly, developers can write code in many other languages, then use WASM to convert it into a form the browser can run.

This is particularly helpful for programs that require a lot of computing power, like video games or design software. Before, running such programs in a browser would be slow or impossible. WebAssembly can make these run smoothly, right in your web browser.

Overall, WebAssembly is expanding the kinds of applications that can be run on the web, making the web a more flexible and powerful place to work and play. Compiling Chorus for WASM allows us to leverage this fairly new technology and bring all of Rusts benefits into a web context.

The next blog post will likely be about progress with the web-based client. See ya until then! :)

Getting started with the Polyphony Client

· 2 min read
bitfl0wer

{/_ truncate _/}

Us labeling Chorus to be in a public-alpha state was really great news for me, for a lot of reasons! It marked a point in Polyphonys history where, after all these months of work, we agreed upon the fact that what we have is good enough to be shown to the public, and that's always a nice thing when investing so much of your free-time into a project. The other main reason why this is such a great thing is, because this alpha state (at least to me) means, that the public API is kind-of stable, or at least stable enough so that I, the project lead, can rely upon the fact that all the public methods will not, in fact, be replaced in 4 days.

This means, that I can finally start working on the Client! And I have done that! For the past 2? 3? Days, I've been tinkering around with Iced-rs (a really, really great UI framework for Rust, written in Rust) and the client repository to create the 'skeleton' of the application. While this is definitely not trivial, especially since I have no prior experience in desktop application development, it's also not too hard either.

While Iced is not mature yet, and "how-to" guides, as well as the promised Iced-book, are still largely missing, the maintainers have done a great job with providing a LOT of code examples and solid rustdocs. It's a fun library/framework to work with, and the Elm-inspired approach of dividing up State, Messages, View- and Update-Logic feels really intuitive and seems to make sure that your Application will never end up in an unexpected state.

That's all I have for today. Thanks for reading this! Here's a video of multi-user login already working ^^

chorus Alpha 0.1.0

· 2 min read
bitfl0wer

We are alpha now! As of 2 days ago, the first Alpha of Chorus, Version 0.1.0, has been released for everyone to look at and use on crates.io!

{/_ truncate _/}

So, is the library complete now? No. And yes! It's, well, complicated... Let me explain!

Chorus is at a point where I can comfortably say that, if you take voice-support out of the calculation for a bit, the foundation feels rock-solid, easy to work with and easily expandable. However, to stay with our house/building metaphor for a bit, the walls aren't painted yet, there's barely any furniture and not all of the electrical outlets have been installed yet.

Okay, enough with this bad metaphor; What I meant to convey is, that a lot of the API endpoints have not yet been implemented, and there are at least a few points we haven't addressed yet - like Gateway Error Handling, to name an example.

But for an early Alpha, this, in my opinion, is absolutely acceptable. Implementing API endpoints is something that probably someone who is entirely new to Rust could do, given that we've streamlined the procedure so much, and the other stuff can comfortably be fixed without having to do any major changes to the internals.

I, for one, am currently experimenting around with the Polyphony Client, which, by the way, will likely be written with Iced as a GUI Framework, not GTK. I have no prior experience in GUI/Desktop Application development, but I am feeling more confident than ever and I'm eager to learn all there is to know about these topics.

That's that! Seeya next time. Cheers,

Flori

Self-updating structs, moving blog posts to GitHub, and more

· 3 min read
bitfl0wer

Introducing self-updating structs, explaining how they work, and what they are good for. Also, moving blog posts to GitHub, and other improvements.

{/_ truncate _/}

It has been a while since the last update post - 1 month to be precise! I haven't gotten around to writing one of these, mostly because of personal time- and energy constraints. However, now that these resources are finally replenishing again, I figured that it is once again time!

Moving Blog Posts to GitHub

This is a pretty self-explanatory point. I thought, that opencollective would find more use by me and other polyphony-curious folk, however, this didn't go as planned. Also, opencollective made their Discord embeds really poopy, which is why I am moving all the blog posts over to GitHub.

A big one: Self-updating structs

Ideally, you want entities like Channels, Guilds, or Users to react to Gateway events. A Gateway event is basically a message from Spacebar/Discord to you, which says: "Hey, User x has changed their name to y!". If you can reflect those changes immediately within your code, you save yourself from having to make a lot of requests and potentially getting rate-limited.

This is exactly what Self-updating structs set out to solve. The first implementation was done by @SpecificProtagonist and me (thank you a lot again, btw) on the 21st of July. However: This implementation, being in its' infancy, has had some design flaws, which to me made pretty clear, that this whole thing needed to be thought through a little better.

The second iteration of these Self-updating structs was finished... today, actually, by me. It saves memory compared to the first iteration by storing unique objects only once, instead of n = how many times they are being referenced-times. While this way of doing things is really efficient, it also has been a pain in the ass to make, which is precisely the reason why this took me so long. I've learned a lot along the way though.

The public API has also gotten a lot better in "v2". This is mostly because I am a big believer in writing tests for your code, and through writing what are essentialy real-world-simulation-examples, I noticed how repetitive or stupid some things were, and thus could improve upon them.

Having this whole thing finished is a big relief. This self-updating thing is an essential feature for any Discord/Spacebar compatible library, and I think that we implemented it very nicely.

Documentation and other improvements

@kozabrada123 took it upon himself to re-write a lot of the codes' Documentation. Thanks for that! This will massively improve the ease of use of this library - both when developing for and with it. koza also improved our CI/CT pipeline by incorporating build-caching into it, which speeds up builds.

This has been the last month of Polyphony. In the coming weeks, I will be working on

  • Implementing self-updating-struct behavior for every struct which needs it
  • Fixing bugs
  • Adding more features, like emojis, 2FA, Guild Settings, etc.!

See ya next time!