Patchwork – P2P Foundation https://blog.p2pfoundation.net Researching, documenting and promoting peer to peer practices Sun, 07 Oct 2018 22:14:12 +0000 en-US hourly 1 https://wordpress.org/?v=5.5.15 62076519 Karissa McKelvey on the Web of Commons https://blog.p2pfoundation.net/karissa-mckelvey-on-the-web-of-commons/2018/10/08 https://blog.p2pfoundation.net/karissa-mckelvey-on-the-web-of-commons/2018/10/08#comments Mon, 08 Oct 2018 08:00:00 +0000 https://blog.p2pfoundation.net/?p=72923 Karissa McKelvey from the Dat Project provides an overview of the new decentralized Internet and the need to insert commons thinking and practices into this new space. This text is based on Karissa’s2017 Full Stack Fest’s keynote and was originally published in the Dat Project’s Blog. Karissa McKelvey:  In the 18th, 19th centuries it was... Continue reading

The post Karissa McKelvey on the Web of Commons appeared first on P2P Foundation.

]]>
Karissa McKelvey from the Dat Project provides an overview of the new decentralized Internet and the need to insert commons thinking and practices into this new space. This text is based on Karissa’s2017 Full Stack Fest’s keynote and was originally published in the Dat Project’s Blog.


Karissa McKelvey:  In the 18th, 19th centuries it was thought that property ownership was the only way to protect common resources such as grazing pastures. Garrett Hardin famously put it: “The individual benefits as an individual from his ability to deny the truth even though society as a whole, of which he is a part, suffers.”

It was thought that communities that only act in rational self interest destroy the common pool resource they are sharing. This is described as “the tragedy of the commons”: that isolated, autonomous individuals will always choose the path best for them as individuals.

Elinor Ostrom introduced a new body of research to challenge this. Over 40 years of research, she was able to prove that Hardin exaggerated the problems involved in managing a commons. In 2009, Elinor Ostrom was the first woman to win the Nobel Prize in economics. She talked about how people actually are able to come to together to form norms and rules that sustain their mutual cooperation. For example, she went to Nepal and studied how people there were managing their own irrigation systems. She found that if communities simply follow eight principles, a sort of blueprint, communities use to self-govern and sustain the resource without depleting it.

What about applying this to the internet? Before her death in 2012 Ostrom published a book with Charlotee Hess called Understanding Knowledge as a Commons. This book laid the groundwork for thinking of digital knowledge as a commons (that is the digital artifacts in libraries, wikis, open source code, scientific articles, and everything in between).

The Internet as a Commons

Looking at the internet as a commons — as a shared resource — allows us to understand both its unlimited possibilities and also what threatens it.

What threatens the internet? Right now, private companies that control large parts of the internet are trying to prevent the internet of commons. If products fail or are deemed not economically viable (for example Vine, Google Reader, etc), the whole suffers. Monopolies, like Google, are able to keep their power by influencing the political landscape. However, in the internet of commons, monopolies are no longer in control, and users would be trusted to self-govern the commons.

Decentralization has been the most recent proposal as our technological means to get away from this and give the power to users. In a decentralized world, users get to control the contracts of the website, can choose to fork that website, re-host data to fix broken links, evade censorship, and overall take ownership of their data. Freedom of expression, privacy, and universal access to all knowledge should be inherent to the web. But right now, those values are not.

Locking the Web Open

Thinking of the internet as a commons allows us to think of different ways we can moderate and grow spaces, allow innovation to flourish, and improve the quality of knowledge and information sharing. As Brewster Kahle puts it, decentralization ‘Locks the Web Open.’

I’m not just dreaming of a new world with Brewster Kahle about the future of the internet. The internet of commons is here today. Peer-to-peer (p2p) applications already exist, are being built, as well as used by real users as we speak — you can build one too! Secure Scuttlebutt, for example, is a completely p2p protocol for syncing data. Patchwork is a social networking application built on top of the Secure Scuttlebutt Protocol. People can join a public server and make friends, then use a gossip approach to find friends of friends. Many early adopters come from IRC and have started using it instead of IRC. It’s immensely successful as a little protocol and you can build something with it today.

Dat is inspired by BitTorrent and built in a similar fashion to Scuttlebutt. It is a decentralized protocol for storing, versioning, and syncing streams of potentially very large datasets. We’re a non-profit, funded by grants and, so far, we’ve operated more like a research lab than a company.

A foundational part of what we’ve been doing for the past three years is to work with university labs, libraries, researchers, and universities to help them manage their scientific data. Scientific articles and their related data are very specific and yet good use case for a commons approach to the internet.

As companies privatize data they create silos or they put up paywalls, and prevent the growth of the commons — another kind of enclosure. This means that certain people with power close the pathways into the commons so that they can profit from it… but it actually detracts from everyone’s ability to use it and also prevents its ability to flourish. Innovation suffers, as fewer people have access to the knowledge and it is much harder to produce contributions that could improve that research. The rationale given for companies to create paywalls is that it is expensive to collect, store, organize, present, and provide bandwidth for the billions of pages of articles and datasets.

Decentralization is a clear way we can reduce the costs of this hosting and bandwidth — as more people come to download the articles and data from the journal or library or university, the faster it gets. The dream is that universities could turn their currently siloed servers into a common resource that is shared amongst many universities. This would cut costs for everyone, improve download speed, and reduce the likelihood that data is lost.

Decentralization of data produces challenges though — just like a torrent, data that is decentralized can go offline if there aren’t any stable and trusted peers. In the case of scientific data, this is an immense failure. To mitigate it, we invoke the human part of a commons — the data will be commonly managed. For example, we can detect how many copies are available in different places, just like a BitTorrent, and compute health for a dat — for example, a dat hosted at the Internet Archive, University of Pennsylvania, and UC Berkeley is probably very healthy and has low probability of ever going offline, while a dat hosted by 5 laptops might go down tomorrow — even though there are more peers. When a dat becomes less healthy, the community can be alerted and make sure the resource does not go down. Decentralized tech and decentralized humans working together to use commons methodology in practice.

Along with this, what we get by design is that links can last forever, no matter what server they are hosted on — using a decentralized network based on cryptographic links and integrity checks allow many servers to host the same content without security risks, a property not present in http.

This concept of decentralization isn’t new. The internet was built upon the concept of it being very resilient, that if a node failed, it’d find another way to get information to other computers. The internet was originally decentralized, but over time it became clear that centralized parties were needed to fund and maintain websites on the internet. The move towards decentralization is almost a yearning for the past, a way to get around this really centralized section of internet history.

Building the Future

A way we’ve been thinking about building protocols for decentralization is looking to how current popular protocols were developed and mirroring those methods. Current very popular modes for transfer were developed by people like Tim Berners-Lee (CERN, www) and Vint Cerf (DARPA TCP/IP) who worked in research labs. They gave away their protocols for free to the public, as products of scientific inquiry. The secret sauce of what they did was to craft open standards that don’t need permission to use and reuse, prioritized usability, and involved no or low barriers to access. Even Google was founded from two folks in a university lab, who published their search algorithm PageRank.

Today, I look at the decentralized landscape in context of what these people were doing back in the day and wonder if we’re continuing their legacy. Ideally, new decentralized protocols could be built into browsers that people already use today. Alongside http://, we imagine dat:// view websites or data from a distributed network (which you can now do with the Beaker Browser!).

I look at initial coin offerings (ICOs) and new blockchain companies that claim to be revolutionizing the way we work on the internet, and I’m not seeing this same model. I’m seeing white papers that are published, and sometimes even implemented in open source. But if you look at what they propose, many offer siloed networks that are privatized, with money being invested into specialized coins that create new enclosures. A big component of these ICOs are trust-less networks, which remove the human elements of trust and social groups from the network.

Decentralization then, is not just a technological problem, it is also a human one. Researchers at MIT have been looking into many of these decentralized tools and are reaching similar conclusions — the technical problems are hard but we must solve the social and people problems if we want to progress: “Decentralized web advocates have good intentions, but there is no silver-bullet technical solution for the challenges that lie ahead.”

To top it off, over $1.6 billion was invested in these ICOs in the past year alone. Where are we going? Is the future of decentralization going to be rooted in paywalls and coins, with the management of those coins and that technology trusted to a single individual or group? Is that really where we want to end up?

With a commons approach to the decentralized web, the most ideal approach is guided from where we came. I am much more excited about creating protocols that are easy to use, develop with, and extend without asking for permission and without paying or having much money at all. That means that they are driven by the community, built for the public good, and given away for free. If the company or organization dies, the protocols should still be usable. Any blockchains involved should not be tied to a particular for-profit company. I should not be tying my data to any one coin or blockchain for fear of enclosure. The protocols should be optimizing for science(broadly speaking, as in developing knowledge) and mutual collaboration rather than optimizing for profit. Let us not recreate the problem we are trying to solve.

Photo by n.a.t.u.r.e

The post Karissa McKelvey on the Web of Commons appeared first on P2P Foundation.

]]>
https://blog.p2pfoundation.net/karissa-mckelvey-on-the-web-of-commons/2018/10/08/feed 1 72923
This Machine Eats Monotheistic Meta Memes https://blog.p2pfoundation.net/this-machine-eats-monotheistic-meta-memes/2018/08/23 https://blog.p2pfoundation.net/this-machine-eats-monotheistic-meta-memes/2018/08/23#respond Thu, 23 Aug 2018 08:00:00 +0000 https://blog.p2pfoundation.net/?p=72315 Some scuttlebutt about Scuttlebutt 🦐 —  hey squiddo, I can’t remember if we talked about Scuttlebutt yet. are you familiar? just a good one to have on your radar, v cool people with excellent tech and zero hype and bullshit 🦑 — Hmm interesting, is Scuttlebutt running in production for something yet? It’s like a service to... Continue reading

The post This Machine Eats Monotheistic Meta Memes appeared first on P2P Foundation.

]]>
Some scuttlebutt about Scuttlebutt

🦐 —  hey squiddo, I can’t remember if we talked about Scuttlebutt yet. are you familiar? just a good one to have on your radar, v cool people with excellent tech and zero hype and bullshit

🦑 — Hmm interesting, is Scuttlebutt running in production for something yet? It’s like a service to run other things on, no?

🦐 —   secure scuttlebutt (ssb) it’s a very low level protocol. works like gossip: messages spread between peers. uses the internet if it is available, but doesn’t need it: local wifi, bluetooth (coming soon), or USB sticks are enough.

identities have logs. log = a sequence of messages. they’re cryptographically authenticated so you can guarantee who said what. identities can follow each other. you replicate the logs of your peers. no central server, no off switch, no delete. so if you want to find me, you need to find one of my peers first. creates peer-to-peer archipelagos of friends and data connected by their relationships.

data can be of any type. apps decide what types of messages they pay attention to. e.g. Patchwork is a social media app, with a few hundred daily active users. other apps: a chess game, distributed github clone, soundcloud clone, blogging client, events, calendar, loomio clone, etc etc etc.

it is exciting because there is a steadily growing community, like great new developers showing up every week or two. and it is the only decentralised tech project I know of that is populated by really gentle, caring, community-building, good politics, critically aware but having fun kinda people

🦑 —Aha very cool, I’ll dig into it more and start following what’s going on. Sounds like a very interesting concept!

🦐 —  its dooope. still bleeding edge in many places, so let me know if you get stuck on the way in

but it is getting to the point now where it is more than just my ultra nerd friends in there having a nice time. e.g. here’s a web view of a newsletter summarising activity in the scuttleverse this past week.

🦑 — So if you were to think about applications to what we’re doing with our festival community, what would they be?

🦐 —  think of all the apps you currently use, but imagine they work offline-first

I think it could be a cool on-site mesh network for the festival, to start with, and then people will be delighted to find they can still stay in touch later, because it uses the internet if it is available

🦑 — How does it work, with regard to timing, when it cannot be ensured that messages are received in order?

🦐 —  that’s right, you can’t guarantee order, there’s a lot of little weirdnesses like that which pop up in a purely subjective universe. messages always reference messages before them, so you can infer order

but yeah sometimes in discussions you will see “oh sorry I didn’t have your message when I wrote my comment”. but actually so far that seems mostly to be a feature, a constant reminder that you are just one subjective agent, there is no official arbiter of truth, everyone has a different experience of the world.

you’d be surprised at how much uptime there is when you have a few peers in a web of tight relationships, there’s nearly always someone online. so you don’t notice it much

you also will see missing messages, like, ‘someone wrote a comment here but they are outside of your network so you can’t see it’

which again, sounds like a bug, but I experience it as a feature. it’s very subtle but you keep getting these reminders that there is no single source of truth.

🦑 — Hmm right, so you need to have done explicit individual authentication with each every other party?

🦐 —  some of the peers are special, they’re called “pubs”. practically the only special thing about them is they are guaranteed to have much higher uptime than your average peer and they can hand out “invites”. If you redeem an invite, that means you follow them, and they automatically follow you back. they work a bit like servers, but not much

so if you connect to a pub that I’m connected to, you’ll be able to find me

then you’ll see a list of people that I’ve followed, and you can choose if you believe the name and avatar is who you think it is

there’s not an emphasis on real world identity verification, but it could be done. most people use real names but a decent fraction also enjoy pseudonyms

🦑 —Ah right, and if a pub sees your activity, and I’m connected to the pub, I see your activity?

🦐 —  yep, but there are people who follow no pubs, and they have a fine experience too, so long as there are a few friends of friends

🦑 — Gotcha. Yeah, there are definite interesting advantages of this, for sure

🦐 —  you can also extend your range, they call it “hops”. by default hops is set to 2, so when you follow me, you replicate my feed, plus all my friend’s feeds. in Patchwork you can see the “extended network” which will show you everything public from your the friends of your friends.

My tech knowledge is pretty patchy so I might be misrepresenting the details. I’m not the official source of truth. (there isn’t one.)

when you get deep into it, the main advantage i see is that it is agent centric (people, relationships), rather than location centric (documents, websites). so I have built up a web of relationships and content on my identity. When I move from Patchwork (social media) to Ticktack (blogging) to GitSSB (github clone), all my relationships and data come with me.

solves one of the common headaches of running online communities: you define the group once, and bring that definition with you to any app you want to use. seriously reduces onboarding friction

which means you actually have competition for social media interfaces, there’s no walled garden that owns your social graph

so the geeker types don’t use Patchwork, they use Patchbay, which has the same people and content, but a different interface that sacrifices some UX niceities but gets you closer to the code

🦑 — Right, but that also means that you become a carrier for a lot of messages that someone else with the right key could decrypt, ensuring more redundancy and coverage of data

🦐 —  so long as you keep your secret key, you can lose your computer and rebuild all your past data based on the copies your friends are keeping for you

as one of the ‘butts said, your friends are now the data centre.

🦑 — Ah. Yeah. Got it. That’s a huge advantage.

🦐 —  Can I have your permission to publish this conversation?

🦑 — Absolutely! If it’s useful to have my identity attached to the conversation, you have my permission for that too

🦐 —  thanks. i think i will recast you as a sweet emoji friend

🦑 — Yeees! Haha

The post This Machine Eats Monotheistic Meta Memes appeared first on P2P Foundation.

]]>
https://blog.p2pfoundation.net/this-machine-eats-monotheistic-meta-memes/2018/08/23/feed 0 72315
Moving forward from Netarchical platforms in the post-Weinstein era https://blog.p2pfoundation.net/moving-forward-from-netarchical-platforms-in-the-post-weinstein-era/2017/12/27 https://blog.p2pfoundation.net/moving-forward-from-netarchical-platforms-in-the-post-weinstein-era/2017/12/27#respond Wed, 27 Dec 2017 09:00:00 +0000 https://blog.p2pfoundation.net/?p=69039 Brilliant reflections from Tara Vancil, originally published a few months ago. Towards a more democratic Web In the aftermath of the recent Harvey Weinstein revelations, Rose McGowan was suspended from Twitter for breaching its Terms of Service. Twitter made an unusual move by commenting on the status of a specific user’s account, which it normally publicly declares... Continue reading

The post Moving forward from Netarchical platforms in the post-Weinstein era appeared first on P2P Foundation.

]]>
Brilliant reflections from Tara Vancil, originally published a few months ago.

Towards a more democratic Web

In the aftermath of the recent Harvey Weinstein revelations, Rose McGowan was suspended from Twitter for breaching its Terms of Service. Twitter made an unusual move by commenting on the status of a specific user’s account, which it normally publicly declares it does not do.

Many people who have suffered harassment on Twitter (largely women), are understandably fed up with Twitter’s practices, and have staged a boycott of Twitter today October 13, 2017. Presumably the goal is to highlight the flaws in Twitter’s moderation policies, and to push the company to make meaningful changes in their policies, but I’d like to argue that we shouldn’t expect Twitter’s policies to change.

Twitter: a neutral platform or a curated community?

No matter if you’re a conservative, liberal, a woman, an apologist for a serial rapist (fuck you), or a Nazi (fuck you too), chances are good that at some point you’ll:

  1. Say something on Twitter that leads to your account being suspended, and/or
  2. Be frustrated by Twitter’s actions (or inaction) surrounding moderation

Twitter is a public space for conversation and community for millions of people, so for Twitter to suspend an account is akin to banning someone from the public center. That should not be taken lightly.

But we should also not take it lightly when when someone is harassed into silence by speech that threatens violence. Threatening speech is no longer just speech – we must consider how that speech impacts other peoples’ voices.

And here lies the problem. Twitter cannot be both neutral platform and arbiter of good and bad speech. Nor do I want Twitter to be either of those things!

  • If Twitter acts as a neutral platform, then unless Twitter can provide very powerful tools to help users manage their feed and who they engage with, then the platform will be flooded with bots, harassment, racism, libel, and all flavors of filth. A purely neutral platform leads to a terrible experience for users.
  • If Twitter acts as the decider of good/bad content, then we all have to worry about whether or not our opinions align with what Twitter has deemed “appropriate”. Maybe they align right now, but what happens if Twitter gets new executives, or if someday Twitter’s leadership is pressured by powerful forces to silence people with beliefs like mine?

Neither of those situations are ideal, and currently Twitter is dancing somewhere between these two worlds, trying to be a neutral platform while selectively enforcing bans and suspensions.

Twitter’s stalemate

You may not agree with Twitter’s policies, but you can likely observe the forces at play here, and understand why Twitter’s moderation policies have appeared inconsistent, unfair, and sometimes downright wrong.

It’s because Twitter is not driven by doing the right thing. Twitter is motivated to avoid upsetting users to the point that they leave Twitter. Users leaving Twitter is bad for business.

For example, If Twitter suspends alt-right accounts that intentionally toe the line between American pride and white supremacy, then they lose a not-insignificant number of users who’ll cry “free speech haters”. If they don’t suspend those users, they risk losing the users who won’t stand for Twitter being used as a platform for harassment and racism.

It’s not going to get better.

Don’t hold your breath

Twitter’s executives likely think their moderation policies are driven by being fair and judicious, but those policies can’t escape the fact that Twitter’s bottom line depends almost entirely on engagement and ad revenue.

Unless we expect Twitter’s business model to change, then we shouldn’t expect their moderation policies to change. No matter what decisions Twitter makes regarding moderation, some large group of users will feel targeted, and will swiftly exit the platform.

Moreover, what could Twitter do that would be a reasonable solution? I don’t see any way out of this.

So what should we do?

Decentralize. Twitter is responsible for moderating who and what shows up in your feed because Twitter’s servers house the content that composes your feed. A centralized service like Twitter or Facebook has the choice to act as a neutral platform for speech, or set strict content guidelines and then work to uphold those policies. I don’t believe either option is a good choice.

The dream of a decentralized Web

I want to decide what is good content for me. I want help making that decision based on how people I trust have responded to that piece of content. I want to be able to mark another user as a porn bot or a Nazi, and I want people who follow me to be able to see that information, and to decide how to act on it.

And most importantly, I don’t want any single person deciding if another person has the right to speak. The fragility of expecting a “media monarch” like Twitter to make good decisions is too risky. I want online media to work much more like a democracy, where users are empowered to decide what their experience is like.

Moving forward

A lot of people feel the same way, and several decentralized social media apps have bubbled up out of this mess.

You have many options if you’re ready to give up on Twitter.

MASTODON

Mastodon has been around for a while, but since it operates on a federated network, it’s not quite the flavor of decentralized I think we deserve.

In order to participate, you have to sign up to an instance, whose servers are run by somebody else. If you pick a good instance with a good administrator, you shouldn’t have any trouble, but you still have to depend on a single person to decide what you should or should not be allowed on your feed.

Running an instance is also hard and expensive work. It would be great if we could find a way to make social media apps both free and easy to use.

PATCHWORK

Patchwork is a peer-to-peer social media application with a rich community. It’s built on top of Secure Scuttlebutt, and acts as a standalone desktop application. It’s a little rough around the edges in terms of UI and performance, but the community is really great.

BUILD A PEER-TO-PEER SOCIAL MEDIA APP ON BEAKER

I work on Beaker, a peer-to-peer browser, and we’ve built APIs that give developers the ability to publish on the user’s “profile” and “timeline”.

Profiles in Beaker are just datasets that live on the user’s computer, and are transported over a peer-to-peer network. With Beaker’s APIs, applications can ask the user for permission to read/write to a user’s profile.

The best part is that because user data is separate from application code, there’s no one social media app we all have to agree upon. As long as we all structure our data in the same format, we’re each free to use any compatible application.

I work on Beaker because I think it’s the kind of Web we deserve. Keep your eyes peeled for the upcoming 0.8 release, where we’ll be releasing the Web APIs I mentioned above. Or if you live on the bleeding edge, you can try building the development branch. If you do, be sure to check out beaker://timeline :).

Screenshot of beaker://timeline in the Beaker browser

Photo by Donna McNiel

The post Moving forward from Netarchical platforms in the post-Weinstein era appeared first on P2P Foundation.

]]>
https://blog.p2pfoundation.net/moving-forward-from-netarchical-platforms-in-the-post-weinstein-era/2017/12/27/feed 0 69039
Coop-source: building decentralised open source with Protozoa https://blog.p2pfoundation.net/coop-source-building-decentralised-open-source-with-protozoa/2017/10/08 https://blog.p2pfoundation.net/coop-source-building-decentralised-open-source-with-protozoa/2017/10/08#respond Sun, 08 Oct 2017 10:00:00 +0000 https://blog.p2pfoundation.net/?p=67862 Mix Irving: Protozoa is a tech coop, and we write open source code. This is a little bit about what that means, and how open source is the foundation on which we’re building an aspirational future. I recently published a new feature for Patchbay — an open source project I maintain. It allows you to easily @-mention... Continue reading

The post Coop-source: building decentralised open source with Protozoa appeared first on P2P Foundation.

]]>
Mix Irving: Protozoa is a tech coop, and we write open source code. This is a little bit about what that means, and how open source is the foundation on which we’re building an aspirational future.

I recently published a new feature for Patchbay — an open source project I maintain. It allows you to easily @-mention people in the scuttleverse(a p2p social network). This isn’t straightforward because it’s a decentralized space where identity is subjective — many people have more than one use-name. (I gave a whole talk on subjectivity here).

I mention Mikey by a name that people in our coding context will be familiar withFoto tomada por: dinosaur

Why is this significant?

I’m proud of this feature, but the more important parts of this story are the open source and the cooperative culture making this possible.

Open source

Over the next couple of days, I’m going to generalise this code into a module called patch-mentions. Then I’m going to propose changes to Patchbay, and a sister project called Patchwork so that they use use this new module.

That way everyone gets the new functionality, and when we find bugs in what I’ve done, we’ll all benefit from the solution. I’ll also be able to use this in subsequent contracting work — a massive speed and reliability boost for myself and for clients.

I’ve actually already done this with a couple of other modules patch-profile (an easy profile editor) and patch-settings (which manages client preference settings).

If you’re talking to a programmer about what excites them about open source, this is probably the heart of it. Every time someone generously shares their work, we save hours / days / weeks of work, and are able to channel these savings into things that matter — like making more accessible interfaces, or building a feature to flag abusive actors (next on my todo list).

Cooperatives

Protozoa is a worker owner cooperative. While I’m writing this, Piet and Dominic are hard at work on different things — contract work, and forming collaborations with other decentralised projects (we’re currently crushing on the Economic Space Agency a bit).

For us, contributing to the commons is an important part of our work — it makes all our future work better, and helps build working relationships with other excellent humans.

Here’s an example of that:

Tableflip are a UK based tech coop we’ve been working with. We met them through the scuttleverse and this is a calendar invite to a catchup planning more work in the future. By working together, we’re able to work on bigger and more exciting projects with the confidence that we can expand to support each other.

The cal invite here is another tool we built is called gatherings. It’s a module Piet and I made to enable more community interaction. Other gatherings I’ve seen in the past week in include: “Art~Hack Wellington: decommidify your creativity”, “Westhaven Car Boot sale”, and “Bad Ukulele Club”.

It feels amazing to have been able to support more connections between people, and to be doing business in a values aligned space — I honestly don’t know TableFlip’s email addresses…

If you’re an excellent human with an rad project you’d like to collaborate on, we’d love to hear from you. I’m mix@protozoa.nz, or you can join the scuttleverse from scuttlebutt.nz

 

Photo by Picturepest

The post Coop-source: building decentralised open source with Protozoa appeared first on P2P Foundation.

]]>
https://blog.p2pfoundation.net/coop-source-building-decentralised-open-source-with-protozoa/2017/10/08/feed 0 67862