The Future of Computing and Why You Should Care

The future of computing and why you should care

Todd Weaver

Founder and CEO
PGP Fingerprint: B8CA ACEA D949 30F1 23C4 642C 23CF 2E3D 2545 14F7

(transcript follows)

Let me set the tone by using a quote from a great person of history:

“The ultimate tragedy is not the oppression and cruelty by the bad people, but the silence over that by the good people.” ~ Martin Luther King Jr.

Let me start by stating: I believe we can change the future of computing for the better. However, currently something is wrong with our digital world; something basic, something is rotten at the core. I want to talk to you about what that is, how it came to be, and why we must change it. And I want you to care… because:

“A person who won’t care, has no advantage over one who doesn’t care.” ~ Mark Twain

This talk comes in three parts:

Part 1. History

The history of the mistreatment of our digital rights.

Most Big Tech companies that abuse people are based in the US, therefore I will describe the history from that perspective. Some things you need to understand: Governments write the rules of the game that society plays. There are always rules, and governments influenced by Big Tech are writing those rules. If you are somebody who wants no rules whatsoever, you will quickly realize rules will be written that govern you, without your involvement.

My sage advice to you: Write the rules. Let’s write the rules that we want to see in an ethical society that respects freedoms and liberties.

Nearly everybody knows that exploitative Big Tech abuse our digital rights, because it’s at the core of their business. It’s the root problem. It will not “get better” unless any one of three things happen:

  1. Government regulation (that is ethical for society)
  2. Business models change (to something ethical for society)
  3. People switch (to something ethical for society)

Big Tech—corporations whose business model exploits humanity for profit—they all suffer from a systemic toxin, that discourages personal freedoms and removes any digital rights we as society demand. Big Tech corporations are already starting the marketing to try to differentiate themselves from it. But marketing alone will not remove the poison within their business model.

Minor disclaimer: You may ask “But… You’re a Company?” Actually, we’re a Social Purpose Corporation (SPC). And that is not just a series of buzzwords, it’s a legal framework of a business that carries with it significant importance. it is the reason we can’t ever exploit people for profit, it is the reason we are unlike all Big Tech who were formed to strip your digital rights in the name of maximizing shareholder value.

There was a recent article in Inc. magazine about us:

“Purism is what is called a ‘Social Purpose Corporation,’ which allows a business to prioritize social objectives over fiduciary duties.” ~ Christine Lagorio-Chafkin – Senior writer Inc.

Let me dive deeper into the problem. All corporations, including all Big Tech giants, have a single goal: Maximize Shareholder Value. That’s it. That’s the only goal. But it’s not just a goal. Under eBay v. Newman, a lawsuit setting legal precedent stating:

The law makes it literally malfeasance for a corporation not to do everything it legally can to maximize its profits.

So if given the choice of making $1 by exploiting people online, or opting to treat people ethically, the Corporation must exploit people online for the dollar, or the board of directors and executives could face a lawsuit from any shareholder that claims they did not maximize the value of their shares.

The regulations at the foundation of Big Tech are forcing the exploitation of our digital rights.

Quoting Chancellor William B. Chandler, III who sums up the problem perfectly in his Delaware Court opinion when eBay sued Craigslist for not maximizing its shares:

“Having chosen a for-profit corporate form, the directors are bound by the fiduciary duties to promote the value of the corporation for the benefit of its stockholders.” ~ Chancellor William B. Chandler, III

We have centuries of legal precedent in the physical world, advanced by science and society guiding our moral compass, trespassing laws, freedom of speech, privacy rights, protection against personal harm and abuse. We have nearly no digital rights. Big Tech trespasses on your data, restricts speech, obliterates privacy entirely. Big Tech exploits people, causes harm and inflicts abuse upon our society.

If somebody approached your bedroom window from outside, put a camera up and started recording, you would immediately call the authorities and report the numerous laws broken—a case would be opened, arrests could be made, charges could be pressed, trials could ensue, criminals could go to jail; but in the digital world none of that exists—you are forced to leak far more details than a camera in your bedroom would share, and you are forced to leak that personal data from your phone all the time.

Big Tech exploits you every millisecond of every day.

All future government regulation will be influenced, funded, and lobbied by Big Tech. Could you imagine a future regulation where Big Tech wins to cryptographically sign everything with their keys, under their control on their products? What a nightmare scenario… Could you imagine your mobile phone under the complete control of Apple or Google?

We need to write the rules based on values we want in society.

AI algorithms from Big Tech have one input variable: $Maximize_Shareholder_Value. That translates directly into

  1. Gather everything on all of the society
  2. Keep people digitally captive
  3. Maximize exposure time
  4. Polarize opinion to elicit more profit

That is not what AI should be taught. Due to data manipulation, no two people in society are getting the same information; it is impossible to have a sane debate about any polarizing topic because we aren’t starting with a foundation of shared knowledge. What if the input request to AI algorithms was “Build an ethical society that respects freedoms and digital rights”? What would society look like then?

Maximizing shareholder value in a society that has nearly no digital rights, guarantees exploitation of that society. Why did we let this happen? How did we let this happen? I know why. Because… It’s convenient to give up control. It is convenient for you to download a proprietary application that exploits you, agree to the legal binding terms of service you didn’t read, and blissfully believe Big Tech is helping you in the digital world. It’s inconvenient to stand up for your freedom.

It seems we are offered to choose between convenience and control or inconvenience and freedom

I believe we can have both convenience AND freedom. We can actually build technology that benefits society faster when they are based on principles we deem ethical.

Society’s technology genius is not lacking, its moral genius is. Trust in Big Tech is eroding rapidly. No Big Tech company has core values that help our digital rights. The largest challenge we will face is the marketing budgets of Big Tech, when they claim things like:

“We protect your privacy” ~ Big Tech
Actually, You exploit personal private data without a persons knowledge
“We use encryption” ~ Big Tech
Actually, It’s inside proprietary apps that you control
“We are secure” ~ Big Tech
Actually, You hold the master keys controlling society
“You can trust us” ~ Big Tech
Actually, You won’t let anybody verify anything

Part 2: The present

Currently, Big Tech is maximizing shareholder value without values. The products, software, and services offered by Big Tech will continue to mistreat people unless we can establish what digital rights are and change society for the better.

Then we advocate, regulate, and build products that adhere to those digital rights.

Mark Twain famously wrote:

“It is curious that physical courage should be so common in the world and moral courage so rare.” ~ Mark Twain

I believe there are five fundamental digital rights:

  1. Right to Change Providers
    If a person wants to change a service provider, they can easily move to another. (Decentralized Services)
  2. Right to Protect Personal Data
    A person owns and controls their own master keys to encrypt all data and communication, nobody else. (User-controlled Encryption)
  3. Right to Verify
    Society has the freedom to inspect the source of all software used, and can run it as they wish, for any purpose. (Software Freedom)
  4. Right to be Forgotten
    A service provider only stores the minimal personal data necessary to provide the service. Once the data is no longer required, it is deleted. (Minimal Data Retention)
  5. Right to Access
    A person must not be discriminated against nor forced to agree to any terms and conditions before accessing a service. (Personal Liberty)

If we can do those things, we can change the future of computing for the better.

Part 3: The future

As technology gets closer and closer to our brain, the moral issues of digital rights become clearer and clearer.

It started with computers, where we would leave them and come back to them. Then phones, that we always have on or near us with millisecond leakage of personal data beyond human comprehension. Then wearables, that are tracking very private details. IOT devices are everywhere— I have to stop to remind everybody: “The S in IOT is for Security” ~ Anonymous—and finally, surgically implanted.

A question to consider: What Big Tech Company would you purchase your future brain implant from? This is coming.

However, I believe we can change the future of computing for the better. Let’s stand together and invest, use, and recommend products and services that respect society.

What future will you choose?

Reposted from Purism

Leave A Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.