ChatGPT and OpenAI
Tech-Twitter has been loving ChatGPT from OpenAI. There's a ton of content out there now that shows ways you can use ChatGPT for fun or for work. If you haven't already, it's worth going to their website here and giving it a try. One of the best things about this launch is how accessible they've made it. No waitlist. No "only fancy tech people get to try it for now" closed-beta. Just sign up and try it.
For this post, I'm going to skip the part where I make a "This post was written by ChatGPT!" gag and give you the basics of what's happening, a look at the underlying economic structure of OpenAI, and then talk briefly about what it could mean for the future.
To first point out what it actually is: ChatGPT is a language model that has been trained to interact in a conversational way. The dialogue format makes it possible for ChatGPT to answer followup questions, admit its mistakes, challenge incorrect premises, and reject inappropriate requests. ChatGPT is a sibling model to InstructGPT, which was trained to follow an instruction in a prompt and provide a detailed response.
To make that more digestible: ChatGPT is a really good chatbot. You know when you go on a website and there's a little chat bubble in the bottom right corner that you can interact with? It's basically that. Except, instead of only knowing a little bit about whatever website you're on and pretty obviously being a robot, it knows just about everything and it's more articulate than the average human.
The most notable limitation is that it doesn't have access to the internet. They basically loaded it with the entire internet pre-2021 data wise, but it can't look up current events or tell you when your train is leaving.
What it can definitely can do though is interpret requests to like "this code is not working like i expect — how do i fix it?" - and it's pretty crazy. It can ask for extra context. It can point out errors. It can even create the code to fix the error. It could have created the original code you had then asked it to debug...
It's not perfect, but it's a massive step forward for how software is created and it has onboarded over 1M users in less than a week.
ChatGPT was created by OpenAI. OpenAI is an AI research and deployment company. Their mission is to ensure that artificial general intelligence ("AGI") benefits all of humanity.
This all started back in 2015 with Sam Altman and Elon Musk. You've heard this theme before from Musk, but generally he (and others like Sam) see AGI coming and want to make sure it's not evil. OpenAI was founded alongside a $1B from some of the biggest names in tech to make AI helpful for everyone and not something that spirals out of control into some sci-fi horror movie with robot overlords. It sounds crazy, but that's really what they're doing.
We recently launched Secondaries on Stonks to invest in companies like OpenAI. If you think this is awesome and might want to invest, give it a look here.
What's interesting about OpenAI on the economic side though is how they're trying to balance mission and money. They started as a non-profit, but later converted to a "capped-profit" company. You can read their 2019 post about it here. This means it's not a charity, but it's also not your classic for-profit venture.
As OpenAI grew they started to run into a problem. They needed the brightest minds in the world to work with them and they needed billions of dollars in capital to accomplish their mission. Their pitch was: "Save the world!" Which is pretty solid.
As recent effective altruism storylines have shown though - *cough* SBF is a criminal *cough* - it's hard to count on goodwill forever and it's a mistake to try to ignore that part of people. It's ok to want to do good and do well. OpenAI leadership recognized that in order to get the best people and the money they needed they were going to have to pitch some upside to pair with the save the world bit. That's where the "capped-profit" comes in.
"Capped-profit" means that OpenAI, their employees, and investors can capture some of the value they create, but after a certain threshold the upside rolls back into OpenAI. More specifically, from their post: "Returns for our first round of investors are capped at 100x their investment (commensurate with the risks in front of us), and we expect this multiple to be lower for future rounds as we make further progress."
They also mention that the cap will be "negotiated in advance on a per-limited partner basis" from there. As far as I understand it, they've raised at least two $1B rounds and I'd guess there has been plenty of extra cash in between. The first was the initial commitment in 2015 from Sam Altman, Greg Brockman, Elon Musk, Reid Hoffman, Jessica Livingston, Peter Thiel, Amazon Web Services, Infosys, and YC Research. The second was a $1B deal with Microsoft in 2019.
Just earlier this year though, some information got out that marked OpenAI shares at a $20B valuation in a "people familiar with the matter" kind of piece. The same article noted big hitters like Sequoia Capital, Tiger Global Management, Bedrock Capital and Andreessen Horowitz buying in. I'm not "familiar with the matter" but I don't think those firms do a ton of charity investments so I'd expect the numbers make sense in one way or the other.
To wrap things up, I wanted to touch on what this could mean for the future. The AI generated images and blog posts are cool, but the part that really matters is the AI generated code. You may have heard of the no-code movement that champions building software without knowing how to code. You may have heard of tools like Co-Pilot that used AI to help people code. This is all a part of the much longer narrative arc of software becoming easier and easier to make.
Back when Bill Gates got started with computers in 1968, the computer he used didn't have a screen and it was slow and you had to input every little thing exactly right. Now, less than a lifetime later, you can tell a computer to code something and it will do it for you and it will fix the errors if there are any if you ask it to.
What does the world look like if everyone can code? What happens when the iteration cycle speeds up 10X? 100X? These are exciting questions that point me towards two things: The ability to actually code will devalue over time. The importance of curation and distribution will continue to grow.
In a world where anyone can make everything, is it all a commodity? We'll find out in the years to come, but it seems to be happening fast.