Podcast notes – Punk 6529 on Bankless: “NFTs are generic carriers for intangible assets…intangibles are $70 TRILLION”

Why pseudonymous identity
Very intrigued by NFTs in late 2020s
Initially thought it was cute games for kids
Saw Gmoney, Punk4156 – inspired by idea

Idea behind bitcoin – provable ownership

“If you’re not sure, you should just try it”
With tech, you can’t just read about it, you gotta use it

Now he checks in and out of 2 different ecosystems – Facebook is his pre-existing / real identity life
Twitter is his NFT and crypto community – noticed it’s as easy, if not easier, to cooperate and collaborate here, didn’t expect this at first

Fact that NFTs are always visible in wallets – security flaw, but solvable (especially expensive ones, eg, Fidenza)

On bulletin boards 20 years ago — didn’t know ID of those people, just had pseudonymous handles
“It’s not weird at all — you’re all gonna do it”
People present differently online, on LinkedIn v IG v Facebook

The NFT makes the pseudonymous ID provable

RSA: “Don’t trust anyone in crypto who hasn’t used Uniswap”

“No constitutional rights without freedom to transact”

There’s a progressive erosion of freedom

One is tech intermediated —
NYC yellow cabs, pay cash, no one can stop you from using them – decentralized physical activity
Now, Uber can stop you any time for any reason

Two is post 911 AML/KYC
Cash now viewed with suspicion
Impossible to launch “cash” as a product today

End state of all this is a few large databases that intermediate everything — it’s a chokepoint, a honeypot
No thinking about second order effects
Some very ambitious politicians will take control, shut down and control large numbers of people, including their opponents
All of this runs outside of due process
Starts with good intentions, but grows and grows
Network effects which become chokepoints

Crypto’s own permissionless architecture becomes an important counterweight – Bitcoin has no CEO

RSA: there will be 2 types of money — controlled money and free money

NFTs are our best shot at achieving decentralization — that’s why he started 6529

Most of his friends are completely clueless — lots of them think he’s lost his mind about NFTs and crypto
Caught in a MLM scheme or Ponzi

What’s funnier — lots of BTC people can’t get into ETH or vice-versa, or NFTs and DeFi — “have you looked at yourself in the mirror?”

“Hardened veteran of being yelled at”

A lot of super technical crypto lovers shit on NFTs, “kid stuff”

Because crypto was too obsessed with the technology, it was clear it was pre-product market fit
NFTs changed that

You don’t buy a CryptoPunk because it’s on Ethereum

When you talk about applications instead of the tech, you’re at the beginning of consumerization
eg, Dolce Gabbana at an NFT conference!

NFTs = generic carriers for intangible assets

Many things you can do with NFTs that you can’t do with crypto, eg, personal IDs
Big companies are using NFTs, but not bitcoin / ethereum

NFTs are infinitely expressive

Metaverse is just the internet, it’s not gonna be one website
Right now you’re 2 inches tall on my laptop, but in the future you’ll be full size – visualization will improve
You’ll need persistent digital objects — NFTs!

You can survive without Twitter, but not really without email
Politicians can’t ban email — it’s a protocol
Architecture of web 1 was open, inter-operable, came out of academia (eg, email, websites)
Architecture of web 2 came out of Silicon Valley, should have been protocols but was captured by large companies

Metaverse will be your all encompassing ambient digital environment

We have moment in time, next 2-3 years – while others think this tech is a joke – we have opportunity to win a technology shift
Twitter was thought as a joke, a curiosity – 11 years later, huge debate about how POTUS uses it

BTC won’t displace state money – state has tremendously powerful tools
ETH won’t be global computing platform – it’s AWS

You can make NFTs as first amendment protected speech – there will be a Supreme Court case on first amendment grounds

NFTs are
—first mainstream crypto consumer app
—possible to get large companies using and integrating
—less threatening to the state

Intangibles on corporate balance sheets are $70T dollars – far more than gold – and many more intangibles that aren’t on balance sheets at all

NFTs can carry any arbitrary intangible on the internet

“Yes We Can” and “Make America Great” are examples of intangibles that bind humans – memes – intersubjective realities and myths

It’s the underlying fabric of society – and now we can make it composable on the internet

Stratechery on Bing’s AI chat: “…the movie Her manifested in chat form”

“This technology does not feel like a better search. It feels like something entirely new — the movie Her manifested in chat form — and I’m not sure if we are ready for it. It also feels like something that any big company will run away from, including Microsoft and Google. That doesn’t mean it isn’t a viable consumer business though, and we are sufficiently far enough down the road that some company will figure out a way to bring Sydney to market without the chains. Indeed, that’s the product I want — Sydney unleashed — but it’s worth noting that LaMDA unleashed already cost one very smart person their job. Sundar Pichai and Satya Nadella may worry about the same fate, but even if Google maintains its cold feet — which I completely understand! — and Microsoft joins them, Samantha from Her is coming”

Source: https://stratechery.com/2023/from-bing-to-sydney-search-as-distraction-sentient-ai/

Podcast notes – Runway founder Cristobal Valenzuela – No Priors (Elad Gil and Sarah Guo): “You shouldn’t dismiss toys”

Guest: Cristobal Valenzuela, founder of RunwayML
From Chile
Studied business / econ
Experimented with computer vision models in 2015, 2016
Did NYU ITP program
Now running Runway

True creativity comes from looking at ideas, and adapting things

How does Runway work?
Applied AI research company
35 AI-powered “magic tools” – serve creative tasks like video or audio editing
Eg, rotoscoping
Also tools to ideate, generative images and video
“Help augment creativity in any way you want”

When started Runway, GANs just started, TensorFlow was one year old

First intuition – take AI research models, add a thin layer of accessibility, aimed at creatives
“App Store of models” – 400 models
Built SDK, rest API

Product sequencing – especially infrastructure – is really important aspect of startup building (what to build when)

Lot of product building is just saying no (eg, to customer requests) if it’s not consistent with your long-term plan

Understand who you’re building for – for them it’s creatives, artists, film makers

Models on their own are not products – nuances of UX, deployment, finding valuable use cases
Having control is key – understand your stack and how to fix it

Built AI research team – work closely with creatives, contributed to new AI breakthroughs
Takes time to do it right

Progression of AI researchers moving from academia to industry

Releasing as fast as you can, having real users is best way to learn

Small team that didn’t have a product lead until very recently

Rotoscoping / green screening is one of Runway’s magic tools
-trained a model to recognize backgrounds
first feature was very slow (4fps), but was still better than everything that existed

Runway is focused on storytelling business

Sarah — domains good for AI – areas where there’s built in tolerance for lower levels of accuracy

Product market fit is a spectrum

“You shouldn’t dismiss toys”

Mental models need to change to understand what’s happening (with generative AI)

Art is way of looking at and expressing view of world
Painting was originally the realm of experts, was costly, the skills were obscure

Models are not as controllable as we’d like them to be — but we’re super early

Podcast notes – Noam Shazeer (Character AI, Attention is all you need) on Good Times w Aarthi and Sriram

Intro
-Founded Character AI
-One of authors of “Attention is all you need”
-Was at Google for 20+ years (took a few years break)

Went to Duke undergrad on math scholarship

Realized he didn’t enjoy math, preferred programming and getting computers to do things

During Google interview, Paul Buchheit asked him how to do a good spell corrector, and Noam ended up writing the spell corrector feature for Gmail

Google has been traditionally a bottoms up company – could work on what he wanted

When he started AI, exciting thing was Bayesian networks

Came back to Google to work with Jeff Dean and Google Brain team
“Just a matter of the hardware”
All the growth in hardware is parallelism

Neural networks are mostly matrix multiplications – operations that can be done well on modern hardware

Gamers / video games pulled GPU advancement (highly parallel hardware) out of market

Idea of neural networks has been around since 1970s – loosely modeled on our impression of the brain

Very complicated formula to go from input > output
Formula is made of parameters, and keep tweaking parameters
Neural nets rebranded as “deep learning”
Took off because of parallel computation and gamers

Neural language models are neural networks applied to text
Input is text to this point, output is prediction of what text comes next (probability distribution)
Infinite amount of free training data (text content)
“AI complete problem”
“Really complicated what’s going on in there” (in the neural network)

It’s a really talented improvisational actor – “Robin Williams in a box”

Model improvement is kinda like a child learning – as training and model size grow

Lot more an art than a science – can’t predict very well – if 10% of his changes are improvements, considered “brilliant research” – kinda like alchemy in early days

(Software) bugs – hard to know if you introduce a bug – the system just gets dumber – makes de-bugging extremely difficult

Co-authored “Attention is all you need”
-Previous state of art in LLM is recurrent neural networks (RNN) – hidden state, each new word updates the hidden state, but it’s sequential – slow and costly
Transformer figures out how to process the entire sequence in parallel – massively more performant
-The entire document / batch becomes the sequence
-Lets you do parallelism during training time
During inference time it’s still sequential

Image processing models – parallelism across pixels – convolutional neural nets (CNN)

Google Translate was inspiration – biggest success of machine learning at the time
Translating languages > one RNN for understanding, and another RNN for generating, and need to connect them
Attention layer – take source sentence (language A), turn into key-value associative memory, like a soft lookup into an index
“Attention” is building a memory, a lookup table that you’re using

DALL-E, Stable Diffusion, GPT3, they’re all built on this Google research

Bigger you make the model, more you train it, the smarter it gets – “ok, let’s just push this thing further”

Eventually need super computer
Google built TPU pods – super computer built out of custom ASICS for deep learning

Now need massively valuable applications

Turing Test, Star Trek, lot of AI inspiration is dialogue

Google LAMDA tech & team – eventually decided to leave and build as a startup

“The best apps are things we have not thought of”

If you ask people with first computers “what is this thing good for”, would get completely wrong answers

Parasocial relationships – feel connection with celebrity or character – one way connection – with AI you can make it two ways

Aarthi: “Your own personal Jarvis”

Still need to make it cheaper – or make the chips faster

Aarthi: ideas / areas for entrepreneurs
-Image gen has exploded – lots of good companies coming, very early and promising
-Things like Github Co-Pilot
-new Airtable – using AI for computation

Sriram:
-What’s optimization function that all these models will work toward?
-Will be a very big political / social debate

How do you know better than the user what the user wants?