A reminder for new readers. That Was The Week collects the best writing on critical issues in tech, startups, and venture capital. I selected the articles because they are of interest. The selections often include things I entirely disagree with. But they express common opinions, or they provoke me to think. The articles are only snippets. Click on the headline to go to the original. I express my point of view in the editorial and the weekly video below.
This Week’s Video and Podcast:
Content this week from @kteare, @ajkeen, @kwharrison13, @geneteare, @cmshroed, @lexfridman, @finkd, @rex_woodbury, @andreretterath, @azeem, @ttunguz, @sdw, @PeterJ_Walker, @aidanfitzryan, @jacqmelinek, @bheater, @jglasner, @tanayj, @kirstenkorosec, @wesg52, @garymarcus
Contents
Editorial: AI or Bust
Global Venture Funding In Q3 2023 Falls Again Despite Late-Stage Rebound Led By Huge AI Deals
China’s evolving global tech expansion – a lens from the Middle East
Lex Fridman with Mark Zuckerberg on the new AI initiatives
The Mobile Revolution vs. The AI Revolution
How to Not Miss the AI Train: Essentials You Need to Know
How AI Can Fight Inequality | Exponentially with Azeem Azhar
Centaurs & Cyborgs : The Jagged Frontier of AI
9 Open Source LLMs and Agents to Watch
OpenAI Unveils DALL-E 3 & Users Can Generate Images in ChatGPT
iPhone 15 Pro Max Camera Review: Depth and Reach
First Cut — State of Private Markets: Q3 2023
Alameda had a $65B line of credit and ‘unlimited withdrawals’
Meta’s New AI and VR Push Includes Celebrity-Influenced Chatbots and Next-Gen Smart Glasses
Ray-Ban’s new Meta smart glasses will be able to translate text
Secondary Investor Industry Ventures Raises $1.7B+ For Two New Funds
A Secondary Investor’s Take On The Current, Still-Unnamed Downturn
A few takeaways from ARM’s IPO
EV boat startup Arc wades into watersports with $70M in fresh funding
Do language models have an internal world model? – – @wesg52
No – Gary Marcus
Editorial: AI or Bust
Another week when there are many talking points.
The Q3 Venture Capital data is out, and both Gene Teare at Crunchbase and Peter Walker at Carta have significant insights this week. In an essay of the week, Kyle Harrison takes a stand back and examines the entire venture value chain – as we have discussed a lot in prior weeks. It is a great effort by Kyle and a very good read,
Also, Sam Bankman Fried is not in front of a Jury. Only two days in, the prosecution is mounting a strong effort to undermine his ethical stance, with his former CTO saying he was asked to write code enabling Alameda, SBF’s hedge fund, to take unlimited loads from FTX. It is not yet clear if this is criminal. I will wait for the defense case and consider it. But these two days have been difficult for SBF.
Meta is announcing Quest 3, which resembles Apple’s forthcoming approach (see through the mask into the room). Meta also announced the next version of its VR glasses with Ray-Ban. These look promising, to be honest. I might even pony up for a pair (prescription lenses too).
But for me, the story of the week is still AI. Open AI announced DALL_E 3 and that it will be included in ChatGPT. ChatGPT is also getting visual acuity – interpreting images and drawings. I have even seen demos of it writing code from paper sketches to make websites. Audio input and output are also now live in the mobile app.
Rex Woodbury writes an over-arching piece comparing the mobile revolution to AI. He concludes that AI will be far more significant than the already mighty impact of mobile.
This is exciting.
It means that the comparison in the title of this piece—the Mobile Revolution vs. the AI Revolution—is something of a misnomer. AI is bigger, a more fundamental shift in technology’s evolution. VR/AR, perhaps underpinned by Apple’s forthcoming Vision Pro, might be a mobile-scale revolution—a massive shift in distribution. That’s probably 5-ish years away. But AI is bigger.
……..
The internet, mobile, and cloud looked like their own distinct revolutions—but rather, they may have been sub-revolutions in the broader Information Age that’s dominated the last 50 years of capitalism. We’re now seeing a brand new sea change—one that only comes around every half-century.
In other words, we’re in for a helluva ride.
This is hard to disagree with. I use ChatCPT at SignalRank. In the past two weeks, I have reduced my code by 90% while increasing how good the results are. I’m probably a B+ SQL programmer. With ChatGPT, I became A++, measured by the code’s productivity, quality, and complexity. Refactoring twelve months of work in two weeks relies on all the learning during that time, but without AI, I could not have done even close to what I have delivered.
Gene’s piece on the Q3 funding environment shines a light on AI’s impact on funding in an otherwise awful environment.
AI companies raised more than $10 billion this past quarter, on par with Q2 2023, Crunchbase data shows.
The largest AI rounds went to OpenAI competitor Anthropic. The company raised $1.3 billion from Amazon and committed to using AWS and Amazon’s inhouse chips to train models.
Cloud data company Databricks raised $500 million at a value of $43 billion in a deal led by T. Rowe Price. That was the largest priced round last quarter and marked an up round from Databricks’ $38 billion value in 2021.
The impact of AI on humanity seems to have no real upper limit. This isn’t the place to discuss all of the applications, but they will be many and large.
Tweet of the Week discusses some of the technical debates around LLMs. Gary Marcus is still comparing ChatGPT to a perfect AGI and finding it wanting, which is right. It is wanting. But the more astounding fact is what it is great at. We will all have digital AI assistants able to help with pretty much any task, and it’s coming soon. Google filed patents this week on the mechanics of robotics, and I think AI in physical objects that can perform tasks is in our mid-term future too.
But my favorite read this week was Halide founder, Sebastiaan de With, reviewing the iPhone 15 Pro Max camera system. It is very detailed and lovingly put together. The camera is amazing. Enjoy…
Essays of the Week
The Value Chain of Capital
And The VC “Back-Scratch” Circle
SEP 30, 2023
Andrew Stanton, one of the filmmakers at Pixar known for directing Finding Nemo, WALL-E, and more, gave a TED Talk in 2012 where he talked about storytelling. At the beginning he tells a story about how an old Scottish man came by his colorful nickname. Now, I won’t repeat the story, cause my Mom reads this blog. But give it a listen. And just think about what’s in a name, and the story behind it.
That’s how I felt as I was reading the Wikipedia entry for the term “Ponzi scheme.” I always forget that the term comes from a specific person; Charles Ponzi in 1920. His scheme of paying back his first wave of investors with the cash from his second wave of investors earned him $220M in today’s dollars. It wasn’t the first known Ponzi scheme (that happened in 1869) and it wasn’t the biggest (that probably goes to Bernie Maddoff, who caused the loss of $20B in cash and $65B on paper.) But Charles Ponzi’s name would forever be attached to the scheme.
So what’s got me thinking about Ponzi schemes? Yet another attempt to understand incentives in the world of venture capital. This time its using Instacart’s cap table to unpack the value chain of capital.
What Is The Venture Capital Value Chain?
Last week, Instacart went public at a $9.9B valuation, down from an all-time-high in early 2021 of $39B. It was one of the first big IPOs since the market corrected, and there was a lot of interesting commentary. But my favorite was this video from Aswath Damodaran, an economics professor at NYU.
In that video, he shared this break down of the equity return from each of Instacart’s funding rounds:
The takeaway was that every investment round from the Series C in 2015 to the $39B Series I in 2021 has failed to beat the S&P 500 in terms of returns. And every round since 2018 is basically flat or negative.
The first reaction was, “well no one should have invested after that 2014 Series B.” Most investors are using benchmarks like beating the S&P 500, so in retrospect no one would want to make an investment anywhere it wouldn’t beat the market, right?
But here’s the dependency. All of Instacart’s funding up to and including the Series B? $55.7M
Now, this is just a rough estimate because we don’t have visibility into all the cash inflows and outflows over the business’s life. But as of yesterday, the company had raised a total of $2.9B in total funding, they had existing debt of $774M, and an ending cash balance of ~$1.8B. Roughly, that means the company had to burn ~$1.8B to get to the size of business they are today.
Granted, Instacart also hit free cash flow positive for the first time in 2022. So I don’t even want to make the argument of this business being a cash inferno like others I’ve written about (like WeWork or Hopin). Instead, the reality is that you can’t just “not raise” the capital post-Series B. Instacart, at least in its current state, needed at least ~$1.8B to generate that return for those early investors.
The headlines will talk about investors like Sequoia, who invested in Instacart’s $8.5M Series A in 2013 at a $75M valuation. That position at the IPO is worth ~$1B (not accounting for dilution and future purchases.) Exceptional by any and every standard! But that’s not the whole story. Sequoia has invested $300M total to generate a total ~$1.4B stake. Still a great 5x return. But for that Series I investment at $39B? Not as great in isolation.
Now, again, I’m not criticizing Sequoia’s investment in Instacart. A 5x return on $300M of capital is what dreams are made of. But Sequoia illustrates two things about the value chain of capital that I’ll explore more below:
Capital Dependency: A company’s “terminal value” can look great for early investors, while being dependent on later investors
What’s It Worth: The success of an early investor’s return is determined based on what a later stage investor decides the company is worth
Sequoia has the benefit of having invested along the way, though 85% of their $1.4B return came from that initial Series A investment. But what if you’re Fidelity or T. Rowe investing for the first time at that $39B Series I? You’re completely in the hole.
And that’s not just true for one excessive 2021 round. Remember that up to the Series B (the last round to beat the S&P 500), the company only raised $55.7M. But they would go on to burn at least $1.8B to get to where they are today. That means that the return for Sequoia and a16z in the Series A / B is dependent on an additional $1.7B of unprofitable capital.
Capital Dependency
A lot of people like to use language like “passing the bag” and “bag holders” to describe the “capital value chain” that I’m outlining. But that isn’t necessarily quite right. In crypto, bag holders are often people who are left with nothing, whose investments are the only driver of enrichment for earlier participants in the chain. But that isn’t always the case in venture.
Look back at our Instacart example. The early investors hold a much higher return than later investors, but all of them still own shares in a near-profitable ~$10B business with $1B+ in revenue. That’s not just a bag of hot air, it’s a real business.
……..Much More
Global Venture Funding In Q3 2023 Falls Again Despite Late-Stage Rebound Led By Huge AI Deals
Gené Teare @geneteare, October 5, 2023
Global venture funding in the third quarter of 2023 reached $73 billion — up a bit quarter over quarter and down 15% from the $86 billion invested in Q3 2022, Crunchbase data shows.
Despite the uptick quarter over quarter, this is the second lowest quarter since funding started to slide in 2022. The startup world is now five to six quarters into the current funding decline, Crunchbase data shows.
Seed and early-stage funding continued to decline year over year, a clear signal that the venture markets are not opening up yet. Late-stage funding was up by close to 10% year over year and 30% quarter over quarter, as companies in strategic sectors — including semiconductors, AI, electric vehicles and sustainability — raised large fundings.
Table of Contents
Easing of the IPO markets
September marked the first time in 18 months that venture-backed technology companies braved the public markets. Last month, two well-established venture-backed startups went public, a signal that the IPO markets could open up for tech listings in 2024.
Instacart — in the top 10 of the most highly valued private companies based on its 2021 funding value — went public at a steep 75% discount to its last private valuation. And marketing email personalization company Klaviyo, also in the top 100 unicorn companies until it went public, listed at a value of $9.2 billion, not far off from its last 2021 private value of $9.8 billion.
Both stocks have come down from their IPO listing price, a signal that the public markets are especially price sensitive in the current economic market when it comes to new listings.
AI funding boost
AI companies raised more than $10 billion this past quarter, on par with Q2 2023, Crunchbase data shows.
The largest AI rounds went to OpenAI competitor Anthropic. The company raised $1.3 billion from Amazon and committed to using AWS and Amazon’s inhouse chips to train models.
Cloud data company Databricks raised $500 million at a value of $43 billion in a deal led by T. Rowe Price. That was the largest priced round last quarter and marked an up round from Databricks’ $38 billion value in 2021.
In related news, companies in the semiconductor sector raised $4.5 billion in Q3, with 96% of the investments into companies based in China, Crunchbase data shows.
China’s evolving global tech expansion – a lens from the Middle East
SEP 28, 2023
As a global venture investor – focused mostly on early-stage tech startups – the worlds I’m in are very different, but not utterly distant, from traditional conversations in Washington, DC on China, and the world.
Here, in my nation’s capital, the discussions are appropriately very top down. The machinations of big government, financial institutions, NGOS and corporations are assuredly key drivers of the world as it is.
In the world of tech startups and innovation, conversations are invariably bottom up. New generations and their enterprises are setting the agenda for the world that is coming.
This blog, in many respects, has long been dedicated to the idea that among the greatest tech trends of our times (putting aside AI now, which I’ll come back to) is not the tech itself but the near universal access to it. Billions in regions that still rarely make it to the top of the agenda in our top-down institutions have super computers in their pockets, and now have been using generative AI since its release and are reshaping the very face of business and innovation globally.
In this rise, it has not surprised me that both so much is shared among innovators in rising markets and that China would become a model for them as a former emerging market themselves. For all the obvious cultural and geographic differences, they all have navigated challenges not contemplated in the West – navigating particularly hard last mile logistics, dealing with rapidly changing regulatory regimes, educating millions of consumers to use fintech who never had a bank account among others. It should come as no surprise that massively successful companies in China are often models for how it is done to the rest of the world as much as Silicon Valley.
Part and parcel of these changes, and if you had asked me for a prediction pre-covid, I would have told you that it was inevitable that China would be entering these global markets aggressively in technology – through investment, acquisition and market expansion. Alibaba, in fact, acquired top players in eCommerce in Southeast Asia, Turkey and Pakistan, and Didi ride sharing in Latin America.
But, in fact, these, and of course TikTok, have been exceptions that have to date proven me wrong. Very little Chinese venture capital has invested in tech startups abroad, little money invested in regional funds. Much operating expansion has been about, say in the case of WeChat in Latin America, expanding first and foremost to be usable by millions of Chinese abroad.
I recently unpacked why this is with some smart Chinese and global investors who have been exploring the Middle East as a case study. As one said to me, “Chinese tech and venture capital are there and not there” and the best way to unpack is in five categories:
The first are the large, tech driven traditional businesses, comfortable for centuries in buying and selling commodities, manufactured goods and financial services. The rules of the road are tried and true, and they have been beating a path to MENA and especially the Gulf for years.
The second are the tech and algorithm first enterprises like TikTok, who are now everywhere. The essence of their service requires not more than a dozen or two dozen employees on the ground to localize their services and they have been off the races in MENA as they are around the world.
The third are Chinese tech startups who have come to the region first and foremost with hopes to get funding from sovereign wealth funds and perhaps some market expansion, but success stories here are very few to date.
The fourth are Chinese tech venture capital firms who have been testing the waters for some years, MSA Capital for example has made some investments in Egypt and Saudi Arabia — but there is no sizable footprint.
And finally, there are the great tech infrastructure enterprises like Huawei and Alibaba Cloud, who are clearly looking for investment and market access in MENA as they are everywhere else.
What is going on? For what I am hearing there are a few elements:
Video of the Week
AI of the Week
The Mobile Revolution vs. The AI Revolution
Rex Woodbury,
The iPhone came out in June 2007; Uber was founded in March 2009.
I’ve been thinking a lot about past technology revolutions and what they can teach us about our current AI revolution. If mobile offers any lesson, it’s that applications take time to develop. ChatGPT came out 10 months ago; this means we probably still have another six to 12 months before killer apps really start to emerge. And the same pattern will likely hold again when Apple launches its Vision Pro next year.
Here’s a chart of U.S. smartphone ownership after the iPhone came out—I’ve overlaid the foundings of WhatsApp (2009), Uber (2009), Instagram (2010), and Snap (2011).
Those are some of the greatest success stories in startup history: Facebook bought WhatsApp for a then-record $16B; Uber trades at a $95B market cap; Instagram would likely trade around ~$100B as a standalone company.
And where this chart ends—in March 2012—we were still early innings. At that point, smartphone penetration in the U.S. was hovering around 40%.
Here’s a chart that shows continued penetration, alongside the foundings of other major mobile companies: Tinder (2012), Robinhood (2013), TikTok (2015). These apps emerged five, six, and nine years after the iPhone launched, respectively.
The lesson here is that technology revolutions take time. Despite the hype for AI right now, we’re still early: while 58% of American adults have heard of ChatGPT, only 18% have used it. In recent months, ChatGPT monthly active users actually ticked down. I expect we’ll need more vertical-specific, user-friendly LLM applications for the technology to really break through. Many of those applications are being built or dreamt up right now.
This quest for understanding technological revolutions drove me to read Technological Revolutions and Financial Capital, an excellent book by the economist Carlota Perez (thank you to Rick Zullo for the recommendation). Perez wrote her book in 2002, shortly after the dotcom bubble burst. The book proved prescient in forecasting the 2000s and 2010s of technology, and I believe it offers some key insights for where we sit in 2023.
This week’s Digital Native explores how today’s AI revolution compares to past revolutions—from the Industrial Revolution of the 1770s to the Steel Revolution of the 1870s, the Internet Revolution of the late 90s to the Mobile Revolution circa 2010.
The key argument I’ll make is this: the AI Revolution isn’t comparable to the Mobile Revolution, as the latter was more a distribution revolution. Rather, AI is more comparable to the dawn of the internet. Or, more fundamentally, AI is an even larger-scale technology shift—it’s the dawn of a new discrete revolution that’s built not around computers acting like calculators, but computers acting like the human brain.
In short, we’re coming to a close of the “Information Age” that started in 1971, and we’re beginning a new era in technology.
Let’s dive in 👇
How to Not Miss the AI Train: Essentials You Need to Know
DDVC #55: Where venture capital and data intersect. Every week.
OCT 4, 2023
TL;DR
Perfect AI storm due to “magic triangle” of intelligent algorithms, powerful compute, and large-scale availability of data
I share 2 frameworks to assess AI: 1) Onion framework to understand terminology and 2) value chain framework to understand different propositions as well as full-stack/verticalized versus horizontal
AI will reduce the cost of experimentation, more specifically the marginal cost of generation, and thus allow companies to achieve more with less
Reduced entry barriers will increase new startup creation, eventually increasing cash demand more broadly across the startup ecosystem
Increased capital demand together with current cash shortage led to strong buyer’s market but will hopefully bounce back to more balanced dynamic
Companies need to access the potential of AI for 1) their workforce and 2) their own product, a simple framework below
Digitization of VC is still at day 0, providing a huge opportunity for first-movers leveraging data & AI to generate alpha, more details via
https://landscape2023.datadrivenvc.io/
—
Last week, I was invited to give two keynotes at Bits & Pretzels, one of the leading startup conferences in Europe. My Sunday session was all about “The Future of Startups & VC: Augmenting Humans with AI”, a topic that’s close to my heart and the essence of this newsletter. Subsequently, the Monday session “How to Not Miss the AI Train: Essentials You Need to Know” was intended to double-click on the previous day and provide a more comprehensive rundown of the recent AI hype, frameworks to properly access AI opportunities as well as a perspective on the impact of AI on our economy, startups, and us as VC investors.
Unfortunately, I needed to cancel the Monday session on short notice due to my very own Earlybird 🐣🤩 surprise. I appreciate that many of you were at the stage when it was canceled (thanks for your kind messages), so I’m happy to share the presentation with you today. Below, you’ll find the most relevant slides including key talking points.
“Magic Triangle of AI”: Algorithms + Compute + Data
It feels that the perfect AI storm has just begun, yet in reality, it has been 70+ years in the making. Only with intelligent algorithms (starting in the 50s), powerful computing (starting in the 90s but waiting for its breakthrough in the 2010s via parallel computing and GPUs), and broad availability of data (“data is the new oil”); the so-called “Magic Triangle of AI”; AI started to become reality.
While the actual epicenter of the AI boom was probably the discovery of attention mechanisms via the “Attention is all you need” paper in 2017, I’m convinced that 2022 will be remembered as the year of breakthrough in AI due to the launch of ChatGPT. ChatGPT and its ability to prompt LLMs via natural language (instead of code) removed significant friction and led to widespread adoption across prosumers.
How to make sense of AI in a structured way – 2 helpful frameworks
Following the craze of the past year and engaging in a variety of conversations around AI, I couldn’t help but notice a common trend—many people seem to confuse the fundamentals and mix up various concepts regarding AI. It became evident that amidst the excitement and buzz surrounding AI, there is a need for clear and simple frameworks to help demystify this complex field.
As described in this “AI Cheat Sheet” post, I’m a big believer in simplifying frameworks to make sure we speak the same language. My two favorites are below.
The “Onion Framework” for terminology
The “Value Chain Framework” for different propositions
What’s holding us back now?
By now it’s pretty obvious that compute infrastructure is the key bottleneck to scale AI. Seemingly unlimited demand faces restricted supply. The result? Nvidia as the leading AI-GPU provider became one of the most valuable companies after posting year-over-year sales growth of 101%, to $13.5 billion for the three months that ended in July — a new record for the company. Of course, this also engages geo-political attention.
The impact of AI on our economy, the startups we invest in, as well as ourselves as VCs
What’s ahead?
It’s very much a crystal ball but I strongly believe that Amara’s law will hold true for AI as for other technological shifts in the past:
“We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run.” – Roy Charles Amara
Those are obviously just a few of many predictions that I anticipate as an impact of AI. I’m particularly excited about the cost of experimentation though. Let’s look at this in a bit more detail.
The impact of AI on the cost of experimentation
I wrote about “The impact of AI on the cost of starting and running a business” earlier this year and we started seeing the first data points for this trend. The cost of experimentation is defined as the resources required to achieve a specific milestone, e.g. product-market fit or 1M ARR, and clearly, we can see that startups begin to achieve more with less.
How AI Can Fight Inequality | Exponentially with Azeem Azhar
Within five years, the open-source AI will have raised the GDP of the poorest countries. That is the premise of my conversation with Emad Mostaque1, founder and CEO of Stability AI, one of the companies behind Stable Diffusion.
Earlier this year, Goldman Sachs published a research brief titled “The Potentially Large Effect of Artificial Intelligence on Economic Growth”2, in which the authors explore the future of the labour market. The study suggested that global productivity could see an impressive uptick, ultimately boosting global GDP by 7%.
In this conversation, we discuss what would need to be true to make this happen. Open-source AI is one part of the answer.
Centaurs & Cyborgs : The Jagged Frontier of AI
Venture Capitalist at Theory
Last week, Harvard Business School released its analysis of LLMs on 758 consultants’ performance when using AI. Three key insights emerged.
In the first experiment (inside AI’s capability frontier), consultants randomly assigned access to GPT-4 AI completed 12.2% more tasks on average & 25.1% faster. Quality improved by 40%.
Lower performing consultants benefit the most from AI augmentation, increasing performance by 43% compared to 17% for higher performers.
This rising-tide effect seems common across AI applications. The benefit to the lower quartiles is dramatic across sales, customer support, & consulting. This has broader implications, something I aim to write about later this week.
In the second experiment, consultants analyze business data to offer strategic recommendations. Here, AI reduces performance – those using AI are 19 percentage points less likely to produce correct solutions.
The “jagged frontier” conceptualizes how AI profoundly increases productivity on some tasks but provides no value or even diminishes performance on seemingly similar tasks.
As with any tool, we need to learn the best applications & techniques for hewing knowledge work effectively.
Last, the paper discusses two types of human/AI collaboration : centaurs & cyborgs. Centaurs ask for high-level help with AI. Cyborgs train the AI to act as a character. The difference remains a bit nebulous to me, but the last section of the paper in Appendix E highlights different techniques, some of which I hadn’t explored.
I haven’t yet asked an LLM to impersonate someone but given the impact to education, I could imagine LLMs becoming sales & customer support trainers for new employees in those roles.
Especially since the impact on lower quartiles (presumably many of which are ramping) is so marked.
9 Open Source LLMs and Agents to Watch
In the past year, there has been a surge of interest in large language models and LLM agents. As large language models continue their assent into multiple fields, they will begin to branch off and become more domain-specific to tackle complex problems that general LLMs aren’t well suited for.
So let’s take a look at some interesting and new open-source LLMs and LLM agents that we are following:
Open Interpreter:
Open Interpreter is a project that aims to create a universal interpreter for large language models. This would allow LLMs to communicate with each other and access information from a variety of sources, allowing them to share information and collaborate on tasks with greater efficiency.
The project is still in its early stages, but it has the potential to revolutionize the way that open source LLMs are used. If successful, it could lead to LLMs being used in a wide range of new applications, from customer service to medical diagnosis.
LLama2.c:
LLama2.c is a fork of the LLM project by Andrej Karpathy. It is designed to be more efficient and easier to use than the original LLM. LLama2.c is written in C, while the original LLM is written in Python. This makes LLama2.c faster and more memory-efficient than the original LLM. LLama2.c also includes several features that make it easier to use, such as a command-line interface and a graphical user interface.
Fooocus:
Fooocus is a project that aims to create a large language model (LLM) that can focus on specific tasks. This would make it possible to use LLMs to solve problems that are too complex for other methods. For example, an LLM that is focused on the task of writing code could be used to generate code for complex software applications. Or, an LLM that is focused on the task of translating languages could be used to translate documents from one language to another.
The Fooocus project is still in its early stages, but it has the potential to revolutionize the way we use LLMs. By focusing LLMs on specific tasks, we can make them much more powerful and useful tools.
CodeLllama:
CodeLllama is an LLM agent that has been trained to write code and generate code in a variety of programming languages. Some of the languages include Python, Java, and C++. Of course it’s not a replacement for coders, what CodeLlama can do is be used to generate code for a variety of tasks, such as creating web applications, developing mobile apps, and writing scripts. Freeing up valuable time for developers to focus on more complex projects and planning.
It can also be used to generate code for specific purposes, such as generating code to implement a specific algorithm or to generate code to solve a specific problem. CodeLlama is a powerful tool that can be used by both experienced and novice programmers.
Llama-gpt:
Llama-gpt is a large language model agent that has been trained to generate text in the style of GPT-3. It can be used to create a variety of different types of content, such as blog posts, articles, and stories. This could be helpful to people such as writers, bloggers, and marketers looking to supercharge their productivity. Llama-gpt is still under development, but it has already been used to create a variety of different types of content, including blog posts, articles, and stories. It is a promising tool that has the potential to revolutionize the way content is created.
OpenTF:
OpenTF is a project that aims to create an open-source implementation of TensorFlow. This would make it possible to use TensorFlow on a wider range of platforms and would also allow for greater customization and flexibility in how TensorFlow is used.
For example, developers could use OpenTF to create their versions of TensorFlow that are optimized for specific tasks or platforms. Additionally, OpenTF could be used to create new features and functionality that are not currently available in the proprietary implementation of TensorFlow. The project aims to create an open-source implementation and would make it possible to use TensorFlow on a wider range of platforms, including those that are not supported by the current iteration
Vall-E-X:
Vall-E-X is a project that aims to create an LLM that can mimic human speech. It is still under development, but it has the potential to revolutionize the way we interact with computers. Currently, we interact with computers through a variety of interfaces, including keyboards, mice, and touchscreens. However, these interfaces are limited in their ability to convey natural human language.
The project hopes to overcome these limitations by allowing humans to interact with computers using natural language. This would make it much easier for us to give computers instructions and ask them questions. It would also make it possible for us to have more natural conversations with computers. Vall-E-X is still in its early stages of development, but it has the potential to change the way we interact with computers in the future.
AI Town:
AI Town is a project that aims to create a virtual world where LLMs can interact with each other and with humans. For example, we could see how LLMs interact with each other in a social setting, and how they respond to different prompts and questions from humans. This information could help us to better understand how LLMs learn and think, and how they can be used in a variety of applications.
Additionally, AI Town could be used to create new forms of entertainment and education. For example, we could create virtual worlds where LLMs act as tour guides, or where they provide educational content.
Seamless Communication:
Seamless Communication is a project that aims to create a system that can automatically translate between different languages. This would allow people from all over the world to communicate with each other more easily and potentially in real-time.
These are just a few of the many new LLMs and LLM agents that are being developed. LLMs have the potential to revolutionize many different industries, and we are excited to see what the future holds for this technology.
Conclusion
It’s becoming important to keep up with any and all changes associated with open source LLMs. The best place to do this is at ODSC West 2023 this October 30th to November 2nd. With a full track devoted to NLP and LLMs, you’ll enjoy talks, sessions, events, and more that squarely focus on this fast-paced field.
Eye On AI: OpenAI And Anthropic Aim For Big Valuation Spikes, Visa Looks To Join Generative AI Gold Rush
October 4, 2023
It’s been a busy week once again in AI, especially when it comes to investing and valuing it.
Just about a week after announcing a new investment worth up to $4 billion from Amazon, AI startup Anthropic is apparently looking for more.
The startup, a ChatGPT rival with its AI assistant Claude, is in talks with investors — possibly Google — about raising another $2 billion in funding at a valuation between $20 billion and $30 billion, according to The Information.
That seems like quite the jump in value. It was just in May when Anthropic raised a $450 million Series C led by Spark Capital. Anthropic did not release a valuation at that time, but reports surfaced before the announcement said it was raising a round at a pre-investment valuation of $4.1 billion.
Anthropic is not alone. Late last week it was reported that its chief rival OpenAI was in talks to sell shares on the secondary market at a staggering $90 billion valuation.
Remember that it was just in early spring when OpenAI picked up about $300 million from the likes of Sequoia Capital and Andreessen Horowitzat a valuation of around $29 billion.
That means in about five months the startup behind the artificial intelligence tools ChatGPT and DALL-E has seen its valuation jump 3x.
That’s especially amazing considering how many startups have slashed their valuations this year as private investors try to value companies at least a little more like public inventors do. (You can see how off private investors were on companies like Instacart, once valued at $39 billion as a private company and now with a $7.3 billion market cap following its IPO last month).
Of course, OpenAI is the poster child for the AI craze right now and seemingly can value itself at what it wants, when it wants. Perhaps the same can be said about Anthropic.
However, even others are also trying the same valuation jump.
Character.AI, which allows people to create their own personalized AI chatbots, is seeking to raise “hundreds of millions of dollars” in a round that could value it at more than $5 billion, Bloomberg reported last week.
It was just in March when the Palo Alto-based startup closed a $150 million Series A at a $1 billion valuation led by Andreessen Horowitz. Four months later, reports started to circulate that Character.AI was already looking to raise more money.
OpenAI Unveils DALL-E 3 & Users Can Generate Images in ChatGPT
OpenAI has unveiled its latest version of DALL-E as part of a research preview, called DALL-E-3 this week. According to a blog post from the AI startup, DALL-E 3 “understands significantly more nuance and detail than our previous systems.”
For enterprise and ChatGPT Plus customers, DALL-E 3 will be available next month via the API and Labs. To the delight of users, DALL-E 3 is built natively on ChatGPT. This will allow users of ChatGPT to generate tailored, detailed prompts for DALL-E 3. This could potentially unlock even more creative features and further push the limits of generative AI art.
Of course, if the prompt provided by ChatGPT isn’t up to par, one could always ask the chatbot to make some changes and tweaks based on what you’d like to see. What’s interesting though, according to OpenAI, even identical prompts deliver improved results between DALL-E 2 and DALL-E 3.
Also, due to potential liability and their commitment to the responsible use of AI, OpenAI has taken steps to limit DALL-E’s ability to generate violent, adult, or hateful content. For example, DALL-E 3 is designed to decline requests that ask for an image in the style of a living artist.
This is likely a result of growing pushback from the art community over the use of human-created works to train AI models. On another note, creators can also opt their images out from training OpenAI’s future image generation models.
Why does this matter? As OpenAI notes, modern text-to-image systems tend to ignore words or descriptions. This forces users to learn prompt engineering. So it seems that OpenAI is attempting to bridge the skills gap between casual users and expert users.
News Of the Week
iPhone 15 Pro Max Camera Review: Depth and Reach
Sebastiaan de With reviews the latest telephoto-equipped camera out of Cupertino in our annual iPhone 15 Pro Max camera review.
Sebastiaan de With, Oct 4, 2023
Apple is in the strange position of having to slowly improve a product while also trying to reinvent it.
Some say their success requires them making small, evolutionary steps seem revolutionary. I don’t quite agree with that.
As iPhones become better and better over the years, small steps eventually bring tip-over points, when technology starts to enable things that we couldn’t imagine years before. These developments enable not mere steps, but leaps forward: the iPhone X’s all-screen form factor and Face ID; iPhone 7’s Portrait mode; last year’s Dynamic Island and 48 megapixel main camera.
So here’s iPhone 15 Pro Max. This year brings a leap in materials and silicon, but marks an evolutionary photography step. Or does it?
iPhone 15 Pro Max
I have to get this out of the way: I find physical camera design important. It seems superficial, but camera design has been a playground and muse for artists and designers through the history of photography. There’s nothing more magical to design than a box that traps light and converts it to creativity.
Few companies appreciate this, but Apple certainly did this year. The new Pro line depart from jewel-like appearance of last year. Gone are the reflective, shiny polished stainless steel rails, replaced with an almost imperceptible brushed finish titanium frame that feels fantastic and grippy thanks to its soft finish. The rounded edges make it comfortable in the hand and contoured to your fingers. It might be an illusion, but even the clearance and contour of the buttons make the entire thing feel more tactile.
The 15 Pro deserves a place next to the inimitable iPhone 4 which Steve Jobs himself described as related to a ‘beautiful old Leica camera’. This iPhone feels like a camera.
I chose the iPhone 15 Pro Max this year, as its optical zoom lens extends to 5× the default camera, or a 120mm focal length equivalent . Its smaller, non-Max sibling maintains its excellent 3× lens.
My previous Large iPhone Experiences— especially iPhone 12 Pro Max— were enough to make me prefer the smaller screen sizes for every subsequent release. Even my standard-sized iPhone 14 Pro felt borderline too large and heavy.
I went into the 15 Pro Max expecting a little discomfort with a giant slab of glass and metal in my hand, but to my surprise, it felt manageable. That titanium frame and rounder edges really make a difference. If next year’s standard size models offer same telephoto lens, I’m not sure if I’ll downsize. It’s that comfortable.
Tactile feedback improves camera usability, and there’s a reason that dedicated cameras still have physical buttons: it’s hard to make tapping glass feel satisfying. This year we gained a button, which we’ll dive into later…. Much Much More →
First Cut — State of Private Markets: Q3 2023
October 4, 2023
Peter Walker
Every quarter, Carta releases information on the startup ecosystem in our State of Private Markets report. It can take a few weeks for rounds to be recorded on our platform, so we produce a full analysis after we get the final numbers.
However, we publish a “first cut” of data as close to the end of the quarter as possible. This initial work focuses on startup valuations and cash raised across venture stages.
Here’s how Q3 is shaping up for U.S. startups:
Valuations see slight increase: Median pre-money valuations ticked upward or remained flat for most venture stages from Q2 to Q3. The change was most positive for Series D and E+ rounds, though valuations at those late stages remain well off their 2021 highs.
Round sizes holding steady: Median round sizes saw little change from Q2 save for some substantial growth at Series A.
Although the final numbers on total rounds and capital raised are not yet available for Q3, it may prove to be a slightly weaker quarter than Q2 for overall venture activity.
We’ll publish our full set of quarterly data in the coming weeks. To receive the full report direct to your inbox, sign up for our Data Minute newsletter.
To see the valuations and round size data below split into primary and bridge round figures, download the addendum below.
Seed & Series A
Series B & Series C
Series D & Series E+
SBF’s ‘Good Faith’ Defense
By Aidan Ryan
Oct. 4, 2023 12:09 PM PDT
Sam Bankman-Fried’s criminal trial kicked off in a Manhattan courtroom on Tuesday with jury selection, followed by opening statements from both the prosecution and the defense earlier today. I went down to check out the action in person on Wednesday, where we got the first in depth look at how Bankman-Fried and his lawyers will defend the case.
Marc Cohen, one of Bankman-Fried’s lawyers, repeatedly used the phrase “good faith” when referring to the former FTX CEO’s actions during his opening statement. He claimed that a number of facts that the government mentioned in its opening statement did happen, including FTX customers wiring money to bank accounts controlled by Bankman-Fried’s Alameda Research trading firm and that there were certain parts of the FTX codebase that were unique for Alameda, but they weren’t secret or done in bad faith. He also argued that FTX was a startup and overlooked some parts of its business, including not having a risk management team or a chief risk officer, but Bankman-Fried had no intent to commit any crimes.
Ultimately, Cohen argued, it was a “perfect storm” of drops in crypto prices that hurt FTX and Alameda and caused a run on the bank. Notably, the prosecutor emphasized to jurors that FTX was not a bank. He also said that Bankman-Fried told Caroline Ellison, the former CEO of Alameda, to hedge its exposure but she did not, which he argued led to problems later.
Bankman-Fried faces seven charges in total, including defrauding FTX customers who have billions of dollars in claims in FTX’s Chapter 11 proceedings. He’s also charged with defrauding investors, which I covered last week, and defrauding lenders of Alameda Research, the crypto hedge fund he co-founded in 2017.
The opening statements have already resurfaced the highly leveraged crypto lending world, which benefited firms like Alameda, Three Arrows Capital and Genesis when prices were going up. But after the collapse of a popular stablecoin called terra last year, crypto’s web of lending toppled like a house of cards—ultimately leading to the collapse of FTX and Alameda.
Prosecutors have charged Bankman-Fried with two counts related to Alameda: wire fraud and conspiracy to commit wire fraud on lenders to the crypto hedge fund. The superseding indictment alleges that Bankman-Fried provided Alameda lenders with false and misleading information so they wouldn’t recall or stop issuing new loans and that he conspired with Ellison to defraud the lenders.
The indictment alleged that FTX, at Bankman-Fried’s direction, told customers to deposit funds into bank accounts that were controlled by Alameda, which in turn transferred the assets to other accounts it controlled to use for its own purposes. That allegedly led to a multi-billion dollar hole at FTX. Bankman-Fried and other executives also allowed Alameda to skirt controls on FTX, which gave it privileged access to the crypto exchange and gave it access to billions of dollars in credit, according to prosecutors. This all occurred even as Bankman-Fried assured customers, investors and the public that FTX and Alameda were completely separate.
When Alameda was hit with calls from its lenders to repay its loans in June 2022, after the collapse of terra and the decline in value of other crypto assets, Bankman-Fried allegedly approved sending billions of FTX customer funds to Alameda to cover the trading firm’s losses. Even after Alameda repaid these loans, Bankman-Fried allegedly worked with Ellison to provide false information to lenders that were still owed money by FTX. That included misleading investors on the use of customer funds and personal loans from Alameda to executives that included Bankman-Fried, who was loaned $1 billion, according to the indictment.
Ellison was also charged with wire fraud and conspiracy to commit wire fraud on lenders to Alameda. She pleaded guilty to seven criminal counts in December and is expected to be a key witness against Bankman-Fried in his trial. Ellison’s former co-CEO, Sam Trabucco, left the trading firm a few months before the collapse of FTX and Alameda. He hasn’t been seen or heard from publicly since the collapse, when he tweeted “much to love everyone” and that he hopes “the road ahead is brighter.”
Friends of Binance
There’s still another big court case in crypto: the Securities and Exchange Commission’s lawsuit against Binance and its U.S. offshoot, Binance.US.
Both crypto exchanges, which are both owned by Binance CEO Changpeng Zhao, asked the court to dismiss the case last month. They argued that in suing the exchanges and Zhao, the SEC overstepped its authority and its lawsuit has “no foundation” in current securities laws.
Some big crypto names agreed with Binance and submitted amicus briefs in the case, including the crypto investment firm Paradigm, which is not a backer of either Binance or Binance.US. In its brief, lawyers for Paradigm wrote that the SEC exceeded its authority in its lawsuit against Binance.
Those endorsements are notable because U.S. crypto firms have traditionally been wary about aligning with Binance, which is also the subject of a long-standing Department of Justice anti-money laundering investigation. But it makes sense these crypto companies would be chiming in now—the Binance case could have wide-reaching implications for the industry, particularly in the absence of any federal framework for crypto regulation.
Payments company Circle, which issues the USDC stablecoin, also filed a brief, though explicitly said its comments weren’t in support of any of the defendants. Instead, lawyers for Circle took aim more broadly at SEC allegations involving Binance’s own stablecoin, writing that the SEC “raises serious legal questions affecting digital currency and the U.S. economy more broadly” and arguing that stablecoins are not a type of security.
Overheard
“If I were a better person, I would have been deeply distressed by all of this. It took about a nanosecond before I thought, ‘Oh my God. This is an incredible story,’” said Michael Lewis, the author of books including “Liar’s Poker,” “Moneyball” and “The Big Short,” in an interview with 60 Minutespromoting his new book about Sam Bankman-Fried.
The book, called “Going Infinite,” chronicles Bankman-Fried’s rise and fall after Lewis met with the former crypto executive more than 100 times. New York Times nonfiction critic Jennifer Szalai wrote in her review that “this isn’t a book of investigative journalism; this is Lewis’s account of being a fly on the wall — a perspective that’s all well and good when your subject isn’t a billionaire savant who is charged with defrauding people who trusted him.”
Alameda had a $65B line of credit and ‘unlimited withdrawals’
FTX co-founder and CTO Gary Wang takes the stand
Jacquelyn Melinek @jacqmelinek / 2:35 PM PDT•October 5, 2023
Image Credits: Bryce Durbin / TechCrunch
The Sam Bankman-Fried trial gained steam after a somewhat sleepier first half of the day. That’s when prosecutors and the defense asked a witness and former FTX developer about the technical details of the crypto exchange as well as Alameda Research.
But that changed around 4 p.m. when FTX co-founder and CTO Gary Wang took the stand, wearing a wrinkled suit. Prior to Wang taking the stand, there was a 15-minute break during which Bankman-Fried looked visibly irritated.
Bankman-Fried’s parents were also there. During the break, they went to their son seemingly in an effort to provide support. At one point his father, Joseph Bankman, patted his mother, Barbara Fried, on the back, said something and laughed. She didn’t laugh back but continued to look away toward her son.
On the stand, Wang admitted that he committed wire fraud, securities fraud and commodities fraud. He added that Bankman-Fried, Nishad Singh and Caroline Ellison were the individuals he committed the crimes with.
Wang, Singh and Ellison pleaded guilty in late December 2022 as part of a deal to cooperate with the government and testify during this trial.
Wang said that they were given “special privileges from Alameda Research,” the crypto trading firm that he said he and Bankman-Fried started prior to launching FTX. Those privileges included getting large lines of credit, unlimited withdrawals and being able to have negative balances. Wang said that the “unlimited funds” came from FTX customers; a special code was added to customer transactions that funneled the money to Alameda.
He shared during his testimony that he was in charge of writing and reviewing code. And while Bankman-Fried did not write the code, Wang said Bankman-Fried did tell him and other developers what to implement. “Sometimes we talked [disagreements] out, but in the end, it’s Sam’s decision,” Wang said.
Negative balances, unlimited withdrawals
Because of these special privileges, Alameda had a $65 billion line of credit, Wang said. “Normal large businesses have single to double digits [of credit] in the millions.” By the time the two businesses filed for bankruptcy in mid November 2022, Alameda withdrew $8 billion, Wang said.
These internal financial advantages were not disclosed to the public, he shared.
Alameda and FTX were both started by Bankman-Fried and Wang, with ownership split 90% and 10%, and then 17% equity and 65% equity, respectively. Singh also had 5% equity of FTX, and a number of outside investors held other positions, Wang noted.
The ownership percentages never changed, he added. At the time, both Wang and Bankman-Fried were billionaires.
During his time at the companies, Wang also disclosed that Alameda “loaned” him around $200 million to $300 million. But the money never went to his bank account, and it instead went to investments that FTX made into other companies.
Naming the business
The company also picked its name to outsmart other businesses, which might have negative connotations toward companies with crypto jargon in their titles. “Alameda” derived from Alameda County in California and “Research,” was because it “sounds prestigious” and is not using a crypto-related name, Wang said.
The initial funds for Alameda came from Bankman-Fried personally as well as various lenders. Wang said Bankman-Fried also believed it would be easier to get bank accounts, rental leases, investors and so on, with a more “normal” company name.
The prosecutors aired a clip of Bankman-Fried from April 2021 on a podcast, where he explained Alameda’s name. “If we named our company Shitcoin Day Trader’s Inc., no one would do business [with us],” he said at the time.
Wang’s testimony is expected to continue on Friday morning until midday, according to prosecutors at the trial.
Meta’s New AI and VR Push Includes Celebrity-Influenced Chatbots and Next-Gen Smart Glasses
Meta is partnering with Microsoft to integrate Bing’s real-time search information across Meta AI, WhatsApp, Messenger and Instagram.
Published Sept. 27, 2023
By Andrew Hutchinson – Content and Social Media Manager
Meta has announced a range of new AI-based features at its 2023 Connect conference, headlined by its AI chat assistant tool, which will essentially integrate a ChatGPT-like bot within all of Meta’s apps.
And what’s more, it’ll be available in different personalities, based on celebrity voices, in order to enhance familiarity and engagement.
There’s a heap to dig into and consider. Here’s a summary of all the key updates announced at Connect day one.
AI Chatbots
First off, as noted, the big announcement of the day, which Meta’s hoping will be its leap into generative AI, is its new AI assistant.
As explained by Meta:
“Meta AI is a new assistant you can interact with like a person, available on WhatsApp, Messenger, Instagram, and coming soon to Ray-Ban Meta smart glasses and Quest 3. It’s powered by a custom model that leverages technology from Llama 2 and our latest large language model (LLM) research. In text-based chats, Meta AI has access to real-time information through our search partnership with Bing and offers a tool for image generation.”
Partnership with Bing? That’s unusual.
In a separate post, Microsoft outlined how Bing data will be shared with Meta AI:
“We’re thrilled to announce that we’ve begun to work with Meta to integrate Bing into Meta AI’s chat experiences enabling more timely and up-to-date answers with access to real-time search information. Bing’s integration extends to Meta AI and a few of Meta’s other AIs available to message with in WhatsApp, Messenger, and Instagram.”
So real-time information to fuel Meta’s AI responses. Wonder what Microsoft’s getting in return?
As you can see in the above example, the new AI chatbot will function much like you would expect, providing the capacity to ask questions, and get immediate answers to queries in-stream.
Which is pretty much in line with the latest wave of AI chatbots, though it is also worth noting that Meta tried this exact same thing back in 2015.
This was Meta’s first chat assistant tool, called “M”, which was designed to provide direct answers and prompts in-stream, and usher in the new age of chatbots on Messenger.
Except, it had problems, with the bot sometimes producing questionable replies, meaning it had to be heavily monitored by actual humans in order to keep it in line.
Meta eventually ended the “M” experiment in 2018, due to high maintenance and low usage, with the latter being more of a concern for Meta at this stage, as it tries the same again…. More
Ray-Ban’s new Meta smart glasses will be able to translate text
Brian Heater@bheater / 11:14 AM PDT•September 27, 2023
Image Credits: Darrell Etherington
Meta just took the wraps off its latest smart glasses at today’s Connect event in Menlo Park. We were able to spend a bit of time with the new wearables and were impressed by what they can do — specifically livestreaming, the biggest missing feature from the last gen.
Turns out Ray-Ban Meta will be getting additional features through a software update next year. The “multimodal” update brings some intriguing real-world features, including the ability to identity landmarks in front of you and translate signs.
It’s not quite the live voice translation Google promised on its still unreleased smart glasses a few years back, but it definitely paints the picture of how these sorts of glasses will become more capable moving forward. Of course, the lack of a heads-up display limits potential functionality, so the devices do most of the work through voice commands.
Secondary Investor Industry Ventures Raises $1.7B+ For Two New Funds
September 27, 2023
Technology investor Industry Ventures announced that it has closed on more than $1.45 billion for its latest secondary fund, which will focus on buying minority stakes in later-stage venture-backed companies.
In addition to the new secondaries fund, Industry Ventures Secondary X, the firm also raised over $260 million for a buyout fund. That fund, Industry Ventures Tech Buyout II, will focus on small software company buyouts and emerging software buyout funds.
The fundraises come at an unusual juncture point in the late-stage startup ecosystem. After pouring in record sums during the technology bull run that peaked in late 2021, late-stage investors have pulled back sharply in the past several quarters.
Valuations of many heavily funded private companies, meanwhile, are far below peak as well. Steep declines in shares of publicly traded energy technology companies have played a big role in pushing down private valuations as well.
Liquidity is also tougher to come by. Industry Ventures calls the present moment “a time of burgeoning market demand for investor liquidity.”
The strategy for San Francisco-based Industry Ventures’ secondary fund will rely on buying stakes from existing investors in the most promising VC-backed private companies at compelling prices. To this end, the firm has a lot of experience.
Over a 20-year history, Industry Ventures says it has completed more than 600 secondary investments. This includes over 400 secondary venture capital fund limited partnership interests and 170 direct secondary company share purchases, collectively covering over 5,500 venture-backed companies.
To date, the firm says the largest exits for the secondary strategy have come from investments in Uber, Alibaba, ZipRecruiter, Nubank, Marqeta, Roblox, Trustwave,Twitter (now X) and Upwork.
A Secondary Investor’s Take On The Current, Still-Unnamed Downturn
Joanna Glasner @jglasner – October 3, 2023
Hans Swildens is convinced the tech and startup market downturn of the past few quarters is missing one crucial element: a name.
Twenty-three years ago, we had the Dot-Com Bubble. In 2008-2009, it was The Great Recession. But what do we call the steep reversal in valuations that followed the late-2021 market peak?
“The Unicorn Collapse?” proposes Swildens, whose firm, secondary-focused investor Industry Ventures, was active during both prior downturns. In the current, still unnamed market correction, the firm is busier than ever, closing on a record $1.7 billion-plus across two new funds last week.
Swildens later decides “unicorn collapse” is too negative a catchphrase. While valuations of heavily funded startups have receded sharply, most are still active, cutting burn and pushing for profitability. As an investor looking to acquire stakes in many such companies, the math only works if one sees upside from current levels.
I suggested the “Unicorn Reverse Stampede,” but then concluded that doesn’t work either. Mythical creatures capable of winged flight probably don’t stampede. Later, “The Big Crunch,” comes to mind — a concept in cosmology that is essentially the reverse of the Big Bang. Or perhaps simply “The Great Reversal.”
The great startup secondary sale
So far, the lack of a definitive name for the current cycle has not been a drag on deal flow. For venture secondary investors, who buy portfolios in venture capital funds or stakes in private companies, there’s virtually unprecedented demand from sellers.
“The venture market is a lot bigger than it was 10 years ago. So when you look at a portfolio from a large LP (limited partner), it’s a lot bigger than it was,” Swildens observed.
The cast of industry players looking to offload stakes is also quite wide. At Industry Ventures, deal flow sources include corporate investors, pension funds, endowments and hedge funds. Some VC funds are also seeking liquidity, as are shareholders in individual private companies.
Valuing private shares, however, isn’t as simple as looking at the last known valuation, or adjusting based on public market comps. For one, venture shares may come with attached rights, such as ratchet clauses and liquidation preferences, that make them more valuable than ordinary shares. Startups pioneering new technologies and business models, meanwhile, may have few relevant comparables for setting valuations.
A maturing market
For all the complications around valuing private company stakes, one development has made things easier: There are more players in the space.
Marketplaces for buying private company shares, including Nasdaq Private Market, Forge and EquityZen, have grown up over the past decade. For the more heavily traded private company shares, values aren’t always too different from what public markets might confer. Swildens points to the case of Instacart, which traded privately prior to its IPO and in some recent cases was actually valued a bit higher pre-IPO than it is now.
The broad pullback in most tech companies not named Apple, Nvidia or Microsoft over the past couple years, however, has made its way to late-stage portfolios. Selling stakes in later-stage startups are generally selling well below their peak 2021 valuations.
Early-stage companies, however, haven’t seen the same impact. For startups that raised their last round at seed, Series A or Series B, Swildens said, it’s common to see valuations remain at the price from their last round.
This time it’s not different
By many measures, the current downturn looks a lot like prior historic pullbacks, Swildens notes. We’re seeing valuation resets, companies adjusting business plans for leaner times, and a much weaker IPO market.
However, this time around there are a few differences. Extreme interest rate hikes in a short time span were a development few had foreseen. Geopolitical anxieties are also impacting decisions around holding stakes in foreign countries, presumably China in particular.
So, just as every downturn is different, every recovery plays out in its own way as well. For now, aside from a few enormous AI deals, the Great Correction isn’t showing signs of turning the corner to the next boom. When it does, secondary investors are angling to be holding the best cards in the house.
Illustration: Dom Guzman
A few takeaways from ARM’s IPO
OCT 2, 2023
This week, I wanted to touch on a few interesting takeaways from Arm’s business journey and IPO that took place a few weeks ago. Arm is a very important company in the semiconductor and broader technology ecosystem despite not being as well known as it should be.
I. A winding company journey
Arm began as a joint venture between Apple and the now defunct Acorn Computers and VLSI Technology in 1990.
The plan of the original JV was to develop a processor that was high performance, power efficient, easy to program and readily scalable, which continues to remain ARM’s mission today. Intel at the time had considered and rejected the simpler RISC (Reduced instruction set computer) architecture, which was what ARM set out to use, aiming to become a global standard.
“Arm’s first CEO, Robin Saxby, had vast ambitions for the then startup. “We have got to be the global standard,” he told his colleagues. “That’s the only chance we’ve got.” (From The Chip War)
While it wasn’t initially able to compete with the x86 Intel architecture on PC at least in the early days, the energy-efficient nature of it made it suitable for portable devices such as PDAs and some of the early phones at the time.
The company went public in 1998, and remained a public entity until 2016, when Softbank purchased it for $32B and took it private.
II. Luck and Market Timing
Probably the most important moment in Arm’s 33 year company journey was Intel’s decision to not make chips for the Apple iPhone.
At the time, Intel had recently struck a deal to make chips for Apple’s mac computers. Shortly after that, Jobs went to Intel CEO, Paul Otellini with a new pitch: to make chips for the first iPhone. Intel declined, as discussed in excerpt below from The Chip War.
Would Intel build a chip for Apple’s newest product, a computerized phone? All cell phones used chips to run their operating systems and manage communication with cell phone networks, but Apple wanted its phone to function like a computer. It would need a powerful computer-style processor as a result. “They wanted to pay a certain price,” Otellini told journalist Alexis Madrigal after the fact, “and not a nickel more…. I couldn’t see it. It wasn’t one of these things you can make up on volume. And in hindsight, the forecasted cost was wrong and the volume was 100× what anyone thought.” Intel turned down the iPhone contract.”
Apple then looked elsewhere for its phone CPUs, and turned to Arm’s architecture which in fact was optimized for energy efficient devices such as phones. Apple still uses Arm-based chips today.
It was a case of good market positioning and timing coupled with good luck for Arm.
III. A hidden monopoly
Today 70% of the world’s population uses Arm-based products, and ~30B Arm-baed chips are shipped per year. That’s almost 4 per person in the world!
Arm chips are used in all kinds of devices from PCs to watches to tablets to TVs to XR headsets.
But it’s the smartphone market where Arm really dominates. In the mobile applications processor market (i.e., the primary mobile chip), Arm has a greater than 99% market share!
A consumer can buy an iPhone or a Samsung or LG phone — but they’re all going to end up relying on an Arm-based chip.
IV. An asset-light business model
Arm doesn’t design or manufacture chips. It develops the IP and building blocks such as the Armv9 instruction set architecture which others can use to design chips, and then either manufacture themselves or using a foundry.
Therefore, it’s approach is asset-light but R&D heavy, to continue pushing on the IP they have developed. It also means that there business model is quite flexible, in that they can license its IP to a wide range of customers, including fabless semiconductor companies, integrated device manufacturers (IDMs), and system-on-a-chip (SoC) designers.
The origins of their model come from their very first CEO Saxby over 30 years ago. He believed that Silicon was a bit of a commodity like steel and didn’t want to build chips. Instead, he wanted to sell the architecture to fabless design firms that would customize it for their own purposes, and then outsource the manufacturing to a foundry (such as TSMC). This vision of a disaggregated chip industry largely played out, and Arm has stuck to the approach that he envisioned.
Arm makes money in two ways:
License revenue which represents revenue related to licensing, software development tools, design services, training, support, etc. The products Arm licenses essentially enable companies to design and manufacture chips they need depending on their use case. Arm has ~230 customers that drive license its technology, on a variety of limited term or more flexible arrangements.
Royalty revenue which rerpesents royalties on every Arm-based chip sold and is either a fixed-fee per chip or a portion of the average selling price of the chip (1-2% of ASP typically). Over 60% of Arm’s revenue comes from royalties.
Arm made ~$2.7B in revenue in 2023 and 2022, on roughly ~30B chips shipped in each year, implying that in aggregate they make about $0.09 per chip in licensing + royalties.
Their gross margins are ~96% (by nature of them essentially selling licenses/IP other than some support), and their operating margins are ~25%.
V. Operating model congruent to business model
Arm can basically be considered as being in the IP business, and the most important driver of their business is continued R&D to generate and improve upon their IP which then gets sold.
Given this, R&D is by far the biggest item of spend in the company, with over $1.1B per year spent on R&D…. Much More
Startup of the Week
EV boat startup Arc wades into watersports with $70M in fresh funding
Kirsten Korosec@kirstenkorosec / 12:40 PM PDT•September 27, 2023
Image Credits: Arc
Arc is making a splash with investors as it wraps up deliveries of its limited edition $300,000 electric boat and eyes its next target: watersports. And, specifically the kind that require a wake.
The Los Angeles-based electric boat startup, which designed, built and has now delivered a limited edition run of the Arc One, recently raised $70 million in a Series B round from a bevy of returning investors, including Eclipse, Andreessen Horowitz, Lowercarbon Capital and Abstract Ventures. New investor Menlo Ventures — specifically long-time partner and self-proclaimed boating enthusiast Shawn Carolan — also joined in. Arc has raised more than $100 million, to date.
Flush with fresh capital, co-founders Mitch Lee and Ryan Cook are planning to scale up with a new higher-volume electric boat designed for wakeboarding, wakesurfing and other watersports such as tubing.
Lee and Cook, a former SpaceX engineer who is also CTO, founded Arc in January 2021 with a plan to develop and sell electric watercraft at various price points and use cases. They started by focusing on the design and development of a purpose-built hull and purpose-built battery packs, a plan that attracted early investment from Will Smith’s Dreamers VC, Kevin Durant and Rich Kleiman’s Thirty Five Ventures and Sean “Diddy” Combs’ Combs Enterprises. The first boat was the Arc One, a 24-foot aluminum boat that produces 500 horsepower and can run between three to five hours on a single charge. The boat is also equipped with software — wireless updates are possible — and modern touches like a touchscreen. Lee said the company has produced fewer than 20 Arc One boats, the last of which should be delivered this fall.
Image Credits: Arc
“The Arc One was a bootstrapping tool,” said Lee, adding that gave the company a jumpstart on production and intellectual property and helped it build out its brand. “This round of financing is really to get us into mass manufacturing of a wakesports boat that is actually designed to help substantially fund our operations. Our goal as a business is to make better boats and sell them for a profit.”
Arc plans to move into a larger 150,000-square-foot facility in Torrance, California later this year as part of that goal. The startup, which will continue to design and build its boats (and the software) in house, is also hiring. Nearly 30 positions are open at the company.
X of the Week
Muddles about Models
Why it’s stretch to say that large language models represent ”literal world models”
GARY MARCUS, OCT 5, 2023
In a new paper that is lighting X on fire (1.6 million views and counting), called Language Models Represent Space and Time, Wes Gurnee and Max Tegmark argue that “modern LLMs acquire structured knowledge about fundamental dimensions such as space and time, supporting the view that they learn not merely superficial statistics, but literal world models”, based on some analyses of the alleged capacity of LLMs to understand geography.
Although I have little doubt about the results, I sharply disagree with their argument. Of course, as Fei Fei Li correctly pointed out on X, it all depends on what you mean by a “model”, but here’s the crux, same as it ever was: correlations aren’t causal, semantic models.
Finding that some stuff correlates with space or time doesn’t mean that stuff genuinely represents space or time. No human worth their salt would think there are dozens of cities thousands of miles off the East Coast, throughout the Atlantic Ocean.
Correlating where cities are with what’s said about them is no substitute for a map.
It’s also not even remotely new. Although Gurnee and Tegmark seem impressed with their results (“we attempt to extract an actual map of the world!”), the fact that geography can be weakly but imperfectly inferred from language corpora— is actually already well-known. A thread on X points out that there in fact lots of earlier results that are essentially similar, like this barely-discussed one from 2017, using a variety of older techniques:
To which I will add this one, more primitive, from 2009, hinging on an earlier text-driven technique called Latent Semantic Analysis:
You don’t need a large language model to discover that Dallas and Austin are commonly used in sentences about Texas and that Chicago and Milwaukee often appear together in sentences about the Great Lakes. (And it is not even clear that LLMs do this any better than the many other older techniques studied in 2017.)
§
I asked Ernie Davis about the new paper, and he wrote back a couple of of hours later with a razor-sharp email, pointing out that even if you charitably called the findings that Gurnee and Tegmark documented a “model”, the system as a whole probably doesn’t actually use that (much) downstream.
Figure 1 of the paper places a fair number of North American cities in the middle of the Atlantic Ocean. OK, so if you take one of those cities and ask for the distance to some city that is correctly placed, does it use the positions as computed by this model? If it doesn’t — and I’d bet very long odds that it doesn’t do that — then that seems to me evidence that the model is not being used. Or, if someone wants to claim that it isn’t evidence, then they would have to explain that.
The whole point of models is to be able to use them downstream for other purposes, e.g., in robotics, navigation, computer graphics, and so on. Models that can’t be used downstream are scarcely worthy of the name and are in no position to solve the reliability problems that plague current AI.
Importantly, as Davis put it, just because you can approximate some feature (like location) doesn’t mean you can reliably use it semantically for inference (is Newark off the coast of the US?).