Trillion dollar Tesla

Hark, all ye who pay attention to the stock market, for Elon Musk’s wheels-focused company broached the $1 trillion market cap threshold today.

Yeah, it finally happened, so the Equity team quickly scrambled for the microphones. Chris put together the show, allowing Alex and Kirsten to dive into the matter. Kirsten, in case you aren’t familiar with her, is TechCrunch’s transportation editor — her crew handles everything that moves under its own power for the team. She’s tremendous.

Aside from the obvious market cap point, we got into:

  • What news drove Tesla higher today, leading to its new valuation record?
  • How is Tesla’s overall financial performance looking?
  • Has the model mix at the company changed over time?
  • And, because Alex was curious, why Telsa cars all look the same when will the Cybertruck will come to market?

In short, it was a very fun Twitter shot. Cheers, and Equity is back Wednesday with our regular programming.

Equity drops every Monday at 7:00 a.m. PST, Wednesday, and Friday at 6:00 a.m. PST, so subscribe to us on Apple PodcastsOvercastSpotify and all the casts!

With Real Tone, Pixel 6 aims to improve your portraits, whatever your skin tone

It makes sense that phone manufacturers are paying extra attention to how faces show up in photos, and the new Pixel 6, announced by Google today, introduces a suite of new AI-powered tools to make humans show up better than ever. The two highlights are Face Unblur — which helps reduce blur on moving faces — and Real Tone. The latter is some AI-powered post-processing magic – powered by Google’s brand new Tensor chip –  aiming to make faces with all skin tones show up as well as possible.

Whether you’re taking selfies or someone-elsies, the vast majority of photos taken with a smartphone are of human beings. Traditionally, it has been extremely hard to get the exposure to look good for photos where multiple faces appear in the photo — especially if the faces all have different skin tones. The new Pixel 6 brings a layer of computational photography to the mix to ensure that everyone who appears in the photo looks as good as they can. The Pixel team worked with a diverse set of expert image-makers and photographers to tune the white balance, exposure and algorithms. They claim that this ensures that the photos work for everyone, of every skin tone.

Google highlights that it sees Real Tone as a mission and an improvement on its camera systems, rather than a conclusive solution to the challenges facing photographers. The company has invested substantial resources into ensuring that all people — and particularly people of color — are better represented in the way cameras capture their faces.

“My mother is a dark-skinned Black woman, my father is a white German. My whole life there’s been a question: How do we get one picture where everyone looks good,” said Florian Koenigsberger, Advanced Photography product marketing manager for the Android team, in a briefing interview ahead of the release of the new phones. “The new camera is a step along the journey. Google’s diversity numbers are not a mystery to the world, and we knew we definitely had some shortcomings in terms of lived experience and who could speak authentically to this.”

The camera team worked with photographers, colorists, cinematographers, direc1tors of photography and directors to get a deeper understanding of the challenges in lighting and capturing a diverse set of skin tones — and in particular people with darker skin tones. Among others, the team leaned on the experience from a broad spectrum of professionals, including Insecure’s director of photography Ava Berkofsky, photographer Joshua Kissi, and cinematographer Kira Kelly. 

“We focused on bringing this really diverse set of perspectives, not just in terms of ethnicity and skin tones, but also a variety of practices,” said Koenigsberger. “The colorists were actually some of the most interesting people to talk to because they think of this as a science that happens in the process of creating images.”

The Google product team worked with these imaging experts to give them cameras and challenged them to shoot extremely challenging imaging situations, including mixed light sources, back-lighting, interiors, multiple skin tones in one image, etc.

“We had to learn where things fall apart, especially for these communities, and from there we can figure out what direction we can take from there,” Koenigsberger explains. “The imaging professionals were very frank, and they were directly in the room with our engineers. I helped facilitate these conversations, and it was fascinating to see not just the technical learnings, but also the cultural learning that happened in this space. I am talking about ashiness, darker skin tones, textures. The nuances for mid-tones can vary.”

The process starts with the camera’s facial detection algorithms. Once the camera knows it is looking at a face, it can start figuring out how to render the image in a way that works well. In testing across devices, Google’s team found that the Pixel 6 consistently performed better than those from competing manufacturers, and even the older-generation Pixel phones.

It isn’t immediately clear how the feature works in practice, and whether it does global edits (i.e. applies the same filter across the entire image), or whether the AI edits individual faces as part of its editing pass. We are hoping to take a deeper look at this specific aspect of the camera functionality to see how it works in practice very soon.

The camera team highlights that the work done in this space means that the training sets for creating the camera algorithms are more diverse by a factor of 25. The Real Tone feature is a core part of the camera algorithms, and it cannot be turned off or disabled.

Google’s Pixel 6 camera smartens up snapshots with AI tools

Google’s latest flagship phones have an impressive set of automated, AI-powered tools to help make your photos look better, with smart blurs, object removal, and skin tone exposure. While we’ll have to test them out to see if they work as advertised, they could be useful for everyone from pixel peepers to casual snapshot takers.

The new cameras themselves are pretty impressive to start with. The main rear camera, shared by the Pixel 6 and Pixel 6 Pro, is a 50-megapixel beast with decent-sized pixel wells and an F/1.85 equivalent aperture (no, it doesn’t capture as much light as an F/1.8 on a DSLR, but it’s still good). The ultrawide one, also shared, is 12 megapixels and f/2.2 on a smaller sensor, so don’t expect mind-blowing image quality. The 6 Pro gets a 48-megapixel telephoto with less low light capability but a 4x equivalent zoom. They’re all stabilized and have laser-assisted autofocus.

Basically if you want the best quality in any situation, stick to the main camera, but if you’re sure about your light go ahead and fire up the wide or zoom. It sounds like all the new camera features work on all the cameras, but generally speaking the better the shot to start with, the better the final result.

The simplest tool to use is probably “face deblur.” How many times have you gotten the perfect shot but it’s not quite sharp? The Pixel Camera will automatically always capture multiple exposures (it’s part of the ordinary process of taking a picture now), and combines the main shot from one camera with a clear shot of the face captured with another. To do it, you just tap on a shot in your gallery that isn’t quite sharp and if there’s a “face deblur” option: boom.

Comparison of two images, a blurry one and one where the face is sharpened.

Image Credits: Google

OK, it’s definitely kind of weird to have only the face sharp in a blurry photo, as you can see in the sample, but look: do you want the picture or not? Thought so.

Also in the blur department are two new “motion modes.” One is an “action pan” that assists in capturing a moving subject like a passing car clearly, while blurring the background “creatively.” That means it applies a directed zoom blur instead of the handheld blur it would normally have, so it looks a little ‘shoppy, if you will, but it’s a fun option. The other one is a long exposure helper that adds blur to moving subjects while keeping the background clear. Helpful for doing something like headlight streaks without a tripod. These will be found in their own motion mode area in the camera app.

An image on the beach before using 'magic eraser' and after, with background people removed.

Image Credits: Google

“Magic Eraser” is the most obviously “AI” thing here. If you take a picture and it’s great except someone just walked into the background or there’s a car parked in the scenic vista, it’ll help you zap those pesky real-world objects so you can forget they ever existed. Tap the tool and it’ll automatically highlight things you might want to remove, like distant people, cars, and according to the example they provided, even unsightly logs and other random features. Driftwood, though, on the beach…really? Fortunately you can pick which to throw in the memory hole, no pressure, or circle unrecognized objects and it will do its best to dispose of them.

“Speech Enhancement” isn’t for images, obviously, but when you’re in front camera mode you can opt to have the device tone down the ambient noise and focus on your voice. Basically Krisp by Google. If it works anywhere near as well you will probably want to use it all the time.

“Real Tone” is an interesting but potentially fraught feature that we’ll be looking into in more detail soon. Here’s how Google describes it: “We worked with a diverse set of expert image makers and photographers to tune our AWB [auto white balance], AE [auto exposure], and stray light algorithms to ensure that Google’s camera and imagery products work for everyone, of every skin tone.”

Photo of a family with dark skin sitting on the beach.

They look great, sure… but they’re models.

Basically they wanted to make sure that their “smart” camera’s core features don’t work better or look better on certain skin tones than others. This has happened many, many times before and it’s an insult and embarrassment when billion-dollar companies blow it over and over. Hopefully Real Tone works, but even if it does there is the fundamental question of whether it amounts to lightening or darkening someone’s skin in the photo — a sensitive matter for many people. “This feature cannot be turned off nor disabled,” Google says, so they must be confident. We’ll be testing this and talking with developers and photographers about the feature, so look for a deeper dive into this interesting but complex corner of the field.

It’s not entirely clear how many of these features will be available beyond the Pixel line of phones or when, but we’ll let you know what we find out.

Frozen coffee startup Cometeer raises $35M Series B and launches its product in earnest

Gloucester, Massachusetts-based Cometeer has been around for nine years. In that time, the company has built up a mad scientist’s lair worth of coffee scientists, equipment and processes to jolt some fresh life into the industry. Based out of a former frozen seafood facility, the company has created a multimillion-dollar proprietary production line to turn beans into flash-frozen little “pucks”, sealed in capsules to keep their flavor intact. The 10x strength brew is then ready to use.

Pick beans. Roast them. Grind them. Add water. Drink. Coffee really doesn’t have to be complicated, but every year a dozen new startups come jittering along to try to find new and innovative ways to inject some flavor and caffeine into the drab, meaningless existence of a technology journalist. Most of those startups are safely ignored, because the vast majority of them will be gone by the time you think of writing a “where are they now” round-up at the end of the year. Still, when a fistful of investors pump a total of $100 million into an upstart, you’d best believe that even the most under-caffeinated reporter begrudgingly shoves some toothpicks to prop open their eyelids, and pays attention.

The previous round was $50 million, closing in April of 2020. In the current round of financing, the company harvested $35 million from D1 Capital, Elephant, Tao Capital, Addition Ventures, Avenir, Greycroft Partners and TQ Ventures, along with a number of coffee-expert angel investors. The company declined to disclose the valuation of the funding round.

To brew the pods, you “melt” the puck by dropping it in a cup of hot or cold water, wait a bit, and you’ve got a fresh cup of joe on the go. The only thing you need is some water, and a way of heating the water, if it’s hot coffee your little heart desires. The capsules stay fresh for up to three years if you keep them in the freezer, and will survive for about three days in the refrigerator.

Frozen Cometeer Capsule

The Cometeer capsules are flash-frozen in liquid nitrogen to preserve the flavor. In your freezer, they stay fresh for about three months. Image Credits: Cometeer

As for the coffee itself, the magic starts with the beans:

“Our roasting partners are the backbone of Cometeer. Equally as important as superior tasting roasts, considerations amongst our roasters is their support of coffee farmers, and commitment to direct trade purchasing at equitable prices multiple times the fair trade minimum,” explains Cometeer’s co-founder and CEO Matt Roberts. “We are focused on building out a diverse group of roasting partners with unique backgrounds, sourcing techniques and roasting styles. Alongside these partners, we look to support the de-commoditization of the coffee industry.”

Cometeer has seen extreme growth over the past couple of years, growing from 12 to 120 employees since its previous round of funding. For now, the company is focusing on its direct-to-consumer play.

“While we are focused on direct to consumer right now, we are trialing on-premise with George Howell’s café in Boston and are piloting B2B coffee solutions with focus on corporate gifting,” explains Roberts.

The company shut down its wait-list today, making the coffee available for anyone who has a credit card and a hankering for a new frontier in Java technology. The capsules are two-buck-pucks, with a price tag of around $2.00 each — the base shipment is 32 capsules for $64.

Watch Google unveil the new Pixel live right here

Google is set to announce new Pixel phones today. The company is holding an event at 10 AM PT (1 PM in New York, 6 PM in London, 7 PM in Paris). And you’ll be able to watch the event right here as the company is streaming it live.

Google already said that it plans to unveil its own Tensor chip for the Pixel 6 and Pixel 6 Pro. The company has also shared a ton of details about the new phones.

The Pixel 6 will have a matte aluminum finish and a 6.4-inch display. The Pro models will have a bigger, 6.7-inch display and a polished aluminum finish.

As for cameras, the regular Pixel 6 will have two camera sensors while the 6 Pro will feature three different camera sensors. And if you’ve seen a photo of those devices already, you already know that they feature a camera bump like you’ve never seen before.

But specifications only tell you one part of the story. It’s going to be important to listen to what Google has to say about chipset performances and camera features. We’ll discover all that during Google’s event.