Wow, People Didn’t Buy Live Mastering Classes.

My annual Black Friday sale over at BrentOzar.com just finished. Every year, I try different experiments with it – changing bundles, adding lifetime memberships, adding buy-now-pay-later, etc. One of this year’s experiments was to add live classes back into the mix.

When I launched them in October, I explained that the time had come to update the material for SQL Server 2022 and Azure SQL DB, and as long as I had to record them myself to update the material, I might as well teach the classes live.

I didn’t offer a Live Class Season Pass as I have in the past. That option was wildly popular because people could just pay once and attend any class that their schedule permitted. However, I didn’t wanna offer that this time around because I was going to teach very few live classes, and I was only going to teach them in one time zone (US-friendly). Instead, I just let people buy tickets for a specific class on a specific date.

Driving the pink-wrapped Speedster in Vegas

And… only 3 people did.

1 person bought a single ticket for a single date of Mastering Index Tuning, another person bought a single ticket for Mastering Query Tuning, and one person for Mastering Server Tuning. That’s it.

You might say, “Well, Brent, maybe the stuff you teach is no longer relevant, or people have heard enough of you.” But here’s the wild part: the recorded class sales were fine. It was only the live classes that simply didn’t sell.

I did sell a handful of seats to the cheaper Fundamentals Week class, so that one will stay on the schedule, and I’ll continue to sell tickets for that.

But my schedule just completely freed up for 2024, hahaha, since I don’t have to be home on specific dates to teach the Mastering classes. I love the bejeezus out of that. I’ll start recording the updated Mastering classes next week, doing that on my own schedule, and updating the recordings in folks’ accounts. That gives me more time to work on new stuff.

And to drive around. By the way, we wrapped Sabine in metallic pink for the Barbie movie this summer. It turned out really well.

Bought a Defender 130 Outbound.

I loved my 83 Cadillac, but I quickly ran into problems with it. I wanted to have a comfy car for 3-4 hour road trips to nearby places like LA and Palm Springs. I pretty quickly figured out that’s a bad idea in a 40-year-old Caddy, especially one with the HT4100 engine that doesn’t fare well in the desert. If I somehow had a giant warehouse garage, I’d have kept it, but I couldn’t justify keeping a car that I could only use in the fall/winter, and only around town.

When thinking about a road-trip replacement, I kept coming back to my Icelandic Land Rover Defender. I looooved it: comfortable, spacious, reliable, capable, and it looked fantastic. We used the heck out of it. Sadly, the US doesn’t allow you to import cars from other countries until they’re 25 years old, so we had to sell it in Iceland when we moved back to the US.

Land Rover just unveiled an even better version, the 130 Outbound, specifically focused on overlanding and road trips. I bought one of the first ones available in the US:

It’s big, but it doesn’t try to be a 3-row 8-person school bus. It only has 2 rows of seats, and the 2nd row folds down nicely. The result is a huuuuuge fairly-flat load floor. On our first road trip with it, we were able to bring home a ton of bags from the Cabazon outlet stores, and yet still had enough room for Yives to lay down and take a nap on the foam mattress we keep in the back.

Amping up the funkiness, the Outbound’s back sides have body-paint panels rather than glass windows. Because I bought a black one, the black panels aren’t obvious yet – they look like windows, but they’re not. Those panels will be more obvious once I wrap it. I’m still deciding between satin black like the launch press photos, or a subtle camouflage pattern done with a combination of matte and gloss black.

I leaned into the “James Bond villain goes offroading” look with 4-spoke Mondial 20″ wheels from Chelsea Truck Company. The satin black finish makes ’em look like steel wheels, but they’re actually light forged alloys.

I won’t be doing any electronics work on this one: it’s just gonna be a workhorse. It’s got wireless CarPlay, heated & cooled seats, the advanced off-road options, and a tow hitch. Like our Icelandic Defender, this one will lead an adventurous life.

Bought a 1983 Cadillac Fleetwood Brougham.

I love huge American land yachts, and I think this car is a wonderful time capsule from the 1980s:

That’s the epitome of 1980s American luxury: metallic light brown, long lines of chrome, padded vinyl roof, whitewall tires. It’s a weird combination of very clean, simple lines – but wildly over-jeweled. It’s like movie starlet wearing a simple dress, but … jewelry on her wrists, fingers, ankles, ears, and neck.

When you open the doors, the kitsch luxury continues with acres of thickly padded brown leather:

It’s just a nice, chill, quiet, 1970s place to be. It’s like your favorite dark old steak restaurant. You know it’s nowhere near as good as the new place, but… you’ve got memories together.

This ain’t a hot rod. The Cadillac HT4100 engine was an infamously bad design that produced less than 150 horsepower. That’s not nearly enough to motivate this two-ton monster that’s actually longer than a modern Cadillac Escalade.

These 80s luxury cars are surprisingly affordable – you can pick ’em up for $10k-$15k, even in good condition like this. It’s only got 15,000 miles on it! These are never going to be rare or investor-quality though because so many of ’em were made, and so many were sold to retirees who took good care of ’em.

They’re just nice, fun, chill time capsules, and I look forward to taking it to dinner on the Strip.

How the Company-Startup Thing Worked Out For Me, Year 11: Massive Changes

Every year, I post a retrospective about what it’s been like to start up a company. If you want to catch up, check out past posts in the Brent Ozar Unlimited tag. This post covers year 11 of the company: May 2021 to April 2022.

Normally I do a blog post that weaves a few things together into a story, but this year, let’s just dump the rollercoaster into a bullet point list:

  • June: Stack Overflow got acquired, and Erika and I had a financial windfall thanks to Jeff & Joel’s very generous stock option grants. We almost built a house in Iceland – we bought two plots and picked a builder, but Erika and I couldn’t agree on the details.
  • July-August: our time in Iceland drew to a close and we started planning to come back to the States. I decided that Black Friday 2021 would be the last year for my Live Class Season Pass, and I would gradually start retiring from live classes before my 50th birthday.
  • September: while still in Iceland, Erika and I decided to divorce. We’d been together for ~21 years.
  • October: we moved back to the States, into separate places in San Diego.
  • November: my annual Black Friday sale brought in about $1.5M, the best one so far. However, to minimize the number of weeks I was stuck in front of a camera, I had redesigned the teaching schedule in a compact, challenging way. I would teach European classes Mon-Weds starting near midnight my time, and then taught US classes Weds-Fri. That let me get down to teaching just 1 week per month, but those long-hour, time-zone-flipping weeks were mentally and physically demanding.
  • December-January: I closed on a condo in Mexico thanks to Erika’s help, and I moved down there. I planned to stay there permanently and coast into retirement there.
  • April: after really missing people and cars, I bought a house in Las Vegas and moved back to the US. That also changed my retirement strategy – I would need to keep working if I wanted to buy fancy cars.

This year was one of the most mentally challenging years of my life.

Some years, I feel like I’m steadily making progress towards a goal. Looking back at Year 11, I was running around like a chicken with my head cut off. I don’t have a nice, methodical story to tell for Year 11. I think if any one of a number of things would have happened just a little differently, Year 11 would have had an entirely different outcome. Year 11 was like one of those Seconds From Disaster shows where a whole bunch of little things add up to a surprisingly bad consequence.

So let’s just zoom out and talk big picture.

The road to Thorsmork is not for all cars.

~18 months into the pandemic, I ran out of gas.

Erika and I were living in Iceland when suddenly my health started taking a weird turn. I coughed constantly and I started dry heaving – often enough to where I had a hard time keeping my asthma medicine down. At first, I couldn’t pinpoint the root cause: a volcanic eruption had made the air quality worse, I couldn’t get my normal asthma medicine in Iceland, and our diet had been changing a lot.

Looking back, I can see that it was stress. I was out of gas, and it was taking a toll on me physically.

That’s a weird thing to write because Erika and I were about as lucky as anybody could be during the pandemic. We were gallivanting around Iceland, making a great living teaching online classes, and didn’t seem to have a care in the world. However, the divisive politics of the time were leading to tough relationships with family members, we struggled with what we wanted out of life long term, argued a lot about our future, and I was coming to terms with my pansexuality.

Photo by Interactive Sports on Unsplash

Starting the divorce process obviously made the stress worse in the short term. I had a very clear vision that my life for the next several months would be running on a track, jumping over a series of hurdles. None of the hurdles were large, and none of them were life-or-death. I just needed to do the best job I could of making forward progress, jumping over each hurdle in order, and continuing on even if I didn’t clear one of them as well as I’d like.

The hurdles were things like taking an inventory of our assets, getting my own place set up, writing the marketing material for the Black Friday sale, teaching the first week-long stint of my Mastering classes back to back, closing on the Mexico condo, etc.

The finish line was December 11, 2021. On that date, I’d be done teaching the first round of classes after the Black Friday signups, and I sprinted down to Mexico.

I hit the finish line in Cabo,
and realized I couldn’t live there.

I loved Cabo and that condo, but two things became clear to me within a few months.

I have that effect on people.

My friends and family weren’t traveling due to the pandemic. I’d bought the place hoping they would fly down for visits. Nobody wanted to fly to another country given all the COVID-19 travel restrictions – they kept asking me to come visit them in the States instead. <sigh> I guessed that would probably change in the long term, but it was really just a guess. Looking back, it was a good guess – my parents have gotten older, and are no longer able to do long international flights.

I missed cars more than I thought. Cabo is a terrible place for cars. The huge potholes, dirt roads, and salty air conspire to turn them into beaters in just a few years. I wanted to have a couple/few nice cars, do road trips, and attend Cars & Coffee events, but that wasn’t going to happen in Cabo.

So I decided to keep my home base in the US. Where to live? My heart will probably always be in Southern California, but a nice house there with plenty of garage space was wildly unaffordable in 2022. I looked for a place with low state taxes, great driving roads, great restaurants, and easy travel, and ended up with Las Vegas. I bought a 5-bedroom Vegas house that needed some work, and I could make it my own.

I was so excited by the prospect of a 3-car garage that I bought, uh, five cars. Near the end of Year 11, I had a Ferrari 328 GTS, Porsche 944 Turbo, Porsche Speedster Replica, Jaguar XKR-S, and a Range Rover. Went a little overboard there, especially for a guy who telecommutes, and today I’m still trying to balance that out. Looking back, I think I was overcompensating by diving into my hobbies because I was hating what I was doing for a living: teaching live training classes. Well, I loved teaching, but…

People loved the Live Class Season Pass,
but they just didn’t show up for class!

All through the pandemic, I’d been selling the hell out of my Live Class Season Pass. People paid once, and then could attend as many of my live classes as they wanted, whenever worked for them. In the Year 10 post, I wrote about how hard I worked to build a kick-ass set of live online classes.

On one hand, it worked really well: I sold a lot of Live Class Season Passes.

On the other hand… the students didn’t show up for class.

Over and over, I would teach a live class and only 5-10 people (out of hundreds of buyers) would show up. The students were just thinking, “I’ll attend some other time.” And… they never did. It’s hard to be upset about that as a teacher, because I was making great money, but it was just extremely demotivating to bring my A game only to have a handful of people actually show up.

During Year 11, I grew to actively resent the students for not showing up. I hated every time I had to show up for class and teach an empty room. First world problems, right? I should have been happy for the easy money, but… yeah, no.

So I closed out Year 11 in a grind.

Grin and bear it, I told myself, and saw October 2022 as my next finish line in the hurdle running. The Nov 2021 Live Class Season Pass buyers had a right to attend live classes for a year, and I’d scheduled classes through Oct 2022. But I was done with that, and I couldn’t wait to teach the last live online class, shut that down, and then never offer a Live Class Season Pass again.

My resentment towards my work led me to work less than I’d ever worked before, I think. I really hated what I was doing, and as a result, I stopped working hard on my blog posts, the First Responder Kit, PollGab, and pretty much anything else that required me to be in front of a computer. I just couldn’t get away quickly enough, and I needed all the recharging time I could get. That recharging went on through Year 12 as well, and seeing friends & family & conferences definitely helped. I knew I liked people, but I didn’t realize how much I liked being with people until a couple years into the pandemic.

I didn’t plan it at the time, but Vegas ended up being the perfect place to recharge those batteries. In Vegas, it’s like the pandemic never happened. We have a good circle of friends, we go out all the time, we hug and kiss cheeks, we taste each others’ drinks. When I travel to other places, I’m reminded that a lot of the US is still struggling with that, and it worries me that so many people are probably still in the bad mental state that I was in during Year 11. The feels. Everybody needed a good hug back then, and we still need good hugs now.

Bought a 1964 Porsche 356 Coupe

I’m madly in love with Sabine, my Porsche 356 Speedster replica built from a VW Beetle. I love the simplicity of the air-cooled engine experience, the looks, the ride, everything. The only drawback has been constantly saying to admirers, “Yes, but it’s a replica.”

So I’ve had my eye out for a genuine 356 coupe for a while, something that I could drive in the winter when the open-air Speedster makes less sense. (The Speedster technically has a heater and a top, but you don’t wanna use either one.)

This mysterious 1964 356 coupe popped up on BringATrailer, and right from the start, it was likely to go for cheap because it had sat idle in a garage for 25 years.

Pretty quickly, the commenters pointed out that something really fishy was going on. The vehicle identification number (VIN) seemed to have been ground down and re-applied, a classic sign that a car has been stolen or rebuilt from a variety of parts. The Kardex (like a certification of authenticity for Porsches) said the car was originally light ivory, but the car’s hidden metal spots suggested that it was originally sky blue. The “SC” badge on the rear of the car implied that it was an high-performance version, but that wasn’t backed up by the Kardex either, and it was unlikely that the car’s current engine or transmission were original.

All the questions stopped serious collectors from bidding.

But me? I’m not a serious collector. I’m a driver. I don’t enter my cars in concours shows to showcase their originality. I drive them and get them dirty. I instantly thought to myself that I’d say the “SC” badge stands for Stolen Car.

So I rolled the dice and bought it. To get it running, I’m going to need to purge & rinse the fuel system, probably replace all of the hoses, lines, and tires that dried out over a couple of decades of inactivity, and redo the brakes. (Fortunately, the 64s have disc brakes all the way around, which is a nice upgrade from the Speedster’s drums.)

That’s it, though. I’m not going to fix the body or interior issues – I’d rather leave it original.

I do need to solve the mystery of what “original” means for this one, though, by getting pictures of all the real VINs, engine numbers, etc and then tracking those down with my mechanic and Porsche.

This does officially bring me to 7 cars, which is just a little too many: two Jag XKR-S’s, the Ferrari 328 (whose restoration is coming along), the Speedster replica, the VW Type 3, this new 356, and a Range Rover.

So I’ll be selling my 2016 Range Rover and my 2013 Jaguar XKR-S hardtop. If you’re looking for a 550 horsepower monster with incredible audio, radar, and laser upgrades, and has been on Doug DeMuro’s channel, email me at BrentO@BrentOzar.com.

Bought a Modified VW Squareback

This adorable little 1971 Volkswagen Type 3 has such an excellent story.

About 20 years ago, it was abandoned at a hotel in Vegas. A towing company carted it away and notified the Californian owner, but the owner didn’t want to bother with it. It was broken down, had a flat tire, dents and dings all over.

It sold at auction for $75.

Most $75 cars are destined for scrap or being sold as parts, but this one got lucky. Its next two decades were spent under the ownership of VW fanatics who saw its potential. A Porsche 914 2.0L engine and 5-speed transmission went in along with an air suspension, new seats, and most recently, an amazing one-of-a-kind paint job.

Photos don’t do justice – it looks like it’s worn out and rusted, but that’s a brand new paint job with four coats of clearcoat. In person, it looks soaking wet, like it’s coated with shiny water.

Why now? Well, it’s about the time of year to take Sabine, my Porsche Speedster replica, out of the dining room for the summer driving season, and I wanted something else to put in there instead. I’d originally wanted to put the Ferrari in there, but Donnie Callaway is still restoring that, so this VW will go in. That’s perfect because there are a bunch of little things I want to do to make it my own before I take it out in the fall and hit car shows with it.

The interior is definitely nice, but I think I can take it to the next level with a beach/Hawaii/tiki theme. Let’s have some fun with this one.

Built a Home Theater

As I got older, I watched more online streaming, and I went to theaters less and less. I can’t remember the last time I went to a theater in-person. COVID certainly didn’t help.

When I was shopping for houses in Vegas, I wanted space to build a home theater. I didn’t want anything fancy like stadium seating, red curtains, or signs. I just wanted a dark room with comfy seating, good audio, and a giant screen. The house I ended up with had five or six bedrooms, depending on how you read the floor plan – way, way more than I need. However, two of the small bedrooms were the perfect shape & layout to be merged into one bigger room and turned into a home theater.

Before the work, the two small bedrooms were just tiny boxes that both looked like this:

And then after the installation:

Home theater

Normally the room is pitch dark, of course – for the photo, I just opened the blinds and the blackout curtains, and turned on the light. (And I mean it’s *really* dark and quiet in there – we’ve had friends who slept over and said it was like a womb, hahaha.)

We have 3 rows of seating altogether – a low front pair of comfy seats, and then two cheap modular couches from Amazon.

The first step was hiring a contractor to demolish the wall between the two rooms, make some electrical changes, turn the entry into just one door, run conduit for the video & audio cables, fix the drywall, and paint it all black. Given the economy – everybody was in a rush to build & flip houses – it was surprisingly hard to find a contractor who was willing to actually show up for anything less than a whole house renovation.

After that dusty mess finished, the room was a blank black slate, and it was time to pick home theater gear. It took a while to get in – turns out a lot of people were building home theaters during the pandemic, go figure.

Projector: Epson LS11000. This laser projector does 4K HDR and renders with sub-20-millisecond latency. (Low latency was important to me because I wanted to play Dead By Daylight in there – although that game’s only 1080p, not 4K.)

One of the cool features of this projector is vertical lens shift: you can mount the projector above or below where you want the image to be displayed. I could mount the Epson to the ceiling, keep it out of the way of the rest of the room, and still have a nice, clean, rectangular picture with low latency.

Without that feature, a projector has to kinda be in the middle of the image, or else it uses digital processing to correct the image – and that adds latency and fuzziness to the picture. (This is why theater projectors are at the back of the theater, and they’re basically aiming at the middle of the screen.)

Screen: 150″ Silver Ticket screen. It’s best not to order these until after the projector actually arrives and gets mounted. That way you can figure out exactly how big the projector’s image will be once it’s mounted in your room, and have a minimum of unlit screen around the image.

Receiver: Yamaha RX-V6A. I wanted Dolby Atmos audio, 4K 60-120fps video, Apple AirPlay 2 to stream video direct from iPhones/iPads, and a receiver that could automatically optimize audio for multiple listening positions. (Some receivers just optimize audio for one spot on the couch.)

Speakers: Jamos S 809 Cinema Pack with Atmos toppers. I’m very picky about my audio, but this is one aspect of the home theater where I actually went cheap. The speakers altogether were under $1,000. These just got really good reviews at their price point. I figured even if they were on the low end of home theater audio, they’d still be an order of magnitude better than anything I’d heard streaming shows on in the last few years. The first time I fired up the opening scene of Baby Driver, I was sold. The Jamos speakers get the job done.

All in, it was less than $6K for the equipment – which was actually less than the construction part of the work!

Once it was all in, I called Vegas Calibration to come out and calibrate the projector for the best results. I tried doing it myself, but hooweee, it’s not a good idea to put a part colorblind guy in charge of the calibration. Totally made a difference before & after.

I totally love it. I spend hours in there playing Dead by Daylight. The giant screen and the immersive audio are fantastic. I bet there are empty nesters out there in the audience who’ve wondered what to do with those extra bedrooms after the kids leave – give this a shot!

Using Stable Diffusion to Generate Presentation Art for Free

This is a brand spankin’ new topic for me here at this blog.

Whenever I start teaching something, there’s a struggle to figure out what I’m gonna share first. Should I cover how to install Stable Diffusion? Which toolkit I use? What hardware I use? How I pick a sampler? How to make bigger images in a certain aspect ratio? I can’t cover everything in one post, and even just laying out a table of contents is an impressive amount of work.

So here’s the deal, dear reader: when inspiration strikes me, I’m going to share things about Stable Diffusion that may be out of order. Later on, if I end up writing a lot – and there really is a lot to share here, depending on how advanced you wanna get – I’ll circle back and build a good Table of Contents. But for now, I’m just going to write things as the inspiration strikes.

I want a photo of a man holding a laptop, standing in a coffee shop.

Should be easy enough. I’m using Automatic1111’s Stable Diffusion webui, so I put “photo of a man holding a laptop, standing in a coffee shop” into the Prompt box. I’ve also tweaked a few settings – more on that later in the series.

Stable Diffusion settings

I click Generate, and 60 seconds later, I’ve got 16 photos to choose from. Free. Boom. Done.

Photos of men with laptops in coffeeshop, by Stable Diffusion

The images aren’t perfect – to be honest, in many cases, the man isn’t actually holding the laptop. Why isn’t it exactly what we wanted, the first time? To understand what’s happening, I’ll use an analogy.

Let’s look at clouds together.

Sky filled with clouds, by Stable Diffusion

Let’s all go outside, lay down on the grass, and look up at the clouds.

I’m going to point at a cloud and ask everybody, “That cloud right there, that specific one, does that look like a man to you?”

Some clouds are obvious: we’re all going to say, “Oh yeah, that’s his head,” or perhaps we’ll say, “That cloud is a stick figure of a man.”

Other clouds are not quite as obvious. Some of us might say, “That’s a guy holding his arms out,” while someone else says, “No no, that’s just his face, and he has really big ears.”

The three components of what just happened:

  • Cloud: one specific thing we’re all looking at together
  • People: different people see different things in the same cloud
  • Prompt: “a fruit”, the thing we’re trying to see in the cloud

That’s exactly how AI art generation works.

Our components:

  • Cloud = seed number. When we generate art with Stable Diffusion, you usually start with a random seed number, which is like pointing at a random cloud in the sky. You can also repeatedly use the same exact seed number, which is exactly like pointing at the same cloud in the sky over and over. (The sky will change over time – and that’s caused by you updating your software, like PyTorch.)
  • Person looking = sampling method, like Euler, Heun, LMS, and so forth. For some clouds, all of the sampling methods produce roughly the same end point, but for many clouds, the interpretations are different.
  • Man = prompt. What we’re looking for in the final image.

If I ask you if it looks like a man, odds are, you can turn any cloud into a man in some way, shape, or form. Maybe it’s just his head, or his torso, or a stick figure.

The word “man” is really generic, and leaves a lot open to interpretation.

However, the more specific my prompt becomes, like “photo of a man holding a laptop, standing in a coffee shop”, then the people (sampling methods) are going to have to stare longer and get more creative to turn that cloud into it – and often, they won’t be able to do it, because the cloud’s general shape just doesn’t match what you’re looking for. That’s why we often generate lots of images based on lots of random seeds – we’re asking the AI to look at different clouds.

Generating specific art with AI requires a little work.

If I specifically want a man holding a laptop – and for this blog post, let’s say that’s important to me – then I have a few different options. We need to go back to that list of images that we built, and pick the closest match. That first one was a good start:

Photo of man holding laptop, standing in coffeeshop, by Stable Diffusion

But there are a few problems with it:

  • There’s an extra hand in there
  • I’m not a big fan of denim shirts over blue jeans
  • His face seems kinda weird

In order to make that image better, I have four common options:

  • I could ask different people (sampling methods) to look at the same cloud (seed), and they’ll produce slightly different images
  • I could ask the AI generate additional variations based on ever so slightly different clouds (seeds)
  • I could change my prompt a little, like excluding extra hands, specifying the kinds of clothes I want him to wear, and ask for a handsome man – this technique is known as “prompt engineering” or “prompt whispering”
  • I could do my own post-processing, like with Photoshop

All of these are valid results, but just to pick one and show how it works, I’m going to take this same cloud (seed number), and have different people look at it to tell me what they see.

Let’s get variations by asking different people to look at the same cloud.

In Automatic1111, I’m going to click on that image I liked, and below it, the image’s details are shown:

man holding a laptop, standing in coffeeshop
Steps: 20, Sampler: Euler, CFG scale: 7, Seed: 2020086530, Size: 512×512, Model hash: 81761151, Batch size: 4, Batch pos: 0

The seed number is the random cloud that the AI looked at. I’ll copy that seed number, and paste it into the Seed box in the UI, replacing -1, which picked a random seed each time. Now, I want to have several different people (sampling methods) look at that same cloud. Here is the relevant part of Automatic1111’s interface:

Sampling steps and sampling method

Sampling steps = how long we’ll spend squinting at the cloud, trying to come up with an image that matches the prompt.

Sampling method = the person looking at the cloud. Each algorithm starts with the same static image (driven by the seed number), but has a different way of interpreting what it sees.

Now, I could try different methods and steps individually, but further down the UI, Automatic1111 offers a better way: X/Y plots.

This tells Automatic1111 to generate different combinations of images in a grid – and obviously, you’re gonna wanna click on this to see the full details:

Each column is a different person looking at the cloud. Overall, most of the sampling methods come up with roughly the same kind of image – just like if you ask several people to look at the same cloud and come up with a specific prompted image, they’re going to see roughly the same subject arrangement, but the devil is in the details.

Each row is a longer period of time that the person spent looking – from 5 steps to 100 steps. More steps usually mean better image quality, but they also mean longer processing time. As you look at a single column, and look at whether images got better with more processing time, you’ll find that each sampling method has a different “good enough” area where it doesn’t make sense to keep doing more processing time.

DPM Fast and DPM2 a Karras are exceptions. Those sampling methods introduce new noise in each processing step, so their images keep changing with each different number of steps. That’s useful if you like the overall idea, but you want fresh inspiration with each step.

In a perfect world, I’d find the sampling method that produced the best quality results in the least steps, so I could produce images quickly. For me, that’s usually Euler – it produces good-enough images in just 10-20 steps – good enough that I know whether I want to bother digging more deeply in that seed or not.

In the real world, I often just shove a seed number into a config like the above one, let it crank away for a while, and then pick the best image out of the batch.

However, a lot of those samplers produce pretty redundant results. Let’s pare it down to a simpler, faster test with less options:

  • Sampling methods: just my 4 favorites: Euler a, Euler, LMS Karras, and DPM2 a Karras
  • Sampling steps: 15, 20, 25

That’s just 12 images (4×3), and my older gaming laptop with an NVidia 3060 can generate that grid in about 60 seconds:

Photos of man holding laptop, standing in coffeeshop, by Stable Diffusion

So my workflow looks something like this:

  1. Put in a prompt for an image I want, using these settings for fast image generation: Euler, 20 steps, CFG Scale 7, 4 batches, 4 images per batch. Click Generate, and I get 16 candidate images back in about 60 seconds.
  2. If I don’t see any images that are good enough, go back to step 1 and refine my prompt.
  3. If I see an image that’s a perfect fit, we’re done here.
  4. If I see a candidate image that’s close but no cigar, run it through an X/Y plot with my 4 favorite sampling methods and 15-20-25 steps. I get 12 refined versions of the image I liked.
  5. If I see an image that’s a perfect fit, we’re done here.
  6. If I see a candidate image that’s close but no cigar, I have one last trick: variations.

Let’s get variations of an image by looking at similar clouds.

Let’s say that in that last pass of 12 images, I really liked this one, DPM2 a Karras with 25 steps:

Man holding laptop, standing in coffeeshop, by Stable Diffusion

I like that his shirt doesn’t exactly match the same denim as his pants, he’s holding the laptop, there’s not something weird under his arm, and his face is normal. (That’s surprisingly hard to get with AI). His hands and arms are just a little weird, but we’re really close.

I’ll change Automatic1111’s settings by turning off the X/Y plot (set the Script dropdown to None), and then:

  • Set the Sampling Steps & Sampling Method to match whatever picture style I liked the most – in this case, 25 steps with DPM2 a Karras
  • Set batch count & batch size to 4 & 4 (to generate 16 images quickly – not all video cards support batches of 4, more on that another day)
  • Click the Extra checkbox next to Seed, and the Variations part shows up:
Seed and variation config in Automatic1111

I’m going to change Variation Strength to 0.05, meaning change the image just the tiniest little bit with each pass. This means, change the original cloud that we’re looking at by just a little each time – randomize the original starting noise, and that’ll also randomize the end result just a little. Because DPM2 a Karras isn’t the fastest sampling method, building these 14 variations takes about 4 minutes:

Variations based on slightly different clouds

I’ll be honest, dear reader: none of these images are perfect. I’m probably going to want to tweak the prompt a little in order to encourage the AI to build more accurate hands – but that’s a story for another blog post. You get the general idea.

Sure, that photo is boring.

But you can get anything you want – as long as you’re willing to put in the work to:

  • Hone the prompt over time
  • Generate a lot of starting point images by looking at a lot of clouds (seed numbers)
  • Try asking different people (sampling methods) to look at the clouds
  • When you find something close, run a few more variations of it to find one that’s just chefs_kiss.gif

For inspiration, hit Lexica.art and start bookmarking styles of art you like. You can pick a style that’s all you – it doesn’t have to be boring stock photos like the above ones. Maybe you want your presentations to revolve around evil rabbits or isometric art or wolf detectives or cartoon pirates. Your presentations can be as beautiful and quirky as you like.

Here’s how to get started.

I really do wish I could teach you everything inside a single blog post. Automatic1111’s tooling is way too hard for most folks to pick up as their first tool. Here are the tools I recommend folks try, in order:

  • Windows: NMKD Stable Diffusion GUI – a simple all-in-one download with everything you need – as long as you’ve got an NVidia GPU with at least 4GB memory on it. (Not 4GB in your computer, mind you, but 4GB on the card itself.)
  • Apple Silicon: DiffusionBee – like the above, a simple all-in-one download.
  • Web page, any platform: Automatic1111 – and yes, the instructions say it’s an automatic installation, but trust me, it’s not as easy as the above two. Start with those first.
  • Command line, any platform: InvokeAI – if you want to generate images at the command line or via an API, this is really powerful.

Resources for learning more:

Things are changing furiously quickly in this space, with tons of new tools and techniques popping up constantly. Prompt engineering – figuring out how to write the right prompts to get the images you want, quickly – is an amazing new field.

Bought Another Blue 2013 Jag XKR-S.

About two years ago, I got a 2013 Jaguar XKR-S hardtop, and I love it. It’s comfortable, fast, loud, useful, and pretty. It’s the definition of what a grand tourer should be: a combination of luxury and performance that make it easy to drive long distances at speed. I do way more road trips than track days, although that might change now that I’m in Vegas.

But originally, when I first fell in love with the XKR-S, I wanted the convertible version.

So I kept my eye out on the market – mostly out of curiosity – watching prices and availability. When this French Racing Blue convertible came up on BringATrailer, the bidding was really slow and low. I told myself that if it was at $50K or below in the final two minutes, that I’d pull the trigger. Bidding stalled at $48K, and neither of the other bidders’ profiles looked like they were really serious, so I bid exactly once at $48,500 – and won.

So now I’ve got both the hardtop and the convertible. Same year, same color.

It was funny seeing them next to each other for the first time – it was immediately obvious under good lighting that they’re wildly different shades of blue right now. That’s because the hardtop’s paint was recently color-corrected, something I’ll wanna do to the convertible just so they look great sitting side by side.

I don’t know yet whether I’ll keep both of them long term. They’re very rare, and it’s kinda fun to own two of them. Jaguar never published detailed numbers on how many were built, but the current best guess is that across the entire 4-year production run, there were 394 convertibles worldwide and 941 hardtops sold worldwide. That sounds like a lot, but only a percentage made it to the US, and of those, only a percentage were French Racing Blue. For example, in 2012, out of the 136 convertibles sold worldwide, only 25 were allocated to the US.

Would I buy a third XKR-S? Well, there’s one other special edition of the XKR-S: the XKR-S GT, a track-focused version with the same engine, but a rougher ride for racing usage. Only 45 of them were built, 30 for the US & Canada, and they were only available in white. I’ve never driven one, and at the prices they usually go for, it’s just not my bag. Like I said, I do more road trips than track days.

So that puts me at two Jags, the Speedster replica, and the Ferrari 328 GTS. I only have a 3-car garage, but the Ferrari is at Donnie Callaway’s getting restored, so that buys me a little time on my decisions.

Doug DeMuro Reviewed My 944 Turbo.

After I bought my 1988 Porsche 944 Turbo off Cars & Bids, I contacted Doug to see if he wanted to review it. The video just went live:

Doug remarks a lot in there about the car’s amazing condition and originality, and that’s certainly one of the big things that sold me on it.

He sums it up with, “This is an exceptional car. It drives well, it’s fun, it’s exciting, and the Turbo version is pretty fast, and I just think there’s a lot to love here.”

I agree!