Site icon Brent Ozar

GroupBy’s One Year Anniversary and What’s Next for 2018

Preliminary podcast artwork

Last year, I unveiled GroupBy.org, a new free community event where anybody could submit a session on any topic, and the attendees would vote to see what sessions made the cut.

I had a lot of questions when we started. Would people submit sessions? What kinds of sessions would readers vote for? Would they prefer attending live, or getting the recordings?

When the event first took off and got a lot of abstracts and attendees, I purposely took my hands off the wheel and started letting the audience decide where things went. I stopped doing marketing, stopped pushing for abstracts, and watched to see what the community latched onto. (After all, this is just a fun experiment for me, not something that needs to see an ROI.)

With 5 events and over 50 sessions under our belt, let’s look at a few metrics and talk about what I learned.

What I learned about the session submission process

I wanted to make it easy for people to submit an abstract, get feedback on it, and have it automatically eligible for several rounds of voting for successive events. I didn’t want speakers to have to re-submit every time – I know as a speaker, I usually submit some of the same talks over and over. No sense in reinventing the wheel.

So the number of 115 abstracts submitted for 5 events sounds low, but I was happy with it. We had more than enough to have competitive voting. We’re not overwhelmed with abstracts, but we don’t need to be.

However, I had to modify the “automatically re-submit it every time” part – we ended up with a few abstracts that the presenters weren’t refining based on feedback, and weren’t doing well in the voting, so it didn’t make sense to keep ’em around.

I’m comfortable with where this is at, and we’ll stick with this for 2018.

What I learned about the peer review process

Only 486 comments have been left.That 486 number is deceivingly high, too, because it includes comments left on sessions after they went live.

In the beginning, abstract review was loud & lively, but this has definitely tapered down over time. The new-abstract feedback is pretty dang silent. I let this go to see if folks would pick it up, and they definitely didn’t – not their fault, the site just doesn’t give reviewers the tools and incentive they need to talk about an abstract.

I think if this is going to be a big goal going forward, I need to make it easier for reviewers with a better web site structure for it. To me, success for GroupBy means growing the next generation of speakers, and we need collaborative abstract review to make that work well. I’m going to put some thought into this during my next personal retreat to figure this out.

What I learned about voting

We initially started with the readers rating sessions individually, and then taking the top-rated sessions. That was a lot of work for readers, though, so we switched to just letting them vote and pick their favorite 10. I’m very happy with how that’s going – the events have been a neat mix of SQL Server topics.

When I first unveiled the anything-goes format of the event, I was actually worried that the topics might suddenly stray way away from SQL Server, like into pure development (C#, Java) or into systems administration. That didn’t happen – the top voted sessions focused on the Microsoft data platform.

I like this a lot, and I have no plans to change it in 2018.

What I learned about the event itself

When I first started it, I was the only cohost because I wanted to show folks what I meant by cohost. Later in the year, I invited past presenters to be cohosts (since they understood how the event worked,) and that worked really, really well. At last week’s event, I stayed online the whole time and popped in for intros & breaks, and I’m there as a backup, but I wasn’t on camera for any of the sessions. I’m really happy with that because it makes the event more sustainable – it can keep working even if I’m not around.

For audience Q&A and chat, we started with using a hash tag on Twitter, but that experience is miserable. Folks don’t want to set up a hash tag monitor column in their Twitter client, they forget to include the hash tag, they can’t ask long detailed questions with threads, etc. We switched over to using the #GroupBy channel in SQLslack.com, and that’s been great, ~600 folks in there now. (Plus it gets more people into the community Slack, which is good – this whole thing is all about getting more people into a sustainable community.)

The session length was a really big experiment. We started with sessions starting every 2 hours, but length was anybody’s guess – presenters could do anything from 45 minutes to 90 minutes. I wanted to encourage a really casual event with long onscreen chats during the breaks.

However, presenters thought they HAD to produce 90 minutes worth of material. While I’m all about free training, I also want this event to be about new speakers getting their feet wet, and 90 minutes is just way too long for first-time presenters. For December’s event, I let presenters pick their session length, encouraging them to do shorter ones.

For 2018, the events will start at 8AM Eastern, and a new session will start every hour. (That means sessions need to max out around 45-50 minutes.) In my own online training experience, that’s felt like the max length of time people can go without a bio break. New presenters will get a single 45-minute spot, and repeat presenters will get the choice of 1 spot or 2. Even if they choose 2 spots, though, it’s still going to be 45 minutes, then 15-minute bio break & chat, then another 45-minute spot.

What I learned about the recordings

Folks are definitely consuming ’em. Stats for 2017, not including the December events:

I really wanted presenters to have a page they could be proud of, with multiple ways for readers to consume their work: video, audio podcast, and transcription. I’m happy with the video & audio, but the transcription is just a big wall of text. We’ve tried having volunteers edit it, but it’s a lot of work, and we need a good, consistent result. Further, we need a timely result, because…

In 2018, each session recording will have one coordinated release date when:

Our super-awesome podcast partner, Digital Freedom Productions, has already put most of the building blocks into place, but the bottleneck is me (or the volunteers) doing the headings & screenshots in the transcription. To fix that, I’ll hire a part-time editor from the SQL Server community – more on that soon.

Exit mobile version