Data By and For the People

I’m the rare human that loves public speaking. Yes I get nervous, of course, but I also get a huge charge out of it. So this Slack from my coworker Susy had a special amount of serotonin accompanying it:

I soon huddled with another colleague Rosana (like Susy, Rosana’s also far more familiar with crowdsourcing than me). I got into my best Michael Krasny consultant curiosity groove to beat back the imposter syndrome and hopefully helped in crafting the panel topic: Crowdsourcing: Data By and For the People, to be hosted fittingly at Mozilla’s Community Space in SF.

Fortunately the conference organizer Epi loved our topic, and we enlisted Megan to be on the panel, along with Christian Cotichni of HeroX, Nathaniel Gates of Alegion, and Anurag Batra of Google.

Since the link to our panel doesn’t include what we wrote up to describe it, I’m pasting it here so you can get a sense of what it was really about:

Per CSW’s website, by “engaging a ‘crowd’ or group for a common goal — often innovation, problem solving, or efficiency,” crowdsourcing can “provide organizations with access to new ideas and solutions, deeper consumer engagement, opportunities for co-creation, optimization of tasks, and reduced costs.” 

But is this a fair value exchange for everyone involved? The above solves a number of problems for companies, but do they help contributors?  And what role does crowdsourcing play in social equity?

As products and services increasingly incorporate Artificial Intelligence (AI), crowdsourcing has a critical role to play in ensuring new technologies and algorithms serve society equally. To quote The Verge: “Data is critical to building great AI — so much so, that researchers in the field compare it to coal during the Industrial Revolution. Those that have it will steam ahead. Those that don’t will be left in the dust. In the current AI boom, it’s obvious who has it: tech giants like Google, Facebook, and Baidu.” If we build the next generation of AI apps using data from a few select players, we risk creating a world that serves the needs of a few corporate entities vs. the needs of all.

If we crowdsource data to train the next generation of AIs, we stand in a much better position to deliver products and services that incorporate the needs of many vs. a few.

This panel will explore how different organizations are approaching crowdsourcing, and dive into the specific implications around rewarding contributors, and the social responsibility of organizations who use crowdsourcing. 

We organized a prep call which went great – we got into some of the thorny topics, surfaced some healthy panel-bait discomfort. But by far the most memorable part was at the end, when, one of the panelists (we’ll let the reader guess) announced s/he had to “go to another part of campus” and “just wanted” to say that the published topic – the one that we just prepped for, Crowdsourcing: Data By and For the People – really shouldn’t be about ethics at all, because nothing really “goes anywhere” from ethics discussions. Instead, we should delve into the “intricacies of crowdsourcing itself.” 

Just before s/he then dashed off to grab a campus bicycle, I reminded the call that the organizer loved it, and I was super grateful that another panelist chimed in to say the topic was precisely why s/he  agreed to be on the panel.

I quickly developed a strong energy for day-of-show.

And it went fine; granted, we were one of just a few panels that weren’t in the main building so: away from all traffic, and, at the tail end of the conference at 3:00pm on a Friday. So we were heartened by the ten or so folks who did show up and listened attentively.

We tackled the time this way:

  • How do you tie into crowdsourcing? 
  • How do you see contributors benefiting?
  • How about the economics?
  • How about ownership and meaningful influence?

And the takeaway? Our closing point was:  if you get others’ data, use it only for the intended use case. And as Megan reminds us, “be sure the intended use case is clear; “consent” doesn’t mean anything if people don’t understand what they’re opting into. And if it changes, that’s okay! Just let people know and require them to consent again.”

Personally I’m quite gratified we didn’t decide to unilaterally change the terms of service on our panel topic, either.

UX vs. DX

I was introduced to Estelle Weyl through my colleague Ali, who suggested Estelle as a speaker for Mozilla’s speaker series. I was intrigued with Estelle’s teaching on the differences between how we as humans perceive the speed and performance of our web browsers (vs. the precise, technical “reality”).

She was of course great, and her final slide also called out another important distinction:

So how fun was it when, a bit over a year later, she invited me to moderate a panel on, yep:

https://forwardjs.com/schedule

We had Tomomi Imura of Slack on board, and were super fortunate to recruit Sarah Federman (newly) of Atlassian and Jina Anne.

This group was so amazing, they agreed to meet on a holiday before the event to huddle. It was there that the subtitle emerged:

We realize we hadn’t intended it, and while we didn’t want to make the Lakoff mistake, we did think it was cool.

So, whiskey it was.

Oh also, the conversation was as great as these women. Estelle and Timomi had previously posted different ways to tackle this. Estelle defines DX as “the methodologies, processes, and tools (frameworks, libraries, node modules, pre- and post-processors, build tools, other third-party scripts, and APIs) by which developers develop web applications for their user base.”

And, because developers are often users too (think developer tools, and of course, frameworks), Timomi approaches their DX in a way that exhorts developer tool makers to keep the developer experience – as users – in mind.

So as a group, we broke this down further, looking at why some developers may be tempted to not think about the UX (whether those users are developers or not, per above), and instead adopt a “resume-driven development” approach (h/t Estelle again) that favors them showing off knowledge of sexy new frameworks vs. delivering a solid UX.

There are also work culture pressures to deprioritize UX. Ship fast or first or cheap, user-be-whatevered, can be a hard force to combat when it comes from management.

But, as others pointed out, developers can still make the choice to not be overly-reliant on tools or frameworks so they can choose the best route for the end-users. Individual engineers can ask forgiveness vs. permission in adopting a user-centric, front-loaded design approach from the start. Finally, to steal (again) from Estelle:

Taking the time to do it right the first time is “fast to code”. Refactoring is not. If you keep the six areas of concern — user experience, performance, accessibility, internationalization, privacy, and security — at top of mind while developing, your finished product will be usable, fast, accessible, internationalizable, private, and secure, without much effort.

Estelle Weyl

Women Do Tech

Cross-posted from Mozilla

This June, two of my worlds collided beautifully when my employer, Mozilla, announced its sponsorship of a prize for the most privacy-respecting Women Startup Challenge finalist in the EU. On the side, I’d been volunteering with the organizers, Women Who Tech, for three years. So how did this all come together? And why?

When I joined Mozilla in 2011 to help run WebFWD, I was excited to support open source startups and their founders. The role was a great marriage of my experience with venture and startups, along with my desire to support innovation globally. As my role at Mozilla has evolved, my passion to support technologists globally has grown; today in my day job, I get to help our own developers around the world be more productive; and I’m still helping others “outside” Mozilla, as a mentor with WXR Fund and Hackers/Founders.

In 2015 when I met the organizers on a shared dist list, they were (and have since remained) focused on solving one big, persistent problem: less than 2% of all venture funding goes to women-led startups. Note that’s in the U.S.; the EU is a bit better, at 11%, but still far from ideal.

Compelled by the scope (and maddening nature) of the problem (and the tenacity and skill of the Women Who Tech team), I raised my hand. First, I helped recruit some online event panelists, including Julie Wainwright and Rebecca Eisenberg. Later, I helped design the startup challenge and have acted as an online and in-person judge. There I saw firsthand the caliber of the participating teams, which made me further lament the wasted opportunity that the current funding environment poses — not only for women founders, but for all the people they could serve if they only had the funding. Everyone loses.
Judging the first Women Startup EU Challenge in May 2017 @ London City Hall

When a broad mix of humans are behind technology, it leads to better outcomes, both in product and people. And, if you read through Mozilla’s Manifesto, you’ll see that Mozilla cares deeply about not just technology, but how technology impacts humans. While funding is hardly the only disparity between men and women in tech, it is significant, as it determines who will be driving what solutions for our future. For all of these reasons and more, I’m thrilled to see the visions of Mozilla and Women Who Tech come together.