Girls + AI Summit 2.0: What Stayed With Me
The toughest a part of the Girls + AI Summit 2.0 wasn’t deciding what to attend. It was accepting what I’d should miss.
The schedule was so packed it was unimaginable to do all the pieces “proper.” There have been too many periods I needed to attend, too many individuals I needed to speak to, and little or no room to breathe between them. At one level, I made a aware option to step out.
I ended up sitting with two of my favourite folks to play hooky with, Sunny Eaton and Lori Gonzalez, speaking as a substitute of listening. The dialog drifted, because it typically does at good conferences, from instruments to penalties. We circled round how programs like ChatGPT complicate the concept of a “cheap expectation of privateness.” We deal with these instruments like personal conversations, despite the fact that they aren’t. That hole—between how these programs really feel and the way they really work—is the place a lot of the danger lives.
Virtually everybody else stayed put. The periods had been too good to overlook. And nonetheless, that dialog lingered. It was a reminder that even at a tightly programmed convention, a few of the most significant moments come from selecting the place to spend your consideration.
That pressure—between construction and spontaneity—outlined the weekend for me. In a approach, it mirrored the bigger conversations we had been having about AI itself: how a lot to automate, when to pause, and the way to decide on intentionally within the face of overwhelming chance.
What Saved Exhibiting Up
Wanting again on the schedule, it might be straightforward to explain the summit as a development from talks to workshops to hands-on constructing. However what stayed with me had been the questions that saved resurfacing.
One of many clearest throughlines was AI literacy—not fluency with instruments, however understanding. How these programs behave. The place they fail. And the way a lot company we hand over after we use them. A number of talks traced turning factors: worry giving option to curiosity, skepticism shifting into discernment. There was a shared recognition that opting out isn’t impartial. Literacy permits engagement to be intentional quite than reactive.
Because the day shifted from listening to constructing, the emphasis moved from instruments to workflows. Essentially the most attention-grabbing conversations weren’t about intelligent outputs. They had been about boundaries and judgment. Not simply what will be automated, however what must be.
Ethics confirmed up not as philosophy, however as follow—particularly round knowledge high quality and provenance. “Bias in, bias out” wasn’t a slogan. It was a warning. The priority wasn’t solely what AI produces, however what we feed it: whose experiences are represented, which sources are trusted, and the way shortly flawed assumptions scale as soon as embedded in a system.
That thread carried instantly into entry to justice. AI wasn’t framed as a magic repair. If something, there was a sober recognition that poorly designed programs can widen gaps as simply as shut them. Entry to justice wasn’t a mission assertion. It was a design constraint.
Beneath all of it was governance—not as a future coverage query, however as one thing already underway. The folks selecting distributors, setting inside requirements, and defining acceptable use are shaping the long run in actual time. Governance defaults to whoever is within the room.
Taken collectively, the summit wasn’t about celebrating AI. It was about accountability. About participating with know-how in ways in which maintain up over time.
What We’re Taking Residence
I didn’t go away with a listing of instruments to attempt. I left with a clearer framework for approaching AI work.
Literacy comes earlier than leverage. Adoption is an organizational design drawback, not only a coaching concern. Ethics begins with inputs, not outputs. Entry to justice should be constructed into programs from the start. And governance is already underway, whether or not we acknowledge it or not.
None of that’s flashy. However it’s foundational.
If there was a shift, it was this: transfer intentionally. Construct the capability to pause. Ask higher questions earlier than accelerating. The long-term influence of AI received’t be decided by how briskly we transfer, however by how thoughtfully we do.
Subscribe
Get skilled insights and sensible suggestions delivered to your inbox each week.
Girls within the Loop
Which is why it feels vital to call one thing I’ve deliberately held till now: this was a convention centered on and led by ladies.
That mattered—not as branding, however as posture.
At many AI conferences, there’s a YOLO vitality: construct quick, deploy sooner, kind out penalties later. The emphasis is on scale and upside, with danger handled as friction.
That wasn’t the posture right here.
As a substitute of “What can we construct?” the questions extra typically gave the impression of “What ought to we construct?” and “Who does this have an effect on?” There was consolation with uncertainty. Openness about tradeoffs. A willingness to confess what hadn’t labored.
Even the design selections mirrored that care. Audio system had walk-up songs. Dolly Parton’s 9 to five marked transitions. Classes had been labeled mini, midi, and maxi—not by hierarchy, however by scale. None of it felt gimmicky. It felt intentional. Human.
This wasn’t an absence of ambition. It was a unique form of ambition—one oriented towards sturdiness, influence, and belief.
Illustration didn’t simply change who was talking. It modified what felt value discussing.
Shaping What Will get Constructed
Cat Moon and her group at Vanderbilt Regulation created greater than a convention. They created an area that modeled a unique approach of participating with AI—curious, accountable, and deliberate.
I left not feeling pressured to undertake extra instruments, however clearer in regards to the accountability that comes with adopting any of them. In a area that always rewards pace, this felt like a vital pause.
If that is the place AI conversations are headed—extra reflective, extra inclusive, extra trustworthy about tradeoffs—it’s a course value investing in.
The way forward for AI isn’t formed within the summary. It’s formed in moments and weekends like this one.
That is why it issues who’s within the room when choices are made.
Subscribe
Get skilled insights and sensible suggestions delivered to your inbox each week.

