• Archives

  • Categories

  • Other profiles

  • wallaceadngromit.net

  • Partimus

  • Xubuntu

  • DC/OS

DevXCon San Francisco

A few weeks ago I attended DevXCon in San Francisco with a colleague of mine. Since my previous role was as a systems engineer, with community outreach being more of a hobby, I’d never attended an event like this. Since it was local I figured it would be a great opportunity.

The event began with a talk by Donnie Berkholz on “(How much) do developers really influence? Reset/reality check” (slides). Setting the stage for the value of developers and how to reach them, the talk quickly cited the Microsoft Hosting and Cloud Study 2015 where he highlighted page 42 of the report, on Stakeholder Decision Making Authority. The several of the top Influencers and Primary Decision Makers, according to the report, were IT Infrastructure Managers, IT Architects and Software Developers. While CEOs and CIO/CTOs had similar numbers, it was clear that the individual contributor had a say in what software was being used. He then cited 451 Research’s Q1 2014 DevOps Study which showed that the top two ways that developers learned about new tooling were word of mouth and trade and blog articles.

The rest of his talk went over various successful examples of companies meeting developers where they are and systems that worked for engagement including:

  • Rewards to high performing members of their communities
  • Prioritization of high quality documentation so developers could succeed
  • Making changes in the product as you notice your user base moving to a new base technology

He then sketched out a story around the tooling that developers used, stressing that while most open source projects unintentionally silo their tooling, developers use all kinds of programming languages and frameworks, operating systems, automation techniques, and more. As a result, the most effective evangelism and advocacy work tends to be done when you’re going to broad spectrum conferences where you’re meeting with developers working on various technologies.

Grace Francisco then gave a talk titled “Gloriously Global!” where she offered various tips for working with a global team and effectively scaling your team when working with a global community, including:

  • Daily individual stand ups that are shared across the team asynchronously (record, share)
  • Two team-wide syncs scheduled so no one is left out timezone-wise
  • Host team off-sites, actual face time from time to time is essential
  • Having firm, clear rationalization for all travel (your team will be scrutinized more due to the nature of the work)
  • Make sure work you do and share is syndicated across platforms (blog, social media, where ever your community is)
  • Build a program to support external champions which includes swag, training, recognition, tools they need to succeed

She also spoke some about how you go about finding developer advocates and evangelists, explaining that the skill set may be challenging to find (technical, able to work externally and openly, speak, write, and do community relations). She suggested asking internally first to see if there are engineers looking for a career change, and then looking to post-sales engineers who are accustom to public-facing, technical work with customers.

After the keynotes, I went to a couple talks on APIs. The first, by Tristan Sokol, dove into the use of Swagger Codegen to create SDKs against APIs from reference files, after reviewing the challenges of SDK creation by a small team. He admitted the downsides of using an automated tool that, by its nature, can’t do a great job of catering to the specifics of every language it provides an SDK for, but argued that it’s better than nothing. The next talk, by Romain Huet, provided a series of things to consider when building an API for developers to build against, including making sure the business case is clear. He also advised that a quick on-ramp, good documentation, and perhaps even on-site demonstrations of usage were valuable to adoption, and talked about making sure error messages for incorrect use are helpful to getting the developer on the right track. Finally, a status page about your API to keep users informed about outages is important.

Continuing in the API theme, Erin McKean gave the first afternoon keynote, talking about “Supporting new developers and your API” where she drew experience from the successful API that Wordnik provides. She mentioned that early on they learned a lot of students were using the API, many of whom were part of university classes, so they made some decisions based on supporting those students where they were. Other tips for supporting them included providing a simple, documented, sandbox where people could play with the API without doing any damage and reviewing error logs to see where people are struggling so you can make improvements to documentation as needed. Like others at the conference, she talked about champions in your community and suggested seeking them out to write blog posts, share workarounds, and even talk about other products when yours doesn’t have a goal of supporting something that some customers want.

Jono Bacon spoke next on “Measuring the Health of your OSS Community” where he worked to dispel the myths around flashy dashboards that seek to measure communities. He reviewed tangible (easy to track in a dashboard) and intangible types of contributions, noting that intangible things like happiness, personal development, relationships and having a rewarding experience are the things that frequently keep people engaged with a community. He shared some strategies for “measuring” these things, including: engagement in person (how many people have their laptops open? are sitting in front?), engagement on social media, and constructive participation during open Q&A sessions. He also explained that it’s important to track the path of a contributor in your community when looking for general trends, but also making sure it’s always easy for new contributors to gain status and recognition, including reputation “points” decay when involvement decreases so that older community members don’t have an unfair advantage.

He also made several key points:

  • Focus on the outcome rather than the process
  • Review on-boarding complexity
  • Identify, track and try to improve failure states
  • Follow retention of community members
  • Keep an eye out for grown stagnation

This talk led nicely into the next, from Bear Douglas on “Building positive developer support experiences” where she outlined a similar series of things you want to keep an eye on:

  • Adoption of your product/project
  • Success rate of integration
  • Customer Satisfaction (CSAT)
  • Willingness to promote (Net promoter score)
  • Retention over time

She reminded us that when a new person comes to your community, it’s often best to assume something has gone wrong. They are frequently coming to the community to solve a problem or get a question answered, and they may be stumped or frustrated. She stressed the importance of keeping that in mind when working with new community members and to cultivate real empathy on the team supporting them (not just a playbook). Listen to what they’re struggling with, engage in real dialog, and make sure you follow up and follow through so they don’t feel patronized or forgotten. She also stressed the importance of a public road map so community members can understand where a feature they’re eager to see or contribute to is in the priorities of the team and aren’t left frustrated by what they see as lack of, or delayed, progress.

What I really liked about this talk is that she also talked about the energy required to be this supportive. No one is happy, supportive and empathetic all the time. Don’t let your team members feel isolated, make sure there are team stand ups, lunches that remind them they’re not alone. Avoid trash-talk style venting, since it can easily spiral out of control and create a negative atmosphere (in spite of how good it may feel to the frustrated person at the moment!). Conversely, when there is good feedback, make sure that gets back to the product team so they know where they are succeeding. Back to team care, making sure there is a mechanism for handing off support when you’re too spent or upset to handle it, someone with fresh perspective can help and turn things around, turning a bad situation into a celebration-worthy success. She also suggested making sure high points are recognized and highlighted, and goodies are sent out to great community members, which makes everyone feel good.

Some more tips gleaned throughout talks as the day progressed included:

  • Make sure support is as public as possible (private email threads expend time and energy, and are only valuable to the individual you’re working with)
  • Keep an eye on the percentage of questions being answered by the community outside your company vs. inside
  • Make it an on-boarding task for new engineers in your company to introduce themselves to the community, not just the company
  • Encourage employees to be open first, share things externally as much as possible
  • Be pro-active about pairing up community members asking questions with community members who have expertise (question gets answered, experts feel valued, everyone wins!)
  • Define a path for a contributor to become a more serious, recognized contributor (reminded me of the value of Ubuntu Membership) and give them the tools to succeed in that path
  • Uncover obstacles that community members encounter and work specifically to get people unstuck and reward them for success
  • Communicate your work internally, sharing successes and failures so those inside the company feel included in the community
  • If you have a Code of Conduct (and you probably should!), build, use and enforce it properly

I think one of the over-reaching themes of the talks, especially later in the afternoon, was a focus on what to measure and how. While broad ideas like those from Jono and Bear can be a guiding force, there is no blanket solution for everyone to use, and you can’t just copy what others have done. You need to be attentive to your business, community and product to know where it makes sense to focus. The path from new user to paying customer is rarely a straight road, so work through collected data to learn your most successful outlets and make sure the time, money and effort is put in those diverse places (events, blog posts, support engagement), tailored for your particular community.

It was a fun event and I’m glad I went. I hope as I’m moving forward in my relatively new efforts as a formal developer advocate I can continue to learn from the practitioners who have come before me and work to implement them in communities I’m working in. Thanks to the organizers and everyone who came to lend their expertise to this event, and in their communities every day.

More photos from the event can be found here: https://www.flickr.com/photos/pleia2/albums/72157681120795004

Cross country by train: Part 2

Continuing on from my last post, our cross-country journey continued as we entered Iowa on Sunday morning. After a day of no proper meals due to not feeling well, I’m happy to say that I managed to consume about half my breakfast of eggs, potatoes and a biscuit.

As this second bit of the ride on the California Zephyr took us into the midwest the scenery did calm down from the breathtaking views in California and Colorado. Still, it was incredibly green and there were some lovely hills here and there. We also saw a fair number of horses and cows, along with fields that I expect would be growing a whole bunch of our food later in the season.

Leaving Iowa, we crossed the Mississippi River into Illinois. I’d seen the Mississippi River on trips to New Orleans and St. Louis, but this was the first land crossing I’d done, and the furthest north. Even up there it’s quite a river.

The ride through Illinois went smoothly, we saw several wind farms and we had a nice fresh air break in Galesburg. We soon left the countryside farms of western and central Illinois and made our way to the final destination of this train: Chicago.

I’ve been through the Chicago airport several times, and spoke at a conference south of there in Urbana-Champaign, but I’d never properly been to Chicago. In spite of not quite feeling 100%, we did have just over three hours before we had to catch our next train and I made a point to venture outside of Union Station. But first I beheld Union Station itself, it was a nice place and we were able to spend the rest of our time in the Amtrak lounge there.

I took a walk around Union Station and then walked across a small river to get over to where Willis (formerlly Sears) Tower is for some pictures. The fresh air was nice, but it was getting a bit warm outside on this side of the Mississippi! After all those mountains and glimpses of snow, I was suddenly reminded that it was the end of May.

Our next train boarded just before 7PM and whisked us off to Washington DC. This time we were on the Capitol Limited, and to mix things up we decided to go with a Roomette rather than the Bedroom on this segment. I probably wouldn’t do it again unless traveling solo. The privacy of our own sleeper space was nice, but we’re not small people and it was pretty tight quarters for the two of us. The top bunk, where I slept, was a bit smaller than in the bedroom and I was extra thankful for the don’t-fall-out net at the edge of my bed. It was only for a night and a half day though, and we weren’t tired of each other’s company, so it worked out fine.

We had dinner shortly after boarding, we were a bit disappointed to learn had the same dining car menu as the California Zephyr. Still, the steak was good, even if I could only finish half of it. It was also enjoyable watching the sun set as we rode through South Bend, Indiana.

I didn’t sleep exceptionally well on that leg of the journey. We had a stop in Toledo, Ohio in the middle of the night that I was inexplicably awake for. I then woke up around 6AM in time to wander around the platform for a few minutes during our stop in Pittsburgh. I think this journey was a bit bumpier than much of the journey on the Zephyr and it was delayed by about an hour due to freight congestion in the corridor.

As Monday morning came into focus, I’m delighted to report that I finished my entire breakfast, though I did skip anything resembling cheese or a buttery biscuit, going instead with plain scrambled eggs and potatoes.

We made our way through more of western Pennsylvania, into Maryland and then West Virgina. It was a lot of familiar views, green forests and fields. I’d vacationed in West Virginia before, but we hugged the Maryland border in the flat areas, so there wasn’t much to see until we got to Harpers Ferry, which is incredibly charming.

Our final stop was Washington DC! We had once again taken the entire Amtrak line. The station in DC was probably the most amazing of the trip. It’s huge and beautiful, and included an large food court with beautiful stairways taking you between floors. I was finally feeling well enough to have a more normal meal, so I got a beef salami and cheese crepe in the food court along with a bottle of water. The tucked away Amtrak lounge wasn’t nearly as nice as the new one in Chicago, so I spent most of my time wandering around the station taking pictures, and a few minutes outside to see the Capitol Building and get some pictures of the station from the outside. It was definitely May weather out there, my jacket was too much and I retreated back to the air conditioned station pretty quickly.

The final leg of our Amtrak adventure took us on the Northeast Regional. Sleeper cars behind us, we had regular business class seats on this train for the quick two hour, largely urban trip from Washington DC to Baltimore and Wilmington to finally Philadelphia!

We departed the train at the familiar 30th Street Station in Philadelphia. We had just over an hour there, taking time to relax for a bit before taking one final train, this time on the local Philadelphia transit, SEPTA regional rail. My father in law graciously picked us up at the station in Trevose just after 7PM and deposited us at the townhouse, just over a mile from the station.

And there we were! Coast to coast on trains. Would we do it again? Absolutely. It was expensive and time-consuming, but I love trains and the chance to fully disconnect, catch up on reading, and see the country is worth it from time to time. We weren’t overly social on the train, but chats during meals with fellow travelers were always nice, learning about them and their reasons for taking the train (usually either for the experience like us, or because flying was too stressful or too much of a hassle). Even before we were off the train I was thinking about the next one I want to take. The 46 hour Empire Builder that travels north of our route, from Portland to Chicago? Or perhaps a more southernly route on the 40 hour Southwest Chief between Los Angeles and Chicago, or between the same two cities but taking 65 hours to traverse Texas on Texas Eagle? I’ll definitely be looking for an opportunity to take the Coast Starlight up the California coast. So many great choices.

More photos (almost 450 of them!) from our journey are in an album here: https://www.flickr.com/photos/pleia2/sets/72157684493832435/

Cross country by train: Part 1

Over Memorial Day weekend MJ and I decided that instead of flying to Philadelphia, we’d get ourselves a pair of Amtrak tickets and take the train instead. We had always wanted to, but the trip lasts three and a half days and it’s difficult to rationalize such a long journey when we have precious little time off. Ultimately, we realized we could take the long weekend to do it, and besides, when else will we get the opportunity? Since we had a home at the end of our trip, we didn’t need much luggage, which was important for the final legs of the journey that lacked check luggage options.

Our trip took us all the way from San Francisco to Philadelphia. It consisted of a town car to the first station, a bus, three Amtrak trains (California Zephyr, Capitol Limited, and Northeast Regional), a SEPTA regional train, and the final mile to our townhouse in my father-in-law’s car. We switched Amtrak trains in Chicago and Washington DC before arriving in Philadelphia where we picked up the SEPTA regional.

We left early on Friday morning, taking a 7:50AM bus from the Amtrak station at the temporary Transbay Terminal in San Francisco. This connector bus was part of our cross-country train ticket, a necessity since the train doesn’t come to the San Francisco side of the bay.

The bus delivered us to Emeryville station where the California Zephyr begins the journey across the country at 9:10AM.

The train arrived on time, and in spite of the process of people herding that hasn’t changed since the 19th century (a woman shouting out where people should stand in line based on the car number they’re in), we all seemed to get where we needed to go and boarded the train.

For this 51 hour journey to Chicago we went with a bedroom on one of the sleeper cars. It includes a couch that folds into a bed, another bed above that that folds down bunk bed style, a chair and a private little toilet/shower room, which I ended up being incredibly grateful for. Big windows in the room allow for great views throughout the journey.

California was beautiful. I spent Friday morning doing as much work as I could before we lost internet access, which I had via a hotspot (no WiFi on the train). I got a lot done, but by the time we got deep into the mountains it was too spotty to do much online and I switched to reading. We’ve visited the Sierra Nevadas before, but going by train was a whole different experience. It was also fun to be so far up in the mountains where there was still snow on the ground.

We crossed into Nevada in the late afternoon, first going through Reno and then through the vast stretches of brown-colored landscape that none-the-less rivaled some of the other mountains I’ve seen around the country.

There were also stretches in Nevada where water poked its way in, showing us a wet and green foreground, with the backdrop of rolling brown hills, a scene we enjoyed during dinner.

Meals on the train are served communally, so for each meal we’d have a couple across the table for us to chat with while we ate. As bedroom customers our breakfast, lunch and dinner were included in the price of our ticket. The train was pretty full throughout our journey, so for breakfast and lunch we typically had to get on a waiting list which they’d call out numbers for as they seated groups. Dinners were reservation-based, which you’d book in the afternoon.

The first meal we had on the train was lunch. Sadly for us, the menu was a bit pork-heavy (lots of bacon), but it was ok to eat around the pork options, both of us having a hamburger for our first meal. It wasn’t the best hamburger I’ve ever had, but it was fine. They served dessert with lunch, I went with the lemon tart. For dinner on the first night a salad was waiting at the table when we arrived, and then we both had steak which was quite good! I had shrimp with mine. For dessert I made it about halfway through a chocolate cake before realizing I wasn’t feeling very well.

The night didn’t go well for me. The train rode through Utah as I had intestinal problems that took me to the bathroom frequently, but couldn’t be explained by motion sickness. No one else on the train was sick, so I’m thinking I boarded the train with a bug or some food poisoning. By morning I was through the very worst of it but my stomach was still upset, so I skipped both breakfast and lunch as we made our way through Colorado. I wasn’t feeling well enough to spend much time reading, but just taking in the sights and relaxing was perfect for me, dozing off here and there as I tried to give whatever sickness I had some room to recover.

And enjoy the sights I did. The first parts of Colorado we saw were a bit more green than Nevada, but still had the beautiful brown/red hills

We had a late afternoon stop in Fraser, Colorado where MJ insisted I get off the train for a few minutes for some fresh air. It was a brilliant idea, in spite of being sickly and pale as I hobbled off the train, the fresh, cool air made me feel a lot better. From there it was further up into the mountains as we approached Denver, giving us some of the best glimpses of Colorado mountains and even a bit of active May snowfall as we came out of one of the tunnels!

That evening we made our first major stop at a Union Station, this one in Denver. We had about a half hour at the station, which gave us time to leave the train and explore a bit. It was a really nice station, clean and open, with lots of fancy shops and trendy places to eat. It was a bit too trendy for me though, I wanted some simple crackers to munch on and ended up with a small, five dollar box of fancy crackers instead. Still, they did the trick.

We re-boarded the train around 7PM, right in time for our dinner reservations. I still wasn’t in shape to eat much, so dinner that evening for me was a dinner roll and some sparkling water. It was OK though, in spite of not feeling well, I was still enjoying the trip, and if I was going to be curled up on a couch not feeling well, it might as well be one on a train!

Over night we crossed into Nebraska, waking up the next day in Iowa.

DevOps Days in Salt Lake City 2017

I was in Salt Lake City for OpenStack Days Mountain West back in December, it was the first time I’d ever been to SLC and I certainly didn’t expect to return so quickly. Still, back in early March one of the organizers for Salt Lake City DevOps Days reached out to me and asked if I’d be interested in giving a keynote for the event. After some brain-racking as to an appropriate topic, I happily agreed to join them to talk about “The Open Sourcing of Infrastructure” which is part history lesson, and part learn-from-history lesson.

But before I talk about that, let me say a few words about Salt Lake City. When I was there in December I didn’t have a great opportunity to really take in how beautiful it was there. I walked around Temple Square and admired the Christmas lights and buildings, acknowledged the mountains, but my heart was elsewhere as I worked through a difficult time. This time I was in a better place. As I rode past the city and into South Jordan, UT where the conference was being held, I really got to check out the scenery. The whole area is surrounded by mountains, which were snow-capped even in May. It’s really something to wake up to, and be reminded of every time you look out the window. Beautiful mountains, right there!

The conference itself was held at Noah’s Event Venue, a great space that easily accommodated the 400 attendees, with a large auditorium on the ground floor, and several rooms throughout the space for open spaces and workshops in the afternoon. The sponsor room could have been bigger, it was a bit overwhelming crowd-wise when I ventured in a couple times and the sponsors were squished in pretty close to each other. Everything else went well though, the lunch lines moved quickly, the outdoor-ish space where we ate gave us a lovely view of the mountains (and was even better when they brought in some heaters the evening of the second day!).

This is the second year of this conference, and last year they established a tradition of having a stuffed animal mascot. Last year it was a unicorn and this year it was a Yak (a la yak shaving). Obviously I had to get my picture taken with the both of them. They also sat up there on the lectern during my talk, hooray!

Talk-wise, there were a few that stood out for me. The first was the opening keynote for the event. They brought in Ross Clanton, formerly of Target, but now at Verizon. I had the opportunity to meet and chat with him and the closing keynote speaker, Gwen Dobson, at the speaker dinner prior to the event. As we figuratively compared prep-for-our-keynote notes prior to the conference, I was certainly eager to hear from both of them.

Ross began his talk by giving some DevOps methodology background, but the meat of what was interesting to me was the strategies used inside of Target and Verizon to really drive the DevOps model. Executive buy-in was essential, but from there you also need management to take training seriously, in several forms. You don’t teach an organization to adopt DevOps by reading a book and expecting and over-night transformation. Instead, you need varied methods of moving the organization forward and celebrating wins, he suggests:

  • Encourage collaborative learning environments where peers teach peers as much as instructors do, and it’s OK to fail and ask questions
  • Run internal DevOps days, bring in a couple outside speakers but also internal folks who have expertise and stories to share
  • Host six-week immersion engagements (“Learning Dojos”) where teams work on their actual backlog using DevOps strategies and have the freedom to learn and ask questions, while solving real problems, not examples created by instructors
  • Gamification of team progress, where teams get points for various DevOps skills and capabilities they’ve started incorporating into their work and are rewarded (Verizon has the DevOps Cup, like the Stanley Cup, awarded each year!)
  • Even if you aren’t winning a DevOps cup, make sure management knows how important it is that they acknowledge and celebrate any positive progress made toward the adoption of DevOps principles
  • Don’t fight people who resist change in your organization, instead do awesome things with your allies, make progress, and most of the nay-sayers will join you eventually

Later that morning we heard from Rob Richardson on “CI/CD on the Microsoft Stack.” This was interesting to me because in spite of my own aversion to proprietary software, I do understand that CI is important for the entire software industry and had been remiss in ever looking into what is available for developers on Windows doing .NET programming. His talk walked the audience through setting up a CI/CD pipeline using TeamCity for CI and hooked into Octopus Deploy for CD (note: both proprietary) that are available and have support for Microsoft-focused environments, and specifically .NET in the case of Octopus Deploy.

Now, I won’t say that this is immediately valuable to me in a practical sense, since I don’t use any of these tools and am uncomfortable building infrastructure tooling around proprietary solutions anyway, but I was appreciative for the broadening of horizons. I learned that there are easy CI/CD options for folks working in the Microsoft world, and adoption of them by people outside of my open source bubble will make the software world better for all of us.

That evening I met up with a couple colleagues from Mesosphere who were attending the conference! Sam Agnew works in sales and joined us from his home base of Denver to meet with folks at the conference and Tim Harper works in engineering on Marathon remotely from a city just south of SLC. After the evening social at the event venue, we all went out to enjoy a nice meal of Mexican food and some drinks. They’re both super friendly and easy to talk to, so it was fun to get to know them a bit. I also found great value in chatting with them about Mesosphere and DC/OS, they believe in the company and products as much as I do, but don’t have the Silicon Valley slant on their opinions and observations about where we’re going.

The second day I gave the opening keynote. During this talk I guided the audience through the past couple decades of infrastructure with an eye on the shift from proprietary to open source software. From there I focused on what we’re open sourcing on the operations side today, and things to consider as we once again become dependent on proprietary technologies, even if they are “in the cloud” this time.

I stopped short of flat out telling people not to use proprietary tooling, or to never consider building their applications into proprietary, hosted APIs and tool kits. A lot of companies successfully do this and a lot of the sponsors at the event make their money by providing hosted products that make sense for them. Instead I implored them to think about their choices carefully, and provided a list of things to think about, including the risk of vendor lock-in, price increases, security and reliability concerns and understanding if/how your (and your customer’s!) data will be used by the vendor. Looking back, these were the same things we asked ourselves a decade ago when we shifted to using Linux as the infrastructure platform of choice. Slides from my talk are up here (PDF).

Thanks to Sam Agnew for taking a picture during my talk!

The final talk that really stood out for me came from Rob Treat who spoke on “Production Testing through Monitoring.” There is a lot of focus in the DevOps world around CI with testing, but the truth is you’ll never find all bugs through testing. He shared a handful of funny (but serious!) examples where once in production, users did things that the developers never thought of that caused serious production issues. This wasn’t because they weren’t testing, but instead because our imagination will simply never come up with every potential use, or misuse, of the software we’re building. This is where monitoring and metrics become essential.

As someone with an operations background who really likes monitoring (I run Nagios at home), this seemed obvious to me, but he took it one step further to make it something worth noting: You don’t just monitor basic things like CPU heat, processes running and return codes (in fact, you might be tracking too much of this kind of stuff), you also track things that make sense for your particular business. This returned me to the talk by Jeffery Smith at the DevOps Days in Seattle where he stressed the importance of IT actually understanding the business.

Rob demonstrated by walking us through an example of using metrics as they tried to figure out why traffic and sales were lower than normal for a couple days. After looking through a bunch of technical reasons, they finally overlaid email bounce statistics over the data and learned that for a couple days, bounces were higher than normal. Since much of the company’s sales traffic is driven by these emails, that caused a clear problem on those days. Having the data to draw that conclusion was vital, but they wouldn’t have known to collect that data if they hadn’t been tuned into how the company drives sales and the fact that tracking something like email bounces would be valuable.

Huge thanks to the organizers of this event. They did a great job making us feel welcome and making sure we had everything we needed. As speakers we also got amazing Utah-themed gift baskets which they graciously offered to ship to us (couldn’t bring it on the plane due to liquids involved, and I didn’t check a bag). The attendees were great too, everyone I spoke to was very friendly, even after they found out what strong feelings I have about using open source and open standards, hah!

More photos from this event here: https://www.flickr.com/photos/pleia2/albums/72157681808549041

Outdoor Caligula, trains, MST3K and eateries

Back when I lived in a house in Schwenksville, Pennsylvania, I would often bring Caligula outside with me in the warmer months to work in the garden or just generally relax outside. He had a 50 foot lead that allowed him to explore, but not get close to the road or into the poison ivy-ridden woods. He enjoyed these visits to the outdoors, chasing chipmunks and laying in the grass in the sun. Simcoe was less interested in outdoor time, in spite of numerous attempts, she was always a bit too afraid and didn’t like wearing a harness.

Young Caligula, gardening in Pennsylvania

Fast forward to today. Caligula has been living in a high rise in downtown San Francisco for over seven years! We haven’t brought him out during all this time. I’d loosely mention taking Caligula out to a park here and there, but Simcoe didn’t like being left alone and she’d often react badly when we brought Caligula home from the vet (hissing, growling, for days!). And I figured she still wouldn’t be interested in coming along for the outdoor adventures. Now that we have just Caligula, it was time to revisit outdoor adventure plans. This past weekend we brought him to Golden Gate Park, where we found a quiet patch of grass not too close to anyone else and enjoyed some food (picked up from a Mexican food truck) as Caligula wandered around on a short leash.

We weren’t sure what to expect. I’d never brought him to a public park before, and I’m sure the car ride over wasn’t his favorite thing, but he loved it. My often lazy cat spent the hour and a half there wandering around our blanket, and then dragging me around so he could explore further.

Caligula in Golden Gate Park

Eventually we rounded off our day as the wind picked up and it got a bit cooler, but I’m really happy that he had such a nice time. I know I’ve been pretty down since losing Simcoe, and I think he’s really missed having his snuggle buddy. It was a good way to cheer all of us up.

I’ve mentioned that 2017 has been a tricky year for me, but I’ve started to feel better. Instead of spending so much non-work, non-traveling work watching TV, I’ve transitioned back into reading. My interest in other hobbies has picked up too, I’ve started moving away from so much computer work and decided to get more serious about my interest in model trains. When I was in Philadelphia last time I picked up a starter train set at a toy show, and I’ve now started to refresh my memory on some of the other basics. I subscribed to Model Railroader magazine, and am now somewhat overwhelmed with how much opportunity there is to learn and explore. I’m also struck by the fact that hobby-wise I’ve mostly focused on digital and outward-focused projects. This will be one of the first that gets me back to hardware, but it quickly occurred to me that it can be pulled into a bunch of the electronics projects I’ve idly wondered about over the years. Arduinos and sound-activated controls for a model railroad set? It’s totally going to be a thing!

Increasing the scale, we decided to go back to Philadelphia over the week of Memorial Day. As we were musing about travel, my interest in trains distracted me into talking about cross-country railroad trips and MJ seriously suggested we finally do it for this trip. After geeking out over routes for a couple hours, MJ secured tickets for us on the California Zephyr which we’ll take the entire length, from Emeryville to Chicago in one of the bedroom compartments. From there we’re taking a Capitol Limited to Washington DC in a Roomette and then on to the Northeast Regional to Phliadelphia in Business Class seats. How long does this trip take, you ask? We’re leaving from San Francisco’s temporary TransBay Terminal at 7:50AM on Friday the 26th and arriving in Philadelphia at 5:15PM on Monday the 29th. From there we’re taking the SEPTA regional rail from 30th street station in Philadelphia up Trevose, where the train drops us just over a mile from our townhouse. So it takes a long time and train is not cheaper. Traveling how we are, in the bedroom and roomette is actually considerably more than flying. For us, it’s all about the experience. I’ve not seen much of the center of the country, there are beautiful places I’m missing out on. Taking a train through over the course of a few days is a pretty exciting proposal, I’m really looking forward to it.

With all this train stuff, I realized over the past year how much more adventurous I’ve gotten with rail-based public transit. I’m slowly starting to default to it where it makes sense time-wise, and sad about missed opportunities to take it in the past.

I also recently finished reading Train by Tom Zoellner. He takes several journeys on train lines all around the world, and weaves a tale that blends his experience on these routes, conversations he has with fellow train passengers and a hefty dose of history about each line, and those which are naturally related to it in some way. It was a beautifully written book, and made me even more excited about our cross-country journey! I recently finished the audiobook for Ringworld. I read the book years ago, but never really got into the series. I decided this time around to buy the series as audiobooks and start making my way through them. I got an audiobook of If the Oceans Were Ink: An Unlikely Friendship and a Journey to the Heart of the Quran which has so far been incredibly engaging. Back to the pages, I’ve been reading Madeleine L’Engle’s The Arm of the Starfish and my second book by Brene Brown, I thought it was just me, but it isn’t.

But OK, I’m not just spending lots of wholesome time reading. The new season of Mystery Science Theater 3000 (MST3K) came out several weeks ago and I’ve been doing my best not to binge watch. I slowly made my way up to 1105, the episode that has my name in the credits because of the Kickstarter campaign. I then went through the next few pretty quickly, they’re just so good! And MST3K has been an important part of my life since I discovered it in the late 90s on the SciFi channel. I don’t remember how I found it, I must have just stumbled upon it in my general watching of the SciFi channel. It’s what made me join my first IRC server to chat with fellow fans. It was there that I met my ex-husband who introduced me to Linux, and dove into IRC client scripting and creating websites. Later I helped a pile of fellow fans run an MST3K fan site, which was tricky after the show stopped airing, but gave me my first experience scouring the internet for stories, which I later used in my work on the Ubuntu Weekly Newsletter.

I had my doubts about a reboot of the series, on the one hand we had many of the original cast and crew members participating, but on the other they suddenly had big names and cameos being announced as part of the project, and there was a real risk of the show getting more serious than I would enjoy. Thankfully, my fears were not realized. The show is just as silly and campy as it ever was. They didn’t let a budget or big names go to their head, it has the feel and jokes that I came to expect from MST3K.

At home things are chugging along. As I write this on an early Friday morning before work Caligula is in super snuggle mode and is curled up against me. He’s been like this since we lost Simcoe. We think he’s lonely, as my trip to SLC this week didn’t leave him the happiest (MJ was at work all day). There is a temptation to get him a new kitten friend, but every time I think about it I get sad and realize I’m not ready for it. Plus with all my travel lately I don’t really have the time to train a new kitten, who will have claws.

Speaking locally, this past month we’ve seen the closing of two Italian establishments in our area. A.G. Ferrari has closed all bay area locations. It’s a shame, that was my go-to spot for fresh Parmesan cheese and Italian bread. Umbria, my favorite Italian restaurant in the city, and conveniently on our block, has closed. We made our way down there on their final night, finding ourselves in the midst of other random diners, as well as family and friends wishing the owner a fond farewell. There were speeches, stories, hugs, and tears, which we were included us in. Thankfully this is not the end of the story for them! They’re moving up to Glen Ellen in Sonoma, with progress being tracked on their #WheresGiulio website. We’ll have to visit when they finally open, but I’ll really miss having such a great local place.

We’ve also been carving out bits of our weekend to actually catch up on boring adult things. Our dining area has always been a den of chaos, and I’ve finally started tackling that by picking up a new piece of Ikea furniture so we have a place to pack things into. The chaos still mostly exists, but it’s starting to be tamed and some things are now put away, hooray!

I think this weekend will be a stay in one. I have a ton to do here before I depart for two weeks. And a busy work week is on the horizon with attendance at DevXCon on Monday and a journey (ferry + car service) up to Napa on Wednesday to speak at a conference on Thursday. Then the rise-with-the-sun trek over to the TransBay terminal Friday morning to catch that train across the country. It’s all exciting stuff though, I wouldn’t trade next week for a boring one even if I could.

DevOpsDays Seattle 2017

At the end of April I made my way up to Seattle for DevOpsDays Seattle. It occurred to me upon arrival that while I’ve spent the past several years very close to DevOps circles and methodologies, this was my very first DevOpsDays! The crew organizing the Seattle event was definitely a great introduction, in spite of the gender ratio that always plagues these events attendee-wise I felt safe and welcome at this event. They also had a diverse selection of speakers without sacrificing quality (Something I tell people all the time is totally doable! Here’s the proof!).

Bonus: My walk to the event both days gave me a great view of the Space Needle. So pretty.

The two day event had the format of a single track all morning, a talk just after lunch, and then Ignite-style talks (5 minutes, 20 auto-advancing slides). From there attendees had the option of one last talk in the main auditorium, or to join fellow attendees in a more interactive series of open spaces (unconference). Put together by the attendees, unconference topics were whatever people had proposed earlier in the day and wanted to have round table discussions about with their peers at the conference. The open spaces then continued through the end of the day.

I won’t give an overview of all the talks, but I do want to highlight a handful that stood out for me.

The first day we heard a talk from Suzie Prince titled “Continuous Integration: A Bittersweet Love Story”. I wasn’t sure what to expect from this talk, but I was eager to hear from her since CI is so near and dear to my heart. She began by discussing two of the most important things about CI: Collaborating on master/trunk (rather than your own branches) and committing code daily (or more!). Coming from the OpenStack world, this wasn’t news to me, yeah, this is CI how we did it! Great!

The big reveal for this talk was that’s not how everyone does it. In fact, based on some research she did last year, most people do CI wrong and suffer in ways they really shouldn’t if they were doing CI properly. The research asked a variety of questions about what people knew about CI and what the pain points are. It was quite astonishing for me to hear some of the results, it sounds like we’ve done a poor job as a community of explaining CI and making sure organizations are implementing it correctly. A blog post about their findings is up here: No One Agrees How to Define CI or CD.

Full video of the talk is available on YouTube, here. I recommend watching it if you’re interested in this topic, her presentation and slides do more justice to the topic than my summary!

My talk was that afternoon. It was my first time giving a Day 2 Ops talk, and I had spent a lot of time while preparing the talk to communicate the right message without being patronizing. Essentially, things get complicated when looking at cloud-native systems where you have an underlying platform (whether it be bare metal or a cloud provider), then whatever you’re running your application in (container?) and then your app itself. You need to be able to get metrics about what all the layers are doing, maintain some kind of monitoring system that understands the setup and can dynamically adjust as your system grows, have a way to access logs and troubleshoot problems down all the layers and have a system for maintaining everything. Plus, you want to give the appropriate access to everyone in your organization based on what they are working on, developers want access to their applications, operators of the physical cluster may need access to the infrastructure but need to know less about the applications.

I had some good talks with folks after this talk, several admitted their organizations accepted the turn-key offering of easily running apps and really got into trouble when things went sideways and they had to debug the actual issue down the stack. No one cares about metrics, logging and troubleshooting until something goes wrong, but more care should be put here in the planning stages, since it does take time and attention, and ultimately it’s all pretty important.

Slides from my talk are up here (PDF) and the video is on YouTube here. I’d like to give this talk again, based on feedback from folks who have seen it, I could use a more formal checklist of things to consider when building a cloud-native system. Plus, I’ll add some talk about integration with existing platforms, we all run complicated things with many moving pieces, no one wants yet-another-tech-specific-dashboard or non-standard tooling that only works when it’s assumed it’s working in isolation.

The second day opened with a talk from Jez Humble on “Continuous Delivery Sounds Great But It Won’t Work Here”. This was a really fun and inspiring talk (though I had heard some of the examples before). He began by going over the top reasons people claim they can’t do CD in their org:

  • We’re regulated
  • We’re not building website
  • Too much legacy
  • Our people are too stupid

His general premise was that these “excuses” for not doing CD in an organization are surmountable with the right culture, and walked the audience through examples that proved this. These included: checks for compliance that can be put into your CI pipeline, the fact that HP’s printer division wasn’t building websites either, but saw significant improvements once adopting CD methodologies, the idea that legacy applications should never hold the rest of the org back and new things should be built to meet new goals (like CD!) and a car production line example that showed how the same employees did higher quality work once their culture changed.

Super interesting stuff. Video of his talk is available here.

I also want to highlight a talk by Jeffery Smith on “How to Elevate Your Contributions as an Ops Engineer”. He very correctly pointed out that IT teams are often very insular and so focused on the tech of the infrastructure, that they don’t poke their heads out to really understand the business, or the specific value they’re providing. He walked through several examples of engineers in a company taking a broader view of the company and what it needed, and being able to make direct impact on the bottom line since they understood where things were going. Plus, this helps you too. He suggested that specific technologies come and go, they get automated or commoditized, and suddenly knowing how to configure something is not as valuable. You bring value by understanding the industry and helping people outside your specific sphere get their work done too, and proving that up the chain. He’s a great speaker so I recommend watching the talk for yourself! It’s up here

Then there were the Ignite-like talks! There were a bunch of great ones, but two really stood out for me, and since they’re only 5 minutes each and really fun, you should just go watch them:

Finally, huge thanks to the organizers of DevOpsDays Seattle. They were really friendly, and I got a kick out of my name being on the back of the conference t-shirts. Usually that’s where conferences put the sponsors! But sponsors get their names on plenty of things, this was a great way to make the speakers feel like rock stars :)

All the videos are up on a YouTube playlist here and more photos from DevOps Day Seattle 2017 that I took are here: https://www.flickr.com/photos/pleia2/albums/72157680011962503

I’m now about to get on a plane to attend and speak at my second DevOpsDays. This time I’m headed off to Salt Lake City! Here I’ll be speaking on “The Open Sourcing of Infrastructure” on Wednesday morning.

My magical smartpen

If you’ve ever seen me in a talk at a conference, you know I take notes, it gives me a record to blog from later and physically writing notes helps me with memory retention. I also carry around a paper notebook in my purse to jot down random stuff (cat’s weight at the vet, space measurements for when I go to Ikea, to do lists created when we’re having brunch and planning out our afternoon). The problem with this is that the contents of these notebooks aren’t captured anywhere digitally. I’m not going to transcribe this stuff after I use it, “but it would be handy to know the size of that space next to the counter, but darn it I left that notebook in my other purse!” or “I’d like to finish that blog post at work today, but I left my conference notebook at home.”

Enter the smartpen.

You write with this magical pen in a special paper notebook and suddenly you have paper notes AND they sync to an app on your phone. From there you can read and transcribe the notes, export them in various free formats, and auto-sync them with a handful of proprietary services.

A bunch of people have asked about my experience, so welcome to the rare blog post I’m writing about a product. I’m not being given an incentive by the companies I mention to write about it, and I probably wouldn’t write about it if I was.

My journey began when a colleague of mine clued me in to the existence of the Moleskine Smart Writing Set back in February while we were at Spark Summit East in Boston. From then on, I had a bit of a bug in my ear about it. I wandered over to the Moleskine shop nearby a few weeks later to try it out, and ended up semi-impulsively buying it there. I say semi-impulsively since I didn’t do as much research as I normally would have for such a thing, and in retrospect I could have gotten individual pieces (pen, notebook) for slightly less elsewhere. But it wasn’t much cheaper, and I did try the product out in their brick and mortar store, which was a valuable pre-buying experience and I want to see stores stick around, so I don’t mind spending my money there.

Regardless, I had it and they had a note return policy once I opened it, which of course I did as soon as I got home. It comes with the following things:

  • 176 page paper notebook with the special dots needed to work with the pen
  • The Moleskine-branded Neo smartpen N2
  • 1 pen tip ink refill
  • USB charging cable

I set up the Moleskine app, jotted down a few notes, and immediately realized I had made a mistake. You see, the pen is just a branded Neo smartpen and if you use the Neo smartpen app, you can use notebooks that aren’t made by Moleskine! Now, while I’d be happy to use just the lovely Moleskine paper pads (in spite of the tremendous price tag, they are nice), right now they only make them in the large size. Not awesome for my purse. Neo directly has lots of notebooks! Including the super cute N professional mini, which now lives in my purse. Oh, and the apps are nearly identical, Moleskine just branded theirs.

Moleskine Paper Tablet N°1 that came with the kit, Neo smartpen and Neo N professional mini

Now, the playing around was behind me and I had everything all set up, time to take this show on the road!

My first conference

I spoke at an Apache Flink conference in early April, and that was my first opportunity to use my shiny new smartpen. I charged it before hopping on the bus to the conference. I took a bunch of notes and it worked quite well.

The weight and size of the pen weren’t a problem for me, I didn’t really notice I wasn’t writing with a normal pen, though I admit I don’t have small hands. I was able to open up the app on my phone and watch writing happen, cool! Or just write a bunch and let is sync up later. The pen claims to store 1000 pages of writing, so syncing frequently doesn’t seem to be something that’s required unless you want to, but it does sync all the pending stuff for all notebooks when you do go to sync it.

I was pretty happy with this trial run, but it did immediately make me realize a few things about the pen that I wasn’t too keen on.

What I don’t like about it

The first three things I don’t like, but I think I can live with or work around:

  • The app isn’t great, it’s kind of confusing
  • All the auto-save options are proprietary (Evernote, Adobe Creative Cloud, Microsoft OneNote)
  • The notebooks are expensive, $30 for the large Moleskine, $14 for the little Neo notebook in my purse

In spite of the app being a bit of a mess, it is basically usable. I’m not sure I figured out how to properly get the backups going to Google Drive (I think I did…?). I’m somewhat worried about data transfer if I get a new phone and have to move content over from the app. The documentation isn’t great on the Neo smartpen website, so far I’ve noticed that it’s not always updated to reflect the latest version of the app. There are also a few little wizards that pop up to explain how to do things, they’re annoying until you realize you actually need them to use the app effectively, which is even more annoying.

In spite of not liking using a proprietary platform for auto-save I don’t have a practical problem with using them now and then, after all, I do use G Suite quite a bit. Practical concessions can be made.

All proprietary auto-save options :(

Plus, even if auto-save is going to a proprietary place, it’s not the only export option. You can export individual pages as PNG, PDF, SVG or TXT (it gets the OCR treatment) and then email them to yourself, upload them to Google Drive, or a few other places (depends on the apps you have installed).

The cost. Eh. I don’t go through these very often, so I can stomach the price of the notebooks once a year or so. Plus, they are really nice.

I could see any of the next three being a problem for me that causes me to stop using it:

  • I have to remember to CHARGE my pen (“what are you doing?” “charging my pen” “uh, ok, that’s a thing now”)
  • I have to remember to BRING my pen, and the special notebook
  • I can’t just use random cute notebooks, I have to buy expensive Neo smartpen notebooks

One of the reasons I attached myself to a paper and pen is because it’s simple and doesn’t require any technology. And I get free notebooks at conferences pretty frequently, it’s fun to use the various sizes and formats they come in, changing to a new notebook when I fill the last one up is fun. The complexity of now making sure I charge and have yet-another-device, and a specific notebook, is a challenge, particularly since I use the pen with both my conference and purse notebooks. If I leave the pen in the wrong bag? No notes for that day!

Finally, there are a few unknowns. What happens if my pen dries up in the middle of a conference? I can’t just grab another pen! I do have a spare tip, and you can order more, but I haven’t yet started carrying them with me. What happens if Neo smartpen goes away as a company? Or stops supporting my device? I can make backups, but it puts me in a tough spot for long-term support of my shiny new system. I also don’t know how well this all works if you have multiple pens, if I did decide to throw down another $150-170 for a second pen that only lives in my purse, can the app cope with two pens being linked? I don’t know! Can I switch which pen is going to which notebook? I don’t know! The inflexibility and confusing-ness of the app is quite a concern here, I’m somewhat worried that doing something unexpected will cause me to lose notes, or have a disjointed experience in the long run with notebooks being digitally split up.

General usage

That’s a lot to complain about, and I’m honestly not sure about this all long term, but the geek in me is in love. I love gadgets and it’s really cool to finally have a digital record of the copious notes I take at conferences. No more are they just stashed in a drawer, never to be seen again once I’ve completed a notebook!

It’s also so great to be able to leave my paper notebook in my conference backpack and not slog it back and forth to my desk or the office when I want to write a blog post that references them. I just load up the app in my phone to browse my notes, or have a peek via Evernote on my desktop. This also means that my conference notebook pretty much lives in my conference backpack, less risk of forgetting it. Also, if I lose it I’ll still have a digital archive.

I’ve now used it at Flink Forward, DevPulseCon and DevOpsDays Seattle. I can’t speak strongly to the battery life, since it’s been pretty reasonable so far and I didn’t charge it between the second two conferences, it lit up when I needed it to and still had 80% charge at the end. I do also usually carry a little battery with me for emergencies for my phone, noise-cancelling headphones and other random devices anyway.

The automatic transcription is pretty decent, I have tried to be a bit less sloppy with my writing, but it’s confused by industry terms. It’s good enough to correct after the fact though, so it gets most of the job done and I just need to pop in for edits. This will be very useful if I do decide I want to formally transcribe anything I write.

In all, the experiment has gone decently well and I’m looking forward to skipping off to Salt Lake City tomorrow for conference number four with my shiny new pen and notebooks!

Quince and Hamilton

I love musicals. As a youth I started off with Disney full-length animated features, buying and becoming obsessed with the sound tracks. I then graduated into the Rodgers and Hammerstein via the classic movies, South Pacific, The King and I and The Sound of Music… When Hamilton started picking up steam, I was right there to lend my ear to the original broadway cast recording. Over and over again. When the Hamilton Mixtape came out in December I was thrilled. So good.

Then MJ surprised me with tickets to see it in San Francisco. I was over the moon! We went with a couple we’re friends with on Saturday.

Prior to the show, we had reservations at Quince for dinner. It’s the newest San Francisco inductee into the Michelin three star club, but it had been on the list with fewer stars for a few years. Now, we had just been to a Michelin-starred restaurant the weekend before, but this is highly unusual. We might go to one per year, it’s an expensive meal and I like taking the weeks afterwards to enjoy the memory. I wasn’t going to say no to an amazing meal with some friends though ;)

In order to get to the show in time, we secured a 5PM reservation and let them know about our time constraint, shortening the typical 3 – 3 1/2 meal window to just 2 1/2 hours, which they were able to accommodate. We also learned that there was a nearby table with the same plans.

The meal was a multi-course set tasting menu, advertised as “Contemporary Californian and Italian” cuisine. The focus was on seasonal and local, with a handful of delicate pasta dishes and a couple featuring asparagus.

With a show ahead of me, I skipped the wine pairing and just had a single glass of Riesling to accompany my meal. It was a nice, sweet choice that went well with the dishes, all of which were as exceptional as expected. The caviar dish was probably my favorite, but my love for pasta made the whole meal quite enjoyable.

We made our exit just after 7:30 and got to the Orpheum Theatre just in time to get to our seats for the 8PM show. We had great seats, nearly centered on the stage aisle seats in the front row of the Mezzanine.

I may have teared up when the show opened. And several other times throughout the show. It was everything I was hoping it would be! Satisfied left me Helpless. I really enjoyed the actor who portrayed Aaron Burr, he kind of stole the show for me.

Since it was my first time seeing the production played out (not just listening to the soundtrack) I was also to catch a bunch of things, like how hilarious King George is, and the very opinionated portrayal of Thomas Jefferson which landed him in “bad guy” territory.

We had a great night, I’m so glad we went.

It is possible to get tickets for showings at the Orpheum now, with a handful of available seats here and there. They also are still running the next-day lottery for the chance to win a pair of $10 tickets.

There are some more photos from the evening here: https://www.flickr.com/photos/pleia2/albums/72157680336767594

DevPulseCon 2017

Back on April 20th I had the pleasure of attending and speaking at my first DevPulseCon, put on by CodeChix. I’ve worked with CodeChix before, back in 2013 I did an OpenStack tutorial in Palo Alto. Then in 2014 I went with them on the road to help with the PiDoorbell workshop at PyCon in Montreal. These experiences were all very fulfilling. CodeChix founder Rupa Dachere has a great vision for all the events she works on and always manages to bring a great team together to execute them.

This conference took place over two days, the first made up of talks and panels, where I was participating, and a training day on the second. I was invited to give a tech talk on “Using DC/OS for Continuous Delivery” and to join an afternoon panel on “Getting Your Next Job – Groundwork You Need To Do Before You Start Interviewing.”

DevPulseCon 2017 was held in the upstairs event space at the Computer History Museum in Mountain View. Rupa did the event introduction, explaining that the event was made up of female engineers from various companies around the bay area. I go to women in tech-targeted events infrequently enough that I find myself really enjoying the environment. Walking into a whole room of highly skilled women who I can geek out with about infrastructure and tooling is quite the departure from what I’m used to at tech events.

The first talk of the morning was by Mansi Narula, Senior Data Architect at eBay, who spoke about NoSQL Database Platforms. She gave a high level overview of Mongo, Cassandra, Couchbase and Hbase and the basic rules around how they are all used at eBay. It was interesting to learn that internally they have a database selection tool that helps developers select which database platform works best for whatever they’re working on based on criteria they present, like speed, reliability and purpose of the data store.

My talk was up next. I began with a basic introduction to DC/OS and what it brings to the Continuous Delivery equation by simplifying a lot of the underlying infrastructure. Jenkins has an Apache Mesos plugin, but in spite with my own background using Jenkins in past roles, preparing for this talk this was my first time really getting a close look at that particular plugin. The demo I did used a Python script to bring up a simple pipeline of changes being made to a repository, uploaded, tested, and deployed on a web server. I customized it some for the event, having it publish a “Hello world” type post specifically for DevPulseCon attendees. I concluded the talk by talking about some of the DC/OS 1.9 features I felt were particularly applicable to folks interested in running an infrastructure platform, including strides made with metrics and logging. I uploaded the slides here (PDF) and they include links to some other resources and the demo I showed.

Thanks to Nithya Ruff for the photos of my presentation (source)

The final tech talk was given by Gloria W., titled “IoT: Yes You Can!” where she broadly outlined the space of DYI internet of things and then dove into some details about how you might get started. She started by talking about the constant struggle of anyone developing in the IoT space around making sure devices are provided with power and some way to communicate. From there she spoke about some of the specific tooling available today, trending toward recommending open source solutions where ever possible. She talked about using Arduinos with sensors, and I was interested to learn about the MATRIX Voice, “an open-source VOICE RECOGNITION platform consisting of a 3.14-inches in diameter dev board, with a radial array of 7 MEMS microphones connected to a Xilinx Spartan6 FPGA & 64 Mbit SDRAM with 18 RGBW LED’s & 64 GPIO pins.” How cool! Kit-wise, she advised attendees to try to steer clear of proprietary development kits since they try to push you onto their platform, and instead select ones that lean toward using open source and open standards. The talk concluded with a raffle where she gave away some of the devices she had brought along.

The afternoon was spent with a series of panels:

  • Getting Your Next Job – Groundwork You Need To Do Before You Start Interviewing
  • Company culture that works for YOU (not just the men in your team) – AKA “work/life balance”
  • Promotions, Visibility, toxic environments and how to deal with them

I can’t share details about these sessions since they did a really novel thing with these: Asked everyone to put down their social media devices and not share what was shared in these panels outside the conference. It allowed panelists and audience members alike to be really honest about their experiences, solutions and advice without risking that they’d be quoted somewhere. Huge thanks to the event for providing a safe space for these kinds of discussions, it was helpful and I think we sometimes suffer from not having enough of this in our industry.

The day concluded with a small after party in the lobby sponsored by Facebook. I am often shy at social events like this, but being a speaker helps, people came up to me to chat about CI/CD and the work we’re doing on DC/OS. I also met an attendee who I chatted about OpenStack with for a while. It was also nice to connect with some of the folks who I already knew at the event, like Nithya who I frequently fail to connect with at events and at home – both homes! She spends time in Philadelphia with her new role and yet our trips back east seem to rarely overlap. I was also amused that when I went to get a beer from the bar and declined a glass they said “the men want glasses and women want the bottle, it’s usually the opposite!” Oh yes, I was in the right place at this event.

Apache Mesos, and big, streaming data events

Over the past several months I’ve been getting into the swing of things with my new role as a Developer Advocate at Mesosphere. This began by attending Spark Summit East back in February, and really got going when I spoke with my colleague Ravi Yadav at Flink Forward in San Francisco early last month.

These very specific technology conferences are somewhat new for me. It’s true that I’ve been going to Ubuntu and OpenStack conferences for nearly a decade, but these projects are huge, with dozens of different projects inside them and various teams, companies and volunteers with varying motivations. It’s a whole different thing feel when you have a small concentration of folks working on a very specific technology directly and together. It’s also a great learning environment, since your attention is not split across a massive community and you can focus on learning how other people are doing things like deployments, scaling and whatever else is specific to that technology.

I wrote about the specific Flink Forward talk Ravi and I gave in the post on the DC/OS blog, but even more generally it was great to meet community members operating in that space and talk shop about the technologies that surround our work. Professional photos from the event are here and I have my own album of pictures I took here. And in case you’re curious, a video of our talk is now online here and slides can be found here.

Ravi shows off a demo between my bits of speaking at Flink Forward, cc-by-sa 2.0 Silke Briel (source)

I’ve also been starting to help run some of the meetups that we’re hosting here at the office. Back in March I attended and MCed my first Apache Mesos meetup, Running Production Containers and Big Data Services Gets Even Better. The meeting was great for me since I’m still getting up to speed with all our projects, and it covered some features in the new releases. First up was Gilbert Song talking about “Mesos 1.2 Updates and Universal Container Runtime” and then a DC/OS 1.9 features talk by Sebastien Pahl. The event concluded with a presentation about Instana, a multi-layer monitoring platform geared toward container-based architectures where your environment is, by design, constantly changing (it is a paid product, but a 14 day trial is offered). A video from the event is up on YouTube.

The opportunity also arose to host a Women in Big Data meetup here at the office where Amita Ekbote and Susan Huynh introduced Apache Mesos and DC/OS and gave a live demonstration of the IoT Pipeline. Suzanne Scala posted a write up of the event, including the slide decks and other links on the Women in Big Data blog, here: Big Data on DC/OS. I attend a lot of tech conferences and events, and they tend to be male-dominated, so I really enjoy these events where I can meet other women doing cool technical stuff. Plus, big data in particular is a space where people are doing some really interesting work.

I’m looking forward to helping out with more local meetups in the coming months here at the office, but also to be speaking at some of my own, I’m aiming for some east coast events in early June that I’m pretty excited about.