• Archives

  • Categories:

All Things Open 2017

The goal of the trip down to Raleigh last week was to attend All Things Open. I attended back in 2014 and was eager to attend again, but was hampered by conflicting events. This year I committed to going, in spite of an unfortunate overlap with the Open Source Summit in Prague, which is important to me because MesosCon Europe happens on the Thursday and Friday of that week. This means I had to get on a plane immediately at the conclusion of All Things Open on Tuesday in order to make it to Prague by Wednesday night to begin my MesosCon EU activities. It was a long week.

But the week began in the most relaxing way possible as I stepped off the train in Raleigh Sunday night. Conference badge was picked up and I was off to a pre-conference social to say my first hellos to fellow conference-goers. I was quickly able to find some friends, but the social venue was quite busy and I was eager to find some dinner.

On the way out of the social I collected some folks and five of us ended up at an empanada and tequila restaurant not too far from the conference venue. Highlight of the night besides good company? They had pumpkin pie emapandas as a dessert. Heavenly.

The next morning the conference began with a series of keynotes. Todd Lewis, the grand architect of All Things Open and super nice guy, kicked everything off with welcomes and piles of gratitude. His personal kindness is one of the main reasons I love this conference so much. It’s also the largest open source conference on the east coast, this year drawing 3000 attendees, which is a new record for the event. That’s how I convince my employers to support the event year after year.

The first keynote came from Tim Yeaton of Red Hat who began his talk with a fun walk through a typical day that brought him to Starbucks, Chili’s, Target and then on to a United flight, all companies who are major users of open source and helping to drive the incredible growth throughout our industry. Jake Flomenberg of Accel joined us next to talk about the rise of open innovation, with companies like Intel and Goldman Sachs choosing open source software first and proprietary only as a last resort, a distinct change from what we were seeing a decade ago. He shared that there’s also a rise in venture capital backing of open source-driven companies who are driving innovation, it’s no longer just Red Hat making money in the world of open source business, others are following the path of Project, Product and Profit.

Danese Cooper, who has worn many hats in her lengthy open source career but is currently the Head of Open Source Software at Paypal, then graced the keynote stage. She began her talk by walking us thorough a history of why many people got involved with open source in the early days, recounting a story from Mark Shuttleworth. He explained his motivation behind investing his wealth in an open source project and company was because he cared and wanted to build something that mattered. The rest of us in open source software weren’t much different, even if we didn’t have millions to invest, my own passion drove my involvement through years of unpaid work. She shared that while many people today are getting involved for the very valid reason of being paid for their work, we can’t forget the lessons we learned along the way. To this end, we need to document our history and motivations so that future generations will know how we got to where we are, as well as mentor new people, serve as board members in open source organizations, and encourage good practices within our communities and company. She concluded by also stressing that we all must continue to follow our moral compass as we contribute, and that money to pay our mortgages will follow. That’s certainly true in my case.

Perhaps the most exciting keynote for me of the morning was from Sara Chipps, the CEO of Jewelbots. Jewelbots are tech-enabled friendship bracelets that have default modes to let you know when friends are nearby, but you can also extend them by writing C++ programs. She gave some history of the company, sharing that she wanted to build something for girls and noticed that they already had traditional friendship bracelets. By creating a tech-enabled one she discovered that the drive to extend the devices functionality-wise would “trick” the girls into having to write code. It worked, of the 10,000 units shipped, 44% of the users have extended them by writing C++ code, an incredible conversion rate. Finally, all the code for the devices has been made open source and is available on GitHub. Super inspiring keynote, it makes me so happy to see things like GoldieBlox and Jewelbots in the market for girls these days.

The final keynote of the day came from Kelsey Hightower of Google and Kubernetes fame. His talk was entirely demo-driven as he stressed that containers were not hype, they do simplify infrastructure by putting applications in nice little boxes. But that they aren’t a panacea, and while they help, we haven’t gotten to our future of flawless automation with them or any infrastructure technology. Much to our delight, he then was able to use Google-driven voice commands to show off not just a Kubernetes deployment on Google’s cloud, but an on the fly upgrade which he sees as the next big evolution in maintainability of large clusters.

With the keynotes behind us, the first talk of the day I attended was by Alicia Carr on Home Automation. The talk itself was a tour of the home automation devices she used in her day to day life, exploring how they interacted with each other (or not!) and her favorites on the market. She addressed the topic of security, admitting that we’re still in the early days of home automation and it’s not where it needs to be to be entirely safe and fool-proof for full, mainstream adoption, but that the companies in the space were aware of these limitations and working on it. Unfortunately there wasn’t much open source to this talk, some of the devices did allow you to code against them, and trigger tooling like IFTTT can allow you do do interesting things. I’d really love to see a talk about the home automation ecosystem of free software hackers that’s emerged, and the open source considerations therein. What devices are the most friendly to hacking? Which can be independently secured most easily?

The next talk I went to was John Mertic on “Accelerating Big Data Implementations for the Connected World” where he focused on the ODPi which seeks to strengthen the engagement model between upstream projects and end users. Five years ago the industry started adopting tooling like Hadoop to do big data processing at a large scale, but even then we weren’t at a place where we are today with the emergence of the Internet of Things and vastly more data coming in than we ever imagined. He shared that that the major challenge now is standardization. The options today for companies seeking to wrangle their data tend to be relying upon raw open source projects, which are frequently not productized enough for simple use, or being led by a vendor who may not have a standardization strategy in mind, instead steering you towards their version of productized piece of open source software. Companies need a resource to help fill in the gap there, and this is were the ODPi comes in.

After lunch I went to Corey Quinn’s talk on AWS cost control. He’s a fun and engaging speaker so I always enjoy going to his talks, but it was also incredibly informative. Though I’m not personally seeking to reduce AWS costs, it was fascinating to learn from his experience about what finance departments, C-level executives and company boards actually care about. Many consulting companies around AWS cost reduction focus on a series of technical strategies that can reduce cost, but his talk wasn’t about those, instead he focused on understanding what you’re running from a business perspective. It’s great to be efficient, but at the end of the day a large bill is actually fine if you can account for what you’re spending it on. Tracking workloads and having an itemized, defendable listings of costs goes much further than a painful cost reduction that knocks a few percents off of what is still an expensive black box as far as finance is concerned. Even better, when you do a project to start tracking your workloads, you’ll inevitably find services that aren’t being used and other extra fat that can be trimmed from your bill, and you do end up with the cost reduction too. Given the audience, he also shared the Open Guide to AWS and his own Last Week in AWS newsletter, both valuable resources to folks administrating AWS.

At 2:30 I gave my talk of the conference! I spoke again on The Open Sourcing of Infrastructure (slides here), taking the audience through the work we’ve done to choose open source as our infrastructure platform and how we can use those lessons moving forward in a world of proprietary cloud services. I gave a version of this talk back at FOSSCON in Philadelphia, but it’s one of those talks that’s always evolving based on audience feedback. The talk went well and afterwards I had a chat that will lead to some additional slides about security. Most specifically, there have been high profile security vulnerabilities in open source software and the major open source tooling that’s using it tends to have patches out within hours of disclosure. We don’t have that with proprietary tooling, in fact, the latest WPA vulnerability has left millions of proprietary devices without patches, and no indication to the customers about when patches will be available, assuming they will be and the devices and software are able to be upgraded at all.

The next talk I went to was from Elsie Phillips of CoreOS in the business track about building an selling an open core product. She walked through business models around training, support and consulting, sharing that these are still valuable avenues of making money, but that they generally weren’t enough and had very slim profit margins. There’s also a balancing act that must be made when you have open core software, making sure that the open source version is valuable and that the “Enterprise” version simply adds features, not that the open source version is restricted in some way. She stressed that the proprietary version should add polish like automation and other tooling that is not impossible to do on your own with the open source version, but saves the company engineering investment if it’s done for them out of the box. I happen to work for a company that uses this model, so while my open source heart bristles at some of the decisions made about what goes into the open source product, we do all have bills to pay and I often find myself in agreement as to where the line is drawn.

The final talk I attended of this first dway was by my Mesosphere colleague Alexander Rukletsov on Health Checking: A not-so-trivial task in the distributed containerized world. This talk drew from his experience with the overhaul of Apache Mesos health checks that occurred a year before. At first glance, it seems like doing health checks inside of containers would be easy since they live on a host that has access to them, but it turns out that it’s not. You have PID and networking namespaces to contend with that can turn even the most simple checks into something unexpectedly difficult. In a distributed system you need to consider the costs and benefits to how broadly you scope your checks, noting that a global scoping may be easier to set up, but you then have network overhead, latency and potential for duplication across your cluster. It was a valuable talk from an operations perspective allowing you to understand what goes into a check you’d generally consider “just an HTTP check” or similar.

All Things Open is a tiring conference and I was busy all day, so my day concluded with some nearby chicken and waffles before heading back to my room.

Tuesday opened with more keynotes! The keynotes began with a thoughtful and inspiring talk by Safia Abdalla who, among other things, called upon the audience to build friendships (not just mentorships) and to not just share code in our communities, but to share knowledge. Her talk was quickly followed up by one from Matt Asay, a familiar executive in open source circles and someone whose opinion I respect even if I don’t always agree with his conclusions. As he reviewed key open source successes in the corporate world he spoke to pragmatism as we push for more openness in our industry. He explained that convenience will typically win out and there’s always some amount of lock-in. There is victory even when building into proprietary clouds, they are more open than what we came from and most of what we’re building on top of that is still open source tooling. As someone who so strongly believes in open source as far down the stack as possible, I see his pragmatism argument, and even addressed it in my talk the day before (you can build against proprietary technology if you want, lots of companies do) but I’m glad for the industry that we have choice.

Burr Sutter joined the keynote stage next to give a very fun and accessible introduction to DevOps. He recounted stories of the improvements to code quality when developers shared in the responsibility in production, stressed the importance of on-demand infrastructure, automation and continuous integraton and deployment for fast, high quality development. John Papas was up next to talk about the open web, talking about some of the technology choices that should be considered when making modern websites, including ways to improve load time through priority loading of elements and searching code for module duplication and the presence of too many dependencies that could increase the size of the page being loaded. He concluded by asking us to be kind to one another, we may not see each other as we interact over coding sites and GitHub, but there are real people on the other end. The last keynote was from Jeff Atwood, the creator of StackOverflow and Discourse. His talk was a story through the creation of StackOverflow as a community and company, and how the more community-focused Discourse project came about. It was amusing to learn that having children so deeply impacted his approach to solving problems, or not solving problems. He learned that not every problem is actually in need of a solution, sometimes we just need to be human and empathetic to one another, and that’s enough. With StackOverflow the goal is very strictly about solving problems. With Discourse it’s about building a community and having fun.

The first session following the keynotes I attended was from Mark Voelker on Interoperable Clouds and How To Build (or Buy) Them. I knew of Mark from his work in the OpenStack community, and his talk centered around the work he’s done and lessons learned while participating in the “OpenStack Powered” program, where products can be certified through a testing suite to carry that label. It causes the OpenStack name means something, and ensures that is not haphazardly applied to products that replace pieces of OpenStack with their own tooling, run non-standard versions of the APIs or are otherwise not what the user may expect from something called OpenStack. He dove into some of the strategies the community uses for determining what should be tested and they iterate on the guidelines with every release, drawing from customers and product creators to learn what they should be including, excluding or changing. These lessons extend to what’s popping up in Kubernetes Certified and other programs in this space.

After grabbing lunch with one of my DevOps + open source compatriots, I met up with Spencer Krum, one of my former colleagues on the OpenStack Infrastructure team and current co-conspirator on open source infra work. He was doing live broadcasts for The Root Shell at ATO and I was invited to swing by to chat. It was fun, we talked about my current focus of containers and our work on open infrastructures. When we wrapped up I headed directly for the lightning talks that had already begun, making it in time to see Chris Short talk about DevOps and the joint official launch announcement with Jason Hibbets of the new OpenSource.com DevOps Team, which I happily joined a few weeks ago. I also really loved the Lego hacking lightning talk from Jen Krieger, DYI electronics hacking with Lego looks like a lot of fun and she showed off some neat projects.

Now, in spite of leaving much of my direct operations work behind when I changed jobs last year, Jenkins still looms large in my work today as I am working on a CI/CD demo and frequently give talks about how containerization can be used in this space. As such, I went to a talk by Kristin Whetstone on “7 Habits of Highly Scalable Jenkins Administrators”. The habits included using the latest LTS version of Jenkins, using Jenkins Pipeline and container-based plugins for more self-service job deployments, parallelization of tests and participation in the Jenkins community. What I really enjoyed about this talk was that she not only would describe the problem space of each of these, but gave links to the specific plugins that would help you achieve your goal with them. The day was getting long as I went with the fun choice for the next session, and joined Gareth Greenaway in his “What the FOSS Community Can Learn from 80s Television” talk. It was a funny trip down 80s television memory lane with open source lessons, including: the importance of working as a team, ignoring MacGyver’s failure to document his solutions, improving the bus factor, reducing the impostor feels by new contributors as Sam in Quantum Leap perpetually felt, and the importance of not simply creating new projects (characters) or forks (spin-offs) without a good reason.

In the late afternoon I met up with my friend Laura over at the booth she was running. The plan? A photo of the Ubuntu gurus we knew at the conference. Success! Not many of us are working on Ubuntu any longer, but the friendships we made in that project sure have lasted.


Our connection? Ubuntu!

With that, we were just one talk away from the end of the conference, and my need to get to the airport. The final talk I attended was by my friend and former colleague on Ubuntu work, Michael Hall. His talk had the interesting name of “The Revolution Will Not Be Distributed” which you quickly learn is not about distributed infrastructure, but instead about the way that software itself is delivered. One of my first contributions to an open source project was packaging in Debian, and it was hard. I had a mentor and throughout my work in Debian and Ubuntu, even as I helped others with packaging, I was always consulting the documentation and asking for help from other people about the proper and expected way to do various tasks with Debian packages. At the time I hadn’t thought much about how this wouldn’t scale, and I was right along side everyone else who worked on various efforts to teach more people packaging and to get software creators to package their software for distributions. This included various efforts that the community team at Canonical embarked on to get more software packaged, none of which resulted in a significant, sustained increase in the number of packages or packagers in the community.

Michael ran the numbers for us. There are about three million applications in the Android marketplace. The Ubuntu desktop has about three thousand. You can definitely argue that the Ubuntu includes high quality applications that build a firm and sufficient foundation for a workable desktop, but we all know that there are gaps, and software we install outside of the standard repositories, whether it’s by using a PPA, install script, or continuing to compile it by ourselves. With this baseline gap, along with statistics about the current number of package maintainers per package throughout various distributions, he calculated that the number of maintainers and people-hours involved to get to Android was astronomical. Even to get a respectable fraction was not within reach. Instead of continuing to toil in the world of distributions with their own package formats, he looked to what is already happening with formats like Flatpak and Snappy, making packages that are simpler to make and portable (especially with FlatHub and the Snap Store), as well as the increased adoption of containers. His vision for the future of distributions was less about packaging of software itself into DEBs and RPMs, and more focus on integration of this software with the desktop environment for a more pleasant user experience. It was an interesting talk, and certainly gave me a lot to think about with regard to our approach to the distribution of software.

More photos from the conference here: https://www.flickr.com/photos/pleia2/albums/72157686641873722

It was then off to the airport! I cabbed it over with my friend Stephen, Michael and David, and the latter two ended up being on the same flight as me. Even more amusing, our connections in Charlotte (not a small airport!) were leaving out of a trio of gates right next to each other. It was a lovely way to conclude my ATO adventure.

Huge thanks to Todd and the team who put on All Things Open. It’s a wonderful conference and I was happy to participate again this year.