Learning Systems Thinking: Essential Nonlinear Skills and Practices for Software Professionals
You need to be signed in to add a collection
Transcript
What is Systems Thinking?
Charles Humble: Welcome to this episode of the GOTO Book Club. I'm Charles Humble, and I'm talking today with Diana Montalion. Diana served as principal systems architect for "The Economist" and Wikimedia Foundation. She is a passionate, insatiably curious web technologist, with a strong background in enterprise strategy, solutions architecture, dynamic team leadership, software engineering, coding, and creating ecosystems that grow innovation. She is also the author of a new book for O'Reilly, called "Learning Systems Thinking: Essential Nonlinear Skills and Practices for Software Professionals," which is out just now. And I want to say straight away, I love this book. I think it's really interesting. It's really thought-provoking. I think it deserves to sit alongside maybe the Donella Meadows classic, "Thinking in Systems," as absolutely essential reading for anyone working in this kind of space. So, I'm really thrilled GOTO asked me to interview Diana. Diana, welcome to the show.
Diana Montalion: Thank you. And okay, we're done. That's good. I feel like that's the best introduction I could ask for, and everything I'd want people to know.
Charles Humble: Brilliant. I'm delighted. So, the first chapter of the book is called "What's Systems Thinking?" And given that there isn't really a formal definition of systems thinking, what would your definition of systems thinking be?
Diana Montalion: I mean, there's a few ways that I think about this. One is that we are still figuring that out, that we don't have a lot of good vocabulary to use, or we have words that we've already defined, but when we use them in a system's circumstance, we mean something slightly different. Also, the book is called systems thinking because that makes it really accessible to people. But systems thinking is just one different way of thinking about systems of software. Pattern thinking, for example, the book could have been called that as well. So, systems thinking will be defined differently in academia, or in courses you might take in Udemy, or books that you might read. So, the way that I define systems thinking is a group of practices that, when practiced together, help us to take relevant and impactful action, even when we can't know what is the right thing to do, because there isn't one right thing to do. And so, systems thinking is about thinking about thinking, improving your own ability to think, improving our ability to think as groups, and also to think about and restructure the communication and thinking patterns in organizations, because that's what changes the technology. Right? Everything we think and talk about is in production, and there's nothing else in production. That's it.
So, when we want to change what's in production, we have to change the way that we think and communicate. And so, the practices of systems thinking help us to do that. They help us to be able to bring a different toolset to the types of challenges we face when we are faced with relationship challenges, systemic challenges, which is different from the way most of us have been trained to think, especially as engineers. We're trained in reductionist thinking, meaning we take complexity, we break it down into smaller parts, to make it accessible. But we forget that we're also designing, inherently, relationships between those parts and patterns, and all these other aspects that will give us outcomes we don't necessarily expect. So, systems thinking is somewhat antithetical to reductionism, and basically just adds more skills to knowledge, to our knowledge work. So, less about really what it is, there's not necessarily a template, and there's lots of gateways to enter to practice it. So, it's very meta that systems thinking depends on your context. But there are some practices that are pretty fundamental and necessary, and they help us to have an experience of systems thinking.
Charles Humble: Right, yes. And just to be clear, you're not saying that reductionism isn't useful as an approach.
Diana Montalion: Right.
Charles Humble: You're saying that this is a sort of additional set of rules on top, right?
Diana Montalion: I was just gonna say, that, in linear thinking, the way that we tend to think, there tends to be a right and a wrong. It tends to be on a spectrum. And that is definitely not what we're saying when we talk about linear versus nonlinear thinking, for example. We need linear thinking and reductionism, and we need those practices to build software. This isn't an either/or. It's being able to discern which types of thinking approaches will help us accomplish what we're trying to accomplish.
Charles Humble: What are some of the practices that make systems thinking up? You said it's a sort of set of practices.
Diana Montalion: So, the first set of practices, or the first thing that really matters, is our own thinking. Each individual is a system of thinking. And if we don't have metacognition, the ability to think about our thinking, to notice our own thinking patterns, our fallacious ways of..where our logic breaks down, our biases and things like that, we're not gonna get very far. And so, the ability to become aware of your own thinking patterns, and improve them through learning, through the types of things that support knowledge work, like deep work. And then, when it comes to the practices together, a lot of it doesn't sound like tech skills, but they're the most, for me, as a systems architect, I couldn't do without them. And that's... We are a really big "no" culture. Like, we tend to think of our role as pointing out what's wrong, right? Like the XKCD cartoon, where he's staying up late because somebody is wrong on the internet, and it's very important that we fix that. And that's great because we're problem solvers, right? That's wonderful. But when it comes to thinking well together, it very much works against us. So, it's learning about how to respond, using systemic reasoning. It's learning about how to approach diverse perspectives, and still synthesize them into something that's actionable.
So, the foundational practices are, they're, they sound easy, like systemic reasoning, meaning, showing how you reached a conclusion, rather than just giving your opinion. Right? "Oh, this is a graph, and graphs won't scale." But how did you reach that conclusion? Why is that relevant to this circumstance? And then from there, we see the things that we start to recognize, like patterns, right? Enterprise integration patterns, the way that software parts integrate with each other, the ability to figure out where to take an action in the system, that will actually resolve the root cause of the problem, rather than duct tape, and hack, in order to keep something working. So, then once we get there, it's a little bit more familiar now to people who are working in event-driven architectures and things like that, where we can't help but practice systems thinking. It's also, on the biggest scale, it's about changing the goals and rules and structures that support the way that we build software, when those rules and structures and patterns are causing us to go in circles and keep rebuilding the same kind of linear outcomes that we're used to, and being able to create organizational approaches that help us operate as a really healthy system.
Recommended talk: Gamification, Sys. Thinking & the Power of Observability • Jessica Kerr & Jessica Cregg • GOTO 2023
Systems Thinking and Microservices
Charles Humble: I want to pick that up in a second, but I also want us to pick something else up, because I think, from a systems design point of view, there's an aspect of systems thinking which I think is quite interesting. And it's that a lot of people make assumptions about systems design, particularly in distributed systems, and sort of microservices type architectures, as we might have them now, where they sort of think that, well, if we're properly decoupled, and we've got asynchronous communication between our microservices, then you have the ability to be independent. And systems thinking, a lot of systems thinking is actually about thinking about the relationships between the microservices, which I think is, somewhat naively, actually, but I think it is quite often kind of overlooked when people first start working in distributed systems. Do you think that's fair? Could you maybe unpack that a bit for us?
Diana Montalion: It's interesting, because sometimes when people say that, what they mean is, if we use a microservices architecture, for example, it means we don't have to be in relationship to other teams anymore. We don't have to communicate, we don't have to do this. And the thing is, you're still in the system. Even if you have teams that are able to act and make decisions independently, they are still part of the larger system, and the choices that they make still impact the relationships between the parts. You're still in relationships with the teams building the other microservices. And so, while a benefit of systems thinking is being able to create good boundaries in the system, that enable effectiveness and efficiency, which is what people are often describing, it can actually just go back to "Let's build silos, but with better words, and different tools." And so, if we don't shift our mindset away from this idea of each team is an industrial pipeline that pushes code to a piece of software, and instead, we think about ecosystems, how this ecosystem is both part of and not part of the bigger system, then I think that that approach can be really effective.
But I know for myself, when we, you know, I, early on, of course, I helped build the monoliths, because that's what we were all doing, right, 20 years ago. And then I often came back, later, to the same organizations, and helped to decouple these big balls of mud that we had ended up with. And the first time that we started building a microservices platform, about six months later, we realized these are really tightly-coupled, and they still have a do-one-thing-then-another-thing. And this is fine. It's just because we don't know asynchronous time as well as we know linear time. Right, asynchronous time, being able to generate knowledge flow, so that we could understand what information needed to be available to everyone, and what could we share in our own team, both in the people, and also in the tech, right? That, quite often, people are, they add an API, they take everything out of the relational database, and they make it JSON, and they output it.
But of course, the people consuming that information, they don't think of the data as structured the way MySQL needs it to be structured. They think of it as a more semantic, user-friendly, consumable bit of information. And so, there's a bunch of mind shifts involved, that we, at least in my experience, I had to practice. I had to try it, and see the limitations of my understanding, and then try it again. And we got a lot better at it over time. And how to structure schemas that were different from what we were used to, timing things. So, yeah, I think, quite often, we prove Pirsig's quote, over and over, right? Which is, if a factory is torn down, but the mindset that built the factory will just build another factory. Right? So, we're often building another factory, using Kafka, instead of...right. So, that, that we forget that there's also this, that we, our own minds are changing, along with the architecture of our software. And it takes time for us to learn how to think differently, and how to communicate differently.
Charles Humble: YesThat Pirsig quote is "Zen and the Art of Motorcycle Maintenance," I think, isn't it? Yeah.
Diana Montalion: Yes
Charles Humble: It's a classic book.
Recommended talk: Insights on Leadership & Innovation • Gene Kim & Charles Humble • GOTO 2024
Leverage Points and Counterintuitive Solutions
Charles Humble: Early on in your book, you introduce the idea of leverage points. So, these are places in a system where a small shift in one thing can produce big changes in everything. And then you talk about counterintuitiveness, and you say this is the most powerful insight of my career. So, can you talk about that? What do you mean by counterintuitives? Why is understanding them so important?
Diana Montalion: Donella Meadows, I'll tell two quick stories, and then I'll bring them together, that answer your question, because it was the stories that revealed it to me. So, Donella Meadows, who wrote "Thinking in Systems," which is definitely a huge inspiration for me and for a lot of us... I've spoken at conferences where it was the speaker's gift. Everyone got a copy of it. And what I really was wanting to do is to take that and apply it specifically to our challenges. Right? Because the book is not about tech, which is good because we see systems are systems, right? That we have...we're not unique in our problems. So, Donella Meadows was in a meeting, and I don't remember the context, but people were talking about the system and talking about the system, and she was getting more and more frustrated. And finally, she just got up and said, "No, you don't understand the system." And she wrote down 12 places in the system where if you make a change, you actually are increasingly likely to have an impact on the system. In that moment, decades of systems experience coalesced in her mind, and just came out in this list. Then that's in the book, her list of leverage points. And a lot of people have built, then, and thought about, and used that list that came out of that moment of insight for her.
Simultaneously, Jay Forrester, who is a system scientist, and really got a lot of our understanding into a vocabulary we can relate to, he was going into organization after organization, and discovering that people generally knew where the pain point was. They knew where the system was struggling, but they were pushing it in the wrong direction. They were doing the wrong thing. They were making it worse, because, again, we have the mind that built the factory. and that's the same mind we use when we face new, novel systems challenges. So, these two stories together reveal how when we think about...when we experience an event, like a bug in production, right, or our production goes down, or there's a recurring problem, that two things are true. We're usually blaming the wrong thing. And then, as a result of blaming the wrong thing, we're doing the wrong thing, and we're doing a thing that is making the problem worse. And there's no avoiding this. This is just part of our learning curve. And doing the wrong thing is called counterintuitiveness. It means that, which makes sense, the thing we really need to do, we don't see, because it's outside of our realm of experience, right? It's going to be counter to what we've learned about software.
One of the most impactful, in our industry, practices that we're developing when we're practicing systems thinking is that moment of pausing, and wondering, okay, are we making this worse? Is this...like, what's happening here that might be inadvertently reinforcing the problem? And can we use tools, like the iceberg model and other systems tools, to understand the root cause better so that when we make a change, we're actually making a change that will stick? And those changes will usually be one of the 12. I've taken Donella Meadows' list of 12, and changed the language to describe the things that we do, like adding caching, and scalability, these kinds of things, which are leverage points. So, when, in our usual day, we're trying to fix problems, when we shift a bit into systems thinking, we're trying to understand how the system itself is causing the problem, and reinforcing it and reinforcing it, and how people are reinforcing it and reinforcing it, to try and find that place where we can make a change that will actually stop the problem from recurring, without causing a whole bunch of other problems that we also don't want. And it's very, very difficult to do. And it requires a learning mindset. It requires being comfortable with uncertainty, all the things that we work so hard against when we're trying to build software. But, knowing that we need to bring some awareness to what we're blaming, and what we're doing, and what we're absolutely certain is the way we do things here, and we've always done them this way, and these are the right ways of doing it, those are usually our system's blockers. And so, the word counterintuitiveness is the reminder that the way we've always done things might in fact be the thing that's causing us the biggest challenge now.
Charles Humble: There's so much in here that I think is so interesting. So, one of the sort of light bulb moments for me in reading the book is it encapsulated some stuff that I'd sort of been thinking, but didn't have it quite clear in my own mind, is that... So, I think this kind of nonlinear thinking is absolutely essential, given the way we architect systems now. And also, as you get more senior, into leadership roles, where you're basically operating, you know, running a team of teams, effectively. But I'd also argue that it is something that isn't necessarily terribly valued. And something that your book kind of crystallized for me was that organizations tend to value how much an individual knows, maybe because that's a thing that's quite easy to assess, you know, in a whiteboard test or whatever. But you actually are successful through how effectively information flows through an organization, and that those two things are actually kind of in conflict. Because, you know, to have a higher stock of knowledge requires you to effectively hoard some of that knowledge. And I just found that really interesting. Could you talk about that a little bit? Because I just think it's a really interesting observation.
Diana Montalion: I had, my experience was similar to yours, where, I knew the pain, but I didn't really have the words for it. And it came from Larry Prusak, who is a professor of knowledge management, that's caught... And Russell Ackoff came from that similar kind of background. And Larry Prusak says that basically, an organization who doesn't have knowledge flow will fail, and they won't even understand why. And so, these words, knowledge stock versus knowledge flow. So, knowledge flows... Knowledge stock describes my expertise, or the expertise that's accessible to me. Knowledge flow is about how I enable other people to develop knowledge and make decisions and do things based on what I also am outputting, or the systems we've designed to share information. In tech, as the relationships between software parts is becoming more complex, as we're not building a monolith, we can't possibly know enough about the system as a whole. I can't know it from the business perspective, the user perspective, but also, there are tools I personally didn't build, that the system is using. And so, the cultivation of knowledge flow, as a practice, as well as a focus of our practice, is in line with the idea that we have to be synthesizing other people's expertise and other people's knowledge into our own all the time, in order to really understand what's happening, and to really make recommendations that are going to improve the system, rather than just solve a local problem.
For me, it was similar to what you're describing, that really got me into, so, where did we think of management, where did that come from? Which got me into Taylorism, and scientific management, and all of these things that are not respectful of knowledge work, which is what we do, because we... Code is knowledge. Code is taking our knowledge, using different languages, to build something that has our logic in it. And so, it really made me curious about, okay, how do we set up processes that support knowledge flow? How do we hire? Because a whiteboard test doesn't tell us anything about whether or not you being part of this organization is going to improve the knowledge system as a whole. We care about whether you know JavaScript well enough to do this implementation that you're doing, but all the JavaScript tool knowledge in the world, all the JavaScript expertise in the world, can't solve systems problems if we don't also understand the context of the problem, the way that people are interacting with it. Like, there's a lot more to it than the code. And so, that whole area of cultivating knowledge flow became really core to systems thinking and technology, and moving away from valuing knowledge stock.
Especially because, for me, I've never worked in the same exact tech stack for a really long period of time. Things have changed so much over the last 20-plus years that I'm always having to learn the next one. I'm always having to learn more practices, more tools, and different ways of thinking. And so, my knowledge stock is of minimal value to an organization that's changing. What is important is that I'm able to generate new and novel knowledge as I learn more about the context and the problems that I'm solving. That's the real, I think, long-term key to our success.
Recommended talk: Thinking Fast and Slow • Linda Rising • GOTO 2019
Navigating Hierarchy and Systems Thinking: The Challenges of Persuasion and Leadership in Tech
Charles Humble: Yes, yes. Absolutely. Absolutely. Something else that I was thinking about, because it's something that I've found, and I've actually found it a lot, is, it is very difficult to apply systems thinking in an environment that is hierarchical. So, if you're in a situation where someone says I don't know, "I'm the founder," or, "I'm the CTO, and we are doing this," I don't think you can use systems thinking effectively. At least I never have. I was curious as to what your experience of that would be, if you agree with me at all. And if you do, how would you sort of solve it?
Diana Montalion: That is the most common question I get asked in workshops, is people say, "Yeah, but no one listens to me." Right? So, I realized, one of the reasons I wrote the book is so that we don't feel so alone, screaming into the wind all the time. Because there's a lot of us out there, but in our day-to-day life, there's not a lot of... It's hard, I guess, it's very challenging. And Donella Meadows says that we know from bitter experience that when we do discover a leverage point, when we do see counterintuitiveness, almost no one will believe us. And I call it the 18-month rule. You point out, you make a recommendation, and it'll take 18 months before people go, "Oh, yeah. We always thought that." "Oh, yeah. That's the way to do it." In the beginning, it's like, "No, no, no. You don't know what you're talking about." And that's because... You see it, because you've done the work, and you see it. But there's also the next part of helping other people come to the same conclusion to see it, right? So, on one side, I say, if you're gonna do systems work, this is part of the downside, right? And if you need, 18 months later, for people to say, "Charles has been saying this from the beginning," you should quit right now because that will never, ever happen. Like, you'll make a recommendation that no one listens to, and then when they do listen to, nobody comes back and says that you were right. So, if there is a bit of cultivating, right, the willingness to be in that space.
And the thing is, you don't know if you're correct in... So, you're also re-examining during that time. So, on one side is recognizing and being...recognizing this process and being patient. On the other hand, systemic reasoning, which is the way of structuring recommendations, theories, ideas, and sharing them in systems thinking, requires consent, and it requires a proactive willingness to learn from each other. As soon as somebody in the room says, "I don't care what you say. I'm in charge. This is what we're gonna do," then you're practicing politics. You're practicing the politics of power. And that's not the same thing, and systems thinking does not change other people's minds. It is not very helpful in politics. And so, sadly, I don't really have a good answer to that question because I've had a similar experience, where...and we all will.
If somebody comes in and just stomps all over it, then you're not doing systems thinking anymore. What I can say, though, is the better you get at creating conceptual integrity, the more trust you build over time. And even if you're in a situation that's not open to that, that isn't going to change, and you can't really move that, your own career will tend to lead you increasingly into situations that do see what you're thinking in your skill set, and want to apply them to their challenges. So, even though you can't change the CEO, who is very hierarchical, and make them see structure differently, over time, I think it leads us into better and better, more satisfying opportunities. And that, for me, over time, despite the scars that I have, that you have too, that we all have, despite the scars, it's definitely, for me, more satisfying. Right? Over time, it's much, much more satisfying work. And there are places to make the impact. It's just not always in those meetings.
Charles Humble: Right. Yes. I just really want to echo something back there, because I think, when you feel like you're seeing something... I mean, so, some of the hardest things that I've had to deal with in my career are those moments where I'm seeing a thing, and I'm, no one else is seeing it, and I can't persuade them, for whatever reason. It's a really difficult, lonely place, I think. And you add to that as well. There's a, the whole point of this is there is ambiguity, right? So, there's a bit of you going, "Well, maybe they are all right, and I'm wrong." And it's just really, really uncomfortable. And I think there's a lot to be said for just being aware that, yeah, a lot of people have that experience. It's not that...it's not just you.
Diana Montalion: No, not at all. It's interesting because, just before I met with you today, I met with previous colleagues who are just so good at helping to create these types of systems of support. And they are...I...they're not strictly on the tech side like I am, but partnering with them enables me to do so much more. And a lot of the conversation was about what you're saying right now, that we need other people who are wrestling with the same challenges. And so, often, I make stealth groups, like, kind of a systems support group, and I didn't know to call it that. But now I look back and go, "That's what it always was, a systems support group," because it is... There's a lot of ambiguity. And also, it's pretty awful to be told you're wrong all the time. Like, we don't seem to have a problem telling people they're wrong in tech all the time. And especially if you're like me, and if you come from a population that is very, very, very lowly represented in tech, right, there are very few people that look like me, that do what I do, or think, maybe, the same way. And so, we need the types of things that help us to not burn yourselves out, and also to not take the system's challenges so personally.
And yet, at the same time, we also need to learn the skills to not... One of the things that I've heard a lot is people will say, "Oh, you just need to understand them more," or, "You just need to empathize more," or, "As a leader, you just need to..." And then the rest of the sentence sounds more like parenting than leadership, right? Sounds more like doing the emotional labor for people who aren't doing emotional labor. And one of the most valuable things I've learned in my career is to stop doing that, whenever I can, because then you're in a Sisyphus role. You're just pushing the same rock up the hill, coming back down. People good at systems thinking often end up in glue roles, roles like that, cat-herding roles. But in systems thinking, no one gets to be a cat. Everybody has to have their own craft of building conceptual integrity together. And so, it's a big challenge to avoid ending up in a role where you're trying to hold, to glue together things that don't want to be glued together. Or, as an architect, how do I create a relationship between this legacy system and this new thing that a team just renegades, and goes and builds. After it's done, figure out how they're in a relationship. Like, "Oh, we probably wanted to talk about this before we built it." So, I feel you. We need each other to help us to stay healthy while we're doing this work, because it's not easy, in a number of ways.
Recommended talk: Dynamic Teams: Reteaming Patterns & Practices • Heidi Helfand & Charles Humble • GOTO 2024
The Journey of Writing a Systems Thinking Book
Charles Humble: Before we close, I wanted to ask a couple of questions about the book itself. So, I was interested, did you get any pushback to the idea of writing it? Because I know I tried to pitch something not a million miles away from this, and was basically told, "Oh, it's not really relevant," which I found very odd. How to respond to that at all? But I was just curious, when you pitched it, were you getting pushback from people, or were they sort of quite alive to the idea?
Diana Montalion: What's interesting is that the proposal came out of an interview. I was interviewed, and was asked, "You should write a book about this." And I said, "I can give you a proposal in a couple of days, because I have this in my head." And in that instance, then, when the proposal went to the publisher, the publisher was like, "I don't know what to do with this." Like, there's too much, like, is it self-help, business, tech? And it was really challenging for me, because I'm like... So, I have been a technologist for 20 years, and I'm saying these are the core skills I need to do the work on my resume. But book publishing, marketing, and also, we have very strong opinions about what tech is, that it's very concrete. It doesn't include these other kinds of skills. And then, I was in a meeting with the O'Reilly team about something else, and someone mentioned that I had been thinking about the book, and they asked me for it. And so, I sent them a proposal. And I have to say that I have had very few experiences in tech that were so supportive of finding my voice in a way that really could help people think about what we were thinking about.
I expected there to be pushback. And instead, I got pushback, but in the direction of really taking a risk. Really say it. Like, don't couch it, don't equivocate, don't hide it under the couch, and sort of do it, but really, really bring it out. And, from my acquisitions editor, David Michelson, my development editor, who was incredible, Shira Evans, the production editor, Clare Laylock, even the copy editor, who gave me 800 notes, made it stronger. So, I couldn't have written the book if the publisher kept pushing it towards, "Well, you need templates and checklists," which I can't give you because I don't know your situation. Like, I don't know which models will be the most effective, but I can help you discern which models, and I can give you some places to start. So, I really have to give O'Reilly a lot of credit.
The thing about writing a book like this is you're onto yourself. Like, I changed so much, because I do all the things I write about, but now I'm writing about them, and I'm aware of them, and it was really challenging for me. I learned too much about myself. And my editor was really supportive of the process as well as the product, which is really unusual. So, it definitely took a team to do this, and it took support and encouragement from other writers too, like Andrew Harmel-Law, who's also writing an O'Reilly book at the same time. So, this is almost in print. If I knew what the journey was going to be, I don't know that I would have. But O'Reilly, they wanted this. They knew the pain that you and I are talking about, and they wanted to begin to speak to it, and have more diversity in approaches. And so, they really opened the door, and made it possible. And I'm very grateful to them for that.
Charles Humble: That's really lovely to hear, actually. You slightly preempted a question I had in my head as well, which was, I'm imagining that writing a book about thinking makes you really examine your own thinking. I mean, I appreciate you thinking about your thoughts all the time. This gets very meta. But what did you sort of...you said you sort of discovered so much about yourself from writing the book. What were some of the things you discovered?
Diana Montalion: For example, the further along I got, the more anxious I was. And I got to the point where I took an online anxiety assessment, and it was on a scale of 1 to 26. And I scored a 24, but I wasn't totally honest about one of the questions, because I felt like I maybe was being too dramatic. And so, in figuring out the rising anxiety, I was diagnosed with ADHD, which might be ADHD and a bit on-spectrum. Like, it's a Venn, you know, neurodiversity is a Venn diagram. But that's one example of how I'd had these sort of functional ways of operating, like any system, my personal system. Had functional ways of operating. And it's very counterintuitive, because these types, these... Neurodiversity looks different in someone raised female, someone who's had a lot of decades of practice with it than it does in, say, an 8-year-old boy, it would look very different. And yet, I couldn't escape it, because I kept bringing myself to the practice, and because I had deadlines, and because I really couldn't avoid the... I couldn't distract myself. And so, that was one example where I'm a different person coming out on the other side of the book than I was going into the book.
It is, just like every leverage point, it was counterintuitive. And it's not fun. I do not like it. My coach, I have a coach who specializes in people who are neurodiverse in tech, who's wonderful. But she's also like an archer. She just, right at the very thing that's the hardest for me. Right? Like, "I wanna do that strategy stuff." And she's like, "How about drinking water?" That's the fundamental practice that I find very hard to do. And so, I really am increasingly comfortable with my own uncertainty about how my own mind works, and how my own process works, and how many structures I still have in my mind, that, rules that I follow, that don't really work for me, and are causing some of the discomfort that I have, and all of that.
And also, I'm much more interested now in enabling change in the industry approach, whereas before, I tended to care about my impact on a particular organization or a particular system. Now, I'm trying to find vocabulary that we can grow together. And some of the things that I've launched now, like the System Crafters Collective, how can I create communities of people that can think well together, so that we can all figure this out? Because I'm certainly not going to be able to, in my lifetime. And also, I'm limited by my own experiences and my own biases. And so, it is a collaborative experiment. So, it's very interesting to have done something so solitary, and come out of it understanding how important it is to do this together, to think and improve as a community. So, I'm having to build my social skills, when my preference is, like, four days reading a book instead.
There are aspects of what I do that I considered because the tech industry considered outside of tech. One of my pet peeves is soft skills and hard skills, because the soft skills aren't... They're harder. It's why they're harder. There's some derision in this. And bringing more of my whole approach to software development has made me better at it. But I tried to be who I was supposed to be, to be effective in an industry where I was pretty underrepresented. And now, I feel more comfortable with talking about things that don't necessarily sound like tech skills, but they absolutely, at least for me, have been critical. So, those are three. Those are three examples, but the first one is definitely the biggest one.
Charles Humble: Diana, that's absolutely brilliant. Thank you so much. It's been wonderful to talk to you. Really fascinating. The book, as I say, is out on O'Reilly now.
Diana Montalion: Thank you. I'm so glad that we got together.
Charles Humble: Me too. Yeah. The book is out on O'Reilly now. I do urge everyone to go and get a copy, because I really do think it's a fantastic book. And it's been an absolute honor to talk to you. Thank you so much, Diana. I really appreciate it.
Diana Montalion: Thank you, Charles Humble. I appreciate your questions. They were wonderful.
About the speakers
Charles Humble ( interviewer )
Freelance Techie, Podcaster, Editor, Author & Consultant
Diana Montalion ( expert )
Systems architecture, thinking and design