[00:58.820 --> 01:00.700] Welcome everybody. [01:00.700 --> 01:10.100] We are going to have a series of executive panels in this room, which I'm really excited about, because this is the first time ever we've created an executive track at CypherCon. [01:10.300 --> 01:14.020] So there'll be folks in this room, I think, 9 to 10. [01:14.020 --> 01:16.780] There's a break from 10 to 11 when the keynote happens. [01:16.780 --> 01:19.660] And then 11 to 12, there's another one in this room. [01:19.780 --> 01:25.280] Other executives we'll be covering from, I think, 11 to 1. [01:25.280 --> 01:32.160] We have a side event we're going to, and then we'll be back from 1 to 2. [01:32.160 --> 01:34.420] No, it's 2 to 3. [01:34.480 --> 01:37.880] So we have a number of panels that we'll be taking you through. [01:37.880 --> 01:43.080] And I'm just so excited because I think this is something that's been missing from CypherCon and Michael and I created it together. [01:43.080 --> 01:45.560] So I'm really looking forward to dumping in. [01:45.560 --> 01:49.420] So this event is Architecting the Enterprise. [01:51.020 --> 01:58.280] And the folks that you're seeing today in several of these conversations are from a group called the Chief Architect Network. [01:58.280 --> 02:03.400] So what we are is a Fortune 500 set of practice leaders of architecture. [02:03.400 --> 02:09.620] So this is actually all those little dots that you see in the map are not just members, but groups of members. [02:11.900 --> 02:18.400] And what we're looking at is basically where those 450 folks are distributed across the globe. [02:18.400 --> 02:21.600] I like to say it's New Zealand to Hawaii the long way. [02:21.900 --> 02:22.980] So it's a lot of fun. [02:22.980 --> 02:29.120] We're going to be going to London, actually, in just the end of the month for a quarterly event. [02:29.120 --> 02:33.980] About 40 of us are gathering, sharing practice, knowledge, and insights. [02:33.980 --> 02:35.700] It'll be a really good time. [02:36.280 --> 02:39.760] So today we're going to be talking about architecting the enterprise. [02:39.760 --> 02:45.120] The other two are cyber security and architecture and AI and cyber. [02:45.120 --> 02:48.880] So those are the three that you're going to hear in order in this room. [02:49.360 --> 02:52.820] So let's talk about what architecting the enterprise is all about. [02:52.820 --> 02:59.880] EA is a pivotal piece within organizations that are seeking to create digital competitive advantage. [02:59.880 --> 03:13.220] We're going to discuss how EA can align with business strategies and how it can both drive efficiency and foster innovation, sharing insights and building and sustaining a robust EA capability that supports evolving business needs. [03:13.240 --> 03:16.740] So Gokula is our moderator and a good friend of mine. [03:17.000 --> 03:23.040] And Jason is... I'll let you introduce yourself in a moment as well. [03:23.660 --> 03:25.900] And we actually have Sandesh as well. [03:25.900 --> 03:26.820] Come on up, Sandesh. [03:26.820 --> 03:27.440] Welcome. [03:28.460 --> 03:30.080] Great to have you here, my friend. [03:31.340 --> 03:32.060] Absolutely. [03:32.120 --> 03:38.360] So Gokula, your call, would you like me to stay up and be a panelist with you or would you... so that'll give us four? [03:38.520 --> 03:39.260] All right. [03:39.260 --> 03:40.060] I'll hang out. [03:40.060 --> 03:42.060] But Gokula, take the mic as our moderator. [03:42.060 --> 03:43.180] Excited to hang out. [03:45.120 --> 03:45.840] Absolutely. [03:46.500 --> 03:48.940] And why don't you two introduce yourselves while we do that. [03:48.940 --> 03:49.700] Jason, go ahead. [03:49.700 --> 03:50.420] Certainly. [03:50.420 --> 03:52.560] My name's Jason Labassi. [03:53.060 --> 03:56.540] I guess right now I'm being a consultant over at Bank of America. [03:56.540 --> 03:58.020] And prior to that, I was over at J.P. [03:58.020 --> 04:07.140] Morgan for 20 years, working in both architecture and solution delivery across mostly the payments and credit card industry. [04:09.910 --> 04:10.950] Hello. [04:11.150 --> 04:12.230] Hello, everyone. [04:12.230 --> 04:13.710] My name is Sandesh Gawali. [04:13.710 --> 04:17.930] I'm director of strategic architecture and advisory at Salesforce. [04:18.270 --> 04:24.430] Just for disclaimer, that's a new norm came in our company is all the views are my personal. [04:24.430 --> 04:28.290] It's nothing to do with my company or my employer. [04:29.950 --> 04:31.530] Check, check. [04:31.710 --> 04:33.390] Hi, I'm Gokula Viswap. [04:33.530 --> 04:38.090] I've been in data analytics and AI world for 30 plus years. [04:38.270 --> 04:40.470] I'm kind of semi-retired now. [04:41.390 --> 04:45.670] Prior to this, I used to work for direct supply. [04:45.770 --> 04:49.730] My office used to be not too far from here, downtown Milwaukee. [04:49.730 --> 04:53.730] I helped direct supply build their data and AI team. [04:54.630 --> 05:06.070] And prior to that, I used to run data analytics AI and supply chain globally for McDonald's and have spent a lot of time in various industries. [05:06.370 --> 05:18.370] To kind of kickstart our panel, architecture and governance for whatever reason have this negative connotation I have seen. [05:18.370 --> 05:34.890] But I have experienced that if you do the architecture right, and if you do the governance right, it actually enables the company's strategy and the technology kind of come together to create the desired business outcome. [05:34.890 --> 05:47.950] And in many situations, actually makes the company very resilient, be able to deal with all kinds of fluctuations in the market, in technologies, in people, and it really builds a strong foundation. [05:48.170 --> 05:57.250] And so today's topic is very near and dear to me because I did focus on that throughout my career to get the architecture and governance right. [05:57.990 --> 06:03.970] So for my panelists, I have a number of questions that I'm going to go through. [06:06.070 --> 06:11.470] And so the first question is actually a critical one. [06:11.470 --> 06:21.550] How can a enterprise architecture framework be designed to align with an organization's strategic objectives? [06:21.990 --> 06:28.570] As I said, I think if architecture is done right, it could be a strategic enabler for the business. [06:28.570 --> 06:44.850] And it does really translate the business goals into technology and operational capabilities that you need to ensure that IT investments and the digital transformations that are going on would be successful. [06:44.850 --> 06:54.810] So panelists, how have you designed enterprise architecture frameworks to align with your organization's strategic objectives? [06:55.230 --> 06:56.110] You want to go first? [06:56.110 --> 06:56.390] Yeah. [06:56.390 --> 06:57.750] Okay, sure. [06:57.750 --> 07:02.770] As Gokul mentioned, the strategic architecture has to be aligned with the businesses. [07:02.770 --> 07:08.910] A lot of time people confuse the solution architecture and focus on particular views or particular areas. [07:09.090 --> 07:11.350] That's not how enterprise architecture works. [07:11.350 --> 07:24.930] It has to be built in a way that it goes across all the areas, aligning to the business architecture, business outcomes, whether faster time to the market, reducing the cost and efficiencies, or probably being innovative in the market, being competitive in the market. [07:25.090 --> 07:33.070] In order to build those, it has to look from the top down, as well as we have the new approach I usually recommend is look from the bottom up as well. [07:33.070 --> 07:35.310] You want all the stakeholders involved, right? [07:35.310 --> 07:45.390] Probably from the top down you have business alignment regions, but those regions and business alignment has to pass down to the team on the bottom who is actually executing on those frameworks and using those tools, right? [07:45.390 --> 07:51.730] So in order to look it from the top-down approach, which is a traditional thought process, look it from the bottom-up approach as well, right? [07:51.730 --> 07:59.430] That's where our enterprise architecture role is coming into play, bonding those two teams together, bonding those two areas together, right? [07:59.430 --> 08:00.750] Acting as a glue. [08:00.790 --> 08:08.950] So in order to build those frameworks, you have to look up, have the collaboration plan, have the right communication plan, as well as strategic alignment, right? [08:08.950 --> 08:12.350] Be an enabler rather than being just as a gatekeeper, right? [08:12.350 --> 08:19.590] That's the thought process I usually put in the mind whenever I speak with the customer and whenever I talk with executives at the level, right? [08:19.590 --> 08:21.750] You have to support this at a certain extent. [08:21.770 --> 08:27.770] We're empowering your ecosystem from the developer and the BUs who can use the platform, right? [08:27.770 --> 08:30.210] Rather than look at it as the hurdle down the line. [08:32.330 --> 08:36.630] Yeah, so I think exactly what Sandesh said. [08:36.630 --> 08:39.410] Number one, you don't want to be the ivory tower. [08:39.590 --> 08:45.070] If you're the ivory tower and you don't have relationships back with the business, that's not going to work. [08:45.070 --> 08:50.670] You always start with what's the key business objectives and what's important to the business. [08:50.870 --> 08:53.870] In finance or a regulatory, right? [08:53.870 --> 08:57.910] Maybe some of those items are front and center. [08:57.910 --> 09:03.350] If you're a startup and you're focusing on external web services, right? [09:03.350 --> 09:05.330] That's going to be front and center. [09:05.330 --> 09:13.670] If you had a lot of security issues, so you really need to understand where the organization is and focus on that. [09:13.670 --> 09:17.070] I think the second thing is really focusing on those relationships. [09:17.870 --> 09:22.890] Again, going back to the ivory tower or I like to call it the city hall, right? [09:22.890 --> 09:26.350] You don't want to be the city hall where people kind of go to you to get permits. [09:26.390 --> 09:28.150] You need to have relationships. [09:28.150 --> 09:30.450] You need to be front and center with the business. [09:30.550 --> 09:35.790] You also need to be front and center with security and front and center with developers. [09:35.870 --> 09:40.750] I think those are kind of at a high level some important areas to kind of focus on. [09:40.750 --> 09:46.410] I love what everyone has said and there's a reason that we all kind of are in the same group. [09:46.510 --> 09:52.790] The other thing I would add is tailoring is one of the most important things we do. [09:52.790 --> 10:05.810] If we just come into an organization and think we can take our box of tools and just deploy it mindlessly, that is just creating an administrative center of excellence or administrative center of waste, honestly, right? [10:05.810 --> 10:07.410] Like, okay, here's the process. [10:07.410 --> 10:08.410] I'm going to review things. [10:08.410 --> 10:11.310] I think we as architects are the smartest people in the room. [10:11.330 --> 10:16.430] Therefore, what we think is the right answer and if you guessed wrong, we're going to stop your project. [10:16.450 --> 10:21.010] What a silly, crazy, narcissistic approach, right? [10:21.010 --> 10:26.550] I think what we have to do is listen, just like my colleagues have shared, and understand what is that business strategy? [10:26.550 --> 10:27.850] What are the key capabilities? [10:27.850 --> 10:30.870] Which capabilities is a word we use a lot in architecture. [10:30.870 --> 10:32.450] What we're really talking about is the what. [10:32.570 --> 10:40.350] Is this like anything as simple as email campaign management or product innovation, right? [10:40.350 --> 10:45.490] What are the things we need to do to achieve the strategy of grow market share? [10:45.490 --> 10:58.730] So as we break that into the things we need to do and we start to now take that across all of the strategies we're trying to drive, we can create a heat map of the areas that are the capabilities that are going to be the most important to drive the business forward. [10:58.730 --> 11:04.390] And we can also start to understand how architecture needs to be applied to support the business strategy. [11:04.390 --> 11:09.730] At the same time, we look at what our IT partners need, our CIO, and that leadership team. [11:09.730 --> 11:11.510] Are we struggling with technical debt? [11:11.510 --> 11:13.710] Are we struggling with a lack of standardization? [11:13.710 --> 11:18.290] Are we struggling with people building the same thing five different ways in different parts of the department? [11:18.290 --> 11:22.450] Okay, so now I'm going to be looking at overlaps and opportunities to consolidate. [11:22.450 --> 11:27.590] Or maybe we're growing so fast we can't keep up and it's more important that we accelerate the business. [11:27.590 --> 11:29.850] We don't actually need to worry about consolidation. [11:29.850 --> 11:32.170] That's why it's so important that we localize. [11:32.310 --> 11:47.570] So as we start to understand the problems we're solving, both from a business enablement and from a technology support for our operations, that allows us to start to understand which tools in our toolbox should we apply at what level of rigor to support the business. [11:47.570 --> 11:48.610] Because that's what we're trying to do. [11:48.610 --> 11:53.170] We're trying to make the CIO and their team win for their business. [11:53.170 --> 11:59.030] And as long as we keep that in a laser focus, we're going to be successful. [11:59.810 --> 12:00.950] All right. [12:00.950 --> 12:02.510] Sorry, that was a little long-winded. [12:02.510 --> 12:03.430] No, no, no. [12:03.490 --> 12:06.390] It's the right setup for the next question. [12:06.710 --> 12:13.470] What are the primary challenges in developing and maintaining a comprehensive enterprise architecture? [12:13.470 --> 12:34.030] In my experience, some of the areas that I've seen is stakeholder buy-in, legacy system complexity, tech debt, scalability, governance versus agility, which is some of the areas that the panel members kind of touched on. [12:34.250 --> 12:52.550] If you look at what's going on right now, I kind of thought I would never go there, but AI is just disrupting so many layers of our technology, starting with machine learning, starting with gen AI. [12:52.650 --> 12:54.970] Now we are talking about agentic AI. [12:55.010 --> 13:00.790] That basically touches the data layer, the application layer, the security layer, the network. [13:00.790 --> 13:02.090] I mean, you name it. [13:02.090 --> 13:05.250] AI is kind of moving fast. [13:05.330 --> 13:09.370] And so that, again, is the governance versus agility. [13:09.370 --> 13:10.830] How do we adapt to that? [13:10.830 --> 13:14.250] So can you please share your thoughts on that? [13:14.250 --> 13:15.030] Sure. [13:16.110 --> 13:17.670] There's a lot of challenges. [13:17.670 --> 13:22.050] Enterprise architecture isn't easy at all. [13:22.050 --> 13:43.530] One of the big challenges I certainly see in my companies that I've worked for, from an enterprise architecture perspective, is you end up either being tied too tightly to the business or sometimes tied too tightly to technology or tied too tightly to the frameworks, [13:43.530 --> 13:45.010] like TOGAF. [13:45.030 --> 13:48.910] And really working between those three is very challenging. [13:49.030 --> 14:00.530] If you're not understanding, A, where the executives and where the business is, you're not going to be putting in the correct EA to guide the organization. [14:00.590 --> 14:14.810] But at the same time, if you're so far removed from the day-to-day kind of delivery and the challenges of your users of the EA, it's not going to work, right? [14:14.810 --> 14:25.670] You're going to put all these standards and processes and other things in place, and it's not going to add value because people are just going to avoid them. [14:25.710 --> 14:38.230] So you really need to have those relationships, both at the business side, at the team side, and we are at a security conference, certainly at the security side. [14:38.230 --> 14:44.290] Just to add what Jason said, also Gokul mentioned about a lot of things, right? [14:44.290 --> 14:56.670] But there are very few common points you can find out because it's every organization these depends on the cultural changes or some level of communication gaps between the executives and the business and the technology team on the ground, right? [14:56.670 --> 15:02.650] So if you look at the rapid pace of innovation, these days you see everywhere is AI and data initiatives going on. [15:02.650 --> 15:13.350] That's the biggest challenge because the organization is not enabled for, or not have capability to even execute some of those strategic initiatives they're looking for, right? [15:13.350 --> 15:17.410] Like adopting the AI or what kind of a model, what kind of a security. [15:17.730 --> 15:22.370] This has exposed a lot of organizations to the area where they can face the challenges. [15:22.370 --> 15:27.930] They don't know which way to go, they don't know what path to follow, what to adopt, and what tools to adopt, right? [15:27.930 --> 15:39.470] So that's where newer generation challenge I can put it in apart from what Gokul and Jason mentioned is AI and data has become really biggest challenge and focus area for the executives as well as the technology team on the ground. [15:39.670 --> 15:49.030] Other thing in terms of enterprise architecture is enterprise architecture, as I mentioned, is always looked at a solution which is like a focus area, which is wrong, right? [15:49.030 --> 15:58.570] Enterprise architecture team goes across looking at the business technology, driving the strategic innovations and business outcomes that align to the vision of the organization, right? [15:58.570 --> 16:00.130] Whether it's adopting or not, right? [16:00.130 --> 16:01.910] That's a confusion point, right? [16:01.910 --> 16:04.050] And that's what I call the skill gap. [16:04.050 --> 16:08.950] Whether it's AI or what is enterprise architecture skills, that's the skill gap. [16:08.950 --> 16:14.950] And organization has to be rapidly evolving, iterating on the strategic plans they have, right? [16:14.950 --> 16:17.390] It cannot be just one solution fits all. [16:17.390 --> 16:29.110] It has to be customized and can be adopted all the time for the needs that are evolving throughout the innovation period or throughout the region of the challenges, the path organization is following on to. [16:30.550 --> 16:37.570] The thing I would add is there's a big cultural challenge we run into when we try to... you know, by the way, I actually didn't introduce myself. [16:37.570 --> 16:38.610] I'm Grant Ecker. [16:38.730 --> 16:47.850] I'm the global chief architect at Ecolab and was previously a chief architect at Walgreens Boots Alliance at Danaher as well as at Medtronic. [16:47.890 --> 16:51.670] And I helped to build this network that we're all a part of. [16:51.930 --> 16:56.410] What I would say is one of the key challenges is a cultural challenge of trust. [16:56.930 --> 17:03.850] I feel like the biggest challenge you run into is why should I listen to you and what value are you going to add? [17:03.890 --> 17:20.870] And if we kind of start with creating foundational value for our colleagues in IT and we get their trust and we understand what challenges they're going after, then we can actually represent them in the business conversations we have, bring things that resonate for them earlier. [17:20.870 --> 17:24.610] Like, hey, this is an effort I just heard about that's going to touch your area. [17:24.790 --> 17:27.010] Who can I get engaged from your team? [17:27.090 --> 17:27.550] Right? [17:27.570 --> 17:36.290] And then I think the other piece that we can really do to address the risk of trust is not trying to be the know-it-alls. [17:36.430 --> 17:39.730] Like, somebody else spends 40 hours a week on cloud. [17:39.750 --> 17:42.390] Why on earth would I think I know more about cloud than they do? [17:42.390 --> 17:43.510] That's insane. [17:43.510 --> 17:49.610] Like, let's get that person in the room and have them own the recommendation for their piece of the puzzle. [17:49.610 --> 17:58.350] And I should focus on how that piece connects to the pieces next to it or anything that's obviously wrong math, which is like 2% of the time, right? [17:58.350 --> 18:00.870] And if I can do that, then I can build trust. [18:00.870 --> 18:05.030] And with trust, all of a sudden, all the other doors start to get opened. [18:05.030 --> 18:12.930] But I can maintain that trust by also staying focused on what actually matters to the organization and that includes IT and business focus areas. [18:12.930 --> 18:23.450] Because if I get distracted on some detail I care about that no one else cares about, I'm really wasting that trust and I'm starting to actually burn the capital that I've worked so hard to build. [18:23.970 --> 18:25.190] Now, here's the thing. [18:25.190 --> 18:26.650] Gokul is an expert in this too. [18:26.650 --> 18:28.430] So we can't just be a moderator, my friend. [18:28.430 --> 18:29.450] What do you think? [18:30.910 --> 18:51.030] Yeah, I think, so that actually leads to the next question I was going to ask, which is how does the EA team collaborate with other departments, which is, you know, you're touching the collaboration and the culture and making sure there's buy-in, there's seamless integration. [18:51.030 --> 19:04.730] I'm not talking about technical integration, but organizational integration and, you know, how we can approach the enterprise architect literacy. [19:05.090 --> 19:17.410] Do, actually, people understand at every level what enterprise architecture be and what do we need to do in order to gain their confidence, their trust, so we can collaborate and work together. [19:17.410 --> 19:21.710] We have to make sure that we drive the year literacy. [19:21.710 --> 19:24.850] So how have you done that? [19:24.850 --> 19:26.850] How have you driven collaboration? [19:26.850 --> 19:32.910] How have you driven year literacy in your organization? [19:33.390 --> 19:34.910] Who wants to go first? [19:35.150 --> 19:36.050] Sure. [19:37.390 --> 19:41.310] So a lot of that starts with the why, right? [19:41.310 --> 19:52.030] In order for the different organizations to embrace you and understand, if you're not able to communicate the why, like, what's the value add? [19:52.130 --> 19:56.810] And have multiple communication mechanisms for the why, right? [19:56.810 --> 19:59.750] You're going to have to have stakeholder meetings. [19:59.750 --> 20:03.310] You're going to have to have, you know, marketing material. [20:03.310 --> 20:04.990] You're going to have to have standards. [20:04.990 --> 20:10.650] You really have to tell everybody why is this in place, not that here's just the rules. [20:10.650 --> 20:16.430] Because if you just have the rules, people are only going to half-heartedly follow them. [20:16.590 --> 20:18.310] So I think that's critical. [20:18.330 --> 20:28.590] The second thing, and I've seen this work very well in different departments, is really having functional or business alignment. [20:28.730 --> 20:33.170] So, you know, my world, I've spent a lot of time in financial services. [20:33.170 --> 20:38.610] So let's say you have an enterprise architect that's aligned to a prepaid card. [20:38.610 --> 20:42.350] You have an enterprise architect that's aligned to payments. [20:42.350 --> 20:49.610] You know, you build those different areas where you've got alignment, including security. [20:49.610 --> 20:50.950] Can't forget about security. [20:50.950 --> 21:00.810] I think security is going to really grow from an enterprise architecture need as we move into AI, probably more than anything else. [21:02.210 --> 21:03.170] Sure. [21:04.510 --> 21:11.550] Just what Jason said, to add to that, in terms of security, right, there are other practices as well, right? [21:11.550 --> 21:14.110] Cloud expertise group, right? [21:14.110 --> 21:15.350] Center of Excellence. [21:15.690 --> 21:23.170] So empowering your teams in terms of, like, federating some responsibility back to the individual teams and the business lines is very important. [21:23.170 --> 21:25.630] That's what we call the shared model of responsibility. [21:25.830 --> 21:34.010] EA is coming into the picture, not just to the gatekeeper, but actually helping and enabling them in order to proceed forward and make a progress. [21:34.010 --> 21:43.870] To make that value back, right, to the business and even to the individual team who is building and working on the ground, you have to make sure they're collaborating together, right? [21:43.870 --> 21:47.050] And that's how you kind of share responsibility model. [21:47.050 --> 22:02.170] I call it as a flexibility in framework, right, where you're giving enough power for them to actually do the right level of things by themselves, like building it, whether it's a data, AI security, but there is a representation from each team back to the age of the group, [22:02.170 --> 22:03.250] like Center of Excellence, right? [22:03.250 --> 22:07.110] You have a center team who is taking care of responsibility of all of that, right? [22:07.110 --> 22:22.750] So that federated and flexibility model do drive that collaboration further where your team can make a progress and you don't have to communicate effectively in terms of just documentation, but it's actually living practice that practicing every day. [22:22.770 --> 22:33.950] So flexibility in framework, that's what I implemented with many customers where people on the ground see as empowering rather than kind of a gatekeeper by your enterprise architecture and being a bottleneck in the process, right? [22:33.950 --> 22:35.250] So they start adopting it. [22:35.250 --> 22:43.130] That's how we drive it and communicate and collaborate across the EA group as well as the individual business application team. [22:43.510 --> 22:49.830] I find that one of the most important things building on what my colleagues have just shared is creating community. [22:50.070 --> 22:51.430] Like, look at CypherCon. [22:51.430 --> 22:53.370] That's one of the best things about this place. [22:53.370 --> 22:57.010] I don't know about you, but for me, some of the best moments have happened in the hallways here, right? [22:57.010 --> 22:59.370] I think that's what's truly special. [22:59.370 --> 23:01.730] We need to create those hallways in our organizations. [23:02.050 --> 23:04.770] By default, teams will stay in their silos. [23:05.010 --> 23:11.510] What we want to do is create a sense of belonging and a sense of sharing knowledge and expert hotline. [23:11.790 --> 23:20.750] So what I like to do is create kind of a place where we have quarterly conversations as a large group and then we find our interest areas. [23:20.750 --> 23:27.130] There might be some interested in data, there might be some interested in cloud, there might be some interested in business architecture, name it, right? [23:27.370 --> 23:31.530] Anything that has enough critical mass to have a conversation and then let's name a leader. [23:31.530 --> 23:32.750] They don't even have to be in my team. [23:32.750 --> 23:33.610] It doesn't matter. [23:33.610 --> 23:37.310] What matters is the conversation happens and we create that connection. [23:37.590 --> 23:42.830] Because ideally, we're going to create the connections for people to come together when a project comes in. [23:42.830 --> 23:46.690] It needs a little security, needs a little cloud, needs a little data, needs a little network, needs whatever, right? [23:46.690 --> 23:48.770] We're going to bring those teams together in that way. [23:48.770 --> 24:02.310] But I think, man, if we could have brought those people together in a way where they're sharing knowledge and best practices and they actually kind of feel like they're part of the same thing, then that's like, oh cool, I get to work with Steve and Mary again, [24:02.310 --> 24:02.710] right? [24:02.710 --> 24:07.190] And by the way, next time I have a problem in that space, I'm not going to wait for a project. [24:07.190 --> 24:09.990] I'm just going to write Steve because I know him. [24:10.090 --> 24:17.090] So like, to me, that's one of the big areas we can create, like that collaborative culture in our organizations. [24:17.490 --> 24:21.770] And now the thing that they can identify with is, oh, I'm part of that architecture community. [24:22.350 --> 24:23.730] And they're proud of it. [24:23.730 --> 24:26.350] And to me, when that happens, that's kind of the magic. [24:27.970 --> 24:31.410] Yeah, I'll kind of reflect on some of my experiences. [24:31.450 --> 24:42.630] I remember when I first joined McDonald's, there was no information governance policy. [24:43.610 --> 24:47.490] Now, if you noticed, I didn't say data governance policy. [24:47.490 --> 24:50.430] I said information governance policy. [24:50.530 --> 24:59.990] And information has many, the three key components of information is where the information gets created. [24:59.990 --> 25:04.650] That's where a lot of the data activities happen. [25:04.650 --> 25:11.430] Second is where the sense out of the data is made. [25:11.650 --> 25:14.790] Meaning, how is it used? [25:14.990 --> 25:16.790] How is it put together? [25:16.790 --> 25:18.550] And the third is the usage. [25:18.550 --> 25:24.490] How do then business use that information to make business decisions? [25:24.490 --> 25:30.830] And if you kind of look at that, and every department had their own policy. [25:30.950 --> 25:44.810] And that, you know, I remember as a new employee, I had to take like eight courses to be, quote unquote, understand various components of information governance policy at McDonald's. [25:44.810 --> 25:47.910] And so I basically said, this is nuts. [25:47.910 --> 25:52.190] Every employee goes through this, and they have to redo this every quarter. [25:52.270 --> 26:02.750] We need to get everyone together, make this easy for people to understand, and have one information governance policy. [26:02.750 --> 26:17.230] So we'll have one training where people actually understand how is security connected to privacy, connected to analytics that we do, connected to the data that we collect, et cetera, et cetera. [26:17.470 --> 26:22.430] And that's the kind of, you know, challenge you see, lack of collaboration. [26:22.430 --> 26:32.830] So I got all the different stakeholders together, and we worked through, it took us some time to do that, and we created the information governance policy. [26:32.830 --> 26:35.950] And we had a quarterly meeting, any change we need to do, et cetera. [26:35.950 --> 26:46.670] So now they don't see us as, oh my God, the network people are just waiting there, or security people are waiting there to block my application, or block my analytics, or whatever it is. [26:46.670 --> 26:57.050] Now they know that they're all working together, and they actually are very, very open to engaging this team well ahead of time. [26:58.070 --> 27:09.190] So coming back to AI, how have you adapted the enterprise architecture to the rapid rise of AI and Gen-AI initiative and usage? [27:10.210 --> 27:11.790] So I'll go first. [27:13.090 --> 27:16.630] This has been the case with every customer I talk to every day now, right? [27:16.630 --> 27:24.850] It's like, hey, Gen-AI, we are trying to experiment, we are trying to go some AI agent or something, some model we are trying to experiment. [27:24.850 --> 27:31.150] The core foundation that's missing, I found out, is the right level of data governance practices in the place. [27:31.150 --> 27:34.030] They don't understand the right level of data literacy. [27:34.790 --> 27:45.790] One thing that Gokul mentioned is having that common understanding of what data looks like, and how it's going to work in the larger picture, right, in terms of the region we are trying to build. [27:46.010 --> 27:57.090] And there is a security concern also in terms of that, right, whether it's ethical, whether are we sharing, are we trying to build something that might turn into the compliance issues with our legal team. [27:57.090 --> 28:05.590] So that lack of understanding in the data is actually trying a lot of these enterprise architecture rules into the trouble, I'll say, right? [28:05.590 --> 28:12.670] Trouble is the reason I call it, because compliance is always looked at as like a hurdle in the process, right? [28:12.670 --> 28:14.930] So it is taking really longer. [28:14.930 --> 28:20.430] And in order to compete in the market, you have to be actually getting these tools out of the door very quickly. [28:20.430 --> 28:21.770] You want to experiment it out. [28:21.770 --> 28:29.710] So what we call is like localizing the experiment, and bringing the insights on top of that, and that experience build on to your larger use cases, right? [28:29.970 --> 28:39.310] Like kind of a building experimentation, like hackathons, and empower the team, they understand what's the role of the data, and how they can leverage and pass it on to the other teams, right? [28:39.390 --> 28:42.250] As I mentioned earlier, interoperability and federated model. [28:42.250 --> 28:44.150] The EA still has the visibility into that, right? [28:44.150 --> 28:48.470] They still have the control, and they're still putting the right level of governance in the picture. [28:48.510 --> 29:01.250] But in order to empower the team, you have to go with the process where you're actually using that data, and bringing the insights for leadership to get the backup and support for those initiatives, as well as you're building something that makes sense for the business, [29:01.250 --> 29:03.950] as well as for your technology teams out of the door. [29:04.490 --> 29:06.670] I'll build on what Sandesh is saying. [29:06.670 --> 29:07.830] He's spot on. [29:08.630 --> 29:10.770] I guess I agree with everything he said. [29:10.770 --> 29:13.430] So rather than just repeat it, I'll talk about a different dimension. [29:13.510 --> 29:17.650] So when AI was new, we had a very different role than we have today. [29:17.650 --> 29:25.070] Today, it's about there's probably an AI team, there's probably an AI center of excellence, there's probably those folks that we can go to, just like we go to the cloud team. [29:25.850 --> 29:31.950] When you go back maybe about 12 months ago, when AI was new, there was no team like that. [29:31.950 --> 29:36.470] And this is where architecture plays such a critical role in incubating innovation, right? [29:36.470 --> 29:42.530] We're going to help to figure out, okay, that's a shiny toy, but how does that connect to the business strategy or to our IT goals? [29:42.530 --> 29:48.650] We keep the groups focused on not just that everyone else is doing AI, so we need to do AI. [29:48.650 --> 29:55.230] No, like, let's do AI in a way that actually achieves our goals and make sure that AI is the right tool for the job as well. [29:55.450 --> 30:06.610] And then it's about sort of incubating the right ways to do it, which is creating the governance councils, the AI governance group that might be a couple clicks down from your CEO, but we need legal, right? [30:06.610 --> 30:13.430] We need HR, we need compliance, we need, gosh, a whole set of groups, right? [30:13.430 --> 30:16.490] Contract review, data and privacy, right? [30:16.710 --> 30:18.150] Security, absolutely. [30:18.150 --> 30:19.110] Thank you, sir. [30:19.390 --> 30:21.770] Security is huge, because that's a big deal. [30:21.770 --> 30:32.810] You know, if a third party is making... Miro came out with a thing not like when this was all new, that if you read the terms, they were getting the right to actually train a knowledge graph on your use of their AI. [30:33.130 --> 30:34.670] Absolutely not acceptable. [30:34.670 --> 30:47.630] I don't want my knowledge workers inside Miro using this tool to be getting a parlor trick of a summarization that then makes all of that proprietary work we're doing part of the knowledge graph of the greater world, right? [30:47.630 --> 30:51.610] Now, it feels fine when you're just a tool user who's like, oh, look, an AI button. [30:51.610 --> 30:54.790] I'm going to click it and click through the user agreement that I didn't even read. [30:55.050 --> 31:07.990] But that's part of our role is to help the organization safely move into this space with the right controls, with the right view on strategy, with the right types of concerns that we need to be looking at. [31:07.990 --> 31:18.830] So that's what I've been excited to watch is that evolution from innovation into what Sandesh has shared, which is really stepping into how we now execute, just like any other domain. [31:19.730 --> 31:27.310] So I'm going to go back to the question, how have you adapted EA to the rapid rise of AI, gen AI initiatives and usage? [31:27.590 --> 31:38.730] And if you look at AI right now, in the audience, how many people could raise their hand and tell me that you know how many models are in your company? [31:40.230 --> 31:41.870] Great question. [31:41.910 --> 31:44.420] Does anyone know how many models are in their company? [31:45.090 --> 31:48.340] Do they know how many duplicate models are in their company? [31:51.020 --> 31:53.960] As long as it's supported by the tool they're using. [31:53.960 --> 32:04.280] So when you look at governance and EA, like, models are becoming like rapid applications all over the place. [32:04.280 --> 32:21.140] And now you have a model that's maybe providing an answer when somebody calls up on the help desk, and then you have a different model that's maybe on the UI, and then you look at risk and standardization, how many different models do you have out there? [32:21.140 --> 32:35.520] So you kind of look at data governance along with these models, and there's just a huge need for almost AI or model governance, just like you had data governance. [32:35.720 --> 32:54.900] In the world that I'm in right now, they are very, very cautious on models, and EA spends an awful lot of time going through anything that is AI and putting all kinds of, I would say, standards, making sure it's captured, making sure it's reviewed. [32:54.940 --> 33:03.140] We're working a lot with, not just inside the company, but all of our vendors to really understand how many models. [33:03.140 --> 33:21.180] And at first I thought there wasn't really going to be that many models, and as you're working with vendors, the amount of models that a company now has, and we're just at the beginning of the AI, I would say, launch, it's just incredible. [33:23.640 --> 33:26.820] I would add one more twist to it. [33:26.820 --> 33:36.500] How do you know that you have successfully managed the AI as part of enterprise architecture? [33:38.380 --> 33:52.400] And to me, if there is no shadow AI in your company, then you have successfully managed enterprise architecture role in managing AI. [33:52.400 --> 34:00.320] Right now, that is one of the biggest challenges companies are facing, is the whole area of shadow AI. [34:00.320 --> 34:08.020] Shadow AI is sometimes so dark that no one in the company knows it's happening, which is a little bit scary. [34:08.280 --> 34:24.540] But I think one of the dimensions that AI has brought in is that rapid evolution, as well as rapid evolution, shows that companies are using AI to create new business value, which is a good thing. [34:24.540 --> 34:34.280] The agility in which a board might be asking the CEO to adopt AI is what we need to manage. [34:36.200 --> 34:40.480] I asked one of the board members to get an understanding. [34:40.480 --> 34:44.320] I was talking about literacy, AI literacy. [34:44.660 --> 34:47.120] What do you think AI is? [34:47.120 --> 34:50.120] And the answer I got was, ChatGPT. [34:50.240 --> 35:00.240] And I went and asked a few other C-level people in the company, and the answer was not too far from ChatGPT. [35:00.240 --> 35:20.740] And that's where I think the literacy aspect, whether it's in AI, whether it's in EA, any of the areas that somehow we expect business leaders to understand, I think we need to take the initiative to bring the literacy to the senior leadership in their language, [35:20.740 --> 35:22.740] as to what is AI? [35:22.740 --> 35:24.820] What can it do for our company? [35:25.200 --> 35:27.160] And how should we approach it? [35:27.160 --> 35:28.940] And what is enterprise architecture? [35:28.940 --> 35:34.540] And how will it make us implement AI solutions better and successfully? [35:34.540 --> 35:36.760] That's what I think our role is. [35:39.460 --> 35:46.480] Moving on to governance, what role does data governance play in EA? [35:46.580 --> 35:50.620] And how can organizations implement effective data management practices? [35:50.620 --> 35:51.800] By the way, time check? [35:52.500 --> 35:53.940] It's about a half hour past. [35:53.940 --> 35:55.460] We've got another 15 or so. [35:55.560 --> 35:57.860] So, going to data governance. [35:58.100 --> 35:58.760] Oh, I'll start us. [35:58.760 --> 35:59.660] Oh, do you want to start? [35:59.860 --> 36:00.460] Go ahead. [36:00.460 --> 36:08.180] So, for data governance, I think it's really important that we understand the significance. [36:08.720 --> 36:13.940] So, there's a lot of things that we're going to do across the organization. [36:13.940 --> 36:15.960] Data just being one of those domains. [36:15.960 --> 36:25.180] We are really the detectors of things that are potentially either risky or important for our strategic posture. [36:25.180 --> 36:41.280] So, I think one of our primary roles in that space is identifying what needs to be governed, if it isn't already, and what scopes that are coming across the bow are things that need to be engaged in so that we don't allow that to move forward without the proper controls. [36:41.280 --> 36:56.280] It might feel like not that big of a deal in the moment, but when you look at the kind of the additive nature of projects that should go through data governance that don't, it basically creates an ecosystem where you no longer have trusted assets. [36:56.280 --> 36:57.480] That's the bottom line. [36:57.480 --> 37:10.440] Data governance allows you to have trusted data assets, and trusted data assets are the foundations for things like AI to be able to monetize your strategy, to be able to create digital offerings. [37:10.440 --> 37:11.900] It's all based on data. [37:11.900 --> 37:20.640] So, data governance is, are we doing the things that allow us to create that data with quality and to ensure that it's compliant, that it's accurate, etc. [37:20.640 --> 37:28.760] So, that's one of our roles is really kind of just making sure at the setting that level of significance and ensuring we have alignment. [37:28.760 --> 37:30.380] Jason, sorry to jump in. [37:30.380 --> 37:32.380] Yeah, no, that was great points, Grant. [37:32.840 --> 37:46.600] When you look at data and data governance and why it's so important, especially in the AI era, one of the analogies I really like is imagine walking into like the largest library in the world, right? [37:46.600 --> 37:49.440] Every book that's ever been published is in there. [37:50.240 --> 37:57.160] But there's no sections, there's no catalogs, there's no numbers, but every book is in there. [37:57.580 --> 38:08.460] So, you have all the data, you have all the books, you know the book's in there, but how do you find it, right? [38:08.460 --> 38:14.060] And data governance really is a process, especially with AI, right? [38:14.060 --> 38:24.380] AI is that from a library perspective, it's that super great graphical computer system that's going to be able to get you your data and do all kinds of things with it. [38:24.400 --> 38:30.040] But if you can't find that book, it doesn't matter that it's in your library. [38:31.000 --> 38:47.060] The bigger the library is, the more data you have there, harder sometimes things can be found to find things unless you've got, you know, data lineage, data catalogs, and all of the artifacts that go around data. [38:49.460 --> 38:53.400] Just this brought up your recent experience with one of my customers, right? [38:53.400 --> 38:56.700] Like, when I asked him, what is your definition of data governance? [38:57.240 --> 39:02.140] And their enterprise architecture group itself has a picture of like, hey, how we restrict the access. [39:02.440 --> 39:04.360] That's like being just a gatekeeper, right? [39:04.360 --> 39:10.880] How we can make sure limit our access to the right label, right people, and at right time, right? [39:11.040 --> 39:13.300] That's like blocking everything, right? [39:13.300 --> 39:19.480] Then that's not going to help in terms of foundation, like bringing working all ecosystem together, right? [39:19.480 --> 39:22.680] So we have to help them understand, bring up like data literacy. [39:22.740 --> 39:25.640] When we talk about data literacy, it's not just the data tagging. [39:25.640 --> 39:33.340] You have to create a heat map or kind of create a map where you have capabilities mapped to the data, and there is right level of interoperability across the group. [39:33.340 --> 39:42.480] And they do understand, like, what are their responsibilities in terms of sharing that data with other business needs and at a larger level, in terms of privacy security. [39:42.720 --> 39:48.640] So if you put up all these links together, all we have to do is have a right level of a data foundation in place, right? [39:48.640 --> 39:58.200] You look for data governance, rather it's an enterprise architecture group versus you're passing back that responsibility back to the individual teams for the siloed data. [39:58.340 --> 40:00.940] Definitely keeping the siloed data is not going to help you, right? [40:00.940 --> 40:07.760] You definitely want a picture where you want to build a product that are innovative at working across, like seamless across your channel and experiences. [40:07.980 --> 40:13.760] So we have to make sure the data is opened up, but with the right level of gates in the place, the right level of governance, right? [40:13.780 --> 40:20.620] So it's not just like blocking and making sure, hey, who gets access and just making sure they should not be accessing the data. [40:20.620 --> 40:21.940] That's not the right way to do it. [40:21.940 --> 40:28.460] You definitely want to innovate on sharing the data across the teams and don't want to silo it into one place, right? [40:28.460 --> 40:41.740] So getting your definition and getting your data set in the right place, that's the first thing I'll make sure our enterprise architecture group, as well as all the stakeholders from the top down and bottom up, should understand and adopt it over the time. [40:44.020 --> 40:49.220] You know, I'm just jokingly, I'll tell about a real use case. [40:49.540 --> 41:01.400] Data governance is preventing anyone executing select start dot start from the whole data warehouse. [41:01.840 --> 41:05.680] You know, that is the primary role of data governance. [41:05.680 --> 41:09.080] And this happened to me at McDonald's. [41:09.080 --> 41:19.140] I won't name the name, but some super duper data scientist from a management consulting firm came and basically wanted to run this query. [41:19.140 --> 41:20.080] And I said, nope. [41:20.080 --> 41:24.620] And then he goes, you know, my partner is going to talk to your CEO. [41:24.620 --> 41:26.280] I said, please go ahead. [41:28.700 --> 41:33.400] And so I'm just giving an extreme example, right? [41:33.400 --> 41:53.480] But I think the role of governance is to really improve the usability, accessibility, and trust in all of your data that you have for all your constituents that need data to run the business and change the business. [41:53.480 --> 41:55.780] Those are the two ways people use data. [41:56.400 --> 42:02.760] And change the businesses primarily can go in many different directions, but those are the two areas. [42:03.200 --> 42:09.280] And that's where the data governance is not a gatekeeper function, except for select start dot start. [42:09.740 --> 42:16.120] But also you need to manage the risk about privacy and other things. [42:16.120 --> 42:32.300] But if you have done the right level of collaboration and right level of communications, these things get baked in at the solution design stage itself and not become a gatekeeping function. [42:32.300 --> 42:38.500] I mean, you still need to check whether they did it or not, but it is done right in the beginning. [42:38.500 --> 42:54.960] And that is the success of EA, Enterprise Architecture, that people actually are thinking about doing those things because they are convinced and you have convinced them those are important for their applications, their database, their analytics, their AI program. [42:56.210 --> 43:11.910] So can you provide an example of a successful EA initiative that have driven large business transformations and innovation and have created strategic business value? [43:12.590 --> 43:14.250] Sure, I'll go first. [43:14.250 --> 43:22.310] A bunch of customers I work for, in my last company and this company as well, everywhere this is happening, right? [43:22.310 --> 43:27.990] You see the innovation happening, data governance, adoption of new technology, rapid pace of innovation. [43:28.230 --> 43:40.810] So one of the customers came up with, hey, we want a better time to the market because we want to compete in the competitive market because a lot of startups, a lot of other competitors building these features that are chiming in our business, right? [43:40.810 --> 43:43.350] We are losing our market competitive advantage. [43:43.810 --> 43:48.310] And that's where we ended up implementing the EA framework for them, right? [43:48.330 --> 43:49.950] Because we want the right level of governance. [43:49.950 --> 43:53.330] You don't want to just start building something prolific. [43:53.630 --> 44:01.410] You don't want to spoil everything and be a challenging situation where it's rather than being productive, it's become counterintuitive, right? [44:01.410 --> 44:12.070] So we had to establish an EA governance board, right level of practices, like having, adopting the right level of data governance, EA practices early in the process, right? [44:12.070 --> 44:22.130] And making sure the thing I mentioned about flexibility and framework, that healthcare organizations were able to come up with AI strategy for them and started experimenting with, right? [44:22.130 --> 44:26.050] And that improved their time to the market because they started reusing a lot of stuff. [44:26.050 --> 44:33.110] They started adopting the framework that actually helped them to steer and leap forward in terms of how quickly they build an app, right? [44:33.110 --> 44:38.390] How quickly their developer ecosystem can use the AI tools to their advantage, right? [44:38.530 --> 44:46.450] At the same time, EA team was using and other teams were using AI tools to their advantage to actually accelerate their part of a process, right? [44:46.450 --> 44:49.770] Probably look at documentation, look at the compliance rules, right? [44:49.770 --> 45:04.950] Feed into their data model and make sure their agent, their custom agent we build for them is actually spitting out the results and giving you insights very quickly rather than spending hours of discussions and coming back with a lot of to and fro between teams, [45:04.950 --> 45:05.570] right? [45:05.570 --> 45:19.490] So involving early in the process and adopting the right level of frameworks and working together in the collaborative plan has helped the customer to actually come up with the AI agents for the healthcare business, telemedicine, very quickly. [45:19.490 --> 45:26.550] Now they have completely 90% of their support team and their customer care is driven by the AI agent. [45:26.590 --> 45:30.710] And business started seeing the value and they started adopting more and more, right? [45:30.710 --> 45:52.790] It's rather than just AI, like if you look at the initiatives, right, that combine a lot of enterprise architecture-based practices that involve their working together with different groups and stakeholders as well as having the right level of governance and principles and while giving the right level of liberty to the teams who are experimenting with the tools and coming up with a strategy or experimentation that helps with the larger initiative within the organization. [45:53.190 --> 45:55.690] That's one of the use cases, right? [45:55.690 --> 46:00.870] There are other use cases I've seen in the financial world using a lot of fraud and security, right? [46:00.870 --> 46:05.830] And in order to prevent the fraud, a lot of companies already adopted some kind of a model, right? [46:05.830 --> 46:09.110] They already have a credit card for detection early in the process. [46:09.350 --> 46:24.270] One of my customers wanted to implement AI on top of that to drive, like some part of that was still manual, we call it a digital labor where we actually started converting their people who are reaching into their chat board or their customer care, driving back to their self-driven model, [46:24.270 --> 46:24.390] right? [46:24.390 --> 46:33.810] I can go in my app, you see the notification and then you start adopting your practice and your way, like you're defining, hey, I'm traveling, that's my legitimate expense, right? [46:33.830 --> 46:41.950] And that's going beyond a certain extent then I'm trying to prevent it, the criteria I typically spend on particular area or particular category of the food. [46:41.950 --> 46:45.910] You might have seen that happening today but a lot of that is coming back to the AI. [46:45.910 --> 46:50.310] Now this AI tool that's adoption is driving that quicker, faster, and better way. [46:50.310 --> 46:54.350] You see a lot of tools that are actually coming back in apps, banking apps, right? [46:54.350 --> 46:58.350] It's coming back with prompts saying that, hey, you just made a transaction, right? [46:58.350 --> 47:02.590] But now they are driving a lot of insights back rather than sending simple notification. [47:02.790 --> 47:12.350] So that's the market change I have seen in the customer where they started spending less but while they're improving the customer experience and getting better share of the market in the process. [47:13.110 --> 47:17.990] Yeah, I'll go on two areas that I've seen where it was very successful. [47:17.990 --> 47:25.310] The first was the product or functional alignment of EA across the organization. [47:25.530 --> 47:38.470] When we were able to take an EA group and say, okay, you're aligned to this area, you're aligned to this area, and there was full buy-in from the business stakeholders and the executives. [47:39.310 --> 47:44.570] And one thing I want to touch on is, right, everybody's coming from a different angle or a different organization. [47:44.570 --> 47:48.970] My previous company, we had 60,000 technologists, right? [47:49.130 --> 47:52.030] That's a massive organization, right? [47:52.030 --> 48:00.070] And an EA for that organization is probably going to look a little different than an EA for an 800-person technology shop. [48:00.070 --> 48:02.850] And those are just different viewpoints. [48:02.990 --> 48:11.090] I think the second one that I've seen that was very successful was a cloud-first standard for driving innovation. [48:11.090 --> 48:14.910] And, you know, everybody's like, oh, cloud's old, but it's cloud-first. [48:14.910 --> 48:29.070] But what was great was the EA team actually provided a lot of the plumbing that enabled the teams to kind of go onto the cloud. [48:29.070 --> 48:34.910] It wasn't just like, hey, you go cloud-first, go figure it out, right? [48:35.270 --> 48:37.070] EA's about standards. [48:37.210 --> 48:40.630] They created a lot of the tooling for the teams. [48:41.170 --> 48:52.590] So the organization was able to move very quickly to a lot of new initiatives, almost being 100% cloud in a very short period of time and a very, very large organization. [48:53.250 --> 48:59.470] As both Sandesh and Jason have talked about, a lot of the places where EA shines is transformation, right? [48:59.470 --> 49:02.810] We're talking about how we change, how we go to market, how we do business, right? [49:02.810 --> 49:06.870] We're looking at how we change, how we materially deliver IT as a service. [49:07.190 --> 49:11.870] In fact, I would argue that if your organization isn't changing, you probably may not need architecture. [49:12.070 --> 49:16.070] I know that's crazy to say, but good luck finding an organization that isn't changing. [49:16.350 --> 49:28.210] So the examples I've seen is where we changed our operating model at Medtronic to have self-service sales for smaller shops and doing that digitally. [49:28.210 --> 49:39.650] Other examples were bringing together two companies, M&A, where we had two ERPs, consolidating them, creating a data platform. [49:39.650 --> 49:41.450] This stuff didn't just exist. [49:41.450 --> 49:46.290] We created the ingest, curate, consume platform. [49:46.290 --> 49:48.170] We called it the digital information core. [49:48.170 --> 49:51.730] That had to first be envisioned and then created. [49:51.730 --> 49:54.750] So it's a lot of these kind of big paradigm shifts. [49:54.750 --> 49:57.410] That's where architecture really shines. [49:58.170 --> 50:02.030] Yeah, I kind of agree with the panel members. [50:02.450 --> 50:09.550] The best value creation through EA happens in a transformation project. [50:09.850 --> 50:13.490] The first one I'll talk about is in oil and gas industry. [50:13.490 --> 50:22.330] We digitally transformed the upstream business of a northern American gas unit of British Petroleum. [50:23.710 --> 50:31.630] Upstream means that's where they actually find and drill and acquire. [50:31.630 --> 50:39.750] And downstream, these are lingos in oil and gas industry, downstream is how to refine it and then bring it to the users. [50:40.330 --> 50:57.710] That whole area was so old that they used to carry, you know, nine millimeter tapes, having data and maps and all kinds of information that would degrade over time. [50:57.710 --> 51:11.590] And so it was a digital transformation where if you, you know, to prevent, if you drill at the wrong location, it is about 15 to 18 million dollar expense. [51:11.750 --> 51:16.890] So you need to be really, really good and accurate in drilling. [51:16.890 --> 51:19.130] So that's kind of one area. [51:19.130 --> 51:23.150] The second one is digital transformation of McDonald's. [51:23.170 --> 51:33.250] And we literally were able to digitize every aspect of the operations at a McDonald's store. [51:33.250 --> 51:44.870] You probably do not know, but to be able to reach McDonald's store from a digital perspective, it's amazing. [51:45.010 --> 51:46.950] And I think they're on the third journey. [51:46.950 --> 52:00.790] So we mapped out in three journeys, three level journey, they're in the third journey where the equipments now, they have switched on the AI functionalities and the chips that already came with all the equipments in the store. [52:00.790 --> 52:04.390] So that's the next level of transformation that's going to happen. [52:05.290 --> 52:12.630] So now we'll move to the next phase, questions from the audience to the panel. [52:12.970 --> 52:15.210] Just yell it out, we'll repeat it back. [52:34.000 --> 52:36.220] I'll repeat the question for the recording. [52:36.280 --> 52:42.700] So you mentioned standards, what have we done to help make developers successful with standards? [52:42.820 --> 52:44.700] Anyone want to take a first shot at that? [52:44.700 --> 52:46.900] I'm going to say something terrible. [52:48.940 --> 52:51.600] OKRs are dashboarding, right? [52:51.600 --> 53:03.900] If you have a large organization, you know, and I have a lot of experience, if you're trying to steer a large organization, unfortunately, dashboarding is huge. [53:03.900 --> 53:06.860] Now there's ways that you can do dashboarding, right? [53:06.860 --> 53:16.160] You know, you can give ownership where the tech director is the one just presenting the dashboard and they're presenting it to all their peers. [53:16.500 --> 53:21.760] Or you can use a dashboard with, you know, a baseball bat or a hammer, right? [53:21.800 --> 53:28.620] But it does steer the chip from that perspective. [53:30.160 --> 53:34.720] That's one simple way if you're really trying to herd the cats. [53:34.720 --> 53:39.360] But going back to something I said earlier, why? [53:40.300 --> 53:44.640] I'm one of those people that I need to know why. [53:44.640 --> 53:48.260] Like I will spend so much time trying to figure out the why. [53:49.540 --> 53:53.440] Now if somebody tells me and my boss tells me to do something, I'll execute that. [53:53.440 --> 53:55.660] And he's like, you know, we don't have time for why. [53:55.900 --> 54:00.840] But in my mind, I'm always trying and following up and want to know the why. [54:00.840 --> 54:08.980] I think if you can articulate the why across the organization, it'll be a little less difficult to herd the cats. [54:11.240 --> 54:14.280] I'll add some of the enablement, right? [54:14.280 --> 54:21.100] Like GitHub, Copilot is one that we really think is pretty fascinating in the recent days, right? [54:21.100 --> 54:23.980] Like evaluating, getting that in, in the right way. [54:24.700 --> 54:28.240] And then there's so many automated testing tools, et cetera, right? [54:28.240 --> 54:35.960] So solution selection and getting the environment created to allow developers to be effective, I think, is a really important role of architecture. [54:36.300 --> 54:40.020] And then also, this is going to sound obvious, listening. [54:40.380 --> 54:41.860] Like they know. [54:41.860 --> 54:44.260] Like they're going to tell you this isn't working for me. [54:44.260 --> 54:48.560] We need to pay attention and we need to test that theory with three or four others. [54:48.600 --> 54:50.420] And they're probably right. [54:50.420 --> 54:52.260] So how do we help them solve it, right? [54:52.260 --> 55:06.800] When there's something that's slowing them up, the change control process is broken, whatever, we can be the change advocates that actually can help the development community move forward when they're stuck in the administrative layers of the organization. [55:06.840 --> 55:09.740] Sometimes we are that administrative layer if we're not careful, right? [55:09.740 --> 55:15.960] But if we do our jobs great, we're the wallbusters for the folks that absolutely need to move. [55:16.780 --> 55:24.120] Just to add what my panel colleagues said here, is one thing I would like to highlight is culture. [55:24.360 --> 55:25.780] It's the nature of people. [55:25.780 --> 55:26.920] They are going to resist. [55:26.920 --> 55:33.140] They're going to try to change or experiment on something, whether you want to restrict it or encourage it, right? [55:33.320 --> 55:35.220] To use it to our advantage, right? [55:35.220 --> 55:41.720] Typically, if you look at from our typical enterprise architecture books, you find people process technology, right? [55:41.720 --> 55:44.340] The people factor is really difficult to change. [55:44.360 --> 55:46.700] That's the fact that we see everywhere. [55:46.800 --> 55:53.820] It's not just the culture, the person, how they perceive the process, whether it's an enabler or it's a blocker for them, right? [55:53.820 --> 55:55.700] Kind of another hurdle they have to do. [55:55.800 --> 56:03.380] So adopting, hearing them out and early adopting, getting engaged early in the process is very critical. [56:03.700 --> 56:05.040] And hear them out, right? [56:05.040 --> 56:09.740] Typically, I call it out like establish a EA help desk, right? [56:09.740 --> 56:13.220] Gokul also mentioned that earlier in the questions, right? [56:13.300 --> 56:15.180] That's really encouraging for the team, right? [56:15.180 --> 56:18.520] Rather than hear them out what they're trying to experiment out, right? [56:18.520 --> 56:22.460] Probably giving them the tools that encourage them, boost their productivity, right? [56:22.460 --> 56:24.020] They're trying to experiment out. [56:24.140 --> 56:28.360] The one thing I see from the people's mindset is they want to learn. [56:28.360 --> 56:29.700] They want to experiment out, right? [56:29.700 --> 56:33.020] And everyone wants to upskill themselves in the new world. [56:33.020 --> 56:36.140] If you try to restrict that, it's not going to work out. [56:36.140 --> 56:42.860] Rather encourage it in a framework or way that you can govern it with right level of dashboarding, OKRs, right? [56:42.860 --> 56:47.400] And providing right level of empowerment to those teams and developers, right? [56:47.520 --> 56:50.760] I call it experimentation, hackathons, whatever you do, right? [56:50.840 --> 56:53.300] Let them come up with the ideas, right? [56:53.300 --> 57:00.060] Probably give them visibility and some buying from your stakeholders to encourage them, motivate them onto that, right? [57:00.340 --> 57:05.720] Help them understand why we are providing these tools and why we are putting some right level of governance. [57:05.740 --> 57:13.780] So definitely they are motivated to adopt what EA is coming up with and encouraged to adopt it over the time and scale it down the line. [57:13.780 --> 57:18.600] Now, one last comment is just make sure there is some type of exception process. [57:18.720 --> 57:29.400] I've seen a project where there was a standard and it was, you shall, and a department spent like, I want to say like $12 million. [57:29.620 --> 57:37.100] No business value, no nothing, just because of a you shall. [57:37.140 --> 57:40.500] Didn't provide more robust system or anything. [57:41.040 --> 57:53.520] So, now, sometimes when you are trying to steer 50,000 cats, there isn't a great process, but you definitely, if you can enable that in some way. [57:54.080 --> 57:58.160] I would add quickly two more of my observations. [57:58.160 --> 58:02.940] Number one, don't aim for perfection in following standards. [58:03.020 --> 58:08.280] There will be many instances where standards won't be followed. [58:08.280 --> 58:11.640] There are very valid reasons why they won't follow the standards. [58:11.660 --> 58:13.600] So that would be one. [58:13.860 --> 58:20.460] So why is very important if they understand why a lot of people will be okay with that. [58:20.460 --> 58:26.680] Second is provide best practices templates that they start with. [58:26.680 --> 58:29.700] So they don't start with just their own. [58:29.860 --> 58:31.840] Provide them with best practices templates. [58:31.840 --> 58:48.880] And I think AI assisted code development has been a good practice because AI, if you train the model with some of your best practices, templates and codes and et cetera, so it does a good job in preserving the standard. [58:49.280 --> 59:01.160] And last one, my own experience, I had one of the best programmers and he was the most unfollower of standards. [59:01.380 --> 59:09.360] So I actually had someone else say to him, once he's done with his coding, take that and then standardize it. [59:09.360 --> 59:13.300] Because he wrote some of the best code I've ever seen. [59:13.300 --> 59:14.380] So anyway. [59:15.420 --> 59:17.100] Please go ahead. [59:52.790 --> 59:54.410] Let's repeat the question. [59:54.550 --> 01:00:00.790] So how do you help somebody collaborate effectively when they might not feel like their job is safe, right? [01:00:00.990 --> 01:00:06.930] And they might hold back in collaborating because they feel like that protects them from some of those threats. [01:00:06.930 --> 01:00:07.950] You were about to answer. [01:00:07.950 --> 01:00:08.590] Sure. [01:00:08.930 --> 01:00:09.710] Thanks for that. [01:00:09.710 --> 01:00:15.430] That's an interesting question because if you heard most of the answers, right, a little bit, combine those. [01:00:15.430 --> 01:00:18.750] I'm combining a little bit of all those points and thoughts together, right? [01:00:19.090 --> 01:00:19.850] That's everywhere. [01:00:19.850 --> 01:00:21.390] Everyone has that challenge, right? [01:00:21.390 --> 01:00:24.330] I want to go up something new, do something better, right? [01:00:24.330 --> 01:00:26.550] That helped me in my career or profession. [01:00:26.730 --> 01:00:28.350] That's what I call upskilling, right? [01:00:28.510 --> 01:00:30.830] Everyone's striving for that, striving for that, right? [01:00:30.830 --> 01:00:32.810] I want to go and experiment out, right? [01:00:32.810 --> 01:00:34.170] And I want to learn something. [01:00:34.470 --> 01:00:36.750] As I mentioned, flexibility framework, right? [01:00:36.750 --> 01:00:38.450] Level of governance, right? [01:00:38.550 --> 01:00:45.690] They should understand the impact and leaders, right, from top down there should be encouragement to do the experimentation, right? [01:00:45.690 --> 01:00:46.850] Sometimes bright ideas. [01:00:46.850 --> 01:00:59.410] Like in my experience, I've seen some dev teams and some developers came up with some really creative idea that became a product for the organization, adding like probably another 10% of revenue in their original product, right? [01:00:59.410 --> 01:01:05.530] So sometimes you have to do those product experimentation for certain and encourage that rather than blocking it. [01:01:05.530 --> 01:01:07.170] How we do it, right, level of governance, right? [01:01:07.170 --> 01:01:09.830] If you hear them out, hey, come back. [01:01:09.830 --> 01:01:12.250] If you have some idea, you want to experiment out, learn out. [01:01:12.310 --> 01:01:13.490] There is a representation, right? [01:01:13.490 --> 01:01:16.150] I mentioned about model of COE and C4E. [01:01:16.290 --> 01:01:22.450] C4E is more federated model where there is a representation from each of the team in certain areas, right? [01:01:22.450 --> 01:01:26.910] Let's say from network and from security, from compliance and governance, right? [01:01:26.950 --> 01:01:41.570] Let's take one person from LBU and they're motivated to learn something, have the rotating role, and they come back and represent in the regular quarterly or monthly architecture representation, right, architecture review board, and they represent their idea, [01:01:41.570 --> 01:01:41.750] right? [01:01:41.750 --> 01:01:45.070] I'm looking at something new tools that might help us, right? [01:01:45.070 --> 01:01:48.650] Which they're motivated to learn because they think it's upskilling. [01:01:48.970 --> 01:01:51.110] Definitely hear them out, right? [01:01:51.110 --> 01:01:54.130] Adopt it experimentally and see whether it really works out. [01:01:54.130 --> 01:01:57.330] If it doesn't work out, pass back the message why it doesn't work out. [01:01:57.330 --> 01:01:59.630] Let them understand why it doesn't work in our organization . [01:02:00.030 --> 01:02:09.610] If he experiment out by himself in a smaller iterative approach and come up with something that's bringing business value, probably that's a good idea to go back and propose it to the business, right? [01:02:09.610 --> 01:02:16.810] So EA can act as a motivator in upskilling those kind of nurturing the talents and making sure people understand it, adopt it, right? [01:02:16.910 --> 01:02:18.730] And there are ways they can do it, right? [01:02:18.730 --> 01:02:21.410] Probably experiment running the hackathons, right? [01:02:21.410 --> 01:02:25.790] Probably running the community, running the best practice standard guidance, getting them involved. [01:02:25.790 --> 01:02:33.590] So they should get a sense of belonging where they're coming back from and they're getting motivated for larger impact, larger initiatives, right? [01:02:33.590 --> 01:02:40.530] So that's how I drive the encouragement for the people who are looking for something new, something better, right, in terms of the ecosystem. [01:02:40.710 --> 01:02:43.190] I think we're running out of time, unfortunately. [01:02:43.730 --> 01:02:53.310] I do want to close with one thing and I first want to thank Gokula, Sandesh, and Jason for coming down and being a part of building out this brand new executive track here at CypherCon. [01:02:53.310 --> 01:02:56.130] And I have a fun announcement that I wanted to share with you. [01:02:56.210 --> 01:03:03.110] Sandesh is actually leading our cybersecurity group for the chief architect network. [01:03:03.110 --> 01:03:13.890] We have about 100 executives in that group and he's doing that in partnership with Monster and in partnership with Rameshwar Balangaguru out of Unify. [01:03:13.890 --> 01:03:15.330] He's in Dallas. [01:03:15.470 --> 01:03:17.330] We're global, so we're all over. [01:03:17.330 --> 01:03:21.130] But just to thank you for that and actually continue to build this out. [01:03:21.130 --> 01:03:24.910] These are some canned cuplinks, my friend, which is kind of fun. [01:03:24.910 --> 01:03:50.330] But I'm really excited about this and this is something that, trying to use his hacker name, Monster and I are really bullish about creating this community and connecting the executive audience with CypherCon because I think it's going to get all of our executives to continue to bring people into this community and provide an opportunity for the folks in the hallways as well to have more reasons to be here. [01:03:50.330 --> 01:03:56.790] So I just really want to invest in CypherCon and in this group and I think what we've created is absolutely amazing. [01:03:56.790 --> 01:04:02.110] So thank you, everyone listening and in the room, for being a part of what's an amazing place. [01:04:02.110 --> 01:04:03.870] Thanks to all the panel members. [01:04:04.330 --> 01:04:05.050] Thanks. [01:04:05.050 --> 01:04:09.850] I should keep meeting Grant every time because I get some kind of goodies every time I meet him.