WEBVTT

1
00:00:40.650 --> 00:00:45.470
Jeremy Epstein: Morning and welcome to the sized distinguished lecture series. We'll be starting in a few minutes

2
00:01:39.640 --> 00:01:44.370
Jeremy Epstein: morning and welcome to the sized distinguished lecture series. We'll be starting in a couple minutes.

3
00:03:03.640 --> 00:03:07.420
Jeremy Epstein: Good morning and welcome to the sized lecture series. We'll start in one minute.

4
00:04:02.450 --> 00:04:07.080
Jeremy Epstein: Good morning and welcome to the sized distinguished lecture series. We'll get started now.

5
00:04:07.090 --> 00:04:36.940
Jeremy Epstein: My name is Jeremy Epstein. I read the secure and trustworthy cyberspace program at Nsf. And it is my pleasure and honor to welcome Laurie Williams and our distinguished structure. For today Murray is a distinguished university Professor in North Carolina State and and co-director of the Ncsu secure Computing Institute and the Ncsu Science of Security Lab work. She's worked with industry leaders, including

6
00:04:36.950 --> 00:04:50.959
Jeremy Epstein: Operation Cisco, Ibm Merck, Microsoft, Nortel, and Red Hat Ori's Research focuses on software security, agile software development practices and processes,

7
00:04:50.970 --> 00:05:07.419
Jeremy Epstein: particularly continuous deployment and software reliability, software testing and analysis. Among her many honors, Laurie is an eye triple a fellow and an Acm. Distinguished scientist closer to our home at Nsf. She's a career winner, and is the lead. P. I.

8
00:05:07.430 --> 00:05:36.220
Jeremy Epstein: For the Satsi frontier program enabling a secure and trustworthy software supply chain. Her involvement in education research activities has included her role as lead investigator in Nsf. Grants to improve the retention of women and minorities in computer science, using various educational practices like agile software development and care programming. Laurie received her Ph. D. From the University of Utah, an Mba from Duke,

9
00:05:36.230 --> 00:05:59.659
Jeremy Epstein: and A. Bs from Lehigh, besides raising three children, hiking and running, She's, a Marathon runner, having completed four Disney Marathons and the original Athens, Greece, Marathon, and numerous half marathons just hearing about. That makes me tired, and with that i'd like to thank Gory for joining us today, and turn the four over to you.

10
00:05:59.890 --> 00:06:02.239
Laurie Williams: Great. Thank you so much, Jeremy.

11
00:06:03.040 --> 00:06:19.989
Laurie Williams: So I continue on. You know my my bio, and making me a human, and and maybe making you feel more human yourself. Um, just to to let you know life unfolds and gives you lots of surprises. And so, as Jeremy mentioned, I have three different degrees from three different places and three different things.

12
00:06:20.000 --> 00:06:24.829
Laurie Williams: I started out as an industrial engineer, thinking i'd be an industrial engineer forever,

13
00:06:25.030 --> 00:06:39.830
Laurie Williams: and then I went to dub and got a business degree, and thought that would be my last degree ever, and walked out that last exam, thinking i'll never take another exam again, and lo and behold, I went back and got a Phd. In computer science from the University of Utah.

14
00:06:40.170 --> 00:06:56.290
Laurie Williams: Um um Professionally, I've I've just worked in two places. I worked at Ibm and and C. State um. When I was at Ibm Um. I walked into work one day, thinking, still thinking I would work for Ibm until the day I retired, and circumstances that day

15
00:06:56.300 --> 00:07:01.539
Laurie Williams: pause me to completely change that, and to decide to take a buyout

16
00:07:01.590 --> 00:07:06.590
Laurie Williams: and go back and get my Phd. In computer science to be a professor.

17
00:07:06.630 --> 00:07:15.330
Laurie Williams: And so it shows, you know, you never know what's going to happen one day when you wake up in the morning your whole life could change, and in this case it was a beautiful change.

18
00:07:15.410 --> 00:07:22.399
Laurie Williams: I do have three children, as Jeremy said two of them to live in San Francisco on the left. My daughter,

19
00:07:22.410 --> 00:07:45.589
Laurie Williams: he's a flavor chemist and my son on the left for apple as a soccer developer, and my son on the right got his Phd. In cell biology, and he lives in really thankfully here, where I do the other. The other thing that I think may be interesting is a passion project that I have. I call it the vibrant, academic, academic vibrancy. Just a a belief that I have that academics,

20
00:07:45.900 --> 00:07:53.780
Laurie Williams: particularly new assistant professors trying to get tenure, and Phd. Student go through a lot of unique pressures.

21
00:07:53.790 --> 00:08:13.639
Laurie Williams: And so I teach at Nc. State, of course, called academic vibrancy, and really focusing on the holistic academic as a person who needs sleep and good nutrition and exercise and meditation and community. And just looking at yourself as a holistic person.

22
00:08:13.650 --> 00:08:17.280
Laurie Williams: And so i'm happy to talk to anyone about that.

23
00:08:18.080 --> 00:08:23.690
Laurie Williams: Now we can go into the technical aspect. So talking about supply chain security

24
00:08:23.700 --> 00:08:41.429
Laurie Williams: Um, Any software, you know. Here phones, cars power grid. Um. The majority of the software in any product is third-party open source software. You know, they a lot of studies say about eighty percent of all software comes from third-party or open source. Software

25
00:08:41.440 --> 00:08:49.280
Laurie Williams: Only twenty percent is what a company will write, and that that is the innovation and the competitive advantage

26
00:08:49.290 --> 00:09:07.519
Laurie Williams: in recent years. And we've seen a lot of um supply chain attacks. Sonotype says that there is about an average seven hundred and forty percent increase in supply chain attacks over the last three years. So it's really a a very popular attack vector of recent.

27
00:09:07.570 --> 00:09:37.250
Laurie Williams: So what are these attack vectors that are being used with supply chain attacks? One is one that's been around forever As long as we've had a resource of supporting software that that is really just where you know, software developer makes mistake. Oops put something in there like Log for J is an example of that. That was just a you know, an unfortunate mistake. There was put in the code people. The attackers discovered it, or people it discovered it, and then attackers jumped on it and exploited it.

28
00:09:37.260 --> 00:09:39.190
Laurie Williams: It caused all kinds of havoc,

29
00:09:39.200 --> 00:09:43.719
Laurie Williams: and so that's been going on. You know that type of attack has been going on forever.

30
00:09:43.730 --> 00:09:47.589
Laurie Williams: Um! Some new attack vectors, though one is

31
00:09:47.600 --> 00:10:16.330
Laurie Williams: similar to the last one, but it's it's with with malicious intent. So the prior one is an accident. This is saying intentionally. An attacker wants to put something religious into the into a code base, so that they can then exploit it. So these are codependencies as a attack, vector and even codependencies as a weapon. So, putting putting vulnerabilities into the open source system or ecosystem, so then people will download and make the tactics

32
00:10:16.570 --> 00:10:33.710
Laurie Williams: an example of that is protest. Where, when this happened last year, where, when Russia and Ukraine started the conflict, there was no Ipc. Where someone wanted to like basically what they did was check an Ip address. If you were from Russia, you

33
00:10:33.720 --> 00:10:52.870
Laurie Williams: files got deleted. So that's a case where you know software downloaded and used as an attack. Vector So this is new newer Um, Another newer one is using the build infrastructure as an attack. Vector so code, cub, and solar winds, or examples of that where the

34
00:10:52.880 --> 00:11:04.089
Laurie Williams: um build infrastructure with the attack, vector and then the last one, because no talk would be complete without mentioning Chat Tbt Open ai large-language models degenerative ai

35
00:11:04.100 --> 00:11:14.430
Laurie Williams: um is What's the potential to use these as a new attack vector in the supply chain, so the last three being the growth that we've seen,

36
00:11:15.450 --> 00:11:28.390
Laurie Williams: and because of all that back in May, the twentieth twenty one, there was an executive order on improving the nation's cyber security that specifically in Section four about the security of the software supply chain. So

37
00:11:28.400 --> 00:11:32.109
Laurie Williams: a lot of focus on the supply chain of recent.

38
00:11:32.130 --> 00:11:33.250
Laurie Williams: So

39
00:11:33.570 --> 00:11:47.409
Laurie Williams: people like with all of the you know, there's the Executive order. People can't sell to the Us. Government unless they comply with the executive order, but in general, like even executive order or not like, What do we do about all of these things that are going on?

40
00:11:47.420 --> 00:12:04.000
Laurie Williams: Um with supply chain so like an industry, a lot of confusion, questions about what what should we do, and what is everyone else doing? And so what i'll be talking about today is a study really focused on trying to answer those questions.

41
00:12:05.470 --> 00:12:11.870
Laurie Williams: The prime study i'll be talking about is a study that I did in conjunction with synopsis

42
00:12:11.880 --> 00:12:39.069
Laurie Williams: um an empirical study and um it's based on. If If people have heard of synopsis or sort of um started out with signals, building security and maturity Model be some um! And Beeson was started by Gary Mcgraw, Sammy Megher's, John Stevens, thirteen years ago, actually almost fourteen years ago. And what basin was was a study that said, You know what should quote unquote

43
00:12:39.080 --> 00:12:50.960
Laurie Williams: um organizations do to be more secure. How should they build software so that the software is more secure, and what they started at that point was then interviewing companies

44
00:12:50.990 --> 00:13:06.109
Laurie Williams: to ask, You know, what are you doing? And if you were a company that got interviewed, you would get a report that said, This is what you do, and this is what everyone else does, and you can look at the discrepancy between what you do and what everyone else does.

45
00:13:06.120 --> 00:13:19.190
Laurie Williams: And then also they contributed to the community a report to be some report, an industry report, and in in that case, as their sample size got bigger, they had, even by domain. What are people doing for

46
00:13:19.320 --> 00:13:38.349
Laurie Williams: software security? And so um the study that I've been working on is inspired by that. And it's looking at specifically. What should we do to reduce software supply, chain risk? And what are people doing. And so with that in mind over the

47
00:13:38.360 --> 00:13:50.439
Laurie Williams: last nine, ten months. Um! I have conducted lots of interviews forty three interviews with seven different companies. I'm. Asking, What do they do

48
00:13:50.450 --> 00:14:01.430
Laurie Williams: in relation to reducing supply chain risk? I will say that these seven companies are early adopter Progressive leader, type of companies.

49
00:14:01.440 --> 00:14:20.719
Laurie Williams: And So i'm going to show you the results of of what people are doing. But you know, keeping in mind that this is what the leaders are doing. So again forty three interviews. I still have more to come. I was hoping this by the time this lecture came around I was one hundred percent done, but one hundred percent done.

50
00:14:20.730 --> 00:14:27.509
Laurie Williams: But I don't think that the results I'm going to show you will will change appreciably. I think that they're pretty stable,

51
00:14:28.640 --> 00:14:37.910
Laurie Williams: the other. So that's you know, in conjunction with. So that's this. Another activity that I have is, you know, Jeremy mentioned this Frontier Grant

52
00:14:54.300 --> 00:15:09.119
Laurie Williams: um secure software supply chain center, and you know. So this is a a group of researchers, academic researchers focused on supply chain, and as part of that we've conducted six supply chain summits,

53
00:15:09.150 --> 00:15:35.589
Laurie Williams: um and supply Chain Summit is bringing together about fifteen people at a time from different government and industry organizations to talk about the complexities of supply chain, and what I have on here are kind of that. They are headlines of of papers. We've written that document, the summits, and we're going to actually probably in the future the strategy will be to

54
00:15:35.600 --> 00:15:49.889
Laurie Williams: um disseminate this the findings of the summits on Archive. But we did write some papers early, and in these summits we bring. We bring, you know, fifteen people together and talk about You know what what

55
00:15:49.900 --> 00:16:09.079
Laurie Williams: challenges are people having? What approaches are they using? Um, I say, about sixty people. There are some people who have come to more than one. So about so sixty different people. Um, all kinds of industries. We have run two on in Silicon Valley, two in the east coast, too, with government. And

56
00:16:09.270 --> 00:16:29.959
Laurie Williams: while these results from these summits are not in the numeric results that i'll show um in my mind. I'm looking for consistency or lack of consistency between what i'm hearing at the summits, and what I'm hearing in the interviews, and i'll say i'm hearing similar things. So um it substantiates what happened in the interviews.

57
00:16:30.830 --> 00:16:54.139
Laurie Williams: One sentiment which i'll bring out because i'm going to. I'll bring it out again in a moment, is the sentiment is so. There's this executive order. Um, and companies that want to sell to the Us. Government need to comply with things coming out of the executive order, and that executive order was in place for all six of these summit. But really the sentiment of the people in the summit

58
00:16:54.150 --> 00:17:09.650
Laurie Williams: has been that the Executive order is forcing industry to adopt security practices that should have been adopted twenty years ago. We want to be more secure. We don't want to just comply, and so that's the spirit of the people who read the summits, which is a beautiful thing to hear.

59
00:17:10.730 --> 00:17:27.869
Laurie Williams: Um. I do want to give a disclaimer that with all of this, these signals, these seven companies and sixty people who came to these summits, I can't tell you who they are. Um! The summits it's Chatham House rules so Chad Mouse rules, says I, I can't really reveal to you who is there

60
00:17:27.880 --> 00:17:47.329
Laurie Williams: and then as far as the study with synopsis. Eventually there will be a paper published, and um and you'll you'll be able to get that paper When the paper when the paper is actually written, then i'll get permission or not permission from the seven companies to reveal their identities or not, it's up to them,

61
00:17:47.570 --> 00:17:52.909
Laurie Williams: so I I unfortunately can't share the names of the companies with you.

62
00:17:53.330 --> 00:18:09.529
Laurie Williams: So let's go back to the sentiment from the summit. Um, you know people want to use the executive order to be more secure, and i'll say to reduce software supply chambers. And so this is my view on this slide is completely my view of

63
00:18:09.540 --> 00:18:26.089
Laurie Williams: the progression of things. First of all, prior to the executive order, Most people who worked in software security within an organization just wanted to get attention. You know there's still so much attention on getting functionality out as fast as we can,

64
00:18:26.100 --> 00:18:38.960
Laurie Williams: and not as much attention on producing secure products. That's just, you know the way it's been. The Executive Order has moved up the Self for Security and the priority priority list. And that's a beautiful thing.

65
00:18:38.970 --> 00:18:52.239
Laurie Williams: However, there was a lot in the second bubble there, a lot of like Oh, no! What do we do now? And so a lot of companies did a lot of hiring and a lot of consternation and um stress about

66
00:18:52.440 --> 00:18:54.189
Laurie Williams: this executive order

67
00:18:54.200 --> 00:19:08.990
Laurie Williams: when time went on, as i'm, listening to all these companies, because the first thing that came out of the executive order was the need to produce a software bill of materials. As

68
00:19:09.000 --> 00:19:17.020
Laurie Williams: the first thing was like. All of a sudden the executive order became self-pervilling materials. That's really the only thing. People were

69
00:19:17.030 --> 00:19:18.789
Laurie Williams: completely focused. On.

70
00:19:19.890 --> 00:19:30.640
Laurie Williams: Then the end of last year, the beginning of this year there was more more concrete guidance given on

71
00:19:30.650 --> 00:19:39.590
Laurie Williams: the Executive order, and something called self attestation, so felt that attestation is companies having to

72
00:19:39.660 --> 00:19:43.919
Laurie Williams: document that they followed good security practices,

73
00:19:44.070 --> 00:19:55.419
Laurie Williams: and so and that was specified in a general way in the beginning of the executive order. But now it's becoming more concrete. It's still not completely concrete, but it's much more concrete now.

74
00:19:55.430 --> 00:20:10.789
Laurie Williams: And so that's the fourth bubble there. Um. And i'm saying Ssdf: So ssdf is a misdocument secure software development framework that has been very much tied to the executive order. And it documents

75
00:20:10.800 --> 00:20:22.900
Laurie Williams: basically software security practices that should be done in order to produce a secure product, and the self attestation is very tied to the Ssdf.

76
00:20:23.090 --> 00:20:42.319
Laurie Williams: As i'll show you in a moment, and so I would say, in a lot of ways. Now, um executive order equals the self attestation which is defined in the Ssdf. The Ssdf. Does say you need to produce a software bill material, so the executive order has broadened from just the Ssdf.

77
00:20:42.780 --> 00:20:55.040
Laurie Williams: Here's where my my opinion comes in that in order to reduce supply chain risk, we need to do more than just have good software practices.

78
00:20:55.050 --> 00:21:13.099
Laurie Williams: So software supply chain risk management um is greater than just just the Ssdf. Which is greater than the Ssdf: even though the Ssdf is a lot for organizations trying to comply with it. But there's things that are important that are not in there.

79
00:21:14.630 --> 00:21:25.700
Laurie Williams: So again, people asking, Okay, you know, we want to reduce supply chain risk. What should we do? What should we do for the executive order? What should we do in general to reduce supply chain risk.

80
00:21:25.710 --> 00:21:45.670
Laurie Williams: Fortunately there have been a lot of documents, good documents that have come out of the focus on supply chain, two of them here, and and I'll and i'll explain why these matter to the study that i'm working on in a moment. But here's the Ssdf. Which is miss eight hundred and sixteen.

81
00:21:45.680 --> 00:21:58.180
Laurie Williams: That's an important document. Down below is nest eight hundred, one hundred and sixty one, which is on cyber security supply chain risk that is hardware and software. It's holistic. It's not just software

82
00:21:59.270 --> 00:22:09.829
Laurie Williams: Salsa, which is supply chain levels for supper artifacts is guidance that we've started out with, Google but has moved into the open Ssf.

83
00:22:09.840 --> 00:22:27.670
Laurie Williams: Um, which is part of it supported by the Linux foundation. I'm. Very focused on. I will say the closing up the the build infrastructure attack vector um. The code Cows and solar Rins type of attack. Vector So some guidance out of there,

84
00:22:28.050 --> 00:22:45.040
Laurie Williams: some others. Um open ss up again has supported a document that started out of Microsoft secure software as a secure supply chain consumption framework. This is as if you're consuming, software not producing software. So it takes a a particular perspective.

85
00:22:45.050 --> 00:23:04.710
Laurie Williams: Um Cloud Native has a nice documents, some software supply chain best practices be some, as I mentioned earlier, has lots of software security practices, some of them directly directly related to supply chain. And finally, Oas for several years now has had their S. Cbs:

86
00:23:04.720 --> 00:23:18.790
Laurie Williams: the software component verification standard. So these there's other documents. These are especially ones that I had chosen, because very much focused on supply chain and not the broader perspective of software development.

87
00:23:19.810 --> 00:23:36.020
Laurie Williams: And so, as I work to structure this, what are, what should people do and what are they doing? Study that I mentioned earlier? Am. I created what I've called the proactive software supply chain risk management framework.

88
00:23:36.030 --> 00:23:48.860
Laurie Williams: And the vision of the framework is to be a holistic framework meaning not just focused on software development, the holistic framework that industry can use to proactively mitigate software supply chain risks

89
00:23:49.020 --> 00:23:58.230
Laurie Williams: through guided adoption of tasks. So individual things that people do tests is a nist word for individual things people can do

90
00:23:59.600 --> 00:24:04.910
Laurie Williams: and supports assessments going in comparison against industry, peers, standards, and guidelines.

91
00:24:05.790 --> 00:24:25.490
Laurie Williams: That's the vision. Um, basically. I will show you. I took the union of all of those documents that I just showed you to look at. You know what's in common and whatnot, and to try to come up with really the the superset of what people organizations should do to reduce supply chain risk.

92
00:24:25.500 --> 00:24:43.750
Laurie Williams: So these are all of the documents I just showed you, and I have many pages, and anyone who's interested in actually getting this document, i'm happy to share on that ah version zero point three level of this actually I'm. On the zero point four level. But I haven't published it yet. And so

93
00:24:44.080 --> 00:24:51.490
Laurie Williams: ultimately, when I took the union of all of those documents. I came up with seventy two tasks, and i'll explain the the

94
00:24:51.500 --> 00:25:03.489
Laurie Williams: relationship in a moment, and for each task there's a name. There's an objective like. Why in the world would you want to do that? What are you trying to accomplish by doing this task? A definition of what the test

95
00:25:03.500 --> 00:25:15.389
Laurie Williams: is the questions which are my interview questions, but also a question that anyone can ask within their own organization and on the right the references of Where did that practice come from,

96
00:25:16.400 --> 00:25:18.569
Laurie Williams: or that task come from him?

97
00:25:18.580 --> 00:25:27.189
Laurie Williams: I took it out and tried to make it bigger here. So on the left is the task name on the right, so you can see, like for organizational security requirements,

98
00:25:27.200 --> 00:25:29.919
Laurie Williams: Their executive order calls for that.

99
00:25:30.440 --> 00:25:46.489
Laurie Williams: The Ssdf calls for it, as some calls for it, nest eight hundred, one hundred and sixty. One calls for it, and the cloud native foundation calls for it. Self. Attestation says This is a practice or a task that is being an organization would need to attest to

100
00:25:48.120 --> 00:25:49.320
Laurie Williams: one second.

101
00:25:50.600 --> 00:25:55.860
Laurie Williams: So this is, I say, mapping all the things to all the things it's a many to many mapping

102
00:25:56.010 --> 00:25:58.250
Laurie Williams: that took a long time to come up with,

103
00:25:58.530 --> 00:26:12.979
Laurie Williams: and I um. I was working with someone from this um Karen Scarfo, who explained to me those you know, different types of mappings. This has, you know, a document where they describe all the different types of mappings. And so

104
00:26:13.170 --> 00:26:29.010
Laurie Williams: what I did is called bi-directional equivalence, which means each one of those is the same. They just call them call it different things, but they should all be the same. There's other map things like part hole and others. But this is bi-directional

105
00:26:29.020 --> 00:26:37.759
Laurie Williams: in theory all. Of those as for the same thing. They just say it in different words, and I will say anyone listening. I would love to have feedback.

106
00:26:37.990 --> 00:26:45.730
Laurie Williams: I did this mapping primarily myself, somewhat based off of mappings that came out of Ssdf and whatnot,

107
00:26:45.750 --> 00:27:01.050
Laurie Williams: but a lot of it I did myself, and scientifically it's always good to have whether one person um, and you know the Microsoft document S. Two, C. Two F. Adrian Digglio, the author of that checked my mapping, and I would love more of that check in, too.

108
00:27:02.930 --> 00:27:19.160
Laurie Williams: So of the framework I mentioned, there's seventy two tasks at the Union. Of all all of the things. Those are organized of seventy two tasks into practices which are groups of practices. So there's fifteen practices which are the blue level.

109
00:27:19.170 --> 00:27:24.950
Laurie Williams: So some example perform compliance. There's five tasks if you look under governance

110
00:27:25.060 --> 00:27:38.260
Laurie Williams: perform compliance. There's five tasks that make up the Reform Compliance practice, or over under deployment. There's six tasks that make up the respond to vulnerabilities. Practice.

111
00:27:38.310 --> 00:27:43.110
Laurie Williams: The practices are then grouped into four groups, a governance group

112
00:27:43.350 --> 00:27:51.870
Laurie Williams: product group, an environment group and a deployment group. So there's an organizational structure for these seventy, two.

113
00:27:52.810 --> 00:28:08.079
Laurie Williams: And this is looking at a life cycle model of how does this all fit in so particularly at these fifteen practice levels? And the four groups are down along the bottom. There's a color code

114
00:28:08.090 --> 00:28:29.940
Laurie Williams: like green are done by a role called business management Red role, Architectural Developer Blue. By it just kind of laying out. Where would all these things be done? Some of them are across the whole life cycle up at the top and the sides, but some of them relate to a particular time in the life cycle all the way to the retirement

115
00:28:29.950 --> 00:28:31.020
a product.

116
00:28:33.210 --> 00:28:46.619
Laurie Williams: So why in the world do I need to have all of these frameworks. Why couldn't I just use the Ssdf. And this is a chart that that kind of shows that all of them were needed. So there was four different groups,

117
00:28:46.630 --> 00:29:05.880
Laurie Williams: and on the left side are all the different frameworks that I mentioned before, and how many tasks were in each of the frameworks, so seventy, two tasks total, and and the P. Ss. Crm. Um. But if you look at like the Ssdf. Has thirty four

118
00:29:05.890 --> 00:29:17.500
Laurie Williams: tasks, and most of them are in the product which makes sense. Product is developing a product. Ssdf: The D. Is developer, so it makes sense that the majority of

119
00:29:17.510 --> 00:29:29.849
Laurie Williams: practices are focused at the developer. But I haven't read, Here are the prime driver. So the prime driver of the product tasks came from the Ssdm.

120
00:29:30.180 --> 00:29:31.930
Laurie Williams: Aaron,

121
00:29:32.170 --> 00:29:42.490
Laurie Williams: the Myst eight hundred, one hundred and sixty. One was a very influential document as well. It brought in the governance a lot of the governance practices, and

122
00:29:42.500 --> 00:29:51.900
Laurie Williams: if you look at the twenty. So twenty governance practices. Twenty of the twenty three governance practices came from this eight hundred, one hundred and sixty one.

123
00:29:52.030 --> 00:30:03.179
Laurie Williams: The in parentheses, says that when I did the union of all of the frameworks, five of those governance practices only appeared

124
00:30:03.290 --> 00:30:14.590
Laurie Williams: in the eight hundred, one hundred and sixty one. Another very influential framework was the Cloud Native software supply chain best practices document.

125
00:30:14.600 --> 00:30:29.260
Laurie Williams: A lot of the environment came from them. Thirteen and eight of them only appear in the cloud native. So it really it. It was important to bring all of these different frameworks together to get a holistic view of what something we should do.

126
00:30:30.050 --> 00:30:35.489
Laurie Williams: So now I give you some some um information on the findings.

127
00:30:37.430 --> 00:30:48.090
Laurie Williams: So this this is a spider chart, and I I won't expect you to get much off the Spider chart. It's kind of busy, and i'm going to go into the details in the next slides.

128
00:30:48.100 --> 00:31:02.140
Laurie Williams: I'm going to give you just if you look at the top right the ratings. So I interviewed people in the companies, and based upon what I heard for a particular task,

129
00:31:02.150 --> 00:31:19.840
Laurie Williams: I said. Zero is they don't do it. Zero point two, five says they're starting to think about it. They're kind of getting started zero Point Five says, you know they're in progress. Zero point seven, five says, you know the firm is there, and then one says they do. They do it, and in some cases

130
00:31:19.850 --> 00:31:21.260
Laurie Williams: exemplary.

131
00:31:22.190 --> 00:31:37.080
Laurie Williams: So when you look around the Spider chart, I have the fifteen practices that come from governance product. You go clockwise, product, environment and disclosure the name of the practice and the parentheses are,

132
00:31:37.850 --> 00:31:56.489
Laurie Williams: How many practices are in? Ah, how many tasks are in that practice! That's the first number. The second number is how many Ssdf practices are in there. So if you look at compliance as an example of at the top, there's five tasks in that practice.

133
00:31:56.500 --> 00:31:57.880
Four of them

134
00:31:57.980 --> 00:32:12.790
Laurie Williams: are specified in the Ssdm reason I'm. Like really focused on Ssdf. Is because since people equate kind of the Ssdf. And and self attestation with compliance. There's so much focus on that.

135
00:32:12.800 --> 00:32:18.659
Laurie Williams: Um! And as I go through you'll see that there's a lot of things that are not covered in that.

136
00:32:19.500 --> 00:32:25.720
Laurie Williams: So if you look down at the bottom, P. Three is component and container choice.

137
00:32:25.860 --> 00:32:30.809
Laurie Williams: There's five tasks in there, Only one of them is specified in the Ssdm:

138
00:32:30.910 --> 00:32:46.339
Laurie Williams: So that's around the outside on the inside. What you can see is, how are our companies doing? How are these seven companies doing? The more towards the inside would indicate that practice isn't very well

139
00:32:46.350 --> 00:32:51.430
Laurie Williams: adopted. At this point. The further out, the better adopted.

140
00:32:52.740 --> 00:33:00.589
Laurie Williams: The Spider chart goes to zero point nine, not to one. So, even though it looks like some are getting all the way to the end, there's another level,

141
00:33:00.600 --> 00:33:19.270
Laurie Williams: and but if you look in the general case, so the lighter pink is all practices. The darker pink is Ssdf: So in general people are adhereing more to the Ssdf tasks than all tasks. But i'll go through the individual detail now.

142
00:33:20.330 --> 00:33:22.350
Laurie Williams: So i'm going to go through

143
00:33:22.440 --> 00:33:34.679
Laurie Williams: of the fifteen practices, and then the tasks that are in the practice. And so This is the governance practice. The T. And the governance group

144
00:33:34.870 --> 00:33:42.190
Laurie Williams: perform compliance, practice. And then there's that five specific individual tasks in there.

145
00:33:42.270 --> 00:33:56.230
Laurie Williams: Um! And you'll you'll get to know this this format in a moment, but the identifier in the framework is an identifier which is on the left of G. One point one, if there's a star near it, that means that's in the Ssdf.

146
00:33:56.350 --> 00:34:00.369
Laurie Williams: If there's not a star in it, then it came from someplace else.

147
00:34:00.750 --> 00:34:10.559
Laurie Williams: And then, under average. This is saying of the seven organizations that I talk to. What's the average adoption of that specific

148
00:34:10.670 --> 00:34:11.889
Laurie Williams: he's

149
00:34:12.290 --> 00:34:23.840
Laurie Williams: and then practice averages the average of those five, the adoption of those five, and then, finally, Ssdf average would only take the tasks that were in the Ssdf:

150
00:34:24.300 --> 00:34:29.360
Laurie Williams: That's how you can read that. So My reaction to some of this compliance is

151
00:34:29.370 --> 00:34:40.079
Laurie Williams: is in general and the organizational requirements. People view the organizational requirements. As that they have a defined so-called life cycle. That's you know. In the general case,

152
00:34:40.090 --> 00:34:57.819
Laurie Williams: Um, people have been checking licenses for a long time. Now predates really this supply chain stuff going on. So people are fairly advanced. Point six, five. So fairly, lance with checking the licenses. Um, not not complete, but you know pretty advanced,

153
00:34:57.830 --> 00:35:06.989
Laurie Williams: and but you can see low adoption of producing producing attestation. So attestation is proof, if you will, or

154
00:35:07.000 --> 00:35:26.640
Laurie Williams: you know, a declaration that you use these goods security practices they are supposed to use so currently, It's okay. If an organization basically has a Pdf. Signed by an executive that says, I promise we use these. That's one form of attestation, moving towards more machinery or automated attestation,

155
00:35:26.650 --> 00:35:29.910
Laurie Williams: but producing it is zero point. One, three pretty low

156
00:35:29.980 --> 00:35:47.079
Laurie Williams: delivering provenance of Providence is part of the Ssdf. Saying, you're going to provide information to the people you're supplying your software with about your practices, about your environment and and whatnot information about where you were when it was built, and what not.

157
00:35:47.090 --> 00:36:06.989
Laurie Williams: So low acceptance of delivering provenance, information, and low acceptance of delivering software or materials, The people are starting to produce subordinate materials, which is the first line of what's required in the executive order. So over the last eighteen months or two years, people are pretty advanced with

158
00:36:07.000 --> 00:36:14.019
Laurie Williams: using tools to produce the software bill materials. But it's not really going beyond that as far as sharing, delivering, and whatnot

159
00:36:14.900 --> 00:36:16.500
self- observation,

160
00:36:16.740 --> 00:36:22.400
Laurie Williams: another still in the governance group developing security policies,

161
00:36:22.860 --> 00:36:24.390
Laurie Williams: and and you know,

162
00:36:24.400 --> 00:36:43.179
Laurie Williams: decently high acceptance of um delivering software policies on some um developing security policies. So one thing is that people do talk about doing, you know, having a good code review policy. Yet two people have to to approve every one

163
00:36:43.190 --> 00:36:52.360
Laurie Williams: full request. But you know two questions where they start to fall off, which is why it's important to do interviews and not surveys um Is

164
00:36:52.870 --> 00:37:10.240
Laurie Williams: Do they check for security so they Do code? Review. Two-person code review. Does it have anything to do with security? Maybe not. Um. And then this comes in later, too, like you know do you enforce that they're actually done, and maybe that's not as much as either. Um. The biggest

165
00:37:10.750 --> 00:37:29.629
Laurie Williams: problem in this group was is asset inventory. So from a supply chain perspective, a lot of you know, a lot of focus on. We need to know what we're using, what our build machines are, what our servers are, what you know, develop machines like have an inventory of everything and

166
00:37:29.810 --> 00:37:38.380
Laurie Williams: point Two nine says, barely emerging. And and the reason is that as people are looking towards producing this asset inventory

167
00:37:38.390 --> 00:37:56.549
Laurie Williams: Now, like you know, they're kind of overwhelmed like we use containers a lot, and they they are spin up and spun down a lot of ephemeral environments, a lot of cloud resources, a lot of change happening all the time. And so, you know, really a lot of confusion with how do we produce this? I said, inventory. Now

168
00:37:56.560 --> 00:37:59.189
Laurie Williams: it would be a business opportunity for someone

169
00:38:00.240 --> 00:38:08.780
Laurie Williams: our next managing suppliers. So people this is not open source. Software? This is vendor, third-party software.

170
00:38:08.790 --> 00:38:15.729
Laurie Williams: And in the general case, these people you know that I've spoken. With who develop contracts with vendors,

171
00:38:15.740 --> 00:38:31.230
Laurie Williams: are good at imposing. Us say all the things so imposing on the Bendridge. You have to use the same silver development lifecycle that we do. You need to give us a sovereign bill of materials you need to give us provenance. You need to give us a testation. So so

172
00:38:31.280 --> 00:38:34.730
Laurie Williams: you know the vendors that they need to do it.

173
00:38:34.740 --> 00:38:36.259
They're pretty good at that,

174
00:38:38.350 --> 00:38:48.639
Laurie Williams: as far as anyone actually fulfilling any of that. That's a different story. But the new contracts are reflecting the new requirements

175
00:38:48.820 --> 00:39:05.179
Laurie Williams: um less mature at, so separation of duties is one of the tasks. And that's really saying that more than what person should be looking at these contracts, more than one person should be reviewing the contractors and whatnot, and that's not happening as much.

176
00:39:05.190 --> 00:39:15.869
Laurie Williams: I have seen cases where there is a collaboration between the contract manager and the software security. So where that approval process is taking place. But in general it's pretty low.

177
00:39:16.730 --> 00:39:23.489
Laurie Williams: Training is a pretty good practice. So I say It's first place, there's hide for first place.

178
00:39:23.570 --> 00:39:24.960
Laurie Williams: So there's a

179
00:39:24.970 --> 00:39:39.810
Laurie Williams: another first place. Practice as well. So training is role-based training, contingency training, where contingency is like there's a security emergency. What should we do? And gathering attack trends?

180
00:39:39.820 --> 00:40:07.880
Laurie Williams: And so there. There is some improved room for a program, particularly on the contingency training, and with gathering attack trends and communicating those trends to the organization and some of the program that can be done there. Um! You'll see me reflect. In some cases this differentiation between prevention practices detecting practices and responding preventing means preventing a security incident,

181
00:40:07.890 --> 00:40:17.460
Laurie Williams: incident or vulnerability and responding to one. And so training is a prevention. If there's good training, we can prevent problems and so

182
00:40:17.470 --> 00:40:18.919
Laurie Williams: good to shift left.

183
00:40:20.150 --> 00:40:26.140
Laurie Williams: I think this is the last governance practice assessing and managing risk.

184
00:40:26.380 --> 00:40:44.400
Laurie Williams: Um, and you know it's not not too bad. Everyone knows they need to do all of these things. It was great or great awareness of the need to, you know, figure out which systems are more critical, and um track risks and decisions. Um, But

185
00:40:44.410 --> 00:40:55.199
Laurie Williams: I would say the lacking is a little bit of What's your repeatable objective process for doing risk management? That's when it can fall down.

186
00:40:55.260 --> 00:41:10.489
Laurie Williams: It's more like we know what our Crown jewels are, and we focus on them the most kind of um risk management. So there could be some work done on making these repeatable and objective security. Metrics is hard. It's a harder problem.

187
00:41:10.500 --> 00:41:23.120
Laurie Williams: Um! And so that's the lowest of these. And so how do we know? Is any of this working? Are we getting more secure? That kind of a thing. Um! The whole industry would like to to know better how we do that.

188
00:41:23.610 --> 00:41:31.340
Laurie Williams: All right. Now, moving into the product group. First, one is developing security requirements

189
00:41:31.450 --> 00:41:47.899
Laurie Williams: only at about halfway here. So halfway means we're making progress similar between product security requirements and release integrity. So these are, you know, proactive kind of practices early, not organizational

190
00:41:47.910 --> 00:42:01.040
Laurie Williams: security requirements, but product-specific requirements, like at the architectural level choices like using a memory safe language, Sandboxing isolation, modularity use of security features like doing that upfront

191
00:42:01.870 --> 00:42:10.699
Laurie Williams: um. And then, you know, providing customers assurance yourself both legitimate, especially like code signing. So some of what can be done there

192
00:42:11.220 --> 00:42:19.989
Laurie Williams: next is building security in so proactive software security practices so that you're building a secure product

193
00:42:20.000 --> 00:42:37.110
Laurie Williams: pretty much getting there. Um! The lowest one is secure by default. We're giving a product to a customer where this is as secure as possible without requiring the customer to make changes to settings in order to, so that the product is more secure.

194
00:42:37.120 --> 00:42:44.839
Laurie Williams: That's the lowest. It's a security versus usability type of classing issue and

195
00:42:45.880 --> 00:42:58.039
Laurie Williams: some people are, you know, having still having trouble with the trade-off the way you know everyone, everyone always has been um. The last one was interesting in-house components and in-house components.

196
00:42:58.470 --> 00:43:11.530
Laurie Williams: So there's all this focus now on third-party components, open source components and scanning them and and updating them And sometimes the in-house components can be forgotten. So it's really putting the focus back on the fact that

197
00:43:11.540 --> 00:43:27.630
Laurie Williams: people develop in-house components and need to make sure they're scanning them and looking for cves and making sure they're not abandoned. And what not so, having in-house components, be first-class citizen getting the same treatment as third-party

198
00:43:29.210 --> 00:43:34.489
Laurie Williams: the next one I have here is in second to last place, so low

199
00:43:34.930 --> 00:43:38.049
Laurie Williams: This is managing components and containers.

200
00:43:38.530 --> 00:43:56.789
Laurie Williams: Um! And what I have on the bottom here is a you know, kind of a flow chart that came from a document that came out of various government organizations. And so basically, what this is saying is, if you're going to bring a component or container in this slow chart here,

201
00:43:56.800 --> 00:44:07.809
Laurie Williams: you know. Make good choice, scan it, and then, once once you scan it and feel like it's a good choice. Issues have been fixed. Then put it in your own component repository and use it,

202
00:44:08.600 --> 00:44:24.619
Laurie Williams: and that's kind of in the in the P. Threex. Make a good choice. Only choose from trusted repositories require some commits from your components, then put them into your own vetted repository,

203
00:44:24.630 --> 00:44:33.239
Laurie Williams: and the last one is preventing component vetting bypass. So preventing a developer from using a component that Hasn't gone through that process.

204
00:44:33.370 --> 00:44:50.890
Laurie Williams: And really kind of low adoption in all of these cases, and I have here components are better than containers containers. There's a lot of you know, not a lot of checking or containers that are coming into an organization from where they come from and and what not. So

205
00:44:50.900 --> 00:45:02.339
Laurie Williams: this is a concern, and i'll bring that concern up again, because it's probably the biggest attack vector in supply chain or through components and containers. And this set of practices here,

206
00:45:02.790 --> 00:45:04.459
Laurie Williams: could you some more work?

207
00:45:05.010 --> 00:45:19.940
Laurie Williams: Um. Next, product grouping is discovering vulnerabilities, and so these are like finding vulnerabilities that are in there. In this prevent detect response this is a detect grouping

208
00:45:19.950 --> 00:45:31.200
Laurie Williams: I have here that's in third highest place at the Fifteen. So high is good code. Review is lower than you might expect.

209
00:45:31.320 --> 00:45:35.510
Laurie Williams: You could hear things like that. We require coding for every

210
00:45:35.620 --> 00:45:54.119
Laurie Williams: full request, but not necessarily related to security, and not sure how it's actually enforced that something actually happens. So we could definitely get better with that Um have lots of tools. So the automated scanning tool is zero point. One six. It's almost complete almost at one point zero.

211
00:45:54.180 --> 00:46:08.109
Laurie Williams: But um! So having a lot of tools yes, using that which is the third one, not as much so sometimes that run all the time. Sometimes the vulnerabilities found are not fixed, and whatnot,

212
00:46:08.250 --> 00:46:11.640
Laurie Williams: when executable security testing is pretty good

213
00:46:12.230 --> 00:46:21.600
Laurie Williams: a lot of external penetration, testing and bug bounty less so internal. It would be good to get some of this internal, so it's more shift left done earlier.

214
00:46:22.720 --> 00:46:27.730
Laurie Williams: Um regular third-party Compliance is sort of low,

215
00:46:27.840 --> 00:46:43.810
Laurie Williams: and it's because it seems that once a vendor is approved. They're approved not as much tracking to make sure that they are following their contract, and we are fixing my vulnerabilities. And then, also, once an open source

216
00:46:43.820 --> 00:46:58.729
Laurie Williams: product is brought in, and and so, Wow! And not a lot of review to go back and say, This is still a good choice, so that the secure componing analysis tool sca tools will say if there's a vulnerability, but not necessarily. If

217
00:46:58.740 --> 00:47:06.550
Laurie Williams: no one's maintained that like so two years after you put it into, you know, approved it. Has anyone maintained it since then not a lot of look back.

218
00:47:08.470 --> 00:47:14.139
Laurie Williams: This is last place. So this is managing vulnerable components. The

219
00:47:14.160 --> 00:47:31.940
Laurie Williams: um. There's two tasks here. One is consuming the S bond. So um having a cell formula materials and actually looking at it, no one does that. And then dependency update. So you run the secure component analysis tools, it tells you

220
00:47:31.950 --> 00:47:45.839
Laurie Williams: which components have a new version which components have vulnerabilities, and this is drinking the water out of fire hose which that picture is indicating that these tools just give you an overwhelming amount of information,

221
00:47:45.850 --> 00:47:52.500
Laurie Williams: and as far as people having a really good process for how to manage. That is zero point three eight

222
00:47:52.550 --> 00:48:12.119
Laurie Williams: kind of low, you know. It could be things like Well, depend upon what gives you a pull request every morning to tell you all the components you should update. But people don't do that, you know, like there's risks. So the associated with just accepting everything. And so um, not a not a good systematic process for how to handle this

223
00:48:12.130 --> 00:48:13.899
Laurie Williams: across the board.

224
00:48:14.330 --> 00:48:27.460
Laurie Williams: Now, i'm moving into the environment, grouping so making a secure environment, this this grouping um of environment especially, would focus on, like the spoulder one's coat cubs

225
00:48:27.490 --> 00:48:37.670
Laurie Williams: build infrastructure like an attack or getting in through the environment. And first one is safeguarding artifact integrity.

226
00:48:37.830 --> 00:48:56.110
Laurie Williams: Um, And you know, if you look across the board advanced authentication, is maturing. So things like multi-factor authentication developers using ssh key. So you know people adopting those at least halfway. Anyway.

227
00:48:56.120 --> 00:49:07.569
Laurie Williams: Um! The branch protection process could be better. So what that is saying is, you can't advance in your branches unless you pass certain security tests

228
00:49:07.580 --> 00:49:18.290
Laurie Williams: um less security tests. And there could be in that branch process, especially with Mono repo. And then just a comment about the last one decommissioned assets,

229
00:49:18.300 --> 00:49:46.199
Laurie Williams: and this was a practice. I say I just added it to the model pretty recently, so I need to circle back with some of the companies. So sample size is pretty small in point two, five um, and it is not referenced in any of the documents that I talked about, and as I that was talking with people, especially Jay White and Microsoft folks, also that who um was just someone who I spoke with. So it's not necessarily Microsoft's not necessarily in any of the studies It's just

230
00:49:46.270 --> 00:49:59.219
Laurie Williams: someone I spoke with, and he really emphasized that decommissioning assets. So a system is drawn up, a machine is done. If you just let it fit without decommissioning.

231
00:49:59.230 --> 00:50:14.039
Laurie Williams: It might have passwords down level vulnerability down level components that no one ever updates so like just letting something sit is dangerous. So you need to have a process for decommissioning the assets one hundred and fifty,

232
00:50:14.180 --> 00:50:17.250
Laurie Williams: and there's a lot of maturity that needs to happen there

233
00:50:18.100 --> 00:50:30.169
Laurie Williams: and then safeguard build integrity again. Especially focused on the coat cups and Solomon's kind of thing, and there's seven practices here.

234
00:50:30.200 --> 00:50:42.830
Laurie Williams: Um halfway point four nine overall. So some areas improvement, so release policy, verification. These are automated release policies.

235
00:50:42.840 --> 00:50:59.440
Laurie Williams: Um could be better like It's treating these as code. These are like information, infrastructures, code scripts, Github action, kind of thing, better handled as code, better templated, better standardized across an environment would be good.

236
00:51:00.460 --> 00:51:06.599
Laurie Williams: Um! They are. Find the dependencies on the environment on build um could be better.

237
00:51:06.950 --> 00:51:26.139
Laurie Williams: Um. People in general don't utilize compiler, interpreter, and what not to fail right then they allow them to give warnings. Um reproducible bills not happening yet. Um, ephemeral bill. It's pretty good. So I thought it all means spin up a build, environment, build and let it go. That's pretty good

238
00:51:26.150 --> 00:51:33.189
Laurie Williams: um, Salsa advocated initially. Anyway, hermetic and parameterless Bill Hermetic,

239
00:51:33.210 --> 00:51:40.589
Laurie Williams: saying, Don't, go out to the Internet and your build process. A lot of people don't do that on parameter list. So you know,

240
00:51:40.600 --> 00:51:46.439
Laurie Williams: people are getting better. I I think that the focus on build environment has been a good thing.

241
00:51:47.520 --> 00:52:04.829
Laurie Williams: Um: secure software development environment. And this is again tied for first um. In general it departments are doing a good job at we're keeping the development environment safe. And this is a prevention practice. Probably the biggest

242
00:52:04.840 --> 00:52:19.659
Laurie Williams: concern. Maybe there's role-based access control. But in a lot of cases there's worldwide read access potentially through the whole corporation. Not sure if that's the best, maybe not enough least privilege, but like

243
00:52:19.670 --> 00:52:25.940
Laurie Williams: pretty much, everyone Uses role-based access control one hundred percent. But then, within that there's some concerns.

244
00:52:27.730 --> 00:52:46.510
Laurie Williams: And then the final group deployment so responding to and disclosing vulnerabilities this one came in second place, overall prevent, detect, respond. This is a response practice. This is a case where you have a problem. You know you sort of have no choice but to

245
00:52:46.630 --> 00:53:03.120
Laurie Williams: to respond to it. So typically response practices have higher adoption. Um! One of the ones that has a lower adoption is emergency fix that that was a practice or a task that came out of the S. Two C. Two framework

246
00:53:03.130 --> 00:53:11.939
Laurie Williams: um. An emergency fix is basically focused on what if you have a component, and the component supplier won't Fix it. Do you have a process?

247
00:53:11.950 --> 00:53:21.730
Laurie Williams: Do fix it. What's your process to fix it? What's your process to contribute your fix. There's some rainfall improvement there.

248
00:53:21.950 --> 00:53:40.419
Laurie Williams: And also it could be more proactive. Eradication. Eradication would say, There's a vulnerability type, don't just fix the one. You found that fix that type throughout your code base. Not a lot of focus on that one person, One company have a good perspective which basically says, You know, once a

249
00:53:40.430 --> 00:53:59.050
Laurie Williams: once a bug bounty, what is some bug bounty people find a particular vulnerability type. They'll just keep finding it for you. And so then they keep wanting to get paid so kind of an acknowledgment. If once they find something like that, we better eradicate it. But they're just going to keep bugging us,

250
00:54:00.060 --> 00:54:14.449
Laurie Williams: and the last one is in the deployment as well. Monitor intrusions and violations and system. Monitoring people are doing pretty well. Build process monitoring not as much.

251
00:54:14.460 --> 00:54:25.269
Laurie Williams: Um. So you know, this is again the solar lens and codec is someone in your build environment, and you need to detect that intrusion. Some some more work needs to be done there,

252
00:54:25.640 --> 00:54:29.279
Laurie Williams: and that's the end of the practices. Let me just give you some.

253
00:54:29.290 --> 00:54:30.709
Laurie Williams: I love you.

254
00:54:31.080 --> 00:54:34.729
Laurie Williams: These are the top ten practices or tasks.

255
00:54:34.890 --> 00:54:37.879
Laurie Williams: Um most. The doctor

256
00:54:37.960 --> 00:54:42.329
Laurie Williams: um, And one one thing to look at is of these

257
00:54:42.380 --> 00:54:56.869
Laurie Williams: of the top ten. One governance practice on the top ten, one product task in the top, ten, five environment and three deployment deployment are the most reactionary, most response.

258
00:54:57.880 --> 00:55:15.679
Laurie Williams: So, again, typical that people have their responses because they have no choice. Great that you know the environment that's proactive at preventing intrusions. I'm. Not as much focused on the governance And product. So some of the ones that are universally accepted.

259
00:55:15.690 --> 00:55:31.189
Laurie Williams: Um, the build output that is saying, Put your build output in a separate place from your build, input from authentication and monitoring changes to configuration settings overall system, monitoring automated So, having automated security gaining tools,

260
00:55:31.210 --> 00:55:48.690
Laurie Williams: I'm having role-based training, having a secure orchestration a platform, having risk based from vulnerability, remediation process, having separate environment and analyzing their vulnerabilities. So all of these really well accepted

261
00:55:49.060 --> 00:55:51.000
Laurie Williams: the bottom ten

262
00:55:51.200 --> 00:56:06.089
Laurie Williams: least accepted four from the governance practice, three from Um, the product of a product group, refund that environment group and none from the deployment group. So all of the deployment practices that are accepted.

263
00:56:06.100 --> 00:56:11.740
Laurie Williams: So things no one does consume an S bond. Nobody does reproducible builds.

264
00:56:11.910 --> 00:56:25.869
Laurie Williams: Barely does anyone deliver provenance Information Barely does anyone produce attestation? Barely does anyone require sign to commits? Barely does anyone deliver software bone materials?

265
00:56:25.900 --> 00:56:42.960
Laurie Williams: Barely does anyone prevent component like people bypassing the component vetting process so that barely preventing the bypass means. Developers can get components into the system without going through the official process.

266
00:56:42.980 --> 00:56:51.490
Laurie Williams: Not a lot of acid inventory, not not a big use of compile, the build process to

267
00:56:51.950 --> 00:57:02.649
Laurie Williams: find vulnerabilities. And then last, not a lot of verification of dependencies in the environment in the build process, but these are the least adopted.

268
00:57:02.930 --> 00:57:07.260
Laurie Williams: Just a warning. This is how I feel is that

269
00:57:07.410 --> 00:57:22.469
Laurie Williams: the biggest attack vector. That gets a lot of attention these days is use of codependencies as an attack vector and use of dependencies as a weapon. That's like, you know, a lot of people when they think about software supply chambers, they think about this

270
00:57:23.200 --> 00:57:26.320
Laurie Williams: footing, malicious commits, and but

271
00:57:26.360 --> 00:57:36.409
Laurie Williams: really the practices and the tasks that would mitigate this attack vector need to be adopted more when they come they are

272
00:57:38.270 --> 00:57:46.350
Laurie Williams: so. What's next? I do have to finalize the interviews like, I said. I think that the findings I shared are stable,

273
00:57:46.450 --> 00:58:02.639
Laurie Williams: but I need to finish it and publish the paper. Other things that I would like to do is to map the tasks that I presented to you to specific attack vectors specific and not not high level, but specific attack vectors

274
00:58:02.650 --> 00:58:11.289
Laurie Williams: to see and actually to. I have a partner. One of the forty three people I interviewed who would like to work with me on this is to take

275
00:58:11.300 --> 00:58:17.700
Laurie Williams: specific attacks that happened, and then go through the steps to say, This is what the attackers did.

276
00:58:17.840 --> 00:58:23.160
Laurie Williams: Could these particular proactive tasks that I'm laying out have stopped it

277
00:58:23.640 --> 00:58:32.840
Laurie Williams: So that would be an interesting exercise. And then I think it'd be good to to track this over time seeing our people adopting the practices.

278
00:58:34.360 --> 00:58:36.300
Laurie Williams: And that's it.

279
00:58:37.500 --> 00:58:49.390
Jeremy Epstein: Well, thank you very much, Lori. That was a great talk. There's some questions in the chat, and i'll ask some of those of you in just a moment. Do you want to stop sharing?

280
00:58:49.400 --> 00:59:06.289
Jeremy Epstein: Yeah, um, i'm going to. Actually, first thing i'll do is as a reminder. This lecture is being recorded, and it'll be posted, and there's the web page where it'll be posted. Ah, soon in the next few weeks, just as a reminder for folks.

281
00:59:06.300 --> 00:59:24.239
Jeremy Epstein: So I I did some work in this area actually. Ah, before B Sim um um! One of my papers. You you cited actually one of my papers. Ah, in this um! So it's! It's interesting to me to see how far it's evolved, and and one of the things that

282
00:59:24.920 --> 00:59:43.950
Jeremy Epstein: so thinking back twenty twenty-five years when when Bill Gates wrote the famous Security Memo, when woke up all of Microsoft that We have to pay attention to the security stuff, and we've made a lot of progress in the past twenty or twenty five years, not just at Microsoft, but everywhere. But

283
00:59:43.960 --> 01:00:02.940
Jeremy Epstein: how? How quickly do you see this moving, so that your scores get to the point where a study like this is irrelevant, because everyone's getting a perfect score and everything everyone's getting in, and eight hundred on their sats. How do we get to that point? How long will it take to get to that point. And

284
01:00:02.950 --> 01:00:13.579
Jeremy Epstein: ah, what what I mean other than regulations are the things that really are going to move us towards that eight hundred Sat score

285
01:00:13.590 --> 01:00:25.920
Jeremy Epstein: right? Yeah, no, I I would guess that. I mean, you know, I think that there could be an analogy between Bill Gates and then Memo and the Executive Order, which I think both very defining moments, but

286
01:00:25.930 --> 01:00:33.970
Laurie Williams: based upon the conversations I've had, I think there's a long way to go, because, as I mentioned these these seven companies I talked to.

287
01:00:34.070 --> 01:00:39.780
Laurie Williams: They are the leaders, you know they they are, and and they have a long way to go.

288
01:00:39.850 --> 01:00:43.150
Laurie Williams: And so, if you take the overall supper industry,

289
01:00:43.640 --> 01:00:46.200
Laurie Williams: I think it, it'll be a what.

290
01:00:46.300 --> 01:00:55.980
Jeremy Epstein: So one of the things in in the very limited study I had done fifteen years ago, or maybe a little more than that was

291
01:00:56.120 --> 01:01:12.659
Jeremy Epstein: trying to look at the difference between how big companies and small companies dealt with these things. Do you have a sense? Ah, in that regard? And and is it going to take longer for those small and medium companies to catch up? Or is the fact that they're more agile an advantage in this case.

292
01:01:12.730 --> 01:01:17.599
Laurie Williams: I do think that there they play off. So I mean in the

293
01:01:17.630 --> 01:01:28.020
Laurie Williams: in the summits, yeah, across the board. We have some small, and so the people who seem to perform the best and even do reproducible builds

294
01:01:28.030 --> 01:01:44.169
Laurie Williams: are the smaller companies just starting out. And so companies, smaller companies just starting out, and or people even within a company talking about their new products, they they can do much better if they start from the beginning,

295
01:01:44.210 --> 01:01:48.209
Laurie Williams: whether it be a small company, medium-size company,

296
01:01:48.760 --> 01:01:50.859
Laurie Williams: they they seem to do better.

297
01:01:52.120 --> 01:02:18.779
Jeremy Epstein: So there's some great questions. I'm going to read some of the questions from the chat. Um. The first question Well, to two questions from the same person that are interrelated to you, elaborate on security metrics, what and how to measure evolution, Current status, grand challenges, future direction. And how does a security metrics in a government group relate to security checks in the environment

298
01:02:19.140 --> 01:02:20.419
Laurie Williams: never happened.

299
01:02:20.430 --> 01:02:29.419
Jeremy Epstein: I need to repeat some of them so like there's like two levels, I mean, when I think about security metrics, I think about two levels.

300
01:02:29.850 --> 01:02:43.900
Laurie Williams: Um, one is like, you know, from just say from Jeremy you the frontier we have, you know. One of the things we are trying to do is, say, are we as an industry doing better? Are we getting better at

301
01:02:44.080 --> 01:02:46.589
Laurie Williams: preventing supply chain attacks? End?

302
01:02:46.600 --> 01:03:05.189
Laurie Williams: That's a really hard one, because you have to define like, what is a supply chain attack, anyway? And um and people change, you know, like different different asset levels. There's a there's a bunch of change that's Ah, that's a really hard one. When I talk to companies about security metrics

303
01:03:05.200 --> 01:03:25.079
Laurie Williams: they're looking at specifically their escapes. They're looking at. You know how many penetration, testing vulnerabilities do they have? How many customer vulnerabilities they have? Um! So they they do actually have How many static analysis! So they can look at their tools and look at the output of their tools

304
01:03:25.090 --> 01:03:33.159
Laurie Williams: and say, like it looks like we're secure enough like they have a little bit more of a handle on that. Not the high level.

305
01:03:33.790 --> 01:03:36.130
Jeremy Epstein: So yeah,

306
01:03:37.170 --> 01:03:38.510
Jeremy Epstein: I think I'm: right.

307
01:03:38.520 --> 01:04:08.290
Jeremy Epstein: So let me. Ah, i'm not necessarily doing these questions in order. Um, The Pss. Crm: Looks like a wonderful framework for securing software supply chain. I was wondering if and how we can apply it in the development of open source. Software Have you seen relevant cases in how open source projects apply these tasks among the four groups, governance, product, environment and deployment, which ones may be more challenging for open source projects.

308
01:04:08.300 --> 01:04:14.120
Jeremy Epstein: And I wonder whether the stuff that what are they called

309
01:04:14.190 --> 01:04:19.189
Jeremy Epstein: the Open Source group that you mentioned at the very beginning? I wonder if they're working in that area.

310
01:04:19.200 --> 01:04:31.929
Jeremy Epstein: Yeah, no. So I've had some as I've gone through the the studies, and I've presented the general framework to open Ss. And had the same question; and I,

311
01:04:32.090 --> 01:04:45.090
Laurie Williams: when I say the governance, probably almost none of those practices would be adopted, but really within the open s stuff. When I presented to them like they would love a study like this on

312
01:04:45.100 --> 01:05:04.460
Laurie Williams: focus, like open-source organizations and so I've I've definitely open to doing that. And So if anyone has a contact there, and there was one open source organization that said we could do it with you. I wanted to get through that companies. First because they were. They were complicated enough. Open source where you're trying to say like,

313
01:05:04.470 --> 01:05:06.390
who does this? When you talk to me

314
01:05:06.400 --> 01:05:18.829
Laurie Williams: it's harder. But i'm definitely open to it. My prediction would be from very little adoption of governance, maybe more of the product,

315
01:05:18.870 --> 01:05:20.970
Laurie Williams: probably less so.

316
01:05:28.630 --> 01:05:32.229
Laurie Williams: But open to that study. If anyone wants to contact me about that,

317
01:05:33.090 --> 01:05:41.090
Jeremy Epstein: you presented the results from seven companies. But there were forty three interviews where the forty three interviews from those seven companies.

318
01:05:41.100 --> 01:05:48.369
Jeremy Epstein: Okay, And are you planning to increase the number of companies beyond thirty?

319
01:05:49.150 --> 01:05:50.979
Laurie Williams: Um. So

320
01:05:51.880 --> 01:05:55.600
Jeremy Epstein: um, with with proper support.

321
01:05:55.610 --> 01:06:00.279
Laurie Williams: So I did this in conjunction with synopsis so far.

322
01:06:00.330 --> 01:06:04.309
Laurie Williams: Um, and I I it's very time intensive. I'll tell you that

323
01:06:04.320 --> 01:06:07.269
Laurie Williams: it's very time intensive, so it does require.

324
01:06:07.450 --> 01:06:10.190
Jeremy Epstein: But it's very interesting, though

325
01:06:10.200 --> 01:06:11.090
Jeremy Epstein: here we have

326
01:06:11.100 --> 01:06:33.609
Jeremy Epstein: um, having done it myself fifteen years ago. I certainly agree with you. Um, In all of the frameworks you mentioned in map. Do you have a recommendation on which is the most effective to reference when implementing and validating technical controls for secure default configurations, either in pipelines or deployed. Software

327
01:06:34.070 --> 01:06:37.019
Jeremy Epstein: So it's specifically secure

328
01:06:37.780 --> 01:06:40.089
Laurie Williams: deploy settings

329
01:06:46.290 --> 01:06:48.330
Jeremy Epstein: they are made of. You know.

330
01:06:48.780 --> 01:06:50.910
Jeremy Epstein: Um,

331
01:06:52.240 --> 01:07:21.070
Jeremy Epstein: A question. How can Academia and Academia help reduce the software supply and chain? Risk? The obvious answer is, Join your research community at at Ncsu. But this is someone who's already a doctoral student. Maybe they'll be interested in joining your team as a a faculty member when they when they graduate there. There are other teams besides You're working on this at other universities. Can you tell us anything about your competitors, so to speak.

332
01:07:21.080 --> 01:07:29.259
Laurie Williams: Yeah, um competitors, collaborators synergistic, I mean, we. We try to view it as that way. So

333
01:07:29.660 --> 01:07:32.949
Laurie Williams: one is. There is a scored workshop.

334
01:07:32.960 --> 01:08:00.719
Laurie Williams: The scored workshop last year was at Ccs. It's at six, which was in L. A. It's it's in Amsterdam, Copenhagen, and they have a deadline in eight days a day to submit something quickly to get there. I would do that. It's a great place like the last year when I went I was really surprised that there was forty people worldwide at this workshop. So if you're interested in supply chain, I would totally

335
01:08:00.760 --> 01:08:06.090
Laurie Williams: try to get to the scored workshop. That's one thing within the us.

336
01:08:06.350 --> 01:08:07.939
Laurie Williams: Are we

337
01:08:08.220 --> 01:08:27.109
Laurie Williams: the frontier like? And this is a contact, me kind of thing. We have quarterly meetings, and we invite people to our quarterly meetings. So if you're doing research in this area, we want to meet you, and we want you to come so like we we do that every quarter we rotate the locations of our

338
01:08:27.120 --> 01:08:31.489
Laurie Williams: um collaborative with the P. I's in the group so definitely contact me about,

339
01:08:31.500 --> 01:08:40.490
Laurie Williams: and even worldwide. We have people from other countries call in and present in our quarter of the meetings as well, so

340
01:08:40.500 --> 01:08:44.359
Jeremy Epstein: try not to have competitors. Try only to have collaborators.

341
01:08:45.060 --> 01:09:00.050
Jeremy Epstein: So there's been a lot of talk in some parts of the Us. Government that we need to focus. Focus. Focus on memory safety, and that's just a small part of your big picture here.

342
01:09:00.350 --> 01:09:11.339
Jeremy Epstein: Do you have a sense of how and why they're focusing in that area, and how to get a bigger vision than just memory safety.

343
01:09:11.620 --> 01:09:22.759
Jeremy Epstein: Yeah. So I mean like memory safety. But of the things that I talked about would come into like product security requirements. So looking at things like memory, safety, safe languages,

344
01:09:22.770 --> 01:09:34.260
Laurie Williams: um isolation and sandboxing things like that. So there are other strategies, not just that, but from an architectural level should be considered

345
01:09:34.390 --> 01:09:39.519
Laurie Williams: so I would broaden it. But be thinking about ways to prevent

346
01:09:41.069 --> 01:09:44.099
Laurie Williams: the impact of supply chain attacks.

347
01:09:44.540 --> 01:09:48.780
Laurie Williams: It's important, but it's not to be all and end all right.

348
01:09:49.590 --> 01:09:50.830
Jeremy Epstein: Um

349
01:09:51.430 --> 01:10:07.920
Jeremy Epstein: Can you share insights on the low scores for some of the s-bomb-related tasks? It seems a little bit surprising, since Sba has received a lot of attention and discussions when the executive order came out. And of course, Allen,

350
01:10:08.500 --> 01:10:09.890
Jeremy Epstein: where am I going?

351
01:10:09.900 --> 01:10:10.990
Jeremy Epstein: Yeah, i'm sorry.

352
01:10:11.000 --> 01:10:22.690
Jeremy Epstein: How many Friedman has has ah ah achieved fame and and notoriety as mistress from um. But but people don't seem to be paying much attention to it according to your results.

353
01:10:22.700 --> 01:10:41.609
Laurie Williams: Yeah. So with the results show, and you know. So it's my numeric results. The summits, and then I do go to the Mitre has a quarterly supply chain meeting. Ss. C. A. I don't know what it all stands for. But anyway, which is a great meeting to go to as well, so like from that overall picture.

354
01:10:41.620 --> 01:10:48.040
Laurie Williams: I'll say that people are getting better and better and tools. We're getting better and better producing an as phone.

355
01:10:48.720 --> 01:10:57.090
Laurie Williams: That was the first line of defense. Of what the Executive order asks for. So that's what people are responding to, producing an s-bomb

356
01:10:57.140 --> 01:10:59.880
Laurie Williams: the next step, which is,

357
01:11:00.000 --> 01:11:12.810
Laurie Williams: it's storing an s bomb and sharing an S. Bond. There's just a lot of technical complexities like, How do people share an S bond between organizations? One hundred and one

358
01:11:12.820 --> 01:11:31.669
Laurie Williams: doing it securely, sharing it only with certain people, because there are, there could be proprietary information in there. What I want to share with certain people. The Government, like. Should every agency contact Microsoft, or the same as bomb? Or is there a way to share it? These are things that are not worked out yet?

359
01:11:31.680 --> 01:11:36.889
Laurie Williams: Um. And then the next phase is actually using the Yes bond

360
01:11:36.900 --> 01:11:38.599
using it for good.

361
01:11:38.610 --> 01:11:58.109
Laurie Williams: And there's not really tools available yet, so like there's an opportunity. I. I know that vendors know that, but actually using it. As for for good. So I mean the the low scores. I'll say the list crews are too twoful. One is all of that. The terrible aspects of moving beyond producing it is one

362
01:11:58.190 --> 01:12:07.369
Laurie Williams: to sharing it and using it is one, but then the other is, I'll say that I've heard more than one time from local people.

363
01:12:08.290 --> 01:12:12.209
Laurie Williams: What what attack vector does the S Bar mitigate?

364
01:12:12.880 --> 01:12:18.859
Laurie Williams: And so, not really feeling the urgency like we want to stop the attacks.

365
01:12:19.140 --> 01:12:24.539
Laurie Williams: So if you think about, prevent a tech to respond as long as it a response one,

366
01:12:24.570 --> 01:12:30.990
Laurie Williams: it's not to prevent one. And so people are focused on preventing attacks. They're not as motivated,

367
01:12:31.000 --> 01:12:35.650
Laurie Williams: and not everyone does have to produce an es bond, not everyone else to the Us. Government.

368
01:12:37.190 --> 01:13:01.729
Jeremy Epstein: Twenty odd years ago, an executive at Johnson and Johnson. Ah told me that they weren't worried about attacks because who would have hacked a baby power company. Of course they have significantly changed their view over the intervening decades. That was that was an early and and fairly naive perspective, and and i'm sure that I know they've changed their view.

369
01:13:01.750 --> 01:13:08.599
Jeremy Epstein: But are there some organizations that are just so small? And

370
01:13:09.250 --> 01:13:10.630
Jeremy Epstein: hmm

371
01:13:10.640 --> 01:13:31.509
Jeremy Epstein: insignificant enough from an attacker's perspective that they don't need to worry about it. I mean, they're they're companies that make software for pizza, borrowers or or things like that. Are there some that can just say, Yeah, there's a bunch of stuff. Don't don't worry about it, or does every company that's creating? Software. We only need to focus on it.

372
01:13:31.610 --> 01:13:34.120
Laurie Williams: I do think every software company

373
01:13:34.170 --> 01:13:36.189
Laurie Williams: he needs to focus on it,

374
01:13:36.200 --> 01:13:55.789
Laurie Williams: because, like once, once something is in the supply chain, anyone anyone can be attacked, whether it's pizza, father, or whatever. Um just kind of. We saw that back with the Mariah attack whenever it was like. Every whole monitor that didn't that had a default password got attacked, and then the Internet got attacked because of it. So

375
01:13:55.800 --> 01:14:03.019
Jeremy Epstein: don't think anyone's immune to it, however, for sure, there's a lack of awareness,

376
01:14:03.090 --> 01:14:11.179
Laurie Williams: you know. So if you could say to a typical like, I, I have talked to like small vendors and explain like, you know you,

377
01:14:11.400 --> 01:14:17.109
Laurie Williams: you can't just take a popular component off the Internet and assume It's good.

378
01:14:17.400 --> 01:14:30.989
Jeremy Epstein: They believe it is like, Oh, all the software that's in an open source. Ecosystem is sure like No, not true, and they they really do believe that. So there is a lot of work, and

379
01:14:31.000 --> 01:14:47.220
Laurie Williams: also it's kind of a little bit of flip-flop like it used to be. If a lot of people the dial download break is high. A lot of people use it therefore it must be secure, because everyone else uses it. But actually now it's kind of like. If a lot of people download it, It's

380
01:14:47.230 --> 01:14:51.969
Jeremy Epstein: really prime to be a target of a supply chain attack because

381
01:14:52.580 --> 01:14:56.890
Laurie Williams: the attack could be leveraged high degree. So a lot to watch for.

382
01:14:56.900 --> 01:15:24.040
Jeremy Epstein: Right There's a value to security through obscurity, in a sense that if you're using something that everybody else is not using um, even though it may have all these problems, Ah! Attackers are less likely to have spent the energy to figure out how to attack you through it, and unless they're specifically aiming at you, if you're aiming at you, you're in trouble. But the the goal is, as the old Jo goes to run faster.

383
01:15:24.050 --> 01:15:27.389
You don't have to run faster than the bear. You have to run faster than your friend.

384
01:15:27.400 --> 01:15:28.490
Jeremy Epstein: Yeah,

385
01:15:28.500 --> 01:15:29.449
Jeremy Epstein: that's right

386
01:15:30.150 --> 01:15:31.240
Laurie Williams: there.

387
01:15:31.250 --> 01:16:00.180
Jeremy Epstein: Um, There's another question here In these frameworks and guides. There seem to be a lot of focus on individual components almost like completing a checklist. The people with whom you have met, however, have expressed an earnest desire to be better than meeting the minimum requirements that check most compliance would achieve. How can these people and their organizations model the interactions at a system rather than component of

388
01:16:00.190 --> 01:16:11.110
Jeremy Epstein: including human factors, to help identify and mitigate risks to system failure which can occur even when every component is individually compliant.

389
01:16:11.120 --> 01:16:20.889
Laurie Williams: Right? So there are some of the individual tasks like so brat Modeling is a good example of one.

390
01:16:27.080 --> 01:16:37.790
Laurie Williams: So it there are. There are on aspects that are system-wide system, architecture, system, design, checks that are in there.

391
01:16:37.800 --> 01:16:41.500
Laurie Williams: Um But there was another question. Besides that, what was

392
01:16:41.960 --> 01:16:45.570
Laurie Williams: so? It is. It is more holistic. It is more holistic

393
01:16:45.580 --> 01:16:47.189
Laurie Williams: of the human aspects.

394
01:16:47.200 --> 01:17:06.890
Jeremy Epstein: So you are absolutely right. Um! And I I've thought about that. The human aspects, and how to get that in so like in the frontier that Jeremy supports, we have a whole focus on human aspects. Because if you think about the adoption of any of these seventy two, every single one of them,

395
01:17:06.930 --> 01:17:22.510
Laurie Williams: it requires a human to adopt it, and humans care about their. They do still care about producing functionality and not being interrupted with their work well and whatnot. So I would say like, if you take all these seventy two, and put a circle around them. That would be the human

396
01:17:23.940 --> 01:17:26.170
Laurie Williams: on an individual task.

397
01:17:27.200 --> 01:17:34.310
Jeremy Epstein: So, going back to the issue of smaller organizations, our government

398
01:17:34.320 --> 01:17:56.879
Jeremy Epstein: agencies like, say, the small business administration. Are they a a vehicle to get some of the word out that you're using here? Um, I don't worry about Microsoft and Ibm and Google. They certainly have all these issues, but they also have the resources to address them. But how do we help?

399
01:17:56.890 --> 01:18:07.189
Jeremy Epstein: Who are the small companies? Is it Sba? Is it the Chambers of Commerce? What is it that can help them to even know that there is a problem?

400
01:18:07.200 --> 01:18:12.299
Laurie Williams: Yeah. Good. Good question. Um. I think all of the above that you said

401
01:18:13.790 --> 01:18:16.030
Laurie Williams: to get the word out. And

402
01:18:16.450 --> 01:18:23.519
Laurie Williams: you know, thinking about you know where to publish these types of things so that they're read by the right people.

403
01:18:23.660 --> 01:18:26.330
Laurie Williams: But yeah, the word definitely needs to get out

404
01:18:27.110 --> 01:18:42.030
Jeremy Epstein: um Another question from the chat. Some of the build Providence discussions that you talked about are there publicly available sources that academics can use, and so to study, build, provenance or whack thereof.

405
01:18:42.460 --> 01:18:55.700
Laurie Williams: Well, I mean, for people interested in build provenance. I would certainly look at two things. One is the Salsa framework, Salsa Ssa. Supply chain levels for supper architecture and the in toto framework.

406
01:18:55.970 --> 01:19:02.919
Jeremy Epstein: Um, which is a cryptographic expression of phenomenon. So those are the two things I would like

407
01:19:04.390 --> 01:19:28.920
Jeremy Epstein: one of the things that the academics obviously always worry about is, where do I publish the results? Um. So are there places that are that folks should be looking I triple e-sective? Um is one, and I Ah, you and I work together on the actual twenty Security and Privacy Magazine. Are there other places that that folks should be looking to understand the latest research,

408
01:19:29.130 --> 01:19:39.940
Jeremy Epstein: I mean, I think, supply chain research. Is, you know it. Can I also publish in self-engineering places so like See? Or

409
01:19:39.950 --> 01:19:57.290
Laurie Williams: um I haven't published an F. Ic. Yet. But you know, I think that it would be fine. But then I think all of the classic so will Ank. Who's my partner in crime at Nancy State? He's the program chair for Oakland, Slash, S. And P. Next year he put supply chain specifically in the call

410
01:19:57.300 --> 01:20:02.030
Jeremy Epstein: So I think all of the classic security venues were replaced to look.

411
01:20:02.290 --> 01:20:16.300
Jeremy Epstein: How about the how about coming at it from the business side? Are there places where people publish the business side of supply chain things that computer scientists might never have heard of.

412
01:20:17.660 --> 01:20:21.090
Laurie Williams: I can't say that I know the answer to that

413
01:20:21.100 --> 01:20:21.849
Jeremy Epstein: she's.

414
01:20:21.940 --> 01:20:32.570
Jeremy Epstein: I bet I bet there are, because I know there are people in business schools looking at these issues from a completely different angle. Okay, cool.

415
01:20:32.890 --> 01:20:50.049
Jeremy Epstein: Um. So if this were, if we were in person, let's see if this is going to work. If we were in person, I would be giving you this challenge coin. Let's see if I can turn it so you can see it, you know. I'm going to take it out of the out of the little

416
01:20:50.060 --> 01:20:57.569
Jeremy Epstein: can I do this without ripping it? Yeah. So we we created these challenge points in a

417
01:20:57.910 --> 01:21:27.710
Jeremy Epstein: that, maybe. Yeah, there we go. Um, it's got Ah, Grace Hopper, you know Lady Ada Lovest and Alan Turing on one side and the Nsf. Logo on the other. This is the size challenge. When, if we were in person I would be handing it to you, since we're not in person, I will be nailing it to you. Ah, but this is a a symbol of Nsf. And sizes. Ah, appreciation of your participation in our distinguished lecture series today.

418
01:21:29.130 --> 01:21:30.789
Jeremy Epstein: Um: Okay. So

419
01:21:30.800 --> 01:21:31.690
Laurie Williams: okay,

420
01:21:31.700 --> 01:21:58.140
Jeremy Epstein: yeah, yeah, um. So we um have a break now until for Nsf. Folks are welcome to join at three o'clock this afternoon, where we'll have more of an informal discussion where I won't be asking all the questions, but there'll be more of an informal discussion. Um, if there are questions from participants

421
01:21:58.150 --> 01:22:15.490
Jeremy Epstein: who are not Nsf. Folks uh i'm sure and worry would be happy to, uh respond to to email or or something like that. Um, Any Any closing thoughts, Lori, before we let people get some lunch or whatever meal is appropriate.

422
01:22:15.500 --> 01:22:17.689
Jeremy Epstein: It's not a fee time of day.

423
01:22:17.700 --> 01:22:34.970
Laurie Williams: Yeah. The only the only closing that i'll have is that you know. Send me an email. I'm happy to share the framework, as I mentioned before, from a research standpoint only if there's only collaborators, no competitors. I'm interested in hearing what you're interested in, so that we can all work together.

424
01:22:35.140 --> 01:22:35.790
It's

425
01:22:35.800 --> 01:22:48.930
Jeremy Epstein: well, thank you again, Lori. I look forward to seeing you at three o'clock. Hope to see many of the Nsf. Folks at three o'clock, and to everybody Nsf. Or not. Nsf: Thanks for participating in Today's distinguished lecture.

426
01:22:49.760 --> 01:22:51.669
Laurie Williams: All right. Thank you

427
01:22:51.680 --> 01:22:54.259
Jeremy Epstein: all right. I'll see you later. Bye bye.

