This post is a quick recap of Global Drone Security Network (GDSN) #3.
We are honoured to host presentation from David Kovar from URSA. If you haven't watched his talk "Understanding CUAS - CUAS Test & Evaluation and CUAS Forensics" please visit our YouTube channel.
Understanding CUAS - CUAS Test & Evaluation and CUAS Forensics
Thank you very much for the opportunity. I really enjoyed participating the last time that we did this, and you've got a great set of presenters and you've got a great audience and that, the opportunity to engage in conversations with peers and with people that are facing these challenges is tremendously valuable. None of us are experts in well, we are experts to a point. But none of us have learned everything that we need to know. We are constantly learning. And some of that comes from the sort of research and development that you and I were talking about. But a lot of it comes from conversations. And we were talking about that trusted third party we were also talking about intelligence sharing. Participating in opportunities such as this, and in your Slack channel and via your platform is a really great opportunity for everybody who's interested in these problems and contributing to solving these problems, to have a voice and to help each other learn. And so really excited about that and very appreciative of everybody who's participating. If you just joined us and missed the first 15 minutes ago. My name is David Kovar, I founded and run a company and company called URSA Inc. And URSA stands for unmanned robotic systems analysis. Our focus is on making sense of the behavior of unmanned systems and in particular, UAVs. That's my phone number, that's my email address. This deck will be available at the end of the presentation. There's a URL in here that goes to a series of blog posts that I did on this topic. So everything that you're about to see, you're going to get all these slides. But if you go to that URL, you're going to get a lot more information that digs into all the various aspects of what I'm about to talk about. You're welcome to reuse this information. The only thing I ask is that you cite our effort. And if you're interested in having conversations about it, do let me know. I'm happy to engage in those conversations. My background. I've been doing digital forensics and Cyber Security Investigations for a very long time now. For those not familiar with the terms, digital forensics, from my perspective, is about extracting information from individual devices and doing a really deep dive on that device. Cybersecurity, and particularly incident response investigations are really about extracting enormous volumes of information from a very large environment. So a fortune 50 company, for example, things like that. I've done both of those and other cybersecurity related work. And that really helped inform what we're doing now, which is extracting information from individual UAVs or Counter UAS systems, but also extracting information from the very much larger environment. So 50 Counter UAS systems scattered around the United States, and UAVs that are found or otherwise, the evidence is extracted from all over the world, and bringing all that together to help do trend analysis, threat analysis, all sorts of other things. I've got five plus years of private sector, UAV forensics and analysis, I suspect that puts me inthe top, some very small percentage in the world, I've really been looking at doing this stuff for a very long time. There are other people out there that are as good as I am or better. I'm not saying that I'm the best. I'm just saying, this is a passion of mine. And I'm very excited and thrilled to be able to participate in the community in doing this sort of work. In the lead up to this, we were talking about trusted, unbiased third parties. That is very important to me and to our firm and to the people I work with. We have a strong desire to understand and to show how systems work. And we try to do this in a supportive manner that does not represent the financial or other interests of any particular party. We've been developing a general purpose telemetry analysis platform for analyzing and visualizing telemetry data. And we can currently extract information from a wide variety UAVs, from Counter UAS systems from ADSB data, which is manned aircraft telemetry AIS data, which is just marine telemetry data. And we're looking at how to intersect that and sort of tell stories of not just how UAVs are operating, but how all these systems are interoperating and relating to each other. So that's the end of the marketing pitch.
What are we here to talk about? Counter UAS systems are a really big thing. And they have been for probably five years now. There's been an enormous amount of VC money invested in them. And in 2019, Bard College did report saying that there were over 250 Counter UAS products on the market. So that number is only gone up since then, the problem is figuring out which ones, which of those systems work, and you have to find what work means. And then you got to figure out, okay, against what threats and in one environment, so you really need to have a formal test evaluation process that helps guide your procurement process. And so that's the primary thing that we're here to talk about. Given my background, it's important for me to remind everybody that test evaluation and forensics sit on that same foundation, both of those efforts, really need to understand how the systems are performing and to be able to document that, they're working towards different ends and they're looking at different data, but they sit on a similar foundation. So I mentioned during the lead up to this, that we were way out in front of the bleeding edge of doing UAV forensics when we started. We're out in front of the need for Counter UAS forensics as well. The number of shootdowns or the number of times where Counter UAS systems have interacted with malicious UAV that has led to some sort of court case or prosecution or whatever, is very, very small at the moment. I think it's reasonable to assume that over the next couple of years, as Counter UAS systems are coming online, within the military, within law enforcement, but also within the civilian sector, in protecting oil and gas and things like that, that we're gonna have more opportunities or more cases where there is a Counter UAS involved with a malicious UAV, and that it's going to go in front of the judge and the jury. And so when that happens, or before that happens, we need to know how to do forensics. Digital Forensics on not just the UAV that was involved, but on all the other systems that were participating in that. So we can tell a full story. So that's why Counter UAS forensics. If you want to test and evaluate Counter UAS systems, there's an enormous amount of work involved in generating the data required to do the analysis. The analysis, while very difficult, is just a very small part of a much larger effort. I was just involved on a conversation with some people on a Facebook group that were about to do some Counter UAS testing that reminded me of this particular point. There's a lot of time and a lot of prep work involved in doing all the logistics, getting the people there, getting everything cited, making sure that there's enough network bandwidth for all the traffic that you're about to do, that you've got real time communications between all the participating groups, that you've got enough channels available, so that everybody's not talking over each other, you need to have a well documented and practiced set of flight plans. And those flight plans should represent the threat models that you want to test against. And you need to have people who are capable of executing those play plans over and over again, in exactly the same way, whenever possible. And you need to have regulatory approval, not just for the Counter UAS systems, because they may be emitting in certain frequencies which are, may interfere with other systems in the area, but also for the flight plans. If you think that the threat actor is going to fly within the regulatory envelope, then you don't need regulatory approval for your flight tests. But if you think that the threat actor may come in at 5000 feet, and then drop down on top of your site, that is probably in violation of, that particular flight test is probably in violation of the regulatory environment. And so you might want to get a waiver for doing that. So there's a enormous amount of work involved in doing this stuff, right? I'm only going to really talk about the data collection analysis part of it. If you're interested in the other aspects of it. Let me know, happy to have conversations about that as well. What are we trying to accomplish? Does your Counter UAS system do what you expect? The sort of informal ways is exactly what I just did in marketing speak, it's really important that the vendors demonstrate in a quantifiable manner, how their systems perform in a variety of environments against a range of targeted flight profiles, and overtime, as the hardware, software and configurations evolve. And as Mike and I were talking in the lead up to this, this is a sort of information that vendors need to be able to share with people. And it's very hard to get them to share. I would say to all the vendors and they know this, but I'm gonna say it anyhow, that the more that they are able to share this information in a standardized form, the easier it is for the people doing procurement to say yes, we understand what your system is and is not capable of it. And it aligns or does not align with our use cases. So a little bit more free flowing information between on both sides, everybody's got to be a little bit more transparent, is going to help with the procurement process. It's also important to share information from event to event to event. Most of these Counter UAS systems are based on some sort of computer system, often with a software defined radio. And there are lots of ways that that hardware and the software evolves over time. So the test that you did six months ago, in the Arizona desert, may be very different than the test that you're doing six months later on a Pacific Island. But having the information from both of them helps inform the end user and the vendor, how these different systems behave in different environments, which is very important to everybody participating. Another way of looking at what we're trying to accomplish is can we do this? And you're gonna say okay, what is this? This is an analysis of the UAVs track verses where the Counter UAS system thought the UAV was at a particular point in time. So in the upper left, we're comparing the Counter UAS' perception, which is the red versus the TSPI track and the TSPI is essentially the ground truth and we'll get into that. So it shows that the UAV and I map this from the real world into ECF Cartesian coordinates for doing some stuff so that's why it's looks as it does. So the UAV flew a little bit sideways, and then it flew in a straight line. Whereas the Counter UAS system thought that it was wandering around. Upper right, it's just showing where it was flying in the real world, the lower two charts are on the left bottom. It's the two dimensional and three dimensional error overtime. Early on in the flight, and all the way, it was actually a fair bit of error, 250 meters off, towards the end of the flight, the error margin was down under 150 meters, and then it got down to around 20 meters. The chart on the right hand side tells a little bit of that story.
As the UAV got closer to the sensor, it got more accurate. And there's a variety of other similar sorts of analysis that you can do. The take home point here is that you need to be able to do this sort of analysis. And to do this sort of analysis, you need two things, you need to know what questions you want to ask. And you need to gather the information required to go answer those questions. One of the things you need to understand or you need to agree on is what is Counter UAS effectiveness mean. In the military, and I'm oversimplifying, so bear with me, Counter UAS effectiveness is essentially, did it stop the threat? There's one or more UAVs coming at us? Did we manage to stop them before they had their intended effect, whether that's ISR or dropping munitions or whatever, did we stop it? If we did, if the UAV is one mile out and has been waived, or somehow or other stopped, then the system was effective, and the nuance is less important. As Mike and I were talking in the lead up, again, in the civilian space, or when you're not on the battlefield, there's a lot of other factors that come into play that help determine what Counter UAS effectiveness means. Some of the things that affect our rules of engagement. The US military, within the bounds of the United States has very different rules of engagement than they do if they're in a war zone. The regulatory environments are very restrictive, and Mike and I were talking about that in the lead up as well. We have Counter UAS capabilities in the United States that would enable us to stop a lot more UAVs coming across the southern border, if we had the regulatory approval to do so. Or they would have the capability of stopping UAVs flying over a nuclear powerplant. And there's a bunch of great articles from an organization called the drive about UAVs surveilling nuclear facilities, for the last couple of years in the United States, we're not able to stop them, because of the regulatory environment. We want to collect evidence, we want to go to court with it, if you laze the UAE out of the sky and melt it down to slag, there's not a lot of evidence left. The cost of systems as a big factor, and the fact that we're operating in non battlefield conditions. So all these go into determining what the effectiveness of the Counter UAS system is. This, I'm going to talk through this slide. But I think it's really important that if you're interested in this topic, pick up my presentation, read through the slide. And if you're doing tests or thinking about doing tests, or you just want understand what the Counter UAS systems are capable of doing, read through all these points, and really internalize them, and talk them over with the people that you're working with, whether it's your team or the vendor, or whatever. So you're all on the same page about what these terms meet. This is a great thing to do around a tabletop exercise. It's a great team building exercise, and also really important for accomplishing before you go off and start doing selection. The first one is detection. So what does it mean to detect the UAV? This is pretty good definition. It's like when the Counter UAS system reports that there's an object and it says and I want to emphasize that includes birds and commercial planes, and it's viewed as a detection. It detected something out there and that's it. The next step helps refine what that 'it' is and starts getting into the countering part of it. So once you've detected it, and you could have, I've got radar images where there's one in UAV, and there are flocks of birds around it. So there's an enormous number of detection, but there's only one UAV classified. So once you've detected it with a Counter UAS system, with or without support from its operator or other systems, determines that the detection is a small UAS. So, it may be that the operator is sitting there with a pair of field glasses. And looking out along the bearing that the Counter UAS system says and looks out and sees it's a UAV coming in. That's still classification. We're not saying it's got to be classified by the Counter UAS system. It just it does need to be classified before you move on to the next step. There are a lot of Counter UAS systems out there, which are integrated systems. So you've got RF detector, acoustic detector, and then you've got some sort of optical system that is slewed along the bearing, to look at where the other system is saying there's a detection out here, and the optical system using some sort of visual analysis or even the human looking image says, okay, that is a UAV. So classification needs to happen. It is really important to locate that object in space. If you're defending your perimeter, you're a prison or a nuclear facility or whatever. Your rules of engagement, your response plan, all that depends on where exactly is this incoming object? Is it three miles out and holding steady? It may not be a threat, or it may be doing ISR. Is it coming in at 200 miles per hour or 10 meters off the deck and coming directly at us. That helps guide us towards what's the intent here. Um, one of the sort of truths of doing testing evaluation is that if your bearing to the UAV is off by five degrees, at three miles, that's a lot less important than if you're off by five degrees at 50 feet. So that's one of those analysis parts is like, if you just say, okay, the accuracy in terms of bearing accuracy is 5%.That's great. But you need to know whether it's 5% at what different distances to really put that 5% in the context. So that location and getting accuracy, that location is more and more important as it gets closer in. Location can be a three dimensional point, you know, lat, long, out, it could be a circle or sphere. So somewhere within this sphere, it could be a line or bearing, radar generally produces azimuth and elevation, or other forms. So understand the accuracy and understand what that location is. Track is basically a compilation of location reports over a period of time. Make sure that the track consists of only the location reports from the same object, and you're not confusing objects together. The second sentence in that definition is important. Track scan be displayed as a line or a sequence of dots. The lines are really pretty. There's two big problems with displaying a track as a line. The first is a classic. Two points make a straight line. If you only have two detections and two locations, and they span five miles, you're going to have this beautiful straight line that shows that the UAV was flying in a straight line for five miles. If you switch that from a line to a sequence of dots, you will then immediately understand that what you have is only two detection points. And you wonder what the heck was going on in between those points. So keep in mind how you're visualizing the data. Because certain types of visualization will cause the observer to come to conclusions that you may not want them to come to, or that actually work against your ultimate objective. I think that it's important for people to deploy Counter UAS systems, even if they don't mitigate, understanding what's going on in your airspace, understanding what the threats are, understanding all that sort of stuff is important.
So if your Counter UAS system only does detect, classify, locate and track, then that's pretty good unto itself. However, if you want, ultimately, you're going to want to mitigate. And so there are a variety of forms of mitigation. You can negate it, you can interdict it, you can neutralize it, you can destroy it, you can send it home. All these things are a form of mitigation. It's really important to be able to identify the point at which the Counter UAS system was actually mitigating the UAS from accomplishing its mission. If the Counter UAS system says we are now jamming the UAV, you might say that's the mitigation. But if the UAV continues on its course for another30 seconds, before that jamming actually has an effect on it, both those points are important. When the mitigation started is important. But when the mitigation effort actually took effect, is also very important. So when you're doing mitigation, make sure you're tracking both. If we say a system detects the UAS, most of us will understand it. But it's also important to understand where all of this information is presented, it can be in a log file, user interface, audible alert, all these things? So, understanding Okay, we detected it is important, understanding how that communications is conveyed to the observer or the operator but also to the system doing recording is also very important. So agree on the terms, understand what they mean, understand how that information is communicated. I mentioned TSPI, TSPI devices. TSPI stands for time, space position information. So it is a device that's assigned to accurately and sometimes very accurately record where that device is in time, in space, and also with position sometimes orientation. For Counter UAS testing, it's a definitive source of truth on which everything else depends. All these UAV have their own log files, which show where they are, you can get them from DJI you can get them from pixhawk, all of them have record this information. A couple problems. One is sometimes it's hard to get that information off, DJI encodes and makes it difficult to get the log files off. But the other problem with them, two other problems with them are related to the fact that position information oftentimes isn't terribly accurate. I've seen multiple sources of position information in a DJI aircraft essentially disagree with each other, because some of it comes from the barometric sensor, some comes from the GPS, some comes from sort of predicting where it's going to be, all that stuff is challenging, and also the GPS may not be terribly accurate. So having an external source of truth that's mounted onto the UAV, to collect this information, I think is really important. If your GPS onboard the aircraft is only accurate to 40 meters, then any conclusion you come to about how accurate the Counter UAS system can never be more accurate than 40 meters. Um, the other problem is that if you're jamming the UAV system, then all of this time space position information, it becomes suspect, as soon as that effect starts coming into play. As soon as the Counter UAS system starts having an effect on the UAV. This is one of those reference slides. You know, if you're interested in TSPI pucks, and by the way, there are not a lot out there, we developed a prototype for the US Air Force, and we're working on getting funding for a production model. This is just some notes on what you're looking for on a TSPI device. Fundamentally, oftentimes, once you do a test, you don't have an opportunity to recreate it, you can't fly the aircraft again, you know, there are other aircraft you need to fly, the vendors need to leave whatever, if you don't collect this information correctly, the first time, getting it again later is hard. So making the TSPI puck, TSPI device as fault tolerant as possible is really, really important. It should be sufficiently weather resistant and impact resistance to survive the normal operating environment. If you're going to shoot the UAV down and it's going to fall to Earth from 300meters, the TSPI puck should survive it, you should have a real time data link when possible, which helps mitigate the problems of shooting it down. But still make it fault tolerant, make it physically robust. That's most of those first set of bullet points. It's got to be accurate. It's got to be in some sort of common format. It's got to be trusted. You want to reduce the swap. Swap stands for size, weight and power. I won't pick it up I've got a really bad TSPI puck with me. It was very large and had an impact on the flight performance of the UAV, it increased the radar cross section, you want to do as little as possible to affect the flight characteristics and the physical characteristics of the UAV that you're mounting this on, the power should be independent of the target, you don't want to draw power from the target either. Whenever possible, provide a real time downlink. That means that you're getting the data, it provides you with situational awareness, which is really helpful for running these sort of things. But also, if the TSPI pucks lands in the river, or it gets lazed or whatever, you still get the data. So that's important. Um, the data link should not interfere with or be affected by the Counter UAS systems in the whole exercise. If you have a 933 megahertz control frequency, and you're using the same 933 megahertz frequency for downlinking your data, and somebody jams that link as part of the mitigation, you just lost your data connection, your TSPI link connection as well. Human observers, we want to fully automate Counter UAS testing evaluation, because humans are error prone, converting analog data, my speaking or my writing notes down into digital form is hard. But it's really important when doing this sort of stuff. There's a lot of information that's not available, or not recorded by electronic sensors. You know, seeing the UAV wobble in flight, for example, when the mitigation takes effect is an important thing to note. Because it may not get captured in the digital data or may be hard to get to. Recording when the audible alert is put out on for the operators is important. Viewing the user interface for the Counter UAS is important. And it's not in here, but it's in the long form blog posts. I would recommend that if you got Counter UAS system displays, that if at all possible, you do a video capture of that, because then when you've got questions about what was the observer seeing two months later, when you're doing the data analysis, you can go look at the video and say, Okay, this is exactly what was going on. So, worth keeping in mind. That's where the human observers come in. They can help us answer how many UAVs were actually in the air. I was participating in an exercise where there was a lot of confusion about how many UAVs were actually available to be detected by the Counter UAS system. You can get the humans to record this and do it accurately, which TSPI pucks go on which aircraft, that's very important to know which track was going with which aircraft, when the aircraft went in the river did the Counter UAS system detect the UAV, mitigate, all this sort of stuff is stuff information that's really important to have a human also record and capture in some sort of digital form. Counter UAS data sources. There are a lot of sources, and which source you want to use for Counter UAS testing, evaluation, forensics, depends on a lot of different factors. And I'm not going to tell you that one is best for your circumstances, you're going to need to figure out based on what you're trying to accomplish, your resources you have available, you know which one of these sources you want to work with. The four primary data sources are the vendor logfiles. So these are computer systems, they are generating volumous logs. If you've done incident response, or digital forensics, you're familiar with this. This is very high fidelity information. It's logged in real time, it's not available in real time. It's really poorly documented, you know, some engineers gonna add some new log message and if it gets documented months later. And so subject matter expertise is often required to really understand these logfiles. However, it's a really great source of information. Most of the vendors have API's, application programming interfaces, you can tie into that and get a lot of real time information, information from the system. I's vendor curated in the sense that the vendor is presenting to the client, the software client information in a form, and only the information the vendor wants to share. But that is really the information you're probably looking for, for doing test evaluation. It is done in real time, which is really important. It is transient though. So if you want to preserve the information that's available via the API, you, the testing evaluation, people should log it and preserve it. That API is well documented, it's user friendly, all that sort of stuff. So in general, if you can tie into this, that's the best option. Supporting a client for doing all the logging in using the vendors API does require a software engineering test and validation of that process. So it's non trivial to get right. There's a sort of standards, discussion to have around that. The vendor user interface is incredibly, highly curated. The vendor is defining the experience. And what information is shared, how it's shared. It's real time, it is transient. If you want to preserve it, you got to do a video recording of it. And it's the ultimate user experience, as I said, preserving this information to understand what the user experience is really important, it may turn out that in the vendor log files, it says, we detected, we classified, we tracked and we mitigated, but none of that information may have gotten to the user interface. And so ultimately, the system failed because the operator didn't get that information. So if you have the resources available, combining those two things together gives you a much better context. Standard and proprietary integration layers are where I think that if we want to do standardized, Counter UAS testing evaluation of scale we should do. There are in the US, two major command and control layers, one is fad C-2, the other one is Medusa. Basically, these are very, very well designed and standardized ways for various sensors to communicate with each other and with control systems .And so if all of your sensors can participate in one of those layers, and you can pull information out of that, it's probably the best source of information, because those standards make sure that all the data is normalized, it makes sure you're getting all the same information for all the sensors, all that sort of stuff. If you're playing at the DoD level, that's easy. If you're doing sort of working down in the consumer commercial level, and in the sub $100,000sensor level it's probably a little bit harder. Hopefully, that will change. Counter UAS systems generate enormous volumes of data, a lot of it is of no interest to many of us. And different use cases will really determine what data you're looking at. Depending on what you're trying to accomplish, which of those four you choose is likely to be obvious. The exercise I was working on, we worked with vendor log files, had a great experience, we've had some challenges, we had to do a lot of normalization, and I'll talk about that. And it's worth noting that the easier it is to obtain the data, the farther you areaway from the unvarnished truth. So just keep that in mind. Garbage in garbage out. I'm gonna start moving a little more quickly through this. If your raw data is flawed, there's no hope for meaningful analysis, you probably can't go collect it again. If it is questionable, if there's doubt about how accurate it is, then the analysts are gonna spend a lot of time making sure, getting rid of all those questions. So that's not time well spent, get it right the first time, if it was poorly organized, you know, you drop it on a thumb drive, and you don't upload that thumb drive to the right place, all that sort of stuff. Then, the data is useless, you've got to make it so it's available to people and they can trust it. And then there's normalization. If it's not in the same format units, geospatial reference models, all that, then the lot of resources required to normalize it before any meaningful analysis is done.
So get the normalization right at the outset. Data normalization, to really compare this information and we saw why we want to compare it, we want to compare where the Counter UAS system thought the target was versus where the target really was, we must use common frames of reference. There are simple ways of doing this, there are very sophisticated adjustments. The effort is to bring in the common frame of reference, the bare minimum, you should use the same time zones. I recommend UTC and reference model WGS 84, lat long altitude for physical location participating systems. It's worth noting that radar systems will not generate lat long altitude, they'll generate bearing and elevation. So you need to have some way of putting that information into that common framework. This is a slide that you should pick up the deck afterwards and digest. It's talking about all the data collection steps. On the left side. It's what we had to do before the event even kicked off. In the middle, it's all the data collection that we did during the event. And then there was all the data collection after. So there's a lot of different elements of data collection that we had to get right for everything to come together correctly. And so this is part of the prep work. But this is also part of the, if we get better at sharing what worked for one exercise with other people, then the next exercise will be easier to do, it will generate better data. So this is the sort of thing that we can all collaborate on and get help everybody else get better at. Data visualization. Excel just does not scale, that's a great starting point. You can also drop things into Google Earth, there's a lot of ways you can get some basic analysis out of this stuff. But if you want to test 10 different systems over a period of a week, and then take those results and compare it to a similar exercise done in Pacific Islands, six months later, you need to do this at scale, you need to apply good software engineering to it, you need to put program and process around it. So there's a lot of work involved. I mentioned the state of visualization, this is a very simple form of it. This was done with Python, you can use MATLAB you can do a lot of things. The take home here, though, is that I've worked with people that kept sharing a lot of conclusions via Excel spreadsheets and pointing at various cells and saying, This means that the system did this at this time. For the person doing the analysis, that probably makes sense. For everybody else who's looking at that Excel spreadsheet on the screen, it's noise, there's no way of following along with what the person is talking about. We're visual learners and putting the information into well designed graphs is important. Anybody can produce a bad graph, spend some time doing some discussions with people about how you want to visualize the data, so it's actually representing and telling the story that you want to tell. Comparing Counter UAS track data. I mentioned most of the stuff. Ultimately, what we want to do is compare the Counter UAS perception to where the UAV was. This is relatively straightforward if you normalize everything, if you haven't, it's essentially impossible. Google Earth works. Plotting error over time and distance is relatively straightforward. So as much as I am trying to push us all towards a standardized automated system, you can use Google Earth, you can use Excel for getting some pretty good understanding of how systems didn't work. So the takeaways that I want to leave you with, that's the definition of a takeaway. Figure out what your use cases are. Are you going to be able to mitigate a UAV? If not, then you don't even need to worry about the mitigation stuff. Or you can choose to de-prioritize that. What's the threat? You know, somebody's trying to bring contraband into your prison? Is somebody trying to do ISR, surveillance over your nuclear reactor, you know, what are the threats you're trying to counter and make sure that you are selecting systems and generating test cases and generating flight profiles that align with what your use cases are and what you perceive your threat will be. And bear in mind that those evolve. On testing representative environment. So if you are putting a nuclear reactor on a Pacific Island, I guess there's some benefits to doing that. I'm not too sure who's going to be consuming your energy. Testing the systems in a mountainous environment or a desert environment isn't really gonna be relevant. Similarly, if you've got clear sightlines out for five miles, then you're detecting ranges are going to be different. And you may be able to use different sensors. So talk through whiteboard, tabletop exercise, all that sort of stuff. Make sure that everybody's on the same page in terms of what you're trying to test, what the threat is, what the environment is, and things like that. Select and fly the targets. So there's a period there. So figure out, what, again, based on your threat model and your use cases, figure out what the adversary is likely to be flying, and figure out how they're gonna be flying them. You or somebody working for you, or a trusted third party should go through the acquisition and operation of those targets. I have seen tests where the vendor brought a UAV to the test and flew their own flight profile, and Counter UAS system work perfectly. Obviously, that's a little bit rigged. And if you just want a general snapshot of does it work at all, great. But if you want to know if it works in your environment, you need to control the selection and then the operation. If you can operate the system, do so.
There's a FAA exercise doing Counter UAS test and validation at five different airports in the United States. One of the interesting requirements for it was that the vendors are obligated to install the system and then walk away from it for a month. So the vendor, any detection needs to be alerted, needs to get to the FAA via the system as it's installed. It's not gonna get tweaked on-site. Like, oh, wait a second, we didn't properly adjust the sensor for this operating environment, or we didn't understand that you're gonna be using fixed wings, that sort of test is really powerful. There are also times that you want the vendor to evolve on site, because doing test evaluation is great for vendors to really improve their systems. So keep in mind what you're trying to accomplish and what your opportunities are. At a bare minimum, get the vendors to explain how their systems are working, and to show you what the user interfaces so that when information is coming in via the user interface, then you understand it. It's a great opportunity for that sort of information sharing. People traveled to the site, you know, take half a day to train everybody on how all the systems work. It's useful. If you can avoid it, don't run one exercise with just one vendor, get as many vendors as you can get to the exercise at the same time. That gives you some opportunities to compare and contrast them. They have intellectual property, you need to give them the opportunity to make sure that the other vendors are not learning how their systems work. So due respect their needs as well as yours. But if you're gonna go through all the effort to get multiple vendors there, to get to do the exercise, get as many vendors there as you can reasonably support. Don't let the vendors reconfigure and modify the system during the exercise. Parentheses without clearly logging it. If they change the nature or the configuration of their system during the exercise, and you don't know that, then different results over the course the exercise may be hard to understand without that context. We had a vendor that changed the orientation of one of their panels, and that had significant effect on some of the results. So just make sure that you document that. Collaborate and share, I'm saying don't sign non-disclosure agreements, you're probably gonna have to sign non-disclosure agreements. And this goes back to what Mike and I were talking earlier on. So this is me, get a hold of me. Since I have a couple extra minutes, I'm going to show you one quick thing that is sort of related to it. There's a variety of Counter UAS systems deployed around Dallas Fort Worth International Airport in Dallas, Texas, well, near Dallas and Fort Worth. We have data from those sensors over a month's time, actually we have done more than that. But this is looking at data from all that time, we combine that data with ADSB data. And what this shows is, all of the times that a UAV came in within 2000 meters of manned aircraft in this particular area, the red dots are within 750 meters. So and there are a number of red dots. So this is one of those examples of, there is great value in Counter UAS systems, even if all they do is detect, track, identify. There's no mitigation going on here. But it's helping the FAA and other people concerned about the airspace understanding what's actually going on. So they can make informed decisions, so they can make regulatory decisions. So I would, and understanding how effective the systems are is still important for this. So the take home here is that, if you're doing Counter UAS work, there is great value in deploying the systems and sharing the data as I am doing here, to help everybody understand what's going on. Our result produces this, it's got the callsign of the aircraft, callsign of the UAV, distance, you know, just data crunching. So that is the end of my presentation, I will stop sharing. And turn it back over to Mike.