Microsoft is rapidly integrating AI technology into its various services using the Copilot suite of tools. After introducing the AI-based Copilot assistant in the Office suite, Microsoft is now turning its attention to cybersecurity.
What is Microsoft Security Copilot?
Microsoft Security Copilot is the company’s new assistant for cybersecurity professionals, designed to identify and fix security flaws and better understand large amounts of everyday data.
Security Copilot uses OpenAI’s GPT-4 artificial intelligence in Microsoft’s security model channel and has a simple, boxy user interface like any other chatbot. You can ask this assistant: “Tell me about all the security events of my company” and after that, you will get a summary of the report related to your request. Of course, behind the scenes of this chatbot, Security Copilot uses 65 trillion daily signals that Microsoft collects about security and specific threats; A feature that allows security professionals to more accurately identify and track threats.
Microsoft Security Copilot was developed to assist the work of security analysts, not to replace them. This assistant even has a section called Pinboard, which is dedicated to sharing information for collaboration. Security professionals can use this tool to assist with incident investigations or to quickly summarize events and help report security issues.
Security Copilot’s security assistant understands natural language input, so security professionals can provide a summary of a specific vulnerability in the requested URL, file, or code snippet, and then request analysis or event and alert information. All requests and responses are stored in this assistant thus providing searchers with a clear and accurate path.
Users can pin or summarize the results provided by Security Copilot in the shared workspace. Thus, other colleagues can continue their work based on the same analysis and review. “This tool is like having a separate workspace for researchers and a shared notebook with the ability to share what you’re working on,” Chang Kawaguchi, an AI security engineer at Microsoft, told The Verge in an interview
One of the most interesting aspects of Security Copilot is that it provides a set of steps or automation. This feature could also include a notice to reverse engineer the script so that security researchers don’t have to wait for this type of analysis to finish on their team. You can even use this assistant to create PowerPoint slides with event information and security charts.
When security researchers request information about the latest vulnerabilities from a Security Copilot, such as Microsoft’s new Bing, the results will be displayed clearly. The Redmond-based tech giant uses information from the Cybersecurity and Infrastructure Security Agency, the National Institute of Standards and Technology’s vulnerabilities database, and its own Threat Intelligence database.
While Microsoft’s Security Copilot looks like other AI-based chatbots, it is limited to security-related queries only. For example, you can’t get the latest information about the weather in your city or ask the assistant what his favorite color is through this tool.
Security Copilot is Microsoft’s latest big effort to use artificial intelligence. The company offers Copilot 365 for the Office suite, and on the other hand, it has expanded the use of Copilot on the GitHub platform to help developers build code.
At present, there is no indication that Microsoft is slowing down the provision of Copilot AI assistants in many of this company’s products and services, so we will likely see this technology in its other programs and services soon.
Currently, the new Security Copilot tool is available to a few select Microsoft customers, and according to Kawaguchi, there is still no set schedule for the release of its public version. The company plans to first test its new security assistant with a smaller group of users and, after solving potential problems, make it widely available to everyone