Microsoft’s AI Copilot for Security launches next month with pay-as-you-go pricing
Microsoft is launching its Copilot for Security next month, bringing its generative AI chatbot to the cybersecurity space. Copilot for Security is designed for cybersecurity professionals to help them protect against threats, but it won’t be a one-off monthly charge like Copilot for Microsoft 365. Instead, Microsoft will charge businesses $4 per hour of usage as part of a consumption model when Copilot for Security launches on April 1st.
Powered by OpenAI’s GPT-4 and Microsoft’s own security-specific model, Copilot for Security is essentially a chatbot where cybersecurity workers can get the latest information on security incidents, summaries of threats, and more. Microsoft first started testing this chatbot nearly a year ago, and it includes access to the latest information on security threats and Microsoft’s 78 trillion daily signals that the company collects through its threat intelligence gathering.
Copilot for Security includes a pinboard section for collaboration between cybersecurity employees, and the ability to summarize events for reporting purposes. Like many other AI chatbots, you can use natural language inputs, feed in files for analysis, or even get Copilot for Security to analyze code. All the prompts are saved in a history log for auditing at a later date.
The pay-as-you-go pricing is designed to allow businesses to scale what they need for AI-powered cybersecurity efforts. “We will have one simple pricing model that covers both the standalone Copilot experience, and embedded experiences across the Microsoft Security product portfolio,” says Microsoft. “A consumption model means it will be easy to get started quickly and on a small scale, to experiment and learn with no upfront per device or per user charges.”
Microsoft’s push for AI in cybersecurity comes as the company is under attack from Russian state-sponsored hackers. Nobelium, the same group behind the SolarWinds attack, managed to spy on some Microsoft executive email inboxes for months. That initial attack also led to some of Microsoft’s source code being stolen, with the hackers getting access to the company’s source code repositories and internal systems.
Read the full article Here