Explain the features of Amazon Detective

In this recipe, we will learn about Amazon Detective We will also learn about the features of Amazon Detective.

Recipe Objective - Explain the features of Amazon Detective?

The Amazon Detective is widely used and is defined as a service that makes it easy to analyze, investigate, and quickly identify the root cause of potential security issues or suspicious activities. Amazon Detective automatically collects the log data from users' AWS resources and uses machine learning, statistical analysis, and graph theory to build a linked set of data which enables users to easily conduct faster and more efficient security investigations. Amazon Web Services security services like Amazon GuardDuty, Amazon Macie, and AWS Security Hub as well as the partner security products can be also used to identify the potential security issues, or findings and these services help alert users when something is wrong and point out where to go to fix it but sometimes there might be the security finding where you need to dig a lot deeper and analyze more information to further isolate the root cause and take action So, determining the root cause of security findings can be a complex process which often involves collecting and combining logs from many separate data sources, using extract, transform, and load (ETL) tools or the custom scripting to organize the data, and then security analysts having to analyze data and conduct some lengthy investigations. Amazon Detective helps in simplifying this process by further enabling users' security teams to easily investigate and quickly get to the root cause of a finding. Amazon Detective can analyze trillions of events from multiple data sources such as the Virtual Private Cloud (VPC) Flow Logs, AWS CloudTrail, and Amazon GuardDuty, and automatically creates a unified, interactive view of users resources, users, and the interactions between them over time and with this unified view, users can visualize all the details and context in one place to identify the underlying reasons for the findings, drill down into relevant historical activities and finally quickly determine the root cause.

Learn to Build ETL Data Pipelines on AWS

Benefits of Amazon Detective

  • The Amazon Detective produces visualizations with the information users need to investigate and respond to the security findings. It helps users answer questions like ‘is this spike in traffic from this instance expected?’ without having to organize any data or develop, configure, or tune their queries and algorithms. Amazon Detective maintains up to a year of aggregated data that shows changes in the type and volume of activity over a selected time window, links those changes to security findings and thus provides easy to use visualizations. Also, Amazon Detective automatically processes the terabytes of event data records about IP traffic, AWS management operations, and malicious or unauthorized activity. It organizes the data into the graph model which summarizes all the security-related relationships in the user's AWS environment. Amazon Detective then queries this model to create visualizations used in the investigations and further the graph model is continuously updated as new data becomes available from AWS resources, so users spend less time managing constantly changing data. Amazon Detective further presents a unified view of user and resource interactions over time, with all context and details in one place to help users quickly analyze and get to the root cause of a security finding. For eg, an Amazon GuardDuty finding, like an unusual Console Login API call, which can be quickly investigated in Amazon Detective with details about the API call trends over time, and user login attempts on a geolocation map and these details enable users to quickly identify if users think it is legitimate or an indication of the compromised AWS resource and thus provides faster and more effective investigations.

System Requirements

  • Any Operating System(Mac, Windows, Linux)

This recipe explains Amazon Detective and the Features of Amazon Detective.

Features of Amazon Detective

    • It provides Interactive visualizations for efficient investigation

Amazon Detective provides interactive visualizations which makes it easy to investigate issues faster and more thoroughly with less effort and with a unified view which enables users to visualize all the context and details in one place, it is easier to identify patterns that may validate or refute a security issue, and to understand all of the resources impacted by the security finding. Using these visualizations, users can easily filter large sets of event data into specific timelines, with all the details, context, and guidance to help users quickly investigate. Amazon Detective enables users to view login attempts on a geolocation map, drill down into relevant historical activities, quickly determine a root cause and, if necessary, take action to resolve the further issue.

    • It consolidates disparate events into a graph model

Amazon Detective can analyze trillions of events from many separate data sources about the IP traffic, AWS management operations, and malicious or unauthorized activity to further construct the graph model that distils log data using machine learning, statistical analysis, and graph theory to build a linked set of data for security investigations. The graph model is prebuilt with the security-related relationships and summarizes contextual, and behavioural insights which enable users to quickly validate, compare and correlate the data to reach conclusions. Amazon Detective’s visualizations are powered by the graph model enabling users to rapidly answer their investigative questions without the complexity of querying raw logs. For eg, the graph provides the context and relationships around when an IP address is connected to an EC2 instance, and then the API calls that a role has issued in the specific time.

    • It automates data collection across all the AWS accounts

Amazon Detective automatically ingests and processes relevant data from all the enabled accounts and users don't have to configure or enable any data sources. Amazon Detective collects and analyzes events from data sources, such as AWS CloudTrail, VPC Flow Logs, and Amazon GuardDuty findings, and maintains up to a year of aggregated data for further analysis.

    • It provides seamless integration for investigating a security finding

Amazon Detective is integrated with AWS security services such as the Amazon GuardDuty and AWS Security Hub as well as the AWS partner security products to help quickly investigate security findings identified in these services. Using the single-click from these integrated services, users can go to Amazon Detective and immediately see events related to the finding, drill down into relevant historical activities and investigate the issue. So, for eg, from an Amazon GuardDuty finding, users can launch the Amazon Detective by clicking on “Investigate” which provides instant insight into the relevant activity for the involved resource, giving users the details and context to quickly decide whether the detected finding reflects actual suspicious activity.

    • It provides simple deployment with no upfront data source integration or complex configurations to maintain

Amazon Detective can be enabled with a few clicks in the AWS Management Console. Also, there is no software to deploy, agents to install, or complex configurations to maintain. There are also no data sources to enable, which means users do not have to incur the costs of data source enablement, data transfer, and data storage.

What Users are saying..

profile image

Gautam Vermani

Data Consultant at Confidential
linkedin profile url

Having worked in the field of Data Science, I wanted to explore how I can implement projects in other domains, So I thought of connecting with ProjectPro. A project that helped me absorb this topic... Read More

Relevant Projects

Build a Data Pipeline with Azure Synapse and Spark Pool
In this Azure Project, you will learn to build a Data Pipeline in Azure using Azure Synapse Analytics, Azure Storage, Azure Synapse Spark Pool to perform data transformations on an Airline dataset and visualize the results in Power BI.

Learn Efficient Multi-Source Data Processing with Talend ETL
In this Talend ETL Project , you will create a multi-source ETL Pipeline to load data from multiple sources such as MySQL Database, Azure Database, and API to Snowflake cloud using Talend Jobs.

Building Real-Time AWS Log Analytics Solution
In this AWS Project, you will build an end-to-end log analytics solution to collect, ingest and process data. The processed data can be analysed to monitor the health of production systems on AWS.

SQL Project for Data Analysis using Oracle Database-Part 1
In this SQL Project for Data Analysis, you will learn to efficiently leverage various analytical features and functions accessible through SQL in Oracle Database

Hands-On Real Time PySpark Project for Beginners
In this PySpark project, you will learn about fundamental Spark architectural concepts like Spark Sessions, Transformation, Actions, and Optimization Techniques using PySpark

Hadoop Project-Analysis of Yelp Dataset using Hadoop Hive
The goal of this hadoop project is to apply some data engineering principles to Yelp Dataset in the areas of processing, storage, and retrieval.

Build an ETL Pipeline with DBT, Snowflake and Airflow
Data Engineering Project to Build an ETL pipeline using technologies like dbt, Snowflake, and Airflow, ensuring seamless data extraction, transformation, and loading, with efficient monitoring through Slack and email notifications via SNS

Yelp Data Processing using Spark and Hive Part 2
In this spark project, we will continue building the data warehouse from the previous project Yelp Data Processing Using Spark And Hive Part 1 and will do further data processing to develop diverse data products.

AWS Project - Build an ETL Data Pipeline on AWS EMR Cluster
Build a fully working scalable, reliable and secure AWS EMR complex data pipeline from scratch that provides support for all data stages from data collection to data analysis and visualization.

Building Data Pipelines in Azure with Azure Synapse Analytics
In this Microsoft Azure Data Engineering Project, you will learn how to build a data pipeline using Azure Synapse Analytics, Azure Storage and Azure Synapse SQL pool to perform data analysis on the 2021 Olympics dataset.