What is aws snowmobile

This recipe explains what is aws snowmobile

What is aws snowmobile?

AWS Snowmobile is an Exabyte-scale data transfer service that is used to move massive amounts of data to AWS. Up to 100PB can be transferred per Snowmobile, a 45-foot-long ruggedized shipping container pulled by a semi-trailer truck. Snowmobile makes it simple to migrate massive amounts of data to the cloud, such as video libraries, image repositories, or even entire data center. Snowmobile data transfer is more secure, fast, and cost effective.

Following an initial assessment, a Snowmobile will be transported to your data center and configured by AWS personnel so that it can be accessed as a network storage target. When your Snowmobile arrives, AWS personnel will collaborate with your team to connect a removable, high-speed network switch from the Snowmobile to your local network, allowing you to begin high-speed data transfer from any number of sources within your data center to the Snowmobile. After loading your data, Snowmobile returns to AWS, where it is imported into Amazon S3.

To help protect your data, Snowmobile employs multiple layers of security, including dedicated security personnel, GPS tracking, alarm monitoring, 24/7 video surveillance, and an optional escort security vehicle while in transit. All data is encrypted with 256-bit encryption keys that you manage via the AWS Key Management Service (KMS) and is designed for data security and complete chain-of-custody.

Benefits

    • Fast transfer even at massive scale

Even with high-speed internet connections, transferring extremely large amounts of data can take decades. Snowmobile can move 100 petabytes of data in a matter of weeks, plus transport time. Over a direct connect line with a 1Gbps connection, the same transfer could take more than 20 years.

    • Strong encryption

Before it is written to Snowmobile, your data is encrypted with keys that you provide. All data is encrypted with 256-bit encryption, and AWS Key Management Service allows you to manage your encryption keys (KMS). Encryption keys used by Snowmobile encryption servers are never written to disc. If power is lost for any reason, the keys are securely erased

    • Rugged, durable, & more secure

Snowmobile is tamper-proof, waterproof, and temperature-controlled. Only AWS personnel can access the data container, and physical access is restricted by security access hardware controls. Snowmobiles are protected by video surveillance and alarm monitoring 24 hours a day, seven days a week, and can be escorted by a security vehicle while in transit.

    • Customized for your needs

Because physical sites may have different migration requirements, AWS will work with you to ensure that all of your requirements are met before Snowmobile arrives on site.

    • Massively scalable

A single Snowmobile can transport up to 100 petabytes of data in a single trip, which is equivalent to using approximately 1,250 AWS Snowball devices.

    • Easy data retrieval

Many organizations are concerned that once all of their data has been moved to the cloud, retrieving it will be both costly and time-consuming. Snowball and Snowmobile provide all customers with a quick and low-cost way to ensure data is quickly transferred into and out of AWS.

How Does AWS Snowmobile Work?

Before a Snowmobile is transported to a data center, an initial assessment is performed. When the Snowmobile arrives at the data center, AWS personnel configure it so that it can be used as a network storage target. When the Snowmobile arrives on-site, AWS personnel begin collaborating with the team to connect a removable high-speed network switch from the Snowmobile to the local network. This starts the high-speed data transfer process. This can be done from a variety of data center sources.

After the data is transferred, the Snowmobile is returned to AWS, where it is imported into Amazon S3.

Snowmobile employs multi-layered security to safeguard data. This includes security personnel, alarm monitoring, GPS tracking, round-the-clock video surveillance, and, if necessary, an escort security vehicle. All data is encrypted with 256-bit encryption keys. The AWS Key Management Service can be used to manage these keys (KMS).

What is an AWS Snowmobile Job?

A Snowmobile Job is the process of migrating data from start to finish using a Snowmobile. It is made up of five steps:

    • Site Survey

In this step, AWS personnel collaborate with the company to understand their migration objectives, data center environment, and network configurations needed to create a migration plan.

    • Site Preparation

During this step, the customer identifies and makes available local services such as parking, power, and so on.

    • Dispatch and Setup

: This step entails AWS personnel dispatching a Snowmobile to the location. The Snowmobile is then configured to be accessed securely as a network storage target.

    • Data Migration

The data is copied from various sources in the data center to the Snowmobile in this step.

    • Return and Upload

The Snowmobile then returns to a designated AWS region, where the data is uploaded to the selected AWS storage device (s).

What Users are saying..

profile image

Abhinav Agarwal

Graduate Student at Northwestern University
linkedin profile url

I come from Northwestern University, which is ranked 9th in the US. Although the high-quality academics at school taught me all the basics I needed, obtaining practical experience was a challenge.... Read More

Relevant Projects

Hive Mini Project to Build a Data Warehouse for e-Commerce
In this hive project, you will design a data warehouse for e-commerce application to perform Hive analytics on Sales and Customer Demographics data using big data tools such as Sqoop, Spark, and HDFS.

dbt Snowflake Project to Master dbt Fundamentals in Snowflake
DBT Snowflake Project to Master the Fundamentals of DBT and learn how it can be used to build efficient and robust data pipelines with Snowflake.

Build a Spark Streaming Pipeline with Synapse and CosmosDB
In this Spark Streaming project, you will learn to build a robust and scalable spark streaming pipeline using Azure Synapse Analytics and Azure Cosmos DB and also gain expertise in window functions, joins, and logic apps for comprehensive real-time data analysis and processing.

Build a Data Pipeline with Azure Synapse and Spark Pool
In this Azure Project, you will learn to build a Data Pipeline in Azure using Azure Synapse Analytics, Azure Storage, Azure Synapse Spark Pool to perform data transformations on an Airline dataset and visualize the results in Power BI.

GCP Project to Learn using BigQuery for Exploring Data
Learn using GCP BigQuery for exploring and preparing data for analysis and transformation of your datasets.

GCP Data Ingestion with SQL using Google Cloud Dataflow
In this GCP Project, you will learn to build a data processing pipeline With Apache Beam, Dataflow & BigQuery on GCP using Yelp Dataset.

Build an Analytical Platform for eCommerce using AWS Services
In this AWS Big Data Project, you will use an eCommerce dataset to simulate the logs of user purchases, product views, cart history, and the user’s journey to build batch and real-time pipelines.

Build a real-time Streaming Data Pipeline using Flink and Kinesis
In this big data project on AWS, you will learn how to run an Apache Flink Python application for a real-time streaming platform using Amazon Kinesis.

Getting Started with Azure Purview for Data Governance
In this Microsoft Azure Purview Project, you will learn how to consume the ingested data and perform analysis to find insights.

Web Server Log Processing using Hadoop in Azure
In this big data project, you will use Hadoop, Flume, Spark and Hive to process the Web Server logs dataset to glean more insights on the log data.