What is aws snowmobile

This recipe explains what is aws snowmobile

What is aws snowmobile?

AWS Snowmobile is an Exabyte-scale data transfer service that is used to move massive amounts of data to AWS. Up to 100PB can be transferred per Snowmobile, a 45-foot-long ruggedized shipping container pulled by a semi-trailer truck. Snowmobile makes it simple to migrate massive amounts of data to the cloud, such as video libraries, image repositories, or even entire data center. Snowmobile data transfer is more secure, fast, and cost effective.

Following an initial assessment, a Snowmobile will be transported to your data center and configured by AWS personnel so that it can be accessed as a network storage target. When your Snowmobile arrives, AWS personnel will collaborate with your team to connect a removable, high-speed network switch from the Snowmobile to your local network, allowing you to begin high-speed data transfer from any number of sources within your data center to the Snowmobile. After loading your data, Snowmobile returns to AWS, where it is imported into Amazon S3.

To help protect your data, Snowmobile employs multiple layers of security, including dedicated security personnel, GPS tracking, alarm monitoring, 24/7 video surveillance, and an optional escort security vehicle while in transit. All data is encrypted with 256-bit encryption keys that you manage via the AWS Key Management Service (KMS) and is designed for data security and complete chain-of-custody.

Benefits

    • Fast transfer even at massive scale

Even with high-speed internet connections, transferring extremely large amounts of data can take decades. Snowmobile can move 100 petabytes of data in a matter of weeks, plus transport time. Over a direct connect line with a 1Gbps connection, the same transfer could take more than 20 years.

    • Strong encryption

Before it is written to Snowmobile, your data is encrypted with keys that you provide. All data is encrypted with 256-bit encryption, and AWS Key Management Service allows you to manage your encryption keys (KMS). Encryption keys used by Snowmobile encryption servers are never written to disc. If power is lost for any reason, the keys are securely erased

    • Rugged, durable, & more secure

Snowmobile is tamper-proof, waterproof, and temperature-controlled. Only AWS personnel can access the data container, and physical access is restricted by security access hardware controls. Snowmobiles are protected by video surveillance and alarm monitoring 24 hours a day, seven days a week, and can be escorted by a security vehicle while in transit.

    • Customized for your needs

Because physical sites may have different migration requirements, AWS will work with you to ensure that all of your requirements are met before Snowmobile arrives on site.

    • Massively scalable

A single Snowmobile can transport up to 100 petabytes of data in a single trip, which is equivalent to using approximately 1,250 AWS Snowball devices.

    • Easy data retrieval

Many organizations are concerned that once all of their data has been moved to the cloud, retrieving it will be both costly and time-consuming. Snowball and Snowmobile provide all customers with a quick and low-cost way to ensure data is quickly transferred into and out of AWS.

How Does AWS Snowmobile Work?

Before a Snowmobile is transported to a data center, an initial assessment is performed. When the Snowmobile arrives at the data center, AWS personnel configure it so that it can be used as a network storage target. When the Snowmobile arrives on-site, AWS personnel begin collaborating with the team to connect a removable high-speed network switch from the Snowmobile to the local network. This starts the high-speed data transfer process. This can be done from a variety of data center sources.

After the data is transferred, the Snowmobile is returned to AWS, where it is imported into Amazon S3.

Snowmobile employs multi-layered security to safeguard data. This includes security personnel, alarm monitoring, GPS tracking, round-the-clock video surveillance, and, if necessary, an escort security vehicle. All data is encrypted with 256-bit encryption keys. The AWS Key Management Service can be used to manage these keys (KMS).

What is an AWS Snowmobile Job?

A Snowmobile Job is the process of migrating data from start to finish using a Snowmobile. It is made up of five steps:

    • Site Survey

In this step, AWS personnel collaborate with the company to understand their migration objectives, data center environment, and network configurations needed to create a migration plan.

    • Site Preparation

During this step, the customer identifies and makes available local services such as parking, power, and so on.

    • Dispatch and Setup

: This step entails AWS personnel dispatching a Snowmobile to the location. The Snowmobile is then configured to be accessed securely as a network storage target.

    • Data Migration

The data is copied from various sources in the data center to the Snowmobile in this step.

    • Return and Upload

The Snowmobile then returns to a designated AWS region, where the data is uploaded to the selected AWS storage device (s).

What Users are saying..

profile image

Anand Kumpatla

Sr Data Scientist @ Doubleslash Software Solutions Pvt Ltd
linkedin profile url

ProjectPro is a unique platform and helps many people in the industry to solve real-life problems with a step-by-step walkthrough of projects. A platform with some fantastic resources to gain... Read More

Relevant Projects

Streaming Data Pipeline using Spark, HBase and Phoenix
Build a Real-Time Streaming Data Pipeline for an application that monitors oil wells using Apache Spark, HBase and Apache Phoenix .

A Hands-On Approach to Learn Apache Spark using Scala
Get Started with Apache Spark using Scala for Big Data Analysis

SQL Project for Data Analysis using Oracle Database-Part 4
In this SQL Project for Data Analysis, you will learn to efficiently write queries using WITH clause and analyse data using SQL Aggregate Functions and various other operators like EXISTS, HAVING.

Retail Analytics Project Example using Sqoop, HDFS, and Hive
This Project gives a detailed explanation of How Data Analytics can be used in the Retail Industry, using technologies like Sqoop, HDFS, and Hive.

Yelp Data Processing using Spark and Hive Part 2
In this spark project, we will continue building the data warehouse from the previous project Yelp Data Processing Using Spark And Hive Part 1 and will do further data processing to develop diverse data products.

Build an Incremental ETL Pipeline with AWS CDK
Learn how to build an Incremental ETL Pipeline with AWS CDK using Cryptocurrency data

Real-Time Streaming of Twitter Sentiments AWS EC2 NiFi
Learn to perform 1) Twitter Sentiment Analysis using Spark Streaming, NiFi and Kafka, and 2) Build an Interactive Data Visualization for the analysis using Python Plotly.

Airline Dataset Analysis using Hadoop, Hive, Pig and Athena
Hadoop Project- Perform basic big data analysis on airline dataset using big data tools -Pig, Hive and Athena.

Migration of MySQL Databases to Cloud AWS using AWS DMS
IoT-based Data Migration Project using AWS DMS and Aurora Postgres aims to migrate real-time IoT-based data from an MySQL database to the AWS cloud.

Talend Real-Time Project for ETL Process Automation
In this Talend Project, you will learn how to build an ETL pipeline in Talend Open Studio to automate the process of File Loading and Processing.