An Overview of AWS Cloud Data Migration Services

mahmoudeid

MESragelden

Posted on January 10, 2023

An Overview of AWS Cloud Data Migration Services

The Original Whitepaper

Introduction

The migration process

One of the most challenging steps required to deploy an application infrastructure in the cloud is moving data into and out of the cloud securely, As There are many different ways to lift-and-shift data to the cloud such as

  • One-time large batches
  • Constant device streams
  • Intermittent updates
  • Hybrid data storage combining the AWS Cloud and on-premises data stores

Choosing your migration method

Before you start your migration to the cloud journey you need to answer the following questions to choose the best migration method

What is the time allocated to perform data transfers?

What is the volume of data?

What is the network speeds/bandwidth available?

to calculate number of days required to migrate a given amount of data you can use the following formula.

Number of Days = (Terabytes * 8 bites per Byte)/(CIRCUIT gigabits per second *
      NETWORK_UTILIZATION percent * 3600 seconds per hour * AVAILABLE_HOURS) 
Enter fullscreen mode Exit fullscreen mode

for example : if you have an Gigabit Ethernet connection (1 Gbps) to the internet and 100 TB of data to move to AWS with 80% utilization it will take around 28 days

(100,000,000,000,000 Bytes * 8 bits per byte) /(1,000,000,000 bps * 80 percent * 3600
      seconds per hour * 10 hours per day) = 27.77 days 
Enter fullscreen mode Exit fullscreen mode

 Recommended migration methods

What is number of repetitive steps required to transfer data?

What is the existing investments in custom tooling and automation in your organization.

will you use the self-managed/managed tools for migration ?

use the following diagram to have more insights.

Migration Methods

Migration Tools

Self-managed migration methods

Tool Use Case Limits comments
AWS S3 CLI move small data to s3 buckets. upload objects <= 5 GB in a single operation User Multipart Uploading for objects > 5 GB
AWS Glacier CLI move small data to AWS Glacier Upload Archieve <= 4 GB Use Multipart Uploading for Archieves > 100 MB
Storage partner solutions http://aws.amazon.com/backup-recovery/partner-solutions/

AWS managed migration tools

  • AWS Direct Connect

    • Use Case : Establishing a dedicated network connection from your premises to AWS
    • Limits : Network speed from 50 Mbps and up to 100 Gbps
    • Comments : It is not a data transfer service itself
    • Security Features : Image description
  • AWS Snow Family (Snowcone)

    • Use Case : Migrate your data fast and secure offline or when loading your data over the Internet would take a week or more
    • Limits : 8 TB HHD && 14 TB SSD
    • Comments : https://aws.amazon.com/snow/
  • AWS Snow Family(Snowball)

    • Use Case : frequently large backlogs of data -physically isolated environment - No available high-speed Internet connections- Disaster Recovery - DataCenter decommission
    • Limits : 80TB HDD && 1TB SSD
    • Comments : https://aws.amazon.com/snow/
  • AWS Snow Family (Snowmobile)

  • AWS Storage Gateway

    • Use Case : File sharing, enabling existing on-premises backup applications, disaster recovery,mirroring and archiving data
    • Limits : it supports storage interfaces file, volume and tape inrerfaces and if supports NFS, SMB, iSCSI and iSCSI VTL Protocols.
    • Comments :
    • Security Features : Image description
  • Amazon S3 Transfer Acceleration (S3TA)

    • Use Case : upload to a centralized bucket from all over the world,transfer gigabytes to terabytes of data on a regular basis across continents, having underutilized the available bandwidth over the Internet when uploading to Amazon S3. You can use the online speed comparison tool to get the preview of the performance benefit from uploading data from your location to Amazon S3 buckets in different AWS Regions using Transfer Acceleration.
    • Security Features : Image description
  • AWS Kinesis Data Firehose

    • Use Case : Easiest way to load streaming data and store it in Amazon S3, Amazon Redshift, Amazon OpenSearch Service, or Splunk.
    • Limits : Record size (before base64-encoding) as large as 1000 KiB, Buffer Size (1 MiB to 128 MiB) and Buffer Interval (60 to 900 seconds)
    • Comments : Image description
    • Security Features : Image description
  • AWS Transfer Family

    • Use Case : Secure file sharing between an organization and third parties, provide a central location where users can download and globally access your data securely, facilitate data ingestion for a data lake
    • Limits : It supports SFTP, FTPS, FTP, and AS2 protocols
    • Comments : AWS Transfer Family - Demo Image description
    • Security Features : Image description
  • Data Sync

    • Use Case : Transferring files in opposite directions, Using multiple tasks to write to the same Amazon S3 bucket and Allowing DataSync to access a restricted Amazon S3 bucket
    • Limits : To access your self-managed on-premises or cloud storage, you need an AWS DataSync agent that's associated with your AWS account.
    • Comments : AWS Datasync Demo Image description
    • Security Features : Image description
  • Third-Party Connectors

    Read More

Conclusion

Summary goes through different AWS Managed/self-managed migration options, Additionally it covered different use cases and security features for each service.

๐Ÿ’– ๐Ÿ’ช ๐Ÿ™… ๐Ÿšฉ
mahmoudeid
MESragelden

Posted on January 10, 2023

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related

ยฉ TheLazy.dev

About