What are the various Routing Policies in route53

This recipe explains what are the various Routing Policies in route53

What is Amazon Route 53?

Route 53 is a DNS service that routes Internet traffic to the servers that host the requested Web application. Route 53 is named after the port of the same name. Route 53, in contrast to traditional DNS management services, enables scalable, flexible, secure, and manageable traffic routing in conjunction with other AWS services.

Using the AWS Management Console, you can use Route 53 to perform three main functions: domain registration, DNS routing, and health checking.

How does Route 53 work?

1. A user launches a web browser and types www.site.com into the address bar.

2. The www.site.com request is routed to a DNS resolver, which is typically managed by the Internet Service Provider (ISP).

3. The ISP DNS resolver routes www.site.com's request to a DNS root name server.

4. The DNS resolver redirects the request from www.site.com to one of the.com top-level domain (TLD) name servers. The names of the four Route 53 name servers associated with the example.com domain are returned by the.com domain name server. The Route 53 name servers are cached by the DNS resolver for future use.

5. The DNS resolver selects a Route 53 name server and forwards www.site.com's request to that Route 53 name server.

6. In the case of simple routing, the Route 53 name server looks for the record www.site.com in the hosted zone site.com and obtains its value, such as the alias of Amazon CloudFront distribution.

7. When the DNS resolver has found the correct route (CloudFront IP), it returns the value to the user's web browser.

8. The web browser sends a request from www.site.com to the CloudFront distribution's IP address.

9. The example CloudFront distribution returns to the web browser the web page for www.site.com from the cache or origin server.

What are the different routing policies available in Route 53?

Route 53 provides powerful policies for efficient DNS requests. Once your domain is up and running, you can select the routing policy that best suits your needs. However, in order to get the most out of the service, you must first understand how each policy type works.

You select a routing policy when you create a record, which determines how Amazon Route 53 responds to queries:

1. Simple routing policy: Use for a single resource in your domain that performs a specific function, such as an Amazon EC2 instance that serves content for the example.com website.

2. Weighted: You can assign weights to resource record sets with this option. For example, you can specify 25 for one resource and 75 for another, which means that 25% of requests will be routed to the first resource and 75% to the second.

3. LBR (Latency-based routing): Use this when you have resources in multiple AWS Regions and want to route end users to the AWS Region with the lowest latency.

4. Failover: Used to configure active-passive failover. More information can be found in our blog post: Route 53 on Amazon: Health Checks and DNS Failover

5. Geolocation: This allows you to balance the load on your resources by routing requests to specific endpoints based on their geographic location.

6. Multivalue response: Use this option when you want Route 53 to respond to DNS queries with up to eight healthy records chosen at random.

7. IP-based routing: You can use IP-based routing to create a series of Classless Inter-Domain Routing (CIDR) blocks that represent the client IP network range and associate these CIDR blocks with locations.

Benefits of Route 53

    • 1. High availability, reliability, and scalability

Amazon Route 53 is built on AWS's highly available and dependable infrastructure and is designed to scale automatically to handle extremely high query volumes

Our DNS servers are distributed, which helps to ensure a consistent ability to route your end users to your application. Route 53 is intended to provide the dependability required by critical applications, and it is backed by the Amazon Route 53 SLA (Service Level Agreement).

Because AWS services are tightly integrated, users can make changes to their architecture and scale resources to accommodate increasing Internet traffic volume without requiring significant configuration or management.

    • 2. Security

You can control who has access to which parts of the Route 53 service by managing permissions for each user in your AWS account. You can configure the Route 53 Resolver DNS firewall to check outbound DNS requests against a list of known malicious domains when you enable it.

    • 3. Global network

A global Anycast network of Route 53 DNS servers distributed globally aids in achieving lightning-fast speeds. Between regions, the DNS database is replicated. Route 53 is now a globally resilient service, which means it can withstand failure in one or more regions and continue to operate.

    • 4. Cost-effective

You only pay for the resources you use, such as the number of queries for each of your domains, hosted zones, and optional features like routing policies and health checks, at a low cost with no minimum usage commitments or up-front fees.

    • 5. Integrated routing policies

It is advantageous to route traffic based on various criteria such as latency, endpoint health, and geographic location. Route 53's flexibility allows for the configuration of multiple traffic policies and determines the activity of policies at any given time.

    • 6. Compatibility with other AWS services

Domain names can be mapped to Amazon CloudFront distributions, Elastic Load Balancers, EC2 instances, S3 buckets, and other AWS resources using Route 53.The use of AWS Identity and Access Management (IAM) with Route 53 assists with updating DNS data privileges

What Users are saying..

profile image

Anand Kumpatla

Sr Data Scientist @ Doubleslash Software Solutions Pvt Ltd
linkedin profile url

ProjectPro is a unique platform and helps many people in the industry to solve real-life problems with a step-by-step walkthrough of projects. A platform with some fantastic resources to gain... Read More

Relevant Projects

Learn to Create Delta Live Tables in Azure Databricks
In this Microsoft Azure Project, you will learn how to create delta live tables in Azure Databricks.

Build Streaming Data Pipeline using Azure Stream Analytics
In this Azure Data Engineering Project, you will learn how to build a real-time streaming platform using Azure Stream Analytics, Azure Event Hub, and Azure SQL database.

Build Classification and Clustering Models with PySpark and MLlib
In this PySpark Project, you will learn to implement pyspark classification and clustering model examples using Spark MLlib.

Learn Data Processing with Spark SQL using Scala on AWS
In this AWS Spark SQL project, you will analyze the Movies and Ratings Dataset using RDD and Spark SQL to get hands-on experience on the fundamentals of Scala programming language.

Talend Real-Time Project for ETL Process Automation
In this Talend Project, you will learn how to build an ETL pipeline in Talend Open Studio to automate the process of File Loading and Processing.

Airline Dataset Analysis using Hadoop, Hive, Pig and Athena
Hadoop Project- Perform basic big data analysis on airline dataset using big data tools -Pig, Hive and Athena.

Real-time Auto Tracking with Spark-Redis
Spark Project - Discuss real-time monitoring of taxis in a city. The real-time data streaming will be simulated using Flume. The ingestion will be done using Spark Streaming.

Building Data Pipelines in Azure with Azure Synapse Analytics
In this Microsoft Azure Data Engineering Project, you will learn how to build a data pipeline using Azure Synapse Analytics, Azure Storage and Azure Synapse SQL pool to perform data analysis on the 2021 Olympics dataset.

Learn Efficient Multi-Source Data Processing with Talend ETL
In this Talend ETL Project , you will create a multi-source ETL Pipeline to load data from multiple sources such as MySQL Database, Azure Database, and API to Snowflake cloud using Talend Jobs.

SQL Project for Data Analysis using Oracle Database-Part 4
In this SQL Project for Data Analysis, you will learn to efficiently write queries using WITH clause and analyse data using SQL Aggregate Functions and various other operators like EXISTS, HAVING.