SlideShare a Scribd company logo
CORE COURSE VIII
CLOUD COMPUTING Credit: 5
UNIT – IV ANEKA
ANEKA
Manjrasoft is focused on the creation of innovative software technologies for
simplifying the development and deployment of applications on private or public
Clouds. Our product Aneka plays the role of Application Platform as a Service for
Cloud Computing. Aneka supports various programming models involving Task
Programming,
HIGHLIGHTS OF ANEKA
Technical Value
 Support of multiple programming and application environments
 Simultaneous support of multiple run-time environments
 Rapid deployment tools and framework
 Simplicity in developing applications on Cloud
 Dynamic Scalability
 Ability to harness multiple virtual and/or physical machines for accelerating
application result
 Provisioning based on QoS/SLA
Business Value
 Improved reliability
 Simplicity
 Faster time to value
 Operational Agility
 Definite application performance enhancement
 Optimizing the capital expenditure and operational expenditure
 All these features make Aneka a winning solution for enterprise customers in the
Platform-as-a-Service scenario.
ANATOMY OF THE ANEKA CONTAINER
Aneka is a platform and a framework for developing distributed applications on
the Cloud. It harnesses the spare CPU cycles of a heterogeneous network of desktop
PCs and servers or data centers on demand.
Aneka provides developers with a rich set of APIs for transparently exploiting
such resources and expressing the business logic of applications by using the preferred
programming abstractions.
System administrators can leverage on a collection of tools to monitor and
control the deployed infrastructure. This can be a public cloud available to anyone
through the Internet, or a private cloud constituted by a set of nodes with restricted
access.
The Aneka based computing cloud is a collection of physical and virtualized
resources connected through a network, which are either the Internet or a private
intranet.
Each of these resources hosts an instance of the Aneka Container representing
the runtime environment where the distributed applications are executed.
The container provides the basic management features of the single node and
leverages all the other operations on the services that it is hosting.
Fabric services directly interact with the node through the Platform Abstraction
Layer (PAL) and perform hardware profiling and dynamic resource provisioning.
Foundation services identify the core system of the Aneka middleware,
providing a set of basic features to enable Aneka containers to perform specialized and
specific sets of tasks.
Execution services directly deal with the scheduling and execution of
applications in the Cloud. One of the key features of Aneka is the ability of providing
different ways for expressing distributed applications by offering different
programming models; Execution services are mostly concerned with providing the
middleware with an implementation for these models.
Additional services such as persistence and security are transversal to the entire
stack of services that are hosted by the Container.
At the application level, a set of different components and tools are provided to:
1) Simplify the development of applications (SDK);
2) Porting existing applications to the Cloud; and
3) Monitoring and managing the Aneka Cloud.
A common deployment of Aneka is presented at the side. An Aneka based Cloud
is constituted by a set of interconnected resources that are dynamically modified
according to the user needs by using resource virtualization or by harnessing the spare
CPU cycles of desktop machines.
If the deployment identifies a private Cloud all the resources are in house, for
example within the enterprise. This deployment is extended by adding publicly
available resources on demand or by interacting with other Aneka public clouds
providing computing resources connected over the Internet
ANEKA ARCHITECTURE
The Aneka architecture is designed to facilitate distributed computing in cloud
environments, providing a versatile and scalable framework for efficient resource
utilization. Below is a description of the key components and layers within the Aneka
architecture:
Application Layer: The top layer of the Aneka architecture is the application
layer, where users deploy their applications and tasks. Users interact with Aneka to
submit jobs, manage workloads, and access distributed computing resources.
Middleware Layer: Sitting between the application layer and the infrastructure
layer, the middleware layer comprises the core components of Aneka. It includes
services for job scheduling, task execution, resource management, and communication.
This layer abstracts the complexities of distributed computing, providing a unified
interface for application deployment.
Resource Layer: At the bottom of the architecture is the resource layer, which
encompasses the distributed computing resources available in the cloud infrastructure.
These resources include computational nodes, storage, and network components. Aneka
dynamically allocates and manages these resources based on the requirements of
running applications.
Job Scheduler: A critical component within the middleware layer, the job
scheduler allocates tasks to available resources. It considers factors such as resource
availability, load balancing, and priority to optimize the execution of jobs across the
distributed environment.
Execution Container: The execution container is a runtime environment that
hosts and executes tasks or applications. Aneka dynamically creates and manages these
containers on distributed nodes, providing isolation and ensuring that applications run
efficiently without interference.
Communication Services: Aneka includes communication services to enable
efficient data exchange and coordination among distributed components. This ensures
seamless collaboration between execution containers, allowing them to work together
in a coordinated fashion.
Monitoring and Management Services: This set provides real-time monitoring
of the distributed computing environment. It collects data on resource usage, application
performance, and system health. Users can access this information for performance
optimization, debugging, and resource planning.
Security Services: Aneka incorporates security services to ensure the integrity
and confidentiality of data and resources. This involves implementing authentication,
authorization, and encryption measures to prevent illegitimate access and potential
security risks.
Integration Adapters: Aneka’s architecture includes adapters facilitating
integration with various cloud infrastructures and middleware components. These
adapters ensure compatibility and interoperability, allowing Aneka to work seamlessly
with cloud providers and technologies.
PLATFORM ABSTRACTION LAYER
The PAL is a small layer of software that comprises a detection engine, which
automatically configures the container at boot time, with the platform-specific
component to access the above information and an implementation of the abstraction
layer for the Windows, Linux, and Mac OS X operating systems.
The collectible data that are exposed by the PAL are the following:
•Number of cores, frequency, and CPU usage
•Memory size and usage
•Aggregate available disk space
•Network addresses and devices attached to the node
Moreover, additional custom information can be retrieved by querying the
properties of the hardware. The PAL interface provides means for custom
implementations to pull additional information by using name-value pairs that can host
any kind of information about the hosting platform. For example, these properties can
contain additional information about the processor, such as the model and family, or
additional data about the process running the container.
The Platform Abstraction Layer (PAL) addresses this heterogeneity and provides
the container with a uniform interface for accessing the relevant hardware and operating
system information, thus allowing the rest of the container to run unmodified on any
supported platform.
The PAL is responsible for detecting the supported hosting environment and
providing the corresponding implementation to interact with it to support the activity of
the container. The PAL provides the following features:
•Uniform and platform-independent implementation interface for accessing the
hosting platform
•Uniform access to extended and additional properties of the hosting platform
•Uniform and platform-independent access to remote nodes
•Uniform and platform-independent management interfaces
FABRIC SERVICES
Fabric Services define the lowest level of the software stack representing the Aneka
Container. Fabric services in Aneka pertain to the underlying infrastructure and
resources that facilitate the execution of distributed applications. These services manage
the fabric of computing resources, ensuring efficient resource allocation, scalability,
and responsiveness to varying workloads. Fabric services are crucial in optimizing the
utilization of cloud resources for distributed computing.
Examples:
Job Scheduling: Efficient allocation and scheduling of tasks across distributed
nodes.
Resource Management: Dynamic allocation and deallocation of resources based
on demand.
Load Balancing: Equitable distribution of tasks to prevent resource bottlenecks.
APPLICATION SERVICES
Application services in Aneka focus on providing tools and capabilities to
support the development, deployment, and management of distributed applications.
These services abstract the complexities of distributed computing, offering features that
enhance application performance, reliability, and adaptability in cloud environments.
Examples:
Task Execution: Services for deploying and executing tasks across the
distributed environment.
Data Management: Handling storage, retrieval, and manipulation of data within
the distributed application.
Communication Services: Facilitating communication and coordination between
distributed components.
FOUNDATION SERVICES
Foundation services form the core building blocks of Aneka, providing
fundamental functionalities that support the overall framework. These services include
essential components and tools that enable Aneka’s seamless integration, security, and
extensibility in diverse cloud computing environments.
Examples:
Security Services: Authentication, authorization, and encryption mechanisms to
ensure data and resource security.
Integration Adapters: Components that facilitate compatibility with various
cloud infrastructures and middleware.
Monitoring and Management: Real-time resource usage monitoring, application
performance, and system health.
RESOURCE MANAGEMENT
Resource management is another fundamental feature of Aneka Clouds. It
comprises several tasks: resource membership, resource reservation, and resource
provisioning. Aneka provides a collection of services that are in charge of managing
resources. These are the Index Service (or Membership Catalogue), Reservation
Service, and Resource Provisioning Service.
The Membership Catalogue is Aneka’s fundamental component for resource
management; it keeps track of the basic node information for all the nodes that are
connected or disconnected. The Membership Catalogue implements the basic services
of a directory service, allowing the search for services using attributes such as names
and nodes. During container startup, each instance publishes its information to the
Membership Catalogue and updates it constantly during its lifetime.
Services and external applications can query the membership catalogue to
discover the available services and interact with them. To speed up and enhance the
performance of queries, the membership catalogue is organized as a distributed
database: All the queries that pertain to information maintained locally are resolved
locally; otherwise the query is forwarded to the main index node, which has a global
knowledge of the entire Cloud.
The Membership Catalogue is also the collector of the dynamic performance
data of each node, which are then sent to the local monitoring service to be persisted in
the long term.
Aneka defines a very flexible infrastructure for resource provisioning whereby
it is possible to change the logic that triggers provisioning, support several back-ends,
and change the runtime strategy with which a specific back-end is selected for
provisioning.
The resource-provisioning infra- structure built into Aneka is mainly
concentrated in the Resource Provisioning Service, which includes all the operations
that are needed for provisioning virtual instances. The implementation of the service is
based on the idea of resource pools.
Resource provisioning is a feature designed to support QoS requirements-driven
execution of applications. Therefore, it mostly serves requests coming from the
Reservation Service or the Scheduling Services. Despite this, external applications can
directly leverage Aneka’s resource provisioning capabilities by dynamically retrieving
a client to the service and interacting with the infrastructure to it. This extends the
resource-provisioning scenarios that can be handled by Aneka, which can also be used
as a virtual machine manager.
BUILDING ANEKA CLOUDS
Aneka is primarily a platform for developing distributed applications for clouds.
As a software platform it requires infrastructure on which to be deployed; this
infrastructure needs to be managed.
Infrastructure management tools are specifically designed for this task, and building
clouds is one of the primary tasks of administrators. Aneka supports various deployment
models for public, private, and hybrid clouds.
Infrastructure organization
It provides an overview of Aneka
Clouds from an infrastructure point of
view. The scenario is a reference model
for all the different deployments Aneka
supports. A central role is played by the
Administrative Console, which performs
all the required management operations.
A fundamental element for Aneka
Cloud deployment is constituted by
repositories. A repository provides
storage for all the libraries required to lay
out and install the basic Aneka platform.
These libraries constitute the software image for the node manager and the
container programs. Repositories can make libraries available through a variety of
communication channels, such as HTTP, FTP, common file sharing, and so on. The
Management Console can manage multiple repositories and select the one that best suits
the specific deployment. The infrastructure is deployed by harnessing a collection of
nodes and installing on them the Aneka node manager, also called the Aneka daemon.
The daemon constitutes the remote management service used to deploy and
control container instances. The collection of resulting containers identifies the Aneka
Cloud. From an infrastructure point of view, the management of physical or virtual
nodes is performed uniformly as long as it is possible to have an Internet connection
and remote administrative access to the node.
A different scenario is constituted by the dynamic provisioning of virtual
instances; these are generally created by prepackaged images already containing an
installation of Aneka, which only need to be configured to join a specific Aneka Cloud.
It is also possible to simply install the container or install the Aneka daemon, and the
selection of the proper solution mostly depends on the lifetime of virtual resources.
Logical organization
The logical organization of Aneka
Clouds can be very diverse, since it
strongly depends on the configuration
selected for each of the container
instances belonging to the Cloud. The
most common
The master node features all the
services that are most likely to be present
in one single copy and that provide the
intelligence of the Aneka Cloud. What
specifically characterizes a node as a
master node is the presence of the Index Service (or Membership Catalogue) configured
in master mode; all the other services, except for those that are mandatory, might be
present or located in other nodes.
A common configuration of the master node is as follows:
• Index Service (master copy)
• Heartbeat Service
• Logging Service
• Reservation Service
• Resource Provisioning Service
• Accounting Service
• Reporting and Monitoring Service
• Scheduling Services for the supported programming models
ANEKA CLOUD DEPLOYMENT MODE
All the general cloud deployment modes like Private cloud deployment mode
and Public cloud deployment mode and Hybrid cloud deployment mode are applicable
to Aneka Clouds also.
Private cloud deployment mode:
A Private cloud deployment mode is mostly
constituted by local physical resources and
infrastructure management software providing
access to a local pool of nodes, which might be
virtualized.
Public cloud deployment mode:
It features the installation of Aneka master
and worker nodes over a completely virtualized
infrastructure that is hosted on the infrastructure of
one or more resource providers such as Amazon
EC2 and GoGrid. The deployment is generally
contained within the infrastructure boundaries of a
single IaaS provider. The reasons for this are to
minimize the data transfer between different
providers, which is generally priced at a higher
cost, and to have better network performance.
Hybrid cloud deployment mode:
It constitutes the most common deployment
of Aneka. In many cases, there is an existing
computing infrastructure that can be leveraged to
address the computing needs of applications.
This infrastructure will constitute the static
deployment of Aneka that can be elastically scaled
on demand when additional resources are required.
 Dynamic Resource Provisioning
 Resource Reservation
 Workload Partitioning
 Accounting, Monitoring and Reporting
In a Hybrid Scenario, heterogeneous resources can be used for different purpose. As
we discussed in the case of a private cloud deployment, desktop machines can be
reserved for low priority workload outside the common working hours.
The majority of the application will be executed on work-stations and clusters,
which are the nodes that are constantly connected to the Aneka cloud.
Any additional computing capability demand can be primarily addressed by the local
virtualization facilities, and if more computing power is required, it is possible facilities,
and if more computing power is required, it is possible to leverage external IaaS
providers.
CLOUD PROGRAMMING AND MANAGEMENT
Aneka’s primary purpose is to provide a scalable middleware product in which
to execute distributed applications. Application development and management
constitute the two major features that are exposed to developers and system
administrators.
Aneka provides developers with a comprehensive and extensible set of APIs and
administrators with powerful and intuitive management tools. The APIs for
development are mostly concentrated in the Aneka SDK; management tools are
exposed through the management console.
ANEKA SDK
Aneka provides APIs for developing applications on top of existing
programming models, implementing new programming models and developing new
services to integrate into the Aneka Clouds.
The SDK provides support for the both programming models and services by
The Application Model
The Service Model
Application model
Aneka facilitates distributed execution in the Cloud by employing programming
model abstractions. These programming models serve as both a conceptual framework
for developers and provide the necessary runtime support for program execution on the
Aneka platform. The Application Model serves as a foundational layer, defining the
minimum set of APIs shared across all programming models. It acts as a common
ground for representing and programming distributed applications on Aneka.
Furthermore, the Application Model is tailored to meet the specific requirements
and features of individual programming models, allowing for a specialized and efficient
development approach.
Service model
The Aneka Service Model
outlines essential criteria for
implementing services in the Aneka
Cloud. Services, hosted in a container,
must adhere to the IService interface,
providing properties like name and
status, control operations, and message
handling. The service life cycle
encompasses states from initialization to
shut down.
A default base class, ServiceBase,
simplifies implementation, offering
features like basic property support,
control operation implementation, built-
in client infrastructure, and service
monitoring. Aneka employs a strongly-typed message-passing model, and developers
can dynamically inject service clients into applications. Advanced configuration
capabilities enhance service integration within the container workflow.
MANAGEMENT TOOLS
Aneka operates as a pure Platform as a Service (PaaS) solution, necessitating
deployment on either virtual or physical hardware. As a result, Aneka places significant
emphasis on infrastructure management, encompassing the installation of logical clouds
on the designated hardware.
This foundational aspect is integral to Aneka’s management layer, which not
only oversees infrastructure but also boasts capabilities for effectively managing
services and applications within the Aneka Cloud.
Infrastructure management
Aneka utilizes both virtual and physical hardware for deploying Aneka Clouds.
The Resource Provisioning Service efficiently handles virtual hardware, acquiring
resources on-demand based on application requirements. In contrast, the Administrative
Console directly manages physical hardware by leveraging the Aneka management API
of the PAL. The core focus of these management features revolves around provisioning
physical hardware and facilitating the remote installation of Aneka on the designated
hardware.
Platform management
Infrastructure management is the foundational layer for deploying Aneka
Clouds. This involves orchestrating the creation of Clouds by deploying services on the
physical infrastructure, facilitating container installation and management. Platform
management features focus on organizing Aneka Clouds logically, allowing hardware
partitioning for different purposes.
Services play a key role in implementing core features, and the management
layer provides operations for essential functions like Cloud monitoring, resource
provisioning, user management, and application profiling.
Application management
User contributions to the Cloud are reflected through applications, with
management APIs equipping administrators with monitoring and profiling features.
These tools enable the tracking of resource usage and its correlation to users and
applications - a crucial aspect in cloud scenarios where users are billed based on
resource consumption.
Aneka offers capabilities to provide both summary and detailed information
about application execution and resource utilization. The Aneka Cloud Management
Studio, serving as the primary Administrative Console, makes these features easily
accessible to administrators.

More Related Content

PDF
Dr.M.Florence Dayana-Cloud Computing Architecture.pdf
PPTX
Introduction to Aneka, Aneka Model is explained
PDF
Aneka platform
PPTX
VTU 6th Sem Elective CSE - Module 3 cloud computing
PPTX
Task programming
PPTX
Cloud computing using Eucalyptus
PPTX
Architectural patterns part 1
PPTX
VTU Open Elective 6th Sem CSE - Module 2 - Cloud Computing
Dr.M.Florence Dayana-Cloud Computing Architecture.pdf
Introduction to Aneka, Aneka Model is explained
Aneka platform
VTU 6th Sem Elective CSE - Module 3 cloud computing
Task programming
Cloud computing using Eucalyptus
Architectural patterns part 1
VTU Open Elective 6th Sem CSE - Module 2 - Cloud Computing

What's hot (20)

PPTX
Defining the Clouds for entriprises.pptx
PPT
PPTX
Introduction to EC2
PDF
Cloud Stakeholders
PDF
Confluent Partner Tech Talk with BearingPoint
PPT
VMware Esx Short Presentation
PPTX
Cloud sim report
PPTX
Benefits Of Building Private Cloud
PDF
SAP BODS Designer PDF
PDF
Introduction to Azure Resource Manager
PDF
SAP HANA SPS09 - Backup and Recovery
PPT
Unit II.ppt
PPTX
Storage Virtualization
PDF
AWS Systems manager 2019
PDF
Azure Custom Backup Solution for SAP NetWeaver
PDF
Cloud Computing Business Models
PPTX
Microsoft Cloud Computing - Windows Azure Platform
PDF
Lecture5 virtualization
PPTX
Presentation oracle optimized solutions
PDF
AWS Basics .pdf
Defining the Clouds for entriprises.pptx
Introduction to EC2
Cloud Stakeholders
Confluent Partner Tech Talk with BearingPoint
VMware Esx Short Presentation
Cloud sim report
Benefits Of Building Private Cloud
SAP BODS Designer PDF
Introduction to Azure Resource Manager
SAP HANA SPS09 - Backup and Recovery
Unit II.ppt
Storage Virtualization
AWS Systems manager 2019
Azure Custom Backup Solution for SAP NetWeaver
Cloud Computing Business Models
Microsoft Cloud Computing - Windows Azure Platform
Lecture5 virtualization
Presentation oracle optimized solutions
AWS Basics .pdf
Ad

Similar to Dr.M.Florence Dayana-Cloud Computing-unit - 4.pdf (20)

PPTX
Cloud Computing
PPTX
Anekacloudplatform11111111111111111.pptx
PPTX
aneka.pptx
PPTX
Cloud application platform chapter_2.pptx
PDF
Cloud Computing.pdf
PPTX
Cloud programming management 111111.pptx
PDF
Ch5.pdf
PPTX
617537294-UNIT-III-Building-Aneka clos.pptx
PPTX
aneka.pptx
DOCX
cloudcomputing.docx
PDF
Introduction to aneka cloud
PPTX
ANEKA in cloud computing platform distributed applications
PPT
Building Aneka clouds.ppt
PDF
module1st-cloudcomputing-180131063409 - Copy.pdf
PPTX
Cloud Management and a Programming Model Case Study.pptx
PPTX
Internet of Things A Vision, Architectural Elements, and Future Directions
PPTX
Cloud Computing Principles and Paradigms: 9 aneka-integration of private and ...
PPTX
PPT
Cloud Computing MODULE-2 to understand the cloud computing concepts.ppt
PPTX
Cloud computing and its applications.pptx
Cloud Computing
Anekacloudplatform11111111111111111.pptx
aneka.pptx
Cloud application platform chapter_2.pptx
Cloud Computing.pdf
Cloud programming management 111111.pptx
Ch5.pdf
617537294-UNIT-III-Building-Aneka clos.pptx
aneka.pptx
cloudcomputing.docx
Introduction to aneka cloud
ANEKA in cloud computing platform distributed applications
Building Aneka clouds.ppt
module1st-cloudcomputing-180131063409 - Copy.pdf
Cloud Management and a Programming Model Case Study.pptx
Internet of Things A Vision, Architectural Elements, and Future Directions
Cloud Computing Principles and Paradigms: 9 aneka-integration of private and ...
Cloud Computing MODULE-2 to understand the cloud computing concepts.ppt
Cloud computing and its applications.pptx
Ad

More from Dr.Florence Dayana (20)

PDF
Dr.M.Florence Dayana-Cloud Computing-Unit - 1.pdf
PPTX
M. Florence Dayana - Hadoop Foundation for Analytics.pptx
PDF
M. FLORENCE DAYANA/unit - II logic gates and circuits.pdf
PDF
M.FLORENCE DAYANA/electronic mail security.pdf
PDF
M. FLORENCE DAYANA - INPUT & OUTPUT DEVICES.pdf
PPTX
Professional English - Reading
PPTX
Professional English - Speaking
PPTX
Professional English - Listening
PDF
INPUT AND OUTPUT DEVICES.pdf
PPTX
NETWORK SECURITY-SET.pptx
PPTX
Network Security- Secure Socket Layer
PPT
M.florence dayana dream weaver
PDF
M.florence dayana computer networks transport layer
PDF
M.Florence Dayana Computer Networks Types
PDF
M.Florence Dayana Computer Networks Introduction
PPTX
M. FLORENCE DAYANA/DATABASE MANAGEMENT SYSYTEM
PDF
M.Florence Dayana
PPT
M.Florence Dayana / Basics of C Language
PPT
M.Florence Dayana/Cryptography and Network security
PDF
M.FLORENCE DAYANA WEB DESIGN -Unit 5 XML
Dr.M.Florence Dayana-Cloud Computing-Unit - 1.pdf
M. Florence Dayana - Hadoop Foundation for Analytics.pptx
M. FLORENCE DAYANA/unit - II logic gates and circuits.pdf
M.FLORENCE DAYANA/electronic mail security.pdf
M. FLORENCE DAYANA - INPUT & OUTPUT DEVICES.pdf
Professional English - Reading
Professional English - Speaking
Professional English - Listening
INPUT AND OUTPUT DEVICES.pdf
NETWORK SECURITY-SET.pptx
Network Security- Secure Socket Layer
M.florence dayana dream weaver
M.florence dayana computer networks transport layer
M.Florence Dayana Computer Networks Types
M.Florence Dayana Computer Networks Introduction
M. FLORENCE DAYANA/DATABASE MANAGEMENT SYSYTEM
M.Florence Dayana
M.Florence Dayana / Basics of C Language
M.Florence Dayana/Cryptography and Network security
M.FLORENCE DAYANA WEB DESIGN -Unit 5 XML

Recently uploaded (20)

PDF
Insiders guide to clinical Medicine.pdf
PDF
01-Introduction-to-Information-Management.pdf
PDF
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
PPTX
PPT- ENG7_QUARTER1_LESSON1_WEEK1. IMAGERY -DESCRIPTIONS pptx.pptx
PPTX
The Healthy Child – Unit II | Child Health Nursing I | B.Sc Nursing 5th Semester
PDF
Mark Klimek Lecture Notes_240423 revision books _173037.pdf
PDF
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
PPTX
Introduction to Child Health Nursing – Unit I | Child Health Nursing I | B.Sc...
PDF
Supply Chain Operations Speaking Notes -ICLT Program
PPTX
Pharmacology of Heart Failure /Pharmacotherapy of CHF
PDF
O5-L3 Freight Transport Ops (International) V1.pdf
PPTX
human mycosis Human fungal infections are called human mycosis..pptx
PPTX
school management -TNTEU- B.Ed., Semester II Unit 1.pptx
PDF
2.FourierTransform-ShortQuestionswithAnswers.pdf
PDF
TR - Agricultural Crops Production NC III.pdf
PPTX
Institutional Correction lecture only . . .
PDF
Basic Mud Logging Guide for educational purpose
PDF
FourierSeries-QuestionsWithAnswers(Part-A).pdf
PDF
102 student loan defaulters named and shamed – Is someone you know on the list?
PDF
Module 4: Burden of Disease Tutorial Slides S2 2025
Insiders guide to clinical Medicine.pdf
01-Introduction-to-Information-Management.pdf
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
PPT- ENG7_QUARTER1_LESSON1_WEEK1. IMAGERY -DESCRIPTIONS pptx.pptx
The Healthy Child – Unit II | Child Health Nursing I | B.Sc Nursing 5th Semester
Mark Klimek Lecture Notes_240423 revision books _173037.pdf
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
Introduction to Child Health Nursing – Unit I | Child Health Nursing I | B.Sc...
Supply Chain Operations Speaking Notes -ICLT Program
Pharmacology of Heart Failure /Pharmacotherapy of CHF
O5-L3 Freight Transport Ops (International) V1.pdf
human mycosis Human fungal infections are called human mycosis..pptx
school management -TNTEU- B.Ed., Semester II Unit 1.pptx
2.FourierTransform-ShortQuestionswithAnswers.pdf
TR - Agricultural Crops Production NC III.pdf
Institutional Correction lecture only . . .
Basic Mud Logging Guide for educational purpose
FourierSeries-QuestionsWithAnswers(Part-A).pdf
102 student loan defaulters named and shamed – Is someone you know on the list?
Module 4: Burden of Disease Tutorial Slides S2 2025

Dr.M.Florence Dayana-Cloud Computing-unit - 4.pdf

  • 1. CORE COURSE VIII CLOUD COMPUTING Credit: 5 UNIT – IV ANEKA ANEKA Manjrasoft is focused on the creation of innovative software technologies for simplifying the development and deployment of applications on private or public Clouds. Our product Aneka plays the role of Application Platform as a Service for Cloud Computing. Aneka supports various programming models involving Task Programming, HIGHLIGHTS OF ANEKA Technical Value  Support of multiple programming and application environments  Simultaneous support of multiple run-time environments  Rapid deployment tools and framework  Simplicity in developing applications on Cloud  Dynamic Scalability  Ability to harness multiple virtual and/or physical machines for accelerating application result  Provisioning based on QoS/SLA Business Value  Improved reliability  Simplicity  Faster time to value  Operational Agility  Definite application performance enhancement  Optimizing the capital expenditure and operational expenditure  All these features make Aneka a winning solution for enterprise customers in the Platform-as-a-Service scenario. ANATOMY OF THE ANEKA CONTAINER Aneka is a platform and a framework for developing distributed applications on the Cloud. It harnesses the spare CPU cycles of a heterogeneous network of desktop PCs and servers or data centers on demand.
  • 2. Aneka provides developers with a rich set of APIs for transparently exploiting such resources and expressing the business logic of applications by using the preferred programming abstractions. System administrators can leverage on a collection of tools to monitor and control the deployed infrastructure. This can be a public cloud available to anyone through the Internet, or a private cloud constituted by a set of nodes with restricted access. The Aneka based computing cloud is a collection of physical and virtualized resources connected through a network, which are either the Internet or a private intranet. Each of these resources hosts an instance of the Aneka Container representing the runtime environment where the distributed applications are executed. The container provides the basic management features of the single node and leverages all the other operations on the services that it is hosting. Fabric services directly interact with the node through the Platform Abstraction Layer (PAL) and perform hardware profiling and dynamic resource provisioning. Foundation services identify the core system of the Aneka middleware, providing a set of basic features to enable Aneka containers to perform specialized and specific sets of tasks. Execution services directly deal with the scheduling and execution of applications in the Cloud. One of the key features of Aneka is the ability of providing different ways for expressing distributed applications by offering different programming models; Execution services are mostly concerned with providing the middleware with an implementation for these models. Additional services such as persistence and security are transversal to the entire stack of services that are hosted by the Container. At the application level, a set of different components and tools are provided to: 1) Simplify the development of applications (SDK); 2) Porting existing applications to the Cloud; and 3) Monitoring and managing the Aneka Cloud. A common deployment of Aneka is presented at the side. An Aneka based Cloud is constituted by a set of interconnected resources that are dynamically modified according to the user needs by using resource virtualization or by harnessing the spare CPU cycles of desktop machines. If the deployment identifies a private Cloud all the resources are in house, for example within the enterprise. This deployment is extended by adding publicly available resources on demand or by interacting with other Aneka public clouds providing computing resources connected over the Internet
  • 3. ANEKA ARCHITECTURE The Aneka architecture is designed to facilitate distributed computing in cloud environments, providing a versatile and scalable framework for efficient resource utilization. Below is a description of the key components and layers within the Aneka architecture: Application Layer: The top layer of the Aneka architecture is the application layer, where users deploy their applications and tasks. Users interact with Aneka to submit jobs, manage workloads, and access distributed computing resources. Middleware Layer: Sitting between the application layer and the infrastructure layer, the middleware layer comprises the core components of Aneka. It includes services for job scheduling, task execution, resource management, and communication. This layer abstracts the complexities of distributed computing, providing a unified interface for application deployment. Resource Layer: At the bottom of the architecture is the resource layer, which encompasses the distributed computing resources available in the cloud infrastructure. These resources include computational nodes, storage, and network components. Aneka dynamically allocates and manages these resources based on the requirements of running applications. Job Scheduler: A critical component within the middleware layer, the job scheduler allocates tasks to available resources. It considers factors such as resource availability, load balancing, and priority to optimize the execution of jobs across the distributed environment. Execution Container: The execution container is a runtime environment that hosts and executes tasks or applications. Aneka dynamically creates and manages these containers on distributed nodes, providing isolation and ensuring that applications run efficiently without interference. Communication Services: Aneka includes communication services to enable efficient data exchange and coordination among distributed components. This ensures seamless collaboration between execution containers, allowing them to work together in a coordinated fashion.
  • 4. Monitoring and Management Services: This set provides real-time monitoring of the distributed computing environment. It collects data on resource usage, application performance, and system health. Users can access this information for performance optimization, debugging, and resource planning. Security Services: Aneka incorporates security services to ensure the integrity and confidentiality of data and resources. This involves implementing authentication, authorization, and encryption measures to prevent illegitimate access and potential security risks. Integration Adapters: Aneka’s architecture includes adapters facilitating integration with various cloud infrastructures and middleware components. These adapters ensure compatibility and interoperability, allowing Aneka to work seamlessly with cloud providers and technologies. PLATFORM ABSTRACTION LAYER The PAL is a small layer of software that comprises a detection engine, which automatically configures the container at boot time, with the platform-specific component to access the above information and an implementation of the abstraction layer for the Windows, Linux, and Mac OS X operating systems. The collectible data that are exposed by the PAL are the following: •Number of cores, frequency, and CPU usage •Memory size and usage •Aggregate available disk space •Network addresses and devices attached to the node Moreover, additional custom information can be retrieved by querying the properties of the hardware. The PAL interface provides means for custom implementations to pull additional information by using name-value pairs that can host any kind of information about the hosting platform. For example, these properties can contain additional information about the processor, such as the model and family, or additional data about the process running the container. The Platform Abstraction Layer (PAL) addresses this heterogeneity and provides the container with a uniform interface for accessing the relevant hardware and operating system information, thus allowing the rest of the container to run unmodified on any supported platform. The PAL is responsible for detecting the supported hosting environment and providing the corresponding implementation to interact with it to support the activity of the container. The PAL provides the following features: •Uniform and platform-independent implementation interface for accessing the hosting platform •Uniform access to extended and additional properties of the hosting platform •Uniform and platform-independent access to remote nodes •Uniform and platform-independent management interfaces FABRIC SERVICES Fabric Services define the lowest level of the software stack representing the Aneka Container. Fabric services in Aneka pertain to the underlying infrastructure and
  • 5. resources that facilitate the execution of distributed applications. These services manage the fabric of computing resources, ensuring efficient resource allocation, scalability, and responsiveness to varying workloads. Fabric services are crucial in optimizing the utilization of cloud resources for distributed computing. Examples: Job Scheduling: Efficient allocation and scheduling of tasks across distributed nodes. Resource Management: Dynamic allocation and deallocation of resources based on demand. Load Balancing: Equitable distribution of tasks to prevent resource bottlenecks. APPLICATION SERVICES Application services in Aneka focus on providing tools and capabilities to support the development, deployment, and management of distributed applications. These services abstract the complexities of distributed computing, offering features that enhance application performance, reliability, and adaptability in cloud environments. Examples: Task Execution: Services for deploying and executing tasks across the distributed environment. Data Management: Handling storage, retrieval, and manipulation of data within the distributed application. Communication Services: Facilitating communication and coordination between distributed components. FOUNDATION SERVICES Foundation services form the core building blocks of Aneka, providing fundamental functionalities that support the overall framework. These services include essential components and tools that enable Aneka’s seamless integration, security, and extensibility in diverse cloud computing environments. Examples: Security Services: Authentication, authorization, and encryption mechanisms to ensure data and resource security. Integration Adapters: Components that facilitate compatibility with various cloud infrastructures and middleware. Monitoring and Management: Real-time resource usage monitoring, application performance, and system health. RESOURCE MANAGEMENT Resource management is another fundamental feature of Aneka Clouds. It comprises several tasks: resource membership, resource reservation, and resource provisioning. Aneka provides a collection of services that are in charge of managing resources. These are the Index Service (or Membership Catalogue), Reservation Service, and Resource Provisioning Service. The Membership Catalogue is Aneka’s fundamental component for resource management; it keeps track of the basic node information for all the nodes that are connected or disconnected. The Membership Catalogue implements the basic services of a directory service, allowing the search for services using attributes such as names and nodes. During container startup, each instance publishes its information to the Membership Catalogue and updates it constantly during its lifetime.
  • 6. Services and external applications can query the membership catalogue to discover the available services and interact with them. To speed up and enhance the performance of queries, the membership catalogue is organized as a distributed database: All the queries that pertain to information maintained locally are resolved locally; otherwise the query is forwarded to the main index node, which has a global knowledge of the entire Cloud. The Membership Catalogue is also the collector of the dynamic performance data of each node, which are then sent to the local monitoring service to be persisted in the long term. Aneka defines a very flexible infrastructure for resource provisioning whereby it is possible to change the logic that triggers provisioning, support several back-ends, and change the runtime strategy with which a specific back-end is selected for provisioning. The resource-provisioning infra- structure built into Aneka is mainly concentrated in the Resource Provisioning Service, which includes all the operations that are needed for provisioning virtual instances. The implementation of the service is based on the idea of resource pools. Resource provisioning is a feature designed to support QoS requirements-driven execution of applications. Therefore, it mostly serves requests coming from the Reservation Service or the Scheduling Services. Despite this, external applications can directly leverage Aneka’s resource provisioning capabilities by dynamically retrieving a client to the service and interacting with the infrastructure to it. This extends the resource-provisioning scenarios that can be handled by Aneka, which can also be used as a virtual machine manager. BUILDING ANEKA CLOUDS Aneka is primarily a platform for developing distributed applications for clouds. As a software platform it requires infrastructure on which to be deployed; this infrastructure needs to be managed. Infrastructure management tools are specifically designed for this task, and building clouds is one of the primary tasks of administrators. Aneka supports various deployment models for public, private, and hybrid clouds. Infrastructure organization It provides an overview of Aneka Clouds from an infrastructure point of view. The scenario is a reference model for all the different deployments Aneka supports. A central role is played by the Administrative Console, which performs all the required management operations. A fundamental element for Aneka Cloud deployment is constituted by repositories. A repository provides storage for all the libraries required to lay out and install the basic Aneka platform.
  • 7. These libraries constitute the software image for the node manager and the container programs. Repositories can make libraries available through a variety of communication channels, such as HTTP, FTP, common file sharing, and so on. The Management Console can manage multiple repositories and select the one that best suits the specific deployment. The infrastructure is deployed by harnessing a collection of nodes and installing on them the Aneka node manager, also called the Aneka daemon. The daemon constitutes the remote management service used to deploy and control container instances. The collection of resulting containers identifies the Aneka Cloud. From an infrastructure point of view, the management of physical or virtual nodes is performed uniformly as long as it is possible to have an Internet connection and remote administrative access to the node. A different scenario is constituted by the dynamic provisioning of virtual instances; these are generally created by prepackaged images already containing an installation of Aneka, which only need to be configured to join a specific Aneka Cloud. It is also possible to simply install the container or install the Aneka daemon, and the selection of the proper solution mostly depends on the lifetime of virtual resources. Logical organization The logical organization of Aneka Clouds can be very diverse, since it strongly depends on the configuration selected for each of the container instances belonging to the Cloud. The most common The master node features all the services that are most likely to be present in one single copy and that provide the intelligence of the Aneka Cloud. What specifically characterizes a node as a master node is the presence of the Index Service (or Membership Catalogue) configured in master mode; all the other services, except for those that are mandatory, might be present or located in other nodes. A common configuration of the master node is as follows: • Index Service (master copy) • Heartbeat Service • Logging Service • Reservation Service • Resource Provisioning Service • Accounting Service • Reporting and Monitoring Service • Scheduling Services for the supported programming models
  • 8. ANEKA CLOUD DEPLOYMENT MODE All the general cloud deployment modes like Private cloud deployment mode and Public cloud deployment mode and Hybrid cloud deployment mode are applicable to Aneka Clouds also. Private cloud deployment mode: A Private cloud deployment mode is mostly constituted by local physical resources and infrastructure management software providing access to a local pool of nodes, which might be virtualized. Public cloud deployment mode: It features the installation of Aneka master and worker nodes over a completely virtualized infrastructure that is hosted on the infrastructure of one or more resource providers such as Amazon EC2 and GoGrid. The deployment is generally contained within the infrastructure boundaries of a single IaaS provider. The reasons for this are to minimize the data transfer between different providers, which is generally priced at a higher cost, and to have better network performance. Hybrid cloud deployment mode: It constitutes the most common deployment of Aneka. In many cases, there is an existing computing infrastructure that can be leveraged to address the computing needs of applications. This infrastructure will constitute the static deployment of Aneka that can be elastically scaled on demand when additional resources are required.  Dynamic Resource Provisioning  Resource Reservation  Workload Partitioning  Accounting, Monitoring and Reporting In a Hybrid Scenario, heterogeneous resources can be used for different purpose. As we discussed in the case of a private cloud deployment, desktop machines can be reserved for low priority workload outside the common working hours. The majority of the application will be executed on work-stations and clusters, which are the nodes that are constantly connected to the Aneka cloud. Any additional computing capability demand can be primarily addressed by the local virtualization facilities, and if more computing power is required, it is possible facilities, and if more computing power is required, it is possible to leverage external IaaS providers.
  • 9. CLOUD PROGRAMMING AND MANAGEMENT Aneka’s primary purpose is to provide a scalable middleware product in which to execute distributed applications. Application development and management constitute the two major features that are exposed to developers and system administrators. Aneka provides developers with a comprehensive and extensible set of APIs and administrators with powerful and intuitive management tools. The APIs for development are mostly concentrated in the Aneka SDK; management tools are exposed through the management console. ANEKA SDK Aneka provides APIs for developing applications on top of existing programming models, implementing new programming models and developing new services to integrate into the Aneka Clouds. The SDK provides support for the both programming models and services by The Application Model The Service Model Application model Aneka facilitates distributed execution in the Cloud by employing programming model abstractions. These programming models serve as both a conceptual framework for developers and provide the necessary runtime support for program execution on the Aneka platform. The Application Model serves as a foundational layer, defining the minimum set of APIs shared across all programming models. It acts as a common ground for representing and programming distributed applications on Aneka. Furthermore, the Application Model is tailored to meet the specific requirements and features of individual programming models, allowing for a specialized and efficient development approach. Service model The Aneka Service Model outlines essential criteria for implementing services in the Aneka Cloud. Services, hosted in a container, must adhere to the IService interface, providing properties like name and status, control operations, and message handling. The service life cycle encompasses states from initialization to shut down. A default base class, ServiceBase, simplifies implementation, offering features like basic property support, control operation implementation, built- in client infrastructure, and service
  • 10. monitoring. Aneka employs a strongly-typed message-passing model, and developers can dynamically inject service clients into applications. Advanced configuration capabilities enhance service integration within the container workflow. MANAGEMENT TOOLS Aneka operates as a pure Platform as a Service (PaaS) solution, necessitating deployment on either virtual or physical hardware. As a result, Aneka places significant emphasis on infrastructure management, encompassing the installation of logical clouds on the designated hardware. This foundational aspect is integral to Aneka’s management layer, which not only oversees infrastructure but also boasts capabilities for effectively managing services and applications within the Aneka Cloud. Infrastructure management Aneka utilizes both virtual and physical hardware for deploying Aneka Clouds. The Resource Provisioning Service efficiently handles virtual hardware, acquiring resources on-demand based on application requirements. In contrast, the Administrative Console directly manages physical hardware by leveraging the Aneka management API of the PAL. The core focus of these management features revolves around provisioning physical hardware and facilitating the remote installation of Aneka on the designated hardware. Platform management Infrastructure management is the foundational layer for deploying Aneka Clouds. This involves orchestrating the creation of Clouds by deploying services on the physical infrastructure, facilitating container installation and management. Platform management features focus on organizing Aneka Clouds logically, allowing hardware partitioning for different purposes. Services play a key role in implementing core features, and the management layer provides operations for essential functions like Cloud monitoring, resource provisioning, user management, and application profiling. Application management User contributions to the Cloud are reflected through applications, with management APIs equipping administrators with monitoring and profiling features. These tools enable the tracking of resource usage and its correlation to users and applications - a crucial aspect in cloud scenarios where users are billed based on resource consumption. Aneka offers capabilities to provide both summary and detailed information about application execution and resource utilization. The Aneka Cloud Management Studio, serving as the primary Administrative Console, makes these features easily accessible to administrators.