Chapter #10 Service-Oriented Architecture — Software Design and Architecture Specialization University of Alberta
Service-Oriented Architecture
Service-oriented architecture (SOA) is a concept that mirrors everyday services, where software deploys tools for other software to use. It involves a service requester (client) and a service provider (server) and aims to build, use, and combine services instead of creating monolithic software suites. This approach applies to both internet-based (external) and internal (within an organization) contexts, although the specific implementations and outcomes may differ based on the context.
Web Services
Internet or web services, available online, can be integrated to build comprehensive applications, such as those in the travel industry that incorporate flight, hotel, and car rental services. However, utilizing external web services introduces potential trade-offs, as developers must balance the convenience of using these services with qualities that may not be directly controlled, leading to a focus on non-functional requirements such as response time, supportability, and availability.
Large Organizations
In large organizations, in-house code can be converted into services that different business units can utilize to achieve organizational goals. Internal service-oriented architecture (SOA) promotes the development of reusable software services, enabling swift responses to opportunities and facilitating accessibility for new business units. Implementing an extensive SOA involves trade-offs, as it can be costly and challenging to support, leading to gradual integration of services. Despite the complexity, SOAs offer benefits such as modularity, extensibility, and code reuse, enabling the creation of new services through the combination of existing ones.
Service Principles
To create effective and reusable services, specific principles are crucial, including modularity and loose coupling for reusability and combinability. Additionally, services should be composable, allowing for the combination of modular services to achieve desired functionality. Ensuring platform and language independence enables the seamless integration of services across different technologies. Services should be self-describing, outlining their interfaces, inputs, and outputs. Moreover, services need to be self-advertising, making their availability known to potential clients through mechanisms like service catalogues or standards such as Universal Description, Discovery, and Integration (UDDI) for distributed applications using web services.
History of Web-based Systems
The evolution of web-based systems began with the development of ARPANET in 1969, leading to the birth of the Internet as a network of interconnected networks. Tim Berners Lee’s conceptualization of the World Wide Web in 1990 revolutionized the way information was linked and shared across the globe. The subsequent introduction of web standards such as HTML and HTTP enabled computer-based communication over the Internet. The advent of web browsers in the early 1990s significantly enhanced the accessibility and popularity of the web, with websites being composed of web pages requested by the browser from a web server. This client-server relationship operates through HTTP, facilitating the communication between the two entities.
The history of web-based systems, from the early Internet to the development of web services and applications, provides important context for understanding service-oriented architectures (SOA) based on web services. Here’s a detailed summary of the key points:
Internet vs. World Wide Web (WWW):
- The Internet and the World Wide Web (WWW) are distinct but interconnected concepts.
- The Internet, created in 1969 with ARPANET, was the global network of interconnected computer networks.
- The WWW, introduced in 1990 by Tim Berners-Lee, built upon the Internet and established web standards and technologies.
Foundations of the WWW:
- The WWW introduced key web standards, including Hypertext Markup Language (HTML) and Hypertext Transfer Protocol (HTTP).
Static Web Pages:
- Early websites consisted of static web pages, where each page was a separate HTML file.
- These pages were stored on the web server and displayed to users as-is.
- Changes to static web pages required manual updates of HTML documents.
- Static websites were suitable for presenting unchanging information.
Dynamic Web Pages:
- Dynamic web pages, emerging in 1993, generated content at the time of access.
- When a dynamic web page is viewed, the web server triggers an application to process the request.
- The application can compute, query databases, or interact with web services to produce dynamic content.
- Changes to dynamic websites were easier to make, as modifications were applied in one place, impacting the entire site.
Web Applications:
- Web applications provide interactive graphical user interfaces that run in web browsers.
- Unlike desktop applications, web applications are stored on remote web servers and accessed via web browsers.
- Web applications are platform-independent and accessible through web browsers, eliminating the need for local installations.
- They require internet access to communicate with backend servers and can offer a rich and interactive user experience.
Web Services:
- Web services are a crucial component for building complex web applications.
- Web services can provide real-time information, such as stock market data, weather reports, or currency conversion.
- These services can be treated as reusable components and integrated into various web applications.
- Web applications and web services communicate using open standards like HTTP, XML, and JSON, allowing machines to manipulate data.
- Web services enable asynchronous request-response patterns, allowing multiple services to process requests simultaneously.
Understanding this historical context helps illustrate the evolution of web-based systems, from static web pages to dynamic web pages, web applications, and the integration of web services into modern web development.
Web Technologies
The use of web technologies in the development of web-based systems involves the consideration of layered architecture, which helps organize the components and interactions in a systematic way. Here’s a detailed summary of the key aspects discussed in the lesson:
Layered Architecture:
- A layered architecture involves organizing components into different tiers to manage interactions and dependencies.
- In a web-based system, the layers typically include the presentation tier (divided into web browser and web server layers), the application tier, and the data tier.
- Layers serve distinct functions, with the lower layers providing services to the layers above.
Layers for Static Web Content:
- A static web page typically involves a layered architecture with a web browser layer, a web server layer, and a data layer.
- The web server retrieves HTML documents from the data layer and delivers them to the web browser, with no processing involved.
Layers for Dynamic Web Content and Web Applications:
- Dynamic web content and web applications require a layered architecture comprising the web browser, web server, application, and data layers.
- Dynamic web pages involve the generation of HTML documents through processing requests in the application layer and data interactions in the data layer.
Services View:
- A UML component diagram can be used to view a web-based system as a collection of services and service requester/provider pairings.
- This view emphasizes the use of external web services and the reinforcement of design principles such as separation of concerns and code reuse.
Challenges and Considerations:
- While the use of web services provides extensive functionality and the potential for code reuse, it also presents challenges in identifying the most appropriate services to use.
- Designers must carefully select the web services that align with the system’s requirements and effectively integrate them into the layered architecture.
Understanding the layered approach and the role of web technologies in web-based systems is crucial for designing efficient, scalable, and well-organized applications.
XML/HTML/JSON
This lesson focused on three common formats used in web systems, namely HTML, XML, and JSON. Here’s a detailed summary of the key points covered:
HTML (Hypertext Markup Language):
- HTML is primarily used to structure text on web pages, marking different elements to define their roles.
- It provides structure but no styling, and CSS is used for styling purposes.
- HTML documents consist of elements like <html>, <head>, and <body>, with the <head> containing metadata and the <body> containing the main content.
- CSS is used for applying styles to elements, helping to define the appearance of content within the HTML tags.
XML (eXtensible Markup Language):
- XML is used for storing and transporting data, allowing for structured information in both machine and human-readable formats.
- XML documents are defined with tags and can be customized with specific schemas for valid tags and their structures.
- It is commonly used to send structured data within web-based systems and applications.
JSON (JavaScript Object Notation):
- JSON is a data-interchange format that is both machine and human-readable, often used for data transfer between web browsers and servers.
- It can be easily converted to JavaScript objects and vice versa, making it a popular choice for passing data in web applications.
- JSON data is organized as name/value pairs, and JSON objects are enclosed in curly braces, with arrays of JSON objects enclosed in square brackets.
Understanding the distinctions and applications of these formats is essential for web development, as they each serve specific purposes in handling and organizing data on web-based platforms.
HTTP
In this section, the focus was on understanding Hypertext Transfer Protocol (HTTP) and its important components and functionalities. Here is a concise summary of the key points covered:
URIs and URLs:
- Universal Resource Identifiers (URIs) are used to identify resources, while Universal Resource Locators (URLs) are a subset of URIs that are used to locate resources. URLs also provide the protocol and domain name or IP address for accessing the resource.
TCP:
- HTTP is built upon the Transmission Control Protocol (TCP) to enable communication between clients and servers. This facilitates reliable, ordered, and connection-oriented communication.
HTTP Requests and Responses:
- HTTP requests consist of a request-line, headers, a blank line, and sometimes a message body, while HTTP responses consist of a status-line, headers, a blank line, and potentially a message body.
- Both requests and responses may include mandatory and optional headers, and the message body can contain various types of content such as HTML, JSON, or other encoded parameters.
Encoding:
- HTTP restricts the characters used in URIs, requests, and request bodies to be ASCII. Special or unsafe characters require encoding, typically with a “%” sign followed by their two-digit hexadecimal encoding.
HTTP Methods:
- The most common request methods in HTTP are GET, POST, and PUT, each serving different purposes such as retrieving resources, adding or modifying resources, and creating or updating resources at a specific location.
HTTP Statelessness and Cookies:
- HTTP is typically stateless, meaning it does not preserve the relationship between requests. However, HTTP cookies can be used by websites to store and update information about a user’s browsing session, enabling the server to track user interactions for various purposes.
Understanding these aspects of HTTP is crucial in comprehending how data is communicated and exchanged on the web, playing a fundamental role in enabling service-oriented architectures and web-based systems.
Javascript
In this lesson, the focus was on understanding how JavaScript can be embedded into HTML documents to make web pages interactive. Here’s a concise summary of the key points covered:
JavaScript Embedding:
- JavaScript can be embedded within HTML using the
<script></script>
tags. It allows for modifications to various elements, attributes, styles, and content within the HTML document. - JavaScript is an interpreted language, meaning it’s executed by the web browser at runtime, enabling dynamic changes to the loaded HTML document.
Interactive Web Pages:
- JavaScript enables more efficient and usable interactions on web pages compared to basic HTML forms. It allows for partial form checking and processing on the client-side, reducing the need for server requests.
- The HTML Document Object Model (DOM) is utilized by JavaScript to modify elements on a web page, enabling the manipulation of content, structure, and style of the HTML document.
DOM Manipulation with JavaScript:
- JavaScript can be used to perform various actions, such as enlarging image thumbnails, hiding and revealing text (e.g., for spoilers), and dynamically modifying the elements and content on the web page.
Utilizing Pre-made Scripts:
- Even without in-depth knowledge of JavaScript, pre-made scripts are readily available online from trusted sources. These scripts can be directly applied or slightly modified to suit the specific needs of a web page.
Embedding JavaScript into HTML offers a powerful way to enhance user experience and provide access to various services directly from a web page.
Distributed Systems Basics
This section highlights the significance of the rapid growth of the Internet and the impact of cloud services on the development of innovative web services. It emphasizes the changing landscape of client-server communications and the increasing prevalence of heterogeneous environments.
Middleware
The middleware header delves deeper into the role of middleware in facilitating communication between applications operating on different systems. It underscores the importance of middleware in enabling seamless interactions and data exchange between heterogeneous systems.
Distributed Computing: This header further elaborates on the concept of distributed computing, explaining how multiple computers on a network can collaborate and share resources to perform complex tasks. It emphasizes the advantages and challenges of distributed computing in modern systems.
Middleware for Enhancing Communication: Expanding on the role of middleware, this section emphasizes its crucial function in improving communication between clients and servers, particularly in the context of diverse and complex computing environments. It highlights the significance of middleware in simplifying the interaction between different components.
Remote Procedure Call (RPC)
The RPC header provides an overview of the concept of remote procedure calls, emphasizing their role as a form of middleware for certain web services. It discusses how RPC enables clients to invoke procedures implemented on a remote server and the significance of this functionality in distributed computing.
History of RPC: This section explores the historical development of remote procedure calls, discussing their origins, initial design, and the motivations behind their creation by Birell and Nielson in the 1980s. It sheds light on the early stages of RPC and its evolution over time.
Basics of RPC: The basics of RPC section covers the fundamental components of an RPC system, including the client, server, and the Interface Definition Language (IDL). It explains how the IDL plays a critical role in specifying the available procedures and their parameters for the client and server to communicate effectively.
Object Brokers and CORBA
This section provides a detailed explanation of object brokers and the Common Object Request Broker Architecture (CORBA) as a comprehensive set of standards for object brokers in distributed systems. It highlights the significance of CORBA in enabling seamless interaction between distributed objects.
Benefits of CORBA: The benefits of CORBA section emphasizes the advantages of using CORBA as a middleware system, highlighting its ability to handle distributed computing in an object-oriented paradigm. It discusses the significance of location transparency and the various options it provides for data handling and optimization.
Criticisms of CORBA: This part addresses the criticisms and challenges associated with CORBA, emphasizing the importance of implementing the standards correctly to avoid potential issues. It discusses the complexities related to location transparency and the impact of poor implementation on the overall performance of the system.
Ibrahim Can Erdoğan