Certainly, I admit to not being a programmer or a techie expert (not to use somewhat derogatory words like “geek” or “nerd”) per se. Still, my engineering background and years of experience as a functional consultant should suffice for understanding the advantages and possible perils of service oriented architecture (SOA).
On one hand, SOA’s advantages of flexibility (agility), components’ reusability and standards-based interoperability have been well publicized. On the other hand, these benefits come at a price: the difficulty of governing and managing all these mushrooming “software components without borders”, as they stem from different origins and yet are able to “talk to each other” and exchange data and process steps, while being constantly updated by their respective originators (authors, owners, etc.).
At least one good (or comforting) fact about the traditional approach to application development was that old monolithic applications would have a defined beginning and end, and there was always clear control over the source code.
Instead, a new SOA paradigm entails composite applications assembled from diverse Web services (components) that can be written in different languages, and whose source code is hardly ever accessible by the consuming parties (other services). In fact, each component exposes itself only in terms what data and processes it needs as an input and what it will return as an output, but what goes “under the hood” remains largely a “black box” or someone’s educated guess at best.
Consequently, SOA causes radical changes in the well-established borders (if not their complete blurring) of software testing, since runtime (production) issues are melding with design-time (coding) issues, and the traditional silos between developers, software architects and their quality assurance (QA) peers appear to be diminishing when it comes to Web services.
Transparency is therefore crucial to eliminate the potential chaos and complexity of SOA. Otherwise, the introduction of SOA will have simply moved the problem area from a low level (coding) to a higher level (cross-enterprise processes), without a reduction in problems. In fact, the problems should only abound in a distributed, heterogeneous multi-enterprise environment.
Then and Now
Back to the traditional practices and mindset: the software world considers design as development-centric (i.e., a “sandbox” scenario), and runtime as operation-centric (i.e., a part of a real-life customer scenario). But with SOA that distinction blurs, since Web services are being updated on an ongoing basis, thus magnifying the issues of recurring operations testing and management.
Namely, companies still have to do component-based software testing (to ascertain whether the code is behaving as expected) at the micro (individual component) level, but there is also application development at the macro (business process) level, since composite applications are, well, composed of many disparate Web services. In other words, programmers are still doing traditional development work, but now that development work becomes involved in infrastructure issues too.
For instance, what if a Web service (e.g., obtaining exchange rates, weather information, street maps information, air flight information, corporate credit rating information, transportation carrier rates, etc.), which is part of a long chain (composite application), gets significantly modified or even goes out of commission? To that end, companies should have the option of restricting the service’s possibly negative influence in the chain (process) until a signaling mechanism is in place, which can highlight changes that may compromise the ultimate composite application.
Functional testing in such environments is a challenge because, by nature, Web services are not visual like conventional, user-facing software applications. In place of a front-end or user interface (UI), some astute testing software can overlay a form that allows team members to see the underlying schema (data structure) of the Web service being tested.
Furthermore, testing SOA applications is problematic since it is not only difficult for a company to know if a particular Web service will deliver on its “contract”, but also, even if it does, whether it will maintain the company’s adopted standards of performance (e.g., under increased loads) and security while complying with its adopted regulatory policies.
Thus, modern SOA software testing tools increasingly provide support for multiple roles, whereby architects can codify policies and rules, developers check for compliance during the test cycle, and support and operations staff can check for compliance issues when problems occur. The new crop of SOA testing tools also increasingly support a range of tests, including functional and regression testing, interoperability testing, and policy conformance. Contrary to traditional software testing tools that inspect code, Web services testing tools deal with the quality of the extensible markup language (XML) messaging layer.
And although both traditional and Web services testing tools deal with syntax, for Web services team members require higher-level awareness of business rules and service policies. This is owing to the highly distributed SOA environment that makes keeping track of changes difficult and underscores the new SOA management complexity.
In fact, change management in pre- and post-application development is essential to filter out redundant changes, prioritize changes, and resolve conflicting changes. But also, if a certain message between the points A and B doesn’t pass in a real-life scenario, there has to be awareness of what needs to be done to rectify it now and in the future.
The abovementioned examples of numerous problems inherent in SOA have caused the previously mentioned silo-ed areas to now come much closer to each other. These are the following: software lifecycle management, applications performance management and information technology (IT) governance, with change management acting as a core information source on all changes in the environment. This union should enable companies to discover which Web services and components exist, who the owners are, and which services and components are actually consumed and by which applications/business processes.
Progress Software Nabs Mindreef
As to be better positioned to deliver testing and governance products that are geared towards setting up continuous testing and validation to ensure the high reliability and quality of multi-tier, composite SOA applications, Progress Software Corporation recently acquired Mindreef. It is interesting to note the quietness of the event that was reported only briefly by ZDNet bloggers Joe Kendrick and Dana Gardner.
Mindreef was a privately held firm founded in 2002 by Frank Grossman and Jim Moskun who leveraged their deep expertise in Microsoft Windows, Java, and device drivers’ debugging and testing to create the Mindreef SOAPscope products for SOA testing and validation. Mindreef was acquired by Progress Software and included in the Progress Actional product group in June 2008.
Prior to being acquired by Progress Software in early 2006, Actional Corporation was an independent leading provider of Web services management (WSM) software for visibility and run-time governance of distributed IT systems in a SOA. Actional’s SOA management products were incorporated under the product name Progress Actional within Progress’ Enterprise Infrastructure Division, and is now a major element of the Progress SOA Portfolio.
In a nutshell, Mindreef has already been wrapped into Progress Actional product group, since it addresses SOA management at the design and testing phase, while Actional primarily addresses SOA management at the production (run-time) phase (e.g., tracing transactional tables). Thus, Progress now has an expanded solution that addresses the quality and management of the full SOA lifecycle, from early concept and design thru go-live implementation, on-boarding new Web services, and overall SOA production management.
Frank Grossman, former chief executive officer (CEO) and founder of Mindreef is now vice president (VP) of Technology for Progress Actional, reporting to Dan Foody, who is in charge of Progress Actional. For more information the acquisition’s rationale, see the frequently asked questions (FAQ) page here.
Since there is so much product integration in the planning stages at this point soon after announcement of the two recent acquisitions (the other one being of Iona Technologies), Progress hopes to have new slide decks to accompany analyst briefings on virtually all of its products over the next several months. Look for follow up blog posts from me at that time.
Zooming Into SOAPscope
Designed for easy use by architects, service and support personnel as well as SOA operations managers, the Mindreef SOAPscope product family comprises SOAPscope Server, SOAPscope Architect, SOAPscope Tester, and SOAPscope Developer.
Essentially, Mindreef products collect information about Simple Object Access Protocol (SOAP) transactions and use it to shed light on Web services communications. But while most of such logging tools store data in pesky flat files, SOAPscope stores it in a relational database for ease of use even by the folks who are not necessarily XML and SOAP experts.
Mindreef SOAPscope Server was initially called Mindreef Coral, and was re-released under the current name in mid 2006. Like many software testing tools, this collaborative testing product includes a “play” button when Web services are exercised based on specific scenarios. If services for some steps of the process scenario are not available, SOAPscope Server can even simulate them.
On one hand, SOA’s advantages of flexibility (agility), components’ reusability and standards-based interoperability have been well publicized. On the other hand, these benefits come at a price: the difficulty of governing and managing all these mushrooming “software components without borders”, as they stem from different origins and yet are able to “talk to each other” and exchange data and process steps, while being constantly updated by their respective originators (authors, owners, etc.).
At least one good (or comforting) fact about the traditional approach to application development was that old monolithic applications would have a defined beginning and end, and there was always clear control over the source code.
Instead, a new SOA paradigm entails composite applications assembled from diverse Web services (components) that can be written in different languages, and whose source code is hardly ever accessible by the consuming parties (other services). In fact, each component exposes itself only in terms what data and processes it needs as an input and what it will return as an output, but what goes “under the hood” remains largely a “black box” or someone’s educated guess at best.
Consequently, SOA causes radical changes in the well-established borders (if not their complete blurring) of software testing, since runtime (production) issues are melding with design-time (coding) issues, and the traditional silos between developers, software architects and their quality assurance (QA) peers appear to be diminishing when it comes to Web services.
Transparency is therefore crucial to eliminate the potential chaos and complexity of SOA. Otherwise, the introduction of SOA will have simply moved the problem area from a low level (coding) to a higher level (cross-enterprise processes), without a reduction in problems. In fact, the problems should only abound in a distributed, heterogeneous multi-enterprise environment.
Then and Now
Back to the traditional practices and mindset: the software world considers design as development-centric (i.e., a “sandbox” scenario), and runtime as operation-centric (i.e., a part of a real-life customer scenario). But with SOA that distinction blurs, since Web services are being updated on an ongoing basis, thus magnifying the issues of recurring operations testing and management.
Namely, companies still have to do component-based software testing (to ascertain whether the code is behaving as expected) at the micro (individual component) level, but there is also application development at the macro (business process) level, since composite applications are, well, composed of many disparate Web services. In other words, programmers are still doing traditional development work, but now that development work becomes involved in infrastructure issues too.
For instance, what if a Web service (e.g., obtaining exchange rates, weather information, street maps information, air flight information, corporate credit rating information, transportation carrier rates, etc.), which is part of a long chain (composite application), gets significantly modified or even goes out of commission? To that end, companies should have the option of restricting the service’s possibly negative influence in the chain (process) until a signaling mechanism is in place, which can highlight changes that may compromise the ultimate composite application.
Functional testing in such environments is a challenge because, by nature, Web services are not visual like conventional, user-facing software applications. In place of a front-end or user interface (UI), some astute testing software can overlay a form that allows team members to see the underlying schema (data structure) of the Web service being tested.
Furthermore, testing SOA applications is problematic since it is not only difficult for a company to know if a particular Web service will deliver on its “contract”, but also, even if it does, whether it will maintain the company’s adopted standards of performance (e.g., under increased loads) and security while complying with its adopted regulatory policies.
Thus, modern SOA software testing tools increasingly provide support for multiple roles, whereby architects can codify policies and rules, developers check for compliance during the test cycle, and support and operations staff can check for compliance issues when problems occur. The new crop of SOA testing tools also increasingly support a range of tests, including functional and regression testing, interoperability testing, and policy conformance. Contrary to traditional software testing tools that inspect code, Web services testing tools deal with the quality of the extensible markup language (XML) messaging layer.
And although both traditional and Web services testing tools deal with syntax, for Web services team members require higher-level awareness of business rules and service policies. This is owing to the highly distributed SOA environment that makes keeping track of changes difficult and underscores the new SOA management complexity.
In fact, change management in pre- and post-application development is essential to filter out redundant changes, prioritize changes, and resolve conflicting changes. But also, if a certain message between the points A and B doesn’t pass in a real-life scenario, there has to be awareness of what needs to be done to rectify it now and in the future.
The abovementioned examples of numerous problems inherent in SOA have caused the previously mentioned silo-ed areas to now come much closer to each other. These are the following: software lifecycle management, applications performance management and information technology (IT) governance, with change management acting as a core information source on all changes in the environment. This union should enable companies to discover which Web services and components exist, who the owners are, and which services and components are actually consumed and by which applications/business processes.
Progress Software Nabs Mindreef
As to be better positioned to deliver testing and governance products that are geared towards setting up continuous testing and validation to ensure the high reliability and quality of multi-tier, composite SOA applications, Progress Software Corporation recently acquired Mindreef. It is interesting to note the quietness of the event that was reported only briefly by ZDNet bloggers Joe Kendrick and Dana Gardner.
Mindreef was a privately held firm founded in 2002 by Frank Grossman and Jim Moskun who leveraged their deep expertise in Microsoft Windows, Java, and device drivers’ debugging and testing to create the Mindreef SOAPscope products for SOA testing and validation. Mindreef was acquired by Progress Software and included in the Progress Actional product group in June 2008.
Prior to being acquired by Progress Software in early 2006, Actional Corporation was an independent leading provider of Web services management (WSM) software for visibility and run-time governance of distributed IT systems in a SOA. Actional’s SOA management products were incorporated under the product name Progress Actional within Progress’ Enterprise Infrastructure Division, and is now a major element of the Progress SOA Portfolio.
In a nutshell, Mindreef has already been wrapped into Progress Actional product group, since it addresses SOA management at the design and testing phase, while Actional primarily addresses SOA management at the production (run-time) phase (e.g., tracing transactional tables). Thus, Progress now has an expanded solution that addresses the quality and management of the full SOA lifecycle, from early concept and design thru go-live implementation, on-boarding new Web services, and overall SOA production management.
Frank Grossman, former chief executive officer (CEO) and founder of Mindreef is now vice president (VP) of Technology for Progress Actional, reporting to Dan Foody, who is in charge of Progress Actional. For more information the acquisition’s rationale, see the frequently asked questions (FAQ) page here.
Since there is so much product integration in the planning stages at this point soon after announcement of the two recent acquisitions (the other one being of Iona Technologies), Progress hopes to have new slide decks to accompany analyst briefings on virtually all of its products over the next several months. Look for follow up blog posts from me at that time.
Zooming Into SOAPscope
Designed for easy use by architects, service and support personnel as well as SOA operations managers, the Mindreef SOAPscope product family comprises SOAPscope Server, SOAPscope Architect, SOAPscope Tester, and SOAPscope Developer.
Essentially, Mindreef products collect information about Simple Object Access Protocol (SOAP) transactions and use it to shed light on Web services communications. But while most of such logging tools store data in pesky flat files, SOAPscope stores it in a relational database for ease of use even by the folks who are not necessarily XML and SOAP experts.
Mindreef SOAPscope Server was initially called Mindreef Coral, and was re-released under the current name in mid 2006. Like many software testing tools, this collaborative testing product includes a “play” button when Web services are exercised based on specific scenarios. If services for some steps of the process scenario are not available, SOAPscope Server can even simulate them.
No comments:
Post a Comment