XML and Web Services In The News - 3 January 2007

Provided by OASIS | Edited by Robin Cover

This issue of XML Daily Newslink is sponsored by IBM Corporation



HEADLINES:

 Software Development: Simplicity Tops the 2007 Agenda
 Q&A from James Bryce Clark: Dynamic SOA Coming in 2007
 Justice Reference Architecture (JRA) Specification Version 1.3
 W3C Publishes Approved TAG Finding on the Use of Metadata in URIs
 Working XML: Understand the Various Approaches to XML Parsing
 New Year, New Look For PC-BSD
 Opinion: The Free Multimedia Opportunity


Software Development: Simplicity Tops the Agenda
Andrew Binstock, InfoWorld
Software development continued to move toward simplicity in 2006. Most evident was the widespread adoption of SOA, which has become the technology of choice for integrating systems of all kinds — in-house between departments, across stovepipe applications, and in B2B and B2C commerce. Web services became easier to develop and use as a result of the emerging popularity of REST (Representational State Transfer). REST does away with SOAP wrappers and other overhead, bringing Web services down to the simplest possible implementation: an XML file sent over the wire via HTTP. Four basic commands (the REST equivalent of CRUD), combined with resource files, make all actions simple to implement. REST is likely to replace much of socket-based communication and a good portion of SOAP-based services as well, especially in straightforward applications where solutions need to be cobbled together quickly. Java 6, released in December, added numerous features to simplify development. It also added better support for scripting languages. One language primarily associated with simplicity, Ruby, will tap these benefits with JRuby, a JVM implementation that should ship in 2007. Likewise, an elegant scripting language called Groovy, due to ship in early 2007, will put the fun back into Java programming. Frameworks such as Spring and Ruby on Rails continued gaining in popularity and commercial support, as developers and their managers came to accept the view that many business apps don't need the heavy enterprise aspects. By giving up some features, and especially scalability, these frameworks have enabled many sites to cut a swath through their backlog. The rise of lightweight frameworks, the continued popularity of scripting languages, and the success of SOA show that IT sites and developers are increasingly relying on simpler technologies.
See also: Atom references

Q&A from James Bryce Clark: Dynamic SOA Coming in 2007
Rich Seeley, SearchWebServices.com
In this second part of a Q&A interview with James Bryce Clark (Director of Standards Development for OASIS), Clark talks about the future of semantic standards to make more intelligent use of the information organizations rely on for their business applications. Clark: [on knowledge representation] "Among the things that are happening now to bring KR, knowledge representation, into our field are all the taxonomy and ontology projects. There are also three business rules projects. Semantic markup people are doing great work. There are several candidates and it's hard to say who will win out because they have different models. But when our people on OASIS committees want to add semantic information to their models there are ways to do it. And it's very important to add that functionality. Also, a lot of the action in adding semantic content, in making business documents smarter, isn't necessarily happening through large complex academic exercises. There are a lot of fairly light methods for adding information that also seem to be gathering a lot of steam. Because not everybody who wants to make their business information smarter wants to go down the road of re-writing everything they have, and hiring four PhDs, and doing ontological research. Sometimes they just want to get a little more metadata or a little more organization. There are a number of really cool methods for doing that that are becoming mature now. One of them that is happily at OASIS is Darwin Information Typing Architecture (DITA). Also there's BPEL, which is widely used. A lot of complex transaction engines and business rules engines are proprietary and have their own models. Nevertheless, they use BPEL for their exchange format. Vendors of systems that orchestrate and run transactions with their own engine that's not necessarily standardized, most of them support BPEL now. That's something I didn't expect to see. It's become an important exchange format. A customer who commits to the Foo Company's method of running their transactions, knows that if they ever want to switch out of Foo Company, the use of BPEL will help them move their data out in a standard format.
See also: Part 1

Justice Reference Architecture (JRA) Specification Version 1.3
U.S. DOJ Global Infrastructure/Standards WG, Draft Specification
Members of the United States Department of Justice GISWG Executive Architecture Committee (EAC) have announced the release of the Global Justice Reference Architecture (JRA) Specification Working Draft Version 1.3. The document is now available on U.S. OJP's Technology and Global Web Site, and is open for public comment through January 16, 2007. The document states a set of requirements for justice interoperability and then describes the Justice Reference Architecture (concepts, relationships, and high-level components) Specification that satisfies those requirements. The document then illustrates the architecture through a set of actual scenarios. Finally, the document provides an initial elaboration of some of the concepts and components in the architecture. This report is intended as a resource for a technical audience, including Global Justice XML Data Model (Global JXDM) implementers, architects, developers, system integrators, and other justice and public safety technical practitioners. It provides the background and concepts & a strong foundation & required for the implementation of SOA. Justice Reference Architecture is a new term coined for the justice community, and it is derived from the OASIS Reference Model for Service-Oriented Architecture 1.0 (SOA-RM ). The reader should refer to the SOA-RM for more detailed information about many of the concepts in this document. JRA is intended to facilitate your SOA implementation by establishing a common language that can be used to exchange data with partner organizations. Solving interoperability challenges continues to be a significant problem and a high priority for the justice and public safety community. There are approximately 100,000 justice agencies that have the critical need to share information across their various information systems, and this variety creates multiple layers of interoperability problems because hardware, software, networks, and business rules for data exchange are different. The need for information sharing has led to this interoperability strategy and the Justice Reference Architecture (JRA).
See also: the reference page

W3C Publishes Approved TAG Finding on the Use of Metadata in URIs
W3C Technical Architecture Group, TAG Finding
W3C announced the publication of an Approved TAG Finding, produced by members of the W3C Technical Architecture Group (TAG). Edited by Noah Mendelsohn and Stuart Williams, this finding "addresses several questions regarding Uniform Resource Identifiers (URIs). Specifically, what information about a resource can or should be embedded in its URI? What metadata can be reliably determined from a URI, and in what circumstances is it appropriate to rely on the correctness of such information? In what circumstances is it appropriate to use information from a URI as a hint as to the nature of a resource or its representations? Simple examples are used to explain the tradeoffs involved in employing such metadata in URIs." From the Conclusions: (1) It is legitimate for assignment authorities to encode static identifying properties of a resource, e.g. author, version, or creation date, within the URIs they assign; this may contribute to the unique assignment of URIs. (2) Assignment authorities may publish specifications detailing the structure and semantics of the URIs they assign. Other users of those URIs may use such specifications to infer information about resources identified by URI assigned by that authority. (3) The ability to explore and experiment is important to Web users. Users therefore benefit from the ability to infer either the nature of the named resource, or the likely URI of other resources, from inspection of a URI. Such inferences are reliable only when supported by normative specifications or by documentation from the assignment authorities. In other cases, users should be aware that their inferences may be incorrect and the effect could be malicious. (4) People and software using URIs assigned outside of their own authority should make as few inferences as possible about a resource based on its URI. The more dependencies a piece of software has on particular constraints and inferences, the more fragile it becomes to change and the lower its generic utility.
See also: other TAG Findings

Working XML: Understand the Various Approaches to XML Parsing
Benoit Marchal, IBM developerWorks
Where does every XML processing start? Through parsing. Parsing is probably the most fundamental service available to developers. The parser reads the XML document, decodes the syntax and passes meaningful objects to the application. The parser might also provide additional services such as validation — making sure the document conforms to an XML Schema or a DTD — or namespace resolution. This article introduces the various approaches to parsing and highlights their pros and cons to help you decide on the tools for your next XML project. It includes links to more articles so when you decide on a tool, you can study the technicalities of a given API. Granted, XML is not a very complex syntax so you can be forgiven for thinking that you can hack your way with regular expressions or other ad-hoc means. In practice it seldom works: XML syntax requires support for multiple encodings and many subtleties, such as CDATA sections or entities. Home-made implementations almost never cater to all these aspects and they create incompatibilities. Conversely, the parsers that ship with development environments were tested with an eye towards compatibility. Because the main reason to adopt a standard syntax like XML is to be compatible with other applications and toolkits, this is one case where it really pays to use a well-tested library. Most parsers offer at least two different APIs, typically an object model API and an event API (also called stream API). The Java platform, for example, ships with both DOM (Document Object Model) and SAX (Simple API for XML). Both sets of APIs offer the same services: decoding of the document, optional validation, namespace resolution, and more. The difference is not in the services but in the data model used by the API. The API you use to read an XML document has a significant impact on the overall performance of your application, so take time to familiarize yourself with the options and choose the best option for your platform, programming language and, more importantly, your project. Generally speaking, event APIs consume fewer resources and, therefore, can be more efficient, but if you store the whole document in memory anyway, then an object API is a better choice because it saves a lot of coding.

New Year, New Look For PC-BSD
Sean Michael Kerner, InternetNews.com
Linux isn't the only open source operating system vying for the desktop; BSD in the form of the PC-BSD effort is too. It builds on the FreeBSD 6.1 base with an operating system that is more tailored for desktop users. Among the improvements in PC-BSD 1.3 is a new install wizard that allows for multiple users to be added at installation time. The new install wizard also provides advanced options for firewall, network and storage partitions. The new PC-BSD also sports a new look and a new base system using the KDE 3.5.5 desktop. Developers have also updated the Hardware Abstraction Layer (HAL) in version 1.3 for improved hardware access. But HAL isn't necessarily entirely reliable for PC-BSD 1.3 just yet. The release notes for PC-BSD 1.3 warn that this is the first release of PC-BSD that incorporates HAL support for the media backend. An open source project in and of itself can't really be acquired, so what it actually acquired was intellectual property and copyrights related to the development of PC-BSD and associated domains, as well as the brains behind the operation, according to Matt Olander CTO iXsystems. Kris Moore, founder of the PC-BSD project, is one of those brains who is now working full time on the project.

Opinion: The Free Multimedia Opportunity
Neil McAllister, InfoWorld
With Windows Vista on the horizon, the fate of desktop Linux could rest in open media formats. As 64-bit processing becomes mainstream, the next major computing platform shift is due to arrive by 2008. If the open source community doesn't step up to the plate and address major impediments to widespread desktop adoption, Linux could be left behind. So say Eric S. Raymond and Rob Landley in their essay, "World Domination 201," published in November. The issue, they point out, is that Linux simply doesn't work out of the box for many of the things that an average computer user expects to do. Chief among these deficiencies is lack of support for many popular multimedia formats. They have a point. Multimedia has always been a difficult challenge for Linux, owing to the quagmire of proprietary codecs and file formats and the accompanying patents that protect them. Linux newbies would doubtless be surprised to learn that few distributions even bundle support for basic MP3 playback these days, out of fear of litigation. the war for free and open multimedia must be fought on two fronts. While it's important that Linux support all the capabilities that commercial operating systems offer, we must also continue to encourage aggressively the adoption of open formats and codecs wherever possible. Fortunately, we have been presented with an opportunity that is of the enemy's own making. Even as the computer industry is readying for a major platform shift in a few years, the content industry is likewise planning a major shift in how multimedia is delivered. New formats, including HD-DVD and Blu-ray, incorporate copy protection technologies that restrict how the content can be used but do nothing to improve the viewer's experience. Coupled with the draconian copy protection systems introduced with Windows Vista, these technologies make PCs less useful, less reliable, and more costly, according to an analysis by security expert Peter Gutmann.


XML.org is an OASIS Information Channel sponsored by BEA Systems, Inc., IBM Corporation, Innodata Isogen, SAP AG and Sun Microsystems, Inc.

Use http://www.oasis-open.org/mlmanage to unsubscribe or change an email address. See http://xml.org/xml/news_market.shtml for the list archives.


Bottom Gear Image