... personal wiki, blog and notes
Bryan's Blog 2007/04
No photos can do justice to this time of the year in the Chilterns, but we try:
NDG Access Control
I've wittered on about access control here for a while. Despite being frantic with various funding proposals (hence the silence), I've found time (with help) to knock out a description of NDG security. It wont make fun reading for those who like simplicity, but that's life, it's as simple as we can make it! James Snell only got one thing wrong in this statement:
Auth is and will continue be the most significant issue with APP interoperability.
Why now? The 2007 e-Science AHM.
Practical Access Control using NDG-security
Access control in the NERC DataGrid (NDG) is accomplished using a combination of WS-Security to ensure message level integrity, X509 proxy certificates to assert identity, and bespoke XML tokens to handle authorization. Access control decisions are handled by Gatekeepers and mediated by Attribute Authorities. The design of the NDG-security reflects the reality of building a deployable access control system which respects pre-existing user databases of thousands of individuals who could not be asked to reregister using a new system, and pre-existing services that need to be modified to take advantage of the new security tooling. NDG-security has been built in such a way that it should be able to evolve towards the use of community standards (such as SAML and Shibboleth) as they become more prevalent and best practice becomes clearer. This paper describes NDG-security in some detail, and provides details of experiences deploying NDG- security both in the e-Science funded NDG and the DTI funded Delivering Environmental Web Services (DEWS) projects. Issues to do with securing large data transfers are discussed. Plans for the future of NDG-security are outlined; both in terms of application modification and the evolution of NDG-security itself.
(Full paper: pdf)
Technical and social requirements for a putative AR5 distributed database of simulation and other data
It is highly unlikely that future large multi-model intercomparison projects involving multiple initial-condition and/or parameter ensembles from multiple institutions will be solved by centralised database solutions. Such centralised databases would need to have very high bandwidth to all possible data consumers, and require significant resources which would not be easy to obtain within existing national budgets. Fortunately solutions to the problem of intercomparison which involve distributed data holdings with common metadata structures and interfaces are possible. There are already a number of possible components of such a solution deployed in a variety of institutions. However, there are a number of issues that would need to be addressed before such solutions could be joined together to provide seamless access to data on an international scale. The issues range from agreeing on which technical solutions (the plural is important) should be used, to establishing trust relationships which could be supported not only by the scientists in the institutes involved, but by their network and computer security administrators. Given that technical developments will most likely continue to be driven by individual funding programs, not by an overall project with internationally generated and agreed requirements, it will be important to understand that success will most likely depend on agreeing common interfaces and information models, not on deploying the same technology throughout.
US National Weather Service is ahead of the game
Over two years ago, I was pleased to note that the US National Weather Service provided forecast data in XML via a SOAP interface. In the intervening period they've moved on considerably: they now have a WSDL interface to their SOAP service, and now a new WFS interface (hat tip: John Caron).
If I ever find time I might have to build some toys to interface with their data!