NGS Innovation Forum 2008

I was recently at the NGS (National Grid Service) Innovation Forum 2008 to find out what existing users of the NGS were doing and to see what the reaction was to future plans for NGS Phase 3.  The first, very encouraging, point was that there were more users there this year than there have been for previous years.  Secondly, these users were more diverse, with representation from researchers, e-science centres and support functions for researchers such as IS and research computing directors.

Day one started with presentations from researchers in biology and physics biomolecules with representatives from other research areas being amongst those at the event.  It has been particularly encouraging for JISC as a funder to see this transition of the NGS from providing resource predominantly for those in the ‘hard’ sciences such as physics and chemistry to greater provision for those in social sciences and the arts and humanities.  One message that remains, however, is that if the NGS is to get more users from a wider range of disciplines then they need to offer alternative methods of accessing the service to the command line and these need to be easy to use.  The benefits are very tangible, with one presentation reporting that modelling time had been taken down from one month to six  hours.

Michael Wilson then described how EGI (the European Grid Initiative) could involve the NGS amongst others and sparked off a very lively debate on who would take the NGI (National Grid Initiative) role for the UK that was required by EGI.  Whilst the UK and other countries have expressed an interest in EGI there is still no firm commitment and Michael’s talk stressed that EGI was only a co-ordinating body for European provision of grid infrastructure, not a funding body for national facilities, as has previously been the case with bodies such as EGEE.  This meant there needed to be national commitment to ensure that the UK was appropriately represented.

From the European perspective we moved to Daniel Katz’s presentation on TeraGrid, the American national grid.  There were a number of points that were particularly notable in the presentation, out of which the most interesting one was the concept of Campus Champions.  Campus Champions help promote TeraGrid and grid usage within their campus in exchange for attendance at TeraGrid meetings and a t-shirt!  More to the point, they are people who would like to encourage grid usage and work with those who are new to the grid to help them carry out their research more quickly or simply do new research.  It is something that we see happen on an ad hoc basis in the UK but gives food for thought on how we get phase 3 of the NGS to encourage new users.  Also of interest for me, with my access management hat on was TeraGrid’s experimental use of InCommon to access grid resources.

After lunch, the programme moved onto grid technologies.  There was a good section on Condor for managing campus grids.  Whilst there is often not much attention paid to grids within an institution they form a vital part of the infrastructure available to researchers.  Hugh Beedie also pointed out that they could be a very effective green alternative to high performance computing, especially given modern machines’ power efficiency.  Next up was a session on Clearspeed from Steven Young.  He described how there were four of these maths acceleration cards that now feature at the Oxford node of the NGS.  At this stage, there isn’t much use of them but they look promising for jobs that are maths intensive.

The day finished with presentations on the training available on the NGS (from David Ferguson) and Andy Richards talking about NGS Phase 3.  Both provoked lively debate from the audience and there was a great deal of interest in David’s offer to run training on a regional basis so if  you couldn’t attend the event and you read this then get in touch directly with the training team and find out about courses at http://www.nesc.ac.uk/training/.

Day two was a chance to tie up with the campus grid SIG and to look at what the experience was for those who had joined the NGS.  The overall conclusions seemed to be that whilst it wasn’t easy to set up the software, the NGS had a very active support community that made the whole process a easier and that there were tangible benefits from going through that process.  This led into how to make the NGS sustainable, which follows the general trend with projects in JISC that are moving to be a service.  It was a topic that received a good deal of audience feedback and I am hoping that this can be followed up after the event as it is not going to be an easy task keeping access as easy as possible whilst making sure that institutions are appropriately recompensed for what they contribute.

The day finished with presentations on new directions for the NGS.  Keir Hawker went through what data services were on offer, with a range of options from Oracle through to MySQL.  Mike Jones then went through how the SARoNGS project was working to allow users who were members of the UK Access Management Federation to get access to NGS resources.

So, what were the key points to take away from the meeting?  I think they were:

  • Research is global and the grid offers a good way of working collaboratively within a trusted infrastructure.  It will be interesting to see how this ties into ongoing work on interfederation and virtual organisations in the identity and access management area;
  • The NGS has a great deal to offer the researcher and they are very keen to engage with active researchers to help them carry out novel research or to make what they do more efficient;
  • There are no doubt potential users of the NGS who could benefit enormously from using it so it is well worthwhile attending a training event or one of the e-Research Roadshows to find out more;
  • Whilst there are resources to try the NGS that are free at the point of use, this model will not scale infinitely so there need to be equitable models for sustainability;
  • There is a growing community of researchers from an increasingly wide range of disciplines but there still needs to be a focus on growing that further;
  • The institution needs to get involved in helping its researchers access grid facilities as more and more research is collaborative in nature.  This is not just providing access to the NGS but includes grid resources on campus so that researchers have a range of resources available to them;

All in all, it looks to be an exciting future for the NGS.  The next major decision point is whether approval is granted by JSR for the Phase 3 proposal.  My thanks to Andy Richards and the team at NGS for a great event and inviting me along and paying for my accommodation.

New Research Project: Privacy Value Networks

I spotted this on the Oxford Internet Institute Newsletter, which may be of interest to those looking at privacy and identity.

3. New Research Project: Privacy Value Networks
————————————————–

The OII is to lead the £2m Privacy Value Networks project: one of three awarded funding under the Technology Strategy Board’s ‘Ensuring Privacy and Consent’ research programme. It will investigate the way the public thinks about privacy and how organisations can model the costs and benefits of processing personal information.

Project website: http://www.pvnets.org/

Project PI Dr Ian Brown, said: “Privacy has become a major issue in the UK, with worries about the development of a surveillance society. We are delighted to have this opportunity to carry out research that will ensure businesses and government agencies fully understand privacy concerns, and can provide effective and efficient services that properly deal with them.”

The project will look at privacy in a range of contexts. These include creating a sensor-enhanced Facebook to help understand how students might share or restrict automatically gathered information such as their location, current companions and activity. Researchers will also investigate how families share this type of information using a new mobile phone application, and how it might be used to improve the lives of children and the elderly while protecting their privacy and autonomy.

The project will look at the government’s own use of sensitive personal information in the Identity and Passport Service, and how it is interpreted by staff and passport applicants. It will also work with financial institutions to design privacy-friendly services that reduce the financial exclusion of those with limited or damaged credit histories.

Given it is sponsored by the TSB, who are doing quite significant projects in this area then I think it is is one to watch over the next few months.  I feel it has some interesting tie-ins with projects such as FLAME and is going to provide useful input into the future work that JISC are  looking to do on identity.

Grant 10/08: Project to Develop an Identity Toolkit

This all sounds a little complex from the title above but I’m really looking forward to some good responses on this grant (started off as a call but has now moved into our new money issuing process so has a different name).  More details can be found here.

For those with quite long memories the background to this was to take up a recommendation from the Identity Project  and provide funding for the development of an identity toolkit that would help universities and colleges with putting in an identity infrastructure. It’s work that has been done at some institutions already so people like Cardiff, for example, have done a good deal of work in this area.  However, what this grant aims to do is to bring together that good experience and provide it all in one place so that everyone can use it either a little or a lot, dependent on where they are in the cycle of managing identity.

We’re hoping this is going to be a very useful piece of work as more and more institutions are joining the federation and having to address the subject of identity management as part of moving to using the federation to control access to resources.  Whilst it is not going to be a panacea it should form an important part of the future work on identity and access management that is going to go ahead over the next few years.

30-5-10

I visited Paul Walk over at UKOLN recently to talk about Shared Infrastructure Services (SIS), amongst other things, and one idea that came out of that discussion was the 30-5-10 idea.I’ll set a bit of background before ploughing into the idea itself.  Most projects at JISC do some really useful stuff that researchers, educationalists, developers and a whole range of other audiences can take and use for themselves  (in response to the cynics, we also do really useful stuff for the other projects that can’t necessarily be used straight away but it helps get us along the process to things that can be used ;-)).  The problem we often face is that the stuff we produce isn’t used because it might not be communicated in quite the right way  or the target audience may well not be aware of it.  As a programme manager that can get very frustrating because sometimes you see an alternative widget that isn’t as good that is being used simply because the project staff or organisation they are working for are better at promoting it. So, we come to 30-5-10.  It’s intended for software or services that can be quite easily demonstrated.  So, good core candidates are some of the SIS projects and projects like NaCTeM.   The idea is this:

  •  30 seconds to get across what your project or the service(s) within your project do.  This could be used at a JISC meeting, when you’re at a conference or wherever you meet other people that might be interested in what you are doing.  The reason for 30 seconds is that within that time you should be able to get across what your project or service does in a sufficiently compelling way that it piques the interest of those who may want to use it so they want to know more.  So, if we take NaCTeM’s Termine service, the 30 seconds could go something like ‘Termine is a service supplied by the National Centre for Text Mining in Manchester to extract meaningful terms from a piece of text or a corpus of texts that are submitted to it.  It uses advanced text mining techniques to ensure that those terms are very accurate relative to the area that the body of text was submitted from.  Termine also ranks the occurrence of terms.  Possible uses include automated metadata extraction to tag the articles submitted.’.  I’m sure that if someone from NaCTeM sees this they will have a few corrections but it gives you an idea of what you would say;
  • 5  minutes to outline how to solve a problem your audience have. So you have the person or audience’s interest.  What next?   You have a dialogue with them to understand how your widget could solve a problem they have, which makes what you have done relevant to them.  This involves actively listening to what they say so they spend more time talking than you do?  There’s  a lot on active listening on the web so I won’t try to cover it here but if you’re asking open questions like ‘What kind of things that you’re doing do you think my widget would be useful for?’ as opposed to ‘Do you think this is useful?’ then you’re onto a good start; try to ensure you’re not asking questions that have yes or no answers.  In my text mining example above, I’m a stressed new programme manager who hasn’t much time to understand the background to committee papers so term extraction helps me by pulling out the key terms that I can then research on the web, making me seem knowledgeable (well, more so than Sarah Palin 😉 );
  • 10 minutes to set up a quick demo that produces results.  Even if your service or project is quite complex and has lots of configuration options, you need to be able to have something a developer can integrate pretty quickly and 10 minutes is a good target.  My term extraction example above is to some extent a bit unfair in some ways; I can submit text online and get answers in substantially less than 10 minutes but it would be good if I could do that in a RESTful way, which I can’t currently;

So there it is.  I’d welcome comments from projects or others about how do’able or sane this is but please bear in mind that the whole premise behind this is to quickly get potential users to a point where they have experienced your solution and are interested in taking it further.  They are then likely to have the patience to get to grips with that SOAP interface or spend a little more time discovering the nuances of what you’ve put together.

TERENA NRENs and Grids Meeting, September 2008

Introduction

I recently attended the NRENs and Grids Meeting in Dublin, kindly hosted by Trinity College.  It gathered together a European audience of those involved in providing national education networks (hence the NRENs bit) and those involved in developing grid software and hardware.  The JISC interest in this event was that we are currently working on a number of projects and programmes with a grid related element (such as the e-Infrastructure programme and new work that we are currently formulating under the capital programme).

The programme for the event can be found here and the slides from the presentations at the event can be found in links next to the programme item.  I’ll not repeat what is on the slides in this blog entry; I’ll just point to the presentations of particular interest and comment on why I found that particular presentation interesting.

Day One – Grids

The first day focused on developments in grids.  The session on eduGAIN was particularly useful in covering how eduGAIN works; it’s quite a complex system but very effective so I’d recommend using the presentation as a 101 if you’re new to it.  Items of interest were that eduGAIN are going to be reviewing using Shib 2.0 and future developments also include non-web-based apps.  Both of these are areas that JISC is actively involved in so it would be worth following what is being done in eduGAIN.

The next presentation looked at easing access to grids via identity federations.  This was of special interest as we are currently involved in doing the same thing through the SARoNGS project.  This meant we had quite a lot to share with the group and after the coffee break Jens Jensen and I did a short presentation on what we were doing under SARoNGS, receiving some useful feedback and some good contacts to share software resources and use cases.  My feeling is that this is a useful area to link up with other European countries on as there are common problems that can be more quickly and effectively addressed through mutliple groups rather than one group on its own.  For example, we have an issue that the SARoNGS solution is constrained by UK Federation policy on passing a unique user name and sharing information between service providers, meaning it cannot be IGTF compliant and is a little less secure.  Norway has similar issues and we resolved to review what could be done in terms of a possible future change to policy that would allow a better technical solution and that would still meet the original goals of that particular aspect of the policy.  I also talked with Christoph Witzig of SWITCH and there is potential to work with them on aspects of MyProxy to make interoperability easier.

Authorisation developments in grids proved to be an interesting presntation by David Kelsey as it gave an insight into future work under EGEE.  The main messages were that there was a scaling back of funding for EGEE that has led to a great deal more focus on specific elements of the infrastructure that need to be tuned and that there was now an expectation from the EC of member states funding grid work.  The reduction in funding has meant that the technical work on middleware has been reduced and there has been a shift to focusing on the authorisation framework and an analysis of how authorisation could be more effective.  There is a broader desire to have a common policy for VOs, which would then mean that trust in them could be brokered in a similar way to the way it is in IGTF.

To wrap up the day, there was a discussion session on what we all felt would be important to address around grids.  The overwhelming part of the discussion focused on levels of assurance, something we have already looked at under the ES-LoA and FAME-PERMIS projects at JISC.   The overall agreement was that this is an area that needs to be addressed to allow new users onto the grid using a lower level of assurance, such as those with a federated ID as opposed to a digital certificate.  It’s going to be interesting to see what happens over the next year or so as members of the group grapple with this issue.  There was also some discussion on attracting more users and new users to grids.  It was generally agreed that we need to lower the bar slightly for those outside the traditional disciplines that use the grid (such as particle physicists and computational chemists).  Current initiatives in Europe would suggest that many have joined JISC in looking at how this could be done and have been succesful, SWITCH being one of the early ones with its IGTF compliant VASH and SLCS solution.

Day Two – Virtualisation

Virtualisation is something we have looked at previously under the NGS but the time was not quite right.  Day Two showed plenty of evidence that maybe it is time to go back to this area under the new round of capital funding to see what we can do.

Cloud Computing for On Demand Resource Provisioning looked at one potential method of providing virtualised resources in a grid environment.  The concept was to have  a virtualised layer to separate the virtual machine from the physical location.  Ignacio Martin Lorente explained how the University of Madrid was trialling using OpenNEbula to be able to do this and hence bring into use machines that had previously not been on the grid as well as allowing for burst traffic by using resources such as Amazon EC2.   I won’t try to explain how the whole thing works; it’s much better explained in Ignacio’s slides.  Setting up VOs on these virtualised resources can take as little as 20 seconds for a standard setup, meaning that environments can be set up and maintained easily without having to rely on being on a physical server.  Ignacio finished his presentation with a look at the RESERVOIR project under the EU Framework Programme , which is a 3 year 17m euro project to get a Next Generation Infrastructure for Service Delivery.  I think both of these projects have  interest for JISC and it was useful to have examples of how virtualisation could work within an institution and a broader initiative to get cloud computing working across Europe.

The presentation on the Challenges of Deploying Virtualisation in a Production Grid covered pretty much what it said on the tin.  Stephen Childs went through how Grid-Ireland had worked on having virtualised environments in their grid environment through open-source software called Xen.  He also covered the results of a survey he carried out to look at virtualisation.  The key points to come out were:

  • It is important to treat a virtualised environment in a production grid in exactly the same way that you would any other production environment.  Some of the virtual machines are going to be up for a long time so need patches, etc in the same way as any other physical server;
  • Virtualisation is gradually gaining ground and now there is a choice of VM software from commercial to open source, it is starting to become an activity that is being engaged in across European academic institutions.  However;
  • This activity is currently on a trial basis as people get used to what is involved in provisioning VMs as opposed to physical servers;
  • There has to be an awareness of where I/O is critical as Xen is especially weak on this at the moment, meaning a virtualised server may not be the best solution;
  • There need to be solid use cases for implementing virtualisation and it must be used appropriately.  The two main reasons for not using virtualisation in the survey were management issues and  performance;
  • A VM host does not behave in the same way as a physical host in all cases – there may be issues with compatibility even if the setup is exactly the same;
  • Monitoring is still quite flaky;

Finally, Stephen outlined how Grid-Ireland has used Xen to install, effectively, ‘grid in a box’, where institutions simply needed to host the box they were given and management was carried out by Grid-Ireland.  This was a neat solution for the institution but involved quite a lot of overhead for Grid-Ireland on management.

I thought this was a good presentation and Stephen is a useful person to talk with further about virtualisation (as further discussions over coffee proved).  He is going to look at putting the survey into a PDF format so that the results can be shared with others.

The remaining presentations covered physical infrastructure so, whilst interesting, were not quite as relevant to what we are doing in Innovation Group.

The final discussion covered future topics and certainly one that we raised was accessing data on the grid, which we are doing quite a lot of work on under the e-Infrastructure programme .

All in all, I think this is a useful group to keep in touch with as the topics they are addressing are ones that we are either currently working on or are interested in for the future.  The event provided a good opportunity to meet with others working in the same areas and share experience as well as get pointers to resources that we could use at JISC.

My thanks go to our hosts at Trinity College in Dublin, who worked very hard to make sure the event ran smoothly, with particular thanks to John Walsh for booking an excellent venue for dinner and being on hand to offer local knowledge (he even guided us back to the hotel from the restaurant!).

Ubiquity

There’s quite a lot of buzz around Ubiquity at the moment, which is probably most simply described as an attempt by Mozilla to take the mashup out of the domain of the web developer and into the hands of the user.  The product allows a user to create their own mashups without having to be fluent in web scripting and coding; all they need to do is install the appropriate client on their browser (currently Firefox only) and then type in what they want to do.

The applications demonstrated in the demo are fairly simple at this stage but it’s easy to see how they could have quite a lot of use in education to help take the drudge out of some common tasks and to open up what we’re doing about combining services.  So, as an ex social scientist I seemed to spend quite a lot of time combining stats together and then displaying them on a map; it would be great if a I had a ‘widget’ that would do that for me and take some of the spadework out.  That then frees me up to do a bit more of the interesting research that I really want to do.

Add a little more and it’s a tool that could become extremely useful.  It’s all built on an open source license so there is potential for Grease Monkey type extensions that allow further extensions.  We are slowly and painfully seeing the freeing up of data under Open Access and a revival in the citizen scientist as a result (see here) .   Then we have tools and standards such as OAuth and OpenSocial that are allowing us to selectively release data about us and permissions to help these services do something for us.

Ultimately, I think it’s worth watching what Ubiquity is doing over at Mozilla Labs because it could start opening up some mainstream avenues for really useful mashup tools that save the researcher and educationalist a lot of time and let them get on with what they’d like to do.

Verisign PIP

Saw this on TechCrunch today and was intrigued.  OK, you are effectively maintaining an identity vault but it further proves yesterday’s post that the bigger vendors are starting to get into identity metasystems, often in a variety of ways.  Given they want to see these succeed commercially then maybe this will be the year when identity starts to get a little easier rather than more complex.

The down sides for Verisign’s PIP(Personal Identity Portal) is  that it still seems quite US focused, you have to have an active browser session with PIP for it to work and there is a limit to which sites it will manage details for.

The up sides are that it works for most of the main commercial sites (such as Amazon, Facebook, LinkedIn), you can have two factor authentication if you so wish and it’s Verisign so they’ve got a good background in dealing with security and trust.

In sum, another useful tool in the armoury of identity for the educationalist and researcher, even if it’s not going to be somewhere to store your federation credentials or that digital certificate to get at Grid resources.

Call for Participation: OASIS OASIS Identity Metasystem Interoperability (IMI) TC

One of the latest calls for participation that came my way was this one for Identity Metasystem Interoperability.  I’ll fess up now and say this has been sitting in my inbox for a while waiting for me to have a look through it hence this entry not being quite as current as it could be.

Firstly, what is an identity metasystem?  A good definition can be found (as always) at Wikipedia.  In brief, an identity metasystem provides for a user to be able to manage their identity credentials all in one place.  So, if I’m a researcher and I have a digital certificate, a federation login and access to a wiki or blog through a user name and password, I can manage them off one interface instead of having to remember each set of details.

So what does this mean?  Well, we at JISC put out an ITT for some work looking at exactly the same area and its applicability to higher and further education last year.  We felt at the time that there was a great deal that could be got out of finding appropriate identity metasystems to manage identity for those in education and research as we’re all conscious of the ever-increasing number of identity credentials we get given.  We didn’t get any responses we could fund so it was put on hold until there was more capacity in the sector to respond.

OASIS’s move to form the group is worth a look because it’s showing a wider interest in getting this working after quite a lot of effort from Microsoft to promote CardSpace and infocards.   There is also the work of the Higgins project and Bandit’s DigitalMe and previous efforts such as at a Burton identity event to show interoperability between all these systems.  Is now the time when identity metasystems will start being used rather than just being shipped with one of the most-used operating systems?  I think time will tell and that users are taking quite a while to get used to this new thing called identity.  In the mean time, I hope that the TC on identity metasystems is a diverse one that reflects the needs not only of Microsoft but also of a wide range of users, including those in education and research.

Yahoo Fire Eagle Launched

Given the amount of buzz over ‘the next big thing’ in Web 2.0 (or are we now moving to Web 3.0?), which appears to be geo-location, it was inevitable that soon one of the bigger established players would launch a platform.  Hence Yahoo’s Fire Eagle didn’t come as much of a surprise when it launched.  As with all apps that take personal identifiable information, it lets you control how you manage your data and what you share.  In this case you can update the service with where you are and that can then go to other services such as BrightKite that actually use the information.  BrightKite’s probably a good example as it allows users to interact based on where they are and what they are doing and it also pushes location data back to Fire Eagle.  Sites like Dopplr are also on board so you can share information about where you will be that can propogate across sites rather than being trapped in one site.

All this is great for the average busy researcher.  I can see where my colleagues are (providing they’re subscribing; big ‘if’) and arrange to meet up or they can contact me.  The mobile phone service is especially interesting as it simply pushes where I am to my services and there is no need for me to do anything.  Suddenly my social network becomes a hell of a lot more interesting and I’m meeting new colleagues who have similar interests and are in the same location.

The downsides are the usual ones for personally identifiable information (PII).  I’m now not just giving up information on what I am interested in but where I am and if that’s being pushed out to a variety of services they have that information too.  OK, they can promise that they will delete that information when I ask them to and Yahoo are very good at giving the option of switching the service off when the user asks for it but that information is still out there in the public domain.   As we’ve seen recently with the Google/YouTube and Viacom legal case, once a user gives out their attention data into the public domain, it can have unexpected consequences.  In that case, attention data had the potential to become PII just by the sheer volume of it and the open-ness to data mining to create a unique profile.  Imagine what could happen with geo-location data that has far more potential to uniquely identify an individual.

All in all, though, I think that geo-location services have a great deal of potential in  higher and further education.  JISC now have quite an extensive geo portfolio and some of those services, such as Digimap, are already helping researchers whereas some others that are embryonic such as GeoXWalk are very close to providing a service.   Match up, say, GeoXWalk with a geo-tagging app such as FireEagle and location aware instruments and you can then start creating intelligent meta-tags for where data is created as well as when and with what. That could create some pretty exciting new research with derived data, license agreements permitting ;-).

First Look at Facebook Connect App

Facebook have published their first site that uses Facebook Connect.  Called RunAround, it allows runners to track their runs and involve their friends without having to add them manually to the site or fill out registration details.  It’s great to see a practical application for Connect and to also see some good privacy principals operating there as well.  A user has the option to register for the site and go down the site’s registration process or use their Facebook details.  A user then actively consents to release information (in this case one line stories) and brings the friends that they have on Facebook who have already registered with Run Around with them.  It’s early days yet so I’m watching for other applications of Connect to see how it all pans out and see how sites such as RunAround fair but this all looks promising for limited disclosure of information to third parties to help the user but not then breach their privacy.

Another related development is Twitter definitely adopting OAuth and Firefox likely to do so too (but straight into the browser).  With Twitter it will mean a much better way of allowing third party apps based on Twitter to carry out action on the user’s behalf without them having to hand over their username and password.  With Firefox it will allow browser apps to carry out actions on a user’s behalf, which opens up what we’ve wanted to look at in JISC for a while, which is n-tier authentication and authorisation (even if at this stage it looks like being at one level).

A more interesting question is around how people deal with these new capabilities.  We’ve already seen through the Identity Project and FLAME how identity is dealt with in FE and HE and how users’ attitudes to releasing personal information differs as well as their awareness of what they’re doing.  DPIE 2 revealed that most users would like to have useful tasks done on their behalf with their personal information, such as registration details being filled in for them.  In a world with technologies such as Facebook Connect and OAuth, whilst we have the technology to alllow users to retain their own personal information, do they necessarily know how to control this?  I think we need apps such as RunAround so users can get to grips with the technology on a fairly simple level and then do more as they feel more comfortable.  Hopefully we’ll then be in a world where the user doesn’t have to give up the crown jewels of identity and cede their username and password details to be able to do simple tasks such as registration.