Saturday, November 5, 2011

Exploring the Dutch security ecosystem in one day!

My activities with testing the UMA-protocol gave me a good insight in how companies specialized in identitymanagement deal with these protocols.
The funny thing is, I had not yet looked at how the IT-security companies look towards identityprotocols like UMA, OpenID and OAuth. Functional testing and document reviewing is one thing, but penetration testing (pentesting) requires a different method of approach.
When I found out InfoSecurity Benelux 2011 was going to take place in Utrecht I registered and attended this exposition.
Why? To find out more about the possibilities in the Netherlands to learn and practise pentesting.
Together with a mate of mine we spent a day exploring the Dutch security-ecosystem, ranging from network to antivirus companies. And more important, IT-security companies.
We visited stands, listened to keynotes and had valuable discussions with Dutch keyplayers in IT-security.
Starting with the stands, they were organized like any exposition, with the big networkcompanies like Cisco having the biggest stands and the IT-security companies the smaller ones.
Also, like any ecosystem, companies (read predators) were luring their customers (read prey) with goodies, lovely ladies (yes, I saw those too) or a F1-racing car experience (seen that before).
In half an hour both our bags were full of security-goodies and folders and we had seen some very good looking ladies (not only the promo-girls).
Then it was time for business: explore the pentest-community.
Companies like Fox-IT (remember the DigiNotar-blog), Madison Gurkha (lockpicking isn't my thing :-) ) and Dionach were on our list and they did not disappoint us.
We also found out a lot of pentesting certifiers were there, like the already mentioned Dionach with their TIGER-scheme, but also Certified Ethical Hacker (CEH)- certifiers (TSTC) and 'free' online trainers (Certified Secure).
It reminded me of the time when I visited the earlier testexhibitions where visitors were blown away with the newest testapproaches like ISEB, ISQTB, TMAP and TestFrame.
IMHO, every approach has its (dis)advantages, and a good pentester should have sufficient knowledge of these different approaches when needed. However, we have to start somewhere, so more digging in this type of certification-world will be necessary.
The afternoon was spent on listening to keynotes addressing recent security developments like the mobile banking facilities of a particular Bank, the security of social media and the history of PKI.
Very interesting stuff, and the presenters gave a clear insight in how they operate in their business with security.
Before we knew it, it was already 16.00 O'clock and exhibition stands were broken down. There was still one thing I had to do.
I had to visit the exhibition of CRYPSYS Data Security, a Dutch ICT Security Distributor for the Benelux with over 20 years of experience. And, more important, with a recent interest in my blog and tweets :-). So, I had to meet these people, although they're no pentestspecialists.
Not a wasted time, because CRYPSYS gave me a good understanding of how they do business and were very patient with my questions. A company for me to watch and learn from.

Then it was over, a few drinks and back in the train going home.
It was a very interesting day at InfoSecurity Benelux 2011, discovering new challenges, learning interesting stuff and meeting great people.
Certainly a follow-up for 2012.

Thursday, September 22, 2011

A one stop NFC testing shop

As I expected a few months ago when blogging about Google Wallet
and NFC mobile payments, companies would also venture on the further development and implementation of this specific payment product.
One of the companies I followed the last months is Collis, a Dutch company with many years of experience in management of introducing new payment products.

Because testing is an important asset of Collis, I immediately thought of them when exploring the testing of mobile NFC payments.
For clarity, I have no commercial ties with this company, only the enthusiasm for testing NFC mobile payments.
So, when following the news of the NFC World Congress I found out Collis launched yesterday a Mobile Test Center for TSMs (Trusted Service Manager), which enables NFC solutions to be checked for
compliance with specifications set by a wide range of industry bodies like MasterCard, VISA, but also the NFC Forum.
Not surprising, if you keep in mind this company does the same for checking creditcard compliance for the already mentioned creditcard companies, which also are huge stakeholders in the adoption of NFC mobile payments.
The NFC-TSM ecosystem is very complex and trust is here the key issue. If its infrastructure is not trustworthy, it looses its stakeholders and it will get destroyed (compare DigiNotar and the digital certificate ecosystem).
Collis could work as a one stop shop for testing of all components of this ecosystem and contribute to the trust of NFC mobile payments, which could enhance its adoption.

As a tester I agree with the method of my Dutch colleagues at Collis and I hope I can help them improve the quality and trustworthiness of the NFC mobile payment ecosystem.

Saturday, September 3, 2011

NFC-payments and PCI-compliance: a tester's adventure!

Summer 2011 is finishing, the evenings are getting shorter in the Netherlands, so time to start blogging again.
This time I was in a dilemma, or reporting about the fraudulent certificate Google-Iran DigiNotar incident , or about looking at how NFC-payments affect payments regulations and testing.
Well, because the former is just fresh and still very guessy, I will share my thoughts on the theme which intrigued me this summer: testing mobile NFC-payments.

So, where to start?
Why not first look at what testing methods there already are on payments, especially focused on security.
For 8 years now I'm in the testing business, mainly for financial institutions, and I saw lot of compliance rules come by. One of these is for payment cards: Payment Card Industry Data Security Standard aka PCI DSS.
Hey, this seems a good start to look for testing NFC payments with a contactless card or mobile phone.
Mind you, I never tested this way, this is, for the moment, just my theoretical view on how to test NFC-payment using the PCI DSS standard. And because it's a big quest, it will take some blog posts to finish it.
But what's PCI DSS and how does it relate to NFC payments?
First I have to find out what the purpose of PCI DSS is.
Its website says:

The PCI DSS is a multifaceted security standard that includes requirements for security management, policies, procedures, network architecture, software design and other critical protective measures. This comprehensive standard is intended to help organizations proactively protect customer account data.

Aha, OK and are there any testing procedures an organisation should undertake to be compliant with the PCI security standards and get its benefits?
Oh yes,both for PCI-solutions vendors and by all entities that process, store or transmit account data must be validated against PCI compliance, except, according to Wikipedia, issuing and acquiring banks.
For vendors ,PIN transaction security must comply with the requirements and guidelines specified in the following documents: a Device Testing and Approval Program Guide and the POI Modular Security Requirements.
The program guide reminds me of the Kantara Initiative Interoperability testing programs I saw last year, so this experience comes in handy.
As every testing program it describes the purpose, the testing process in overview and detail, and what to do if a security breach or compromise takes place. These are specialized security tests done by specialized evaluation labs like T-systems as seen on this list.
For organisations handling large volumes of transactions, validation of compliance is done annually, by an external Qualified Security Assessor (QSA) , or by Self-Assessment Questionnaire (SAQ) for companies handling smaller volumes like small webshops.
To avoid a SAQ, and lessen the burden, a webshop can outsource its creditcardhandling to a payment acquirer like PayPal. PayPal is the one who should be PCI compliant, as long as the webshop does not store, transmit, or process payment card information.
This shows how complex the ecosystem is and how stakeholders are affected by the PCI compliancy.
How does NFC-payments affect the relationship between PCI compliancy and its stakeholders in the creditcard industry?
IMFO, the primary change is the method of authentication by the customer, but the underlying technology to execute this, should be PCI compliant. This means the device enabling NFC payments should be PCI compliant (meaning a different annual PCI-compliance test for authentication for the vendor) and the same for the company or payment acquirer, if the creditcard handling is affected.
Visa is even eliminating the requirement for US merchants (European program already in process) to annually validate their compliance with PCI DSS if 75% of the merchant’s annual Visa transactions originate from chip-enabled terminals.
This is done to prepare the US payment infrastructure for NFC-based mobile payments. So, the NFC-stakes are high for the creditcard companies.
Not to forget, Mobile payments brings also a new species (and not a small 1) in the creditcard PCI DSS ecosystem: the cell phone company.
It should also be PCI compliant because it is a part of the processing (I haven't seen a cellphone customer of PayPal) and can also put the creditcard bill on the phone bill or via a NFC chip put in it like Visa’s payWave or MasterCard’s PayPass.

So, for a tester there is enough adventure in the creditcard PCI DSS Ecosystem. Different stakeholders, different chains and different tests to do. I look forward to it and will share my thoughts and experiences in this new ecosystem.

Saturday, July 9, 2011

A book review for a change: A clear look on Cloud Computing

A few weeks ago, Maurice van der Woude, Cloud Computing Evangelist and fellow Dutchman, published a book: Een heldere kijk op Cloud Computing, Een onafhankelijke gids voor aanbieders, afnemers en twijfelaars.
Unfortunately it's in Dutch, so my nonDutch readers could be tempted to stop reading this blogpost.
Understandable, but then you would miss my review in English.
So what's it all about then?
Maurice van der Woude, owner of Personal Consult, is a strategy advisor for (in)ternational (corporate) organisations and specialized in Cloud Computing, especially SaaS.
I met Maurice at the launch of EuroCloud Netherlands in 2009, where together with other Cloud Computing evangelists, he wants to enhance the cloud computing in the Netherlands and Europe.
The last 2 years we helped each other in letting the public get familiar with Cloud Computing and identified possible risks and solutions for this.
So, when Maurice published the book I bought it for a nice price and read it in about 2 hours, with a tester's viewpoint.
This book was established together with the help of Nobel and EuroCloud Netherlands.
It's goal: To clarify the obscurities around Cloud Computing and to be a quick-reference book for the Dutch market explaining in clear and practical language what Cloud Computing is and how we, as end-users, could use it. It is also a independent (!)referencework for the industry, who wants to use it in their business. The author stresses the book does not highlight the considerations of using Cloud Computing, this is up to the end-user. This I'm glad with, enough books and blogs are filling the newspool with cloud computing marketing, without anything to add!!

The following items of Cloud Computing (CC) are discussed in distinct chapters: Definitions; History; SaaS Models; CC and politics; Producers; Processes; CC and support; Sales; End-users; Data security and availability; Integration; Contractmanagement; Business or Technical ; 2 cases: Case 1: Foundation M: Crimefighting in the Cloud; Case 2: Be more yourself, an organisation with ambition in the Cloud.

As a tester I thought interoperability is also an important issue, but the author discusses this in the sales-chapter when describing vendor lock-in.
One issue here though, when I saw interoperability was not described apart in a chapter, I wanted to look for it in a index, but the book does not contain an index nor a glossary.
Perhaps because of the book its quick-reference function, but I still miss it as illustrated by the interoperability-example.
Like I said, I read it in 2 hours, the language is in clearly written Dutch and the use of jargon is avoided, or explained if necessary (eg. the different service models or the use of an SLA or laws around Cloud Computing)

It's a real quick reference guide, explaining without going to much in to detail, holding the reader focused on the subject. An issue here is that if something is explained the source is not always mentioned. Or does the author refer to the source list on page 1? But this is only a list, not an index of footnotes. This means the reader is forced to look for further reading on his own. A mental note for next time perhaps.

It's written for Dutch industries/organisations, but also mentioning the USA, especially with laws (Safe Harbour Principle). Companies are clearly not mentioned by names or it's compulsory like Apple's graphical interfaces in the eighties and the examples of the economical power-position of for instance Microsoft.
Written with a strategist's view, it gives people guidelines in using Cloud Computing like the checklist for avoiding vendor lock-in, without being too positive or too negative, it stays independent.

The book ends with the illustration of the use of Cloud Computing by 2 distinct and interesting cases, 1 in a business environment, the other 1 in a nonprofit environment, highlighting the possibilities of Cloud Computing in these 'distinct' environments.

After reading this book, I had a better understanding of Cloud Computing and the use of it in the Netherlands. It's written in a very accesible language, although I miss a detailed sources-list, a glossary and I found a few spelling-mistakes (the author may ask me for them :-) ).

A must-read for any Dutch business- or IT-professional interested in the use of Cloud Computing.

Thursday, July 7, 2011

UMA webinar 13 July: the draft specs, we got them!

Kantara Initiative's User-Managed Access Work Group aka UMA WG has announced the release of draft specifications for the UMA protocol.
UMA heralds a new era of user-centric access control for web-based applications such as social-networking sites, content-sharing portals and personal data lockers.
The UMA WG will demonstrate the capabilities of UMA in a public webinar on

July, 13, 09:00 PDT / 12:00 EDT

All are welcome to attend.

Register for the webinar and find out more at the UMAWG-homepage

Follow the group at Twitter: @UMAWG

And FaceBook: UserManagedAccess

We UMAnitarians hope to see you at the webinar!

Monday, June 6, 2011

Wave and Pay, your money away: it's the device that counts!

Infosecurity notes, that the UK banks are rolling out PayWave and PayPass across London in preparation for the Olympics next year, when hundreds of thousands of visitors from around the world – many from Asia where NFC payments are commonplace – will visit London with their cards. Next to this, Telefonica O2 also announced plans to launch a mobile wallet system using NFC technology.
Hm, last week it was Google Wallet. By the way, Google Wallet links with MasterCard, O2 with VISA Europe.
Still, I'm scarier using a mobile wallet-app than an NFC-enabled creditcard.
Why?
Wave & Pay with your creditcard is different in security than Wave & Pay via your smartphone app. Both creditcard and mobile wallet-app use NFC-technology, but your smartphone is, contrary to your creditcard, used for Internet browsing or accessing other data and applications and therefore is at significantly greater risk for exposure to malware.
What then if you let the software encrypt and transfer the data. According to Ira Winkler, president of the Internet Security Advisors Group , it's like putting an airbag on a motorcycle, the airbag (the encryption) may protect, but lots of other things can go wrong.

IMFO, mobile NFC(!)-payments at this moment are of higher risk than paying cash, creditcard or via your bankcard.
All because the underlying device, the smartphone is still not secure enough for these financial transactions. Just look at the Android infections in the beginning of this year.
Then again, European banks, like Rabobank and ABNAMRO work already for years with mobile payments.
The USA should work more with their European counterparts in the security of mobile banking (banking and phones), then perhaps a secure app can be made, although even the secure element in the Android is susceptible to reverse engineering.
Could it become a dream or a nightmare. Time will tell...

Sunday, May 29, 2011

Google goes NFC payments, oh la la!

When I was at the IIW12 a presentation was given about the changing landscape in payments and banking.
PayPal was giving the creditcard companies like VISA and MasterCard a hard time keeping customers for their online payments via creditcard.
Why use expensive creditcards when you have PayPal?
But the creditcard companies try hard to keep their 'beloved'customers.
How? Well, they add NFC-payments to the creditcard-landscape.
Users can pay for goods using NFC-enabled devices, either NFC-enabled phones with stored data that act as a debit/credit payment card (example follows soon) or NFC-powered contactless payment cards they touch ('wave') to readers like VISA's payWave.
However, American Express did not want to wait for the NFC-enabled devices and, in March 2011, launched "Serve" an app that turns a desktop, mobile phone, and Facebook account into a virtual wallet. With Serve, customers can send and receive money, pay bills, or make digital purchases through a cloud-based peer-to-peer network.

Hm, lots of new online payment-products, and lots to say about security and privacy, but when I was making this blog Google came with an anouncement.

All this NFC- and mobile payment in the cloud also triggered Google to get involved.
So, 26 May 2011 they launched Google Wallet (duh!!), together with Citi, MasterCard, First Data, and Sprint as their partners.
Hm, MasterCard already had PayPass ,but why not partner with Google to use it's NFC-enabled Nexus 4G?
Nothing new concerning NFC-telephones, if you look at VISA's efforts, and the ISIS-project, but now Google is involved. OK, Google has its Google Checkout, but is now also into NFC-payments. This was for Sprint the call to join Google wallet and not ISIS.
Also important, because the NFC-payments adoption is in Europe higher than in USA: Dutch public transport already uses a NFC-enabled card, comparable to the U.S. ORCA-card, which I also saw in San Fran.
Heee, but was the Dutch OV-chipcard not already hacked way back in 2008?
That's why I was triggered when I saw the creditcard companies using this technology!!
Even, if Google and financial institutions are involved in the NFC-payments network, I'm still cautious, because of my experience with the OV Chipcard.

Why I am cautious I will discuss in my next post(s), where I will look at the security-issues related to using NFC-enabled devices for payment,by card or by mobile phone.

Saturday, May 14, 2011

Internet Identity Workshop 12: seen by a Tester

A week ago the Internet Identity Workshop 12 took place in the Computer History Museum in Mountain View, California.
Three days (3-5 May) listening to and discussing the latest trends in Internet Identity protocols, enterprise identity management etc. from a user-centric view.
Boring, no way!!
First of all, it wasn't a normal conference, with fancy presentations and the audience neatly listening and asking questions afterwards.
Nope, this was an unconference, where every day at the beginning the schedule is made of people who want to discuss or present thoughts on user-centric online identities.
This agenda can then be viewed on a big wall in the centre of the conference hall, which I thought was a very good and pragmatic way to schedule the proposed sessions.
Well, time to get dirty I thought, and the first day I already hosted 2 sessions , 1 on security measures for identity protocol flows (always nice to test those :-) ) and also the pros and cons of using OAuth in online banking (you never know in the future).
Very nice sessions where I could discuss my thoughts as a tester with identity experts from different industries, like telco, finance and computer hardware.
However, I wasn't here only to gather info,together with XMLgrrl (the 1 and only :-) ) and the guys from Newcastle Uni. (great to see ya folks!),I did a little PR for UMA, which was very effective, because UMA was also spoken in sessions where UMAnitarians were absent :-).
Next to this, The Newcastle Uni. guys did a kick-ass Ipad(!)demo of their SMART-project. Great stuff to see.

But wait, there is more. I saw sessions about companies wanting to become a relying party, identity-policies between US and Europe, personal data stores, online vaults and many more.
And not to forget the Trust Frameworks, which are being developed for different industries, and have complex flows to test.
For a bloke from Europe, the sessions about NSTIC were very interesting to see: what does the US-government want to do with the trusted identities in cyberspace?
Thanks for the helpful info there guys. It made things clear about how the Americans want to deal with identity in cyberspace, although not every attendee agreed, which made a nice discussion.

I could go on and on about the IIW12, but I want to keep my blogs short.
I had a great time, learned a lot and it's encouraging to see the IIWs are also already taken place in Europe. A great way to stay updated on the work in user-centric identities, which are getting more important every day for everyone involved in internet development..

Any questions about the IIW? Just send me an email or call me.

So, my Silicon Valley Trip (and San Fran ;-) ) was fantastic, let's see where my next adventures will be.
Hmm, perhaps Hawaii??

Thursday, May 12, 2011

The Status Quo of OpenID development

Preceeding the IIW12, the OpenID Summit took place at the World Headquarters of Symantec in Mountain View, California.
Considering my prior interest in OpenID and its future layerment on Oauth 2.0 (next to UMA !!) I was very interested in the Status Quo of OpenID development.
This Summit, presented by the OpenID Foundation, as part of a 2011 series, focused on 'Balancing Security and the User Experience', very interesting for me as a tester.
Through 4 sessions (3 panel discussions and 1 presentation) the attendees were stimulated to think about and discuss the present state of OpenID, the changing authentication protocols, the best practises and also the monetization (making money) of identity without traumatizing the customer.
Especially the latter is important, because of the adoption of OpenID and other identity protocols by enterprises and governments. No business Case means no assurance of a possible Return on Investment, resulting in NO adoption by enterprises or government.
A Business Case alone, is in my opinion, still insufficient, because if the OpenID protocol is crap, no customer wants to buy it.
Well, you might guess what my question was: Why not involve testing in the OpenID development lifecycle from the beginning, the specs, to improve the quality?
After all, I have done this for the UMA-protocol last year, and the UMAnitarians are very happy with it. Reactions to this from the OpenID Summit were positive, let's see what happens in the coming weeks.
But let's get back to the OpenID Summit. I won't give elaborate descriptions of how the panel discussions went (see the link above for more info and the panel members), but I will highlight some.

The first panel, chaired by Nico Popp, our Symantec host, discussed the changing authentication protocols like strong authentication, One Time Passwords (OTP), PKI (-smartcards), but also identity proofing, biometrics and risk-based authentication (especially banking!) were addressed. Next to this the different levels of authentication were explored.
I thought it described the evolution of authentication protocols and easy to follow if you had some knowledge of authentication.

The next session was done by The Oauth-pro's: Mike Jones, John Bradley and Nat Sakimura.
They gave us an insight in the Status Quo of OpenID development.
Next to the work done on JSON and JWT chain representation, especially OpenID ABC framework and OpenID Artifact Binding were discussed. Vey nice, because, that's what I came for.
Regarding the rapid development of mobile phone authentication, more use cases will be made to extend the OpenID development here.
Well done guys, I'm up to date again on the OpenID development.

The third session was all about best practices and chaired by Eric Sachs from Google.
Especially the authentication of web 2.0 apps were discussed and especially the minimal scope of the parameters of a ID check. I think they were Name, Email and Photo.
Also the combination of OpenID and the HTTPS-protocol to ensure a secure exchange of data. Facebook, for instance, now gives its customer the opportunity to use this protocol.
But still a lot has to be done here to ensure a good functioning of the OpenID protocol.

The last session, hosted by Don Thibeau, features investors interviewing technology leaders about investing company money in identity and technology leaders interviewing investors about venture investing in identity companies. Bottom line here was, is there an investment opportunity in Identity management or online privacy: NO.
It still needs a well defined business case and certainly won't give profits in the short term, although these aren't excluded in the long term.

Well, that was for me the OpenID Summit May 2011: I learned a lot, had good pizza for lunch and went home with the feeling that OpenID development is ongoing, although it needs a good business case and a critical look from a tester's point of view.
OpenID Foundation, Symantec and Google, thanks for a great day!

The next blog will highlight my days at the Internet Identity Workshop 12 last week.

Monday, May 9, 2011

What's next in online identities? Cordny in Silicon Valley: a blog series

Last week, on invitation by PIMN, and with 4 other invitees, I spent a week in Mountain View (Silicon Valley, USA) visiting groundbreaking events on the development of online Identity and Access Management. These events were the OpenID Summit and the Internet Identity Workshop 12.

The next days I will describe my point of view of these events.
Separately, because both events, although related to each other, have distinct goals and attracts a different crowd.

I thank PIMN and the organizers of the events above for the fantastic and educative time I had in Mountain View, and look forward to see and work with them again in the future on other events.

In my next blog I will discuss my view of the first event which took place on Monday May 2nd 2011, the OpenID Summit.

Sunday, March 20, 2011

UMA meets EEMA

Lat week I was on a PR-mission in Leuven, Belgium at the EEMA eID interoperability Conference.
Together with 2 members of the Kantara inititative I presented UMA to the EEMA-delegates, investigated the possible use of UMA as a part of the eID (electronic ID) and the possible cooperation of Kantara and EEMA. We succeeded in all.
This EEMA conference was organized to discuss specific areas of importance in the digital identity arena and exchange ideas amongst its delegates.
This year it was mainly about Industry,Business and Administrations dealing with privacy, which was for me not surprisingly given the enormous amount of attention paid to this difficult issue the last year.
Companies like SafeNet, Verizon, IBM and CA shared their vision and solution for eID-issues, while institutions like Novay and the Fraunhofer Institute gave insight in their e-ID-research.
Administrations were also represented by different countries ,EU-consortia and agencies(eg. ENISA, STORK, SSEDIC), giving the conference a diverse crowd consuming the latest intel on eIDs.
And in this crowd I was present with my UMA-session, which was well-received by the delegates and new fruitful contacts were made.
UMA was a bit of an outsider, because most issues dealt with authentication, in contrast to the authorization-protocol UMA.
However, UMA is user-centric,and interoperable, so much discussion was about its use in trust frameworks between authentication protocols like OpenID, SAML and other authorization protocols like OAuth. After all, in an enterprise it's very important if you know if the person sharing data with you online is really that person (authenticated) and also is authorized by his company to share these things. Missing both functions makes this person useless to you, costing only time, effort and at the end profit of your enterprise.
With UMA you have a 'doorman', dealing with the sharing of your data with 3rd parties, relieving you from the hassle of doing this yourself.

Together with my fellow UMAnitarians I look forward to future implementations of UMA in online identity-solutions build together by industry, business and administrations.
All in favor of the person UMA is build for: the user who wants to control the access of his online data!

Sunday, March 13, 2011

Feeling like Marco Polo

The last few years understanding the process of online identities were like an adventure to me.
Sometimes I feel like Marco Polo, although he explored new countries and trades, my mission is to explore and test new ways people can share their online identities and resources.
Like Marco Polo, I meet extraordinary people like UMAnitarians, OAuthians and OpenIDealists.
As Marco Polo had to master his Chinese to understand his new companions, I have to learn XML, JSON, HTTP and different webprotocols to understand my new companions.
That's why I'm grateful people like XMLGrrl , Identity Woman and many more guide me in this exploration.
This week I will do some UMA-trading in the form of a session at the EEMA eID interoperability conference in Belgium and in May I will explore, together with a Dutch team, the Internet Identity Workshop 12 in the USA.

Like Marco Polo, I'm and adventurer and tradesman, maybe Google Circles will be my next quest. Something I have to talk with the UMAnitarians about soon.

Sunday, February 27, 2011

A Tester's perspective: Privacy in Design by Microsoft

A month ago I promised to blog about privacy solutions the cloud vendors apply at this time.
This post will discuss Microsofts efforts in handling privacy.
When googling for Microsoft the first hit's a bullseye.
A portal about how Microsoft deals with privacy issues and links to relevant information, ordered in a structured way. Regarding usability,a good start.
A portal is nice, but does it have info about how Microsoft deals with privacy issues?
Privacy by Design is a hot topic in the privacy community and also organized in Microsofts business, in both development and operation.
Bold words, but how is this done?
First, Microsoft deals with Privacy by following the Microsoft Privacy Principles, which address Accountability, Notice, Collection, Choice and Consent, Use and Retention, Disclosure of Onward Transfer, Quality Assurance, Access, Enhanced Security, and Monitoring & Enforcement.
An example of the use of these principles is the link Privacy
available at the Windows Live Hotmail-site.
Wow, Privacy Principles, but who assures me, the user, these principles are lived by Microsoft?
Microsoft's Chief Privacy Officer (CPO, I just love those abbreviations), is responsible for managing the risks and business impacts of privacy laws and policies.
The CPO and his team had a great influence on the new Microsoft's U-Prove (former CardSpace) and the Tracking Protections in IE9.
OK, Microsoft is concerned about the user's privacy, are there any negative sides to its policy?
Well, you could say the long development and at the end elimination of CardSpace in favor of U-Prove, but is this privacy-related? The Geneva-project was, IMHO, always a bit mysterious, but when Credentica was bought by Microsoft in 2008 things started to make more sense. Then it's more an issue what to use for identity control and if it's usable?
Believe me, I have enough experience with software projects where the architect says his design is flawless, but that during end-to-end-test the software its performance is just plain lousy.
Another reason to involve testers at the beginning of a project.

Concluding,Microsoft commits itself to privacy, but it's still an evolution of development and process, do not expect miracles!
People at Microsoft are also just people.

Sunday, February 20, 2011

Got the flu last week, what did I miss?

Last week it was my, once in two years, out-of-the-office-because-of-the-flu-week.
More simply said, I was bugged :-(.
No worries, I'm back on my feet and now I'm looking what I missed out on testing, SaaS, security and identity last week.
Fortunately, my fellow bloggers weren't ill and could produce a daily/weekly news for me, like Frank Wray's Identity in the Cloud Weekly,Christophe Primault's The GetApp.com Daily, EPA's blog and Jaap Kuipers his PIMN. Great stuff guys, saves a lot of Googling.

If I would exclude testing now for keeping it short, what did I miss out on SaaS, security and identity?
Well, one nice thing to mention on SaaS/Cloud computing is a webcast Maurice van der Woude, general director EuroCloud Europe, gave on Brighttalk about Managing Hybrid Clouds from a Supplier and User Perspective. Here, next to explaining what a hybrid cloud is, he also discusses the interoperability needed in a hybrid cloud and the privacy issues. A very informative talk, which is suitable for both business and tech-pro's.

Going further to security, well, the biggest news was the RSA-conference held in San Francisco, attended by some of my fellow UMAnitarians and also PIMN-members.
For UMA, Congratulations to the SMART team for their win of an IDDY award in the Proof of Concept category from Kantara for their UMA development work! This is good news for a possible adoption of UMA by the industry.
Another interesting RSA-item to mention is the panel-discussion, co-led by Ikuo Takahashi on Legal issues occurred by international cloud computing. This means, cloud computing is more and more seen by policy-makers as something to happen and legal issues must be attended. It now only depends on how this policy will be governed, and on what geographic scale: globally or per country?
Mr. Takahashi, thank you for your feedback on my questions to this, it gave me a lot of insight, which I will further explore the coming weeks.

So, this is just my humble view of last week. One week knocked-out by the flu, but luckily I can rely on my fellow-bloggers, as they can rely on me, to keep the news posted.

Sunday, February 6, 2011

USA responds to the changing EU Data Privacy Directive, where's Asia?

Last week I blogged about that the EU Data Privacy Directive is going to be changed in response to the adoption and development of Cloud Computing.
IMHO, I thought the USA couldn't lag behind and I was not surprised that the NIST , the U.S. National Institute of Standards and Technology, has issued two new draft documents on cloud computing for public comment, including the first set of guidelines for managing security and privacy issues in cloud computing. Next to this, NIST has developed a Cloud Computing Collaboration site on the Web to enable two-way communication among the cloud community and NIST cloud research working groups.
So, it seems both USA and the EU are initiating efforts to guide the secure adoption of cloud computing by industry and consumer.
Now, I'm wondering about one thing, compared to Europe and the USA, what are the Asian countries doing to guide a secure adoption of cloud computing?
For a testpro like me it is very nice guidelines are being made for the 'Western' countries, but a lot of the 'cloud' is build in the 'East', so this I can't neglect.


Asia is not unified like Europe or the USA, so government guidelines here are not easily made for the many different countries forming Asia.
Private consortia like Asia Cloud Computing Association (see Europe's EuroCloud ) have been developed. But wat about the Asian governments, are they making unified guidelines for Cloud Computing?
John Galligan, Microsoft Asia Pacific's regional director for Internet policy, discusses this, with an emphasis on Singapore, on futuregov.asia and zdnet.asia.

Challenges there still are, one of the sentences made here I want to citate:

'One significant concern regarding cloud technology is the uncertainty over the location where data is stored and how strong data protection is to safeguard against criminal intent.'

This is also the case in the Western world, and as in the West, secure IT-auditing by the Asian governments and private sectors is necessary to test the security of their continuously innovating IT-infrastructure.

Galligan also says :"It's very interesting when people start to look at reliability, the level of redundancy and individual's access to the system, it can move decision makers to understand that maybe their current infrastructure is not as stable and secure as they think it is."

OK, it's a response from an employee of a private firm, but, IMHO, this is the single problem now with Cloud Computing, only with tackling these risks of reliability, redundancy and access, policy makers all over the world can be moved to adopt Secure Cloud Computing.

And that's a mutual challenge for all global parties involved in Cloud Computing: Business, IT-auditing, development and test!!

PS:
I'm no expert on Asian law, this example of cloud computing in Singapore does not have to be the case for other Asian countries, it only wants to illustrate an Asian response to Cloud Computing

Saturday, January 29, 2011

Dealing with privacy in the cloud: the European Data Protective Directive

Yesterday, Friday 28 January 2011, it was Data Privacy Day, an international celebration of the dignity of the individual expressed through personal information.
What a coincidence, the day before I was invited by my dear friend Paolo Balboni to take part in "The Expert Panel on Cloud Computing and the Protection of Personal Data". Considering my critical attitude of a tester towards software and the knowledge of user-centric webprotocols like UMA and OpenID Paolo thought I should have my say here.
I had to be in Amsterdam for another meeting, so I gladly accepted the invitation.
What's it all about then?

The Istituto Italiano Privacy (IIP) together with the European Privacy Association (EPA) have organized "The Expert Panel on Cloud Computing and the Protection of Personal Data"
The IIP together with the EPA published a working paper titled ‘Cloud Computing and the Protection of Personal Data: Privacy and the Global Web, Risks and Resources for the Citizens of the Internet’.
IIP and EPA are aware of the on-going debate on privacy and cloud computing in the Netherlands. Therefore, they want to share their pan-European experience on the matter with the panel and learn about the Dutch experience.
Through presentations it became clear both IIP and IPO want to make a position paper, based on the input from the panel and their working paper to address the issues of all parties involved in Cloud Computing and Privacy in Europe.
This is a very hard nut to crack, because the European Community consists of many different countries with different laws and different privacy regulators.
However, there is the Data Protection Directive (off. Directive 95/46/EC on the protection of individuals with regard to the processing of personal data and on the free movement of such data), a European Union directive which regulates the processing of personal data within the European Union. All members of the European Union must follow this Directive and implement it in their Privacy Policy.
But what happens when a non-European Community cloud provider is not following the Data Protection Directive? Can he be caught?
No, he can't be caught if the cloud provider, as a data controller, is not based in Europe and not using equipment in the EU.
Hm, data controller, what's that and are there other data parties?
A data controller, according to the Data Protection Directive, is the one who determines purposes and means of the processing of personal data (art. 2d) and there is also a Data processor, who processes personal data on behalf of the controller (art. 2e).
See where I'm going? In cloud computing it remains quite unclear who's the data controller and processor, and the Data Protection Directive is not clear in this yet.
Another privacy issue addressed in the panel discussion is the transfer of data outside the EU.
A EU-customer has no idea or control of where its data is located and fears its data subject rights are not guaranteed.

These are privacy issues to be dealt with.
Therefore the Directive 95/46/EC is under revision to address also the issues of Cloud Computing.
ENISA published a study recently, dealing with the legal and security issues of cloud computing and the CAMM project will deliver in 2011 a new business barometer for the quality of the security profiles of the Cloud Service Providers.

And then there will be the IIP/EPA Position Paper, aimed at addressing concrete data protection issues and suggestions of solutions for a sustainable privacy-friendly cloud framework.
Input from cloud vendors is very much appreciated here.


Interesting times ahead for who's interested in the protection of personal data in the EU.

This post was mainly about solutions for privacy in policies, my next post will be about the privacy solutions the cloud vendors apply at this time.

Sunday, January 16, 2011

Testing UMA means testing controlling an individual's online data by himself!

One of the reasons I joined the UMA-WG, was that I wanted to be involved in a project right from the specs and not when it is time for systemtesting. Next to that, the concept of UMA fascinates me and worth making me sweat!
The active discussions we have about the testability of the specs inspire me to improve my work as a systemtester.
The implementations of UMA can be in legious domains: enterprise, government, education, e-commerce etc. etc.
This makes it a project where IT-architects from different domains can work together making user stories and use cases and improve this user centric authorization protocol.
Yes, we also have OpenID and OAuth, but, IMFO, OpenID is for authenticating the user and OAuth for authorizing it.
UMA let an individual control the authorization of data sharing and service access made between online services on the individual's behalf, as a layer on OAuth. It doesn't involve the authentication, but is very much dependent on OAuth and its possible changes, which are very much monitored by the UMA-WG.

A few years ago I started this blog, because I wanted to share my thoughts on testing SaaS and identity. The latter, because, IMFO, testers were mixing up authentication and authorization, which is disturbing, because it are important elements of web2.0, online user-interactivity.
With OpenID I started, but UMA drives me more because it is fresh, very user-centric and can be interoperable with OpenID through OpenID/AB, melting two of my favorite testsubjects (authentication and authorization) in one.

I wait for the day I can test an online user-interface (say banking :-) ) where an individual, with the help of the UMA-protocol, can control the data he or she wants to share with third parties, on the individual's behalf.

Something worth sweating for!