From Russia, with data…

Recently (Source: BBC News) it was reported that Russia are seeking to pass new laws requiring data about Russian citizens to be stored within the country, rather than in datacentres in the United States “where it can be hacked and given to criminals” (quoting MP Vadim Dengin).

At first glance this seems to be a relatively ridiculous stance to take, flimsily disguised as an attempt to protect the data of Russian citizens when actually many skeptics believe this is more about control which could lead to Russia becoming the next country with an iron-curtain firewall – much like China has operated for years. A key question is how will they enforce this in any way that would benefit Russian people?

Irrelevant of the motivations behind this move, there are potential implications for digital practitioners that need to be thought about going forward. For a start, if there is any possibility that a Russian is going to use your application and requires storing any data then that database will need to be stored in Russia. A shrewd move if Russia plans on creating datacentres, but from a practical point of view would the rest of the world want their data stored in Russia?

One option would be to develop the system so that anyone based in Russia has their data stored in a Russian version of the database. But let’s be honest, it isn’t really practical to go down this route. Where does it end? Do you have a database for each country that requires one?

At the other end of the spectrum is the consideration that you have to rule Russian customers out of your experience if they have to do any sort of account creation. For some sectors that may not be a concern. The Google’s, YouTube’s and Amazon’s of the world may decide this is a risk worth taking. But what about the investment sector, for example? Russia has a lot of wealth and ruling them out could be a big problem. Similarly, research becomes a lot more difficult. For an entity trying to undertake surveys Russia may be a key demographic but this may well rule them out of being included.

What is the reality? We think that this is likely to be a very hard thing for Russia to police and most likely they really are only targeting big companies. The only real way to enforce this is that Russian internet access becomes locked down in a utilitarian move to “protect data”, but which would actually be severely curtailing Russian freedoms online. If this is the case then any company serious about having a presence online in Russia would have to have a Russian version specifically for the purpose. Instead, what will most likely happen is that businesses will turn their backs on Russia and so we won’t need to worry anyway.

Internet World 2014 – An agency owners viewpoint

(As featured at

I recently attended the Internet World expo at The Excel Centre in London. This was a return visit for me but the first time I have attended as the Managing Director of an agency, so my aims for the day had somewhat changed. Previously, I had found the expo to be a very useful yardstick for what is important in the Digital sector and what the current trends of focus are for both clients and agencies. It has also been very useful to look at what other companies are developing in terms of products and to align these with our own efforts. Most importantly, in the past as a project manager it has allowed me to see what is at the cutting edge so that I can offer consultancy to clients about what they might want to consider for their own strategies.

This year I had new questions to answer, primarily would Internet World be a good platform for us to showcase our agency and our new CMS product. Additionally I wanted to look at what the key development trends are in similar technologies to our own, to make sure we are competitive and focussing in the right areas.

In my previous visits to Internet World I had found a good balance between information and sales opportunities, with a wide variety of providers to talk with, product demonstrations to view and seminar talks to attend. Overriding all of this, previously there seems to have been an overriding focus each year that brings the whole show together, with exhibitors complimenting this and seminar talks designed to further develop on this theme.

Attending this year with my new MD tinted glasses on was a very interesting experience. In looking to answer my key questions I first started by browsing the exhibitors. The first thing I noticed was there seemed to be fewer than last year, but the other thing I noticed was that quite a few of the comparable agencies to Siteset that I’d seen the previous year were absent this time around. This could be for a number of reasons, but one reading of this is that Internet World did not serve them well in terms of new business leads.

Looking to the product side of the fence, there were the usual big names this year. Sitecore were present, although notably with a more modest stand than they had last year. In fact in general the larger players seemed to be more confined, whereas last year a number of the stands were a feature in themselves. This could have been due to the more confined space of this year’s expo.

In terms of CMS products (one of the products our agency offer), there were really only three present on the day; Sitecore, Kentico and Cantarus (although other proprietary offerings were also there). Last year there were many other CMS products of varying scales on show, and I again noticed a lack of PHP technology in this area. Much like the failure of some agencies to have a return visit, it seems that perhaps this is not the right forum for emerging products to try and gain a foothold.

A new thing Internet World has worked on is the networking areas and this year this included a pitch area. Whilst this was an interesting listen, I think I got unlucky though in that I seemed to always be there when the people talking hadn’t rehearsed beforehand. Similarly The Marketplace had a pitch session, but the area was monopolised by people who wanted to use the beanbags as a place to catch up on emails rather than actually listening to the speakers and so the focus was rather lost.

So what about this year’s seminar talks? This is one of the key ways for attendees to pick up on the latest trends and see what is happening at the cutting edge. In the past I have found this is a really good way to see what other agencies are doing with their clients and products. Citrix was no exception, talking about their work towards the Internet of Everything. Whilst not particularly accessible to smaller agencies like ourselves it was fascinating to see what the future will hold and start to think about how we might generate ideas for products that will fit within that. It was a real example of work right on the fringe. But this year it seemed like this was a bit of an exception to an otherwise rather tame rule. A lot of the talks seemed to be very marketing based, particularly those by agencies, where most of the seminar was dedicated to selling their services rather than actually focussing on the work. This was, in large part, a contributor to a day that seemed to lack the seamless thread pulling the whole show together that previous years have had.

I realise that there is an element of the hypocritical about the way I am assessing the show this year. On one front I am asking whether an agency could go there and generate leads but on the other front I am criticising it when agencies do that very thing in seminars. But that is one of the fine lines that Internet World needs to draw and that some seminars achieve very well. Seminars provide a good insight into the work, which then subtly promote the agency behind it. But when the talk itself becomes about promoting the agency then all else is lost. It was notable how few questions were asked at the end of each talk, possibly a reflection on people being less than keen to be sold to, or perhaps even having lost interest.

So did I answer my questions? If I’d asked those questions at last year’s show I would have come away clearly thinking that this year we should have been there ourselves, showcasing our work and our product. But this year the focus had shifted. The smaller product owners were not there and the talks were not at all focussed on quality content. Instead the seminars featured people both trying and failing to sell their brand or talking about topics that were not particularly new, unless they were so far ahead of the game that for most it was just an interesting look into the future.

My questions remain somewhat unanswered at this point, but I am no longer convinced that Internet World is the place to learn about the latest tech and approaches. I’m certainly not convinced that it is a good platform for an agency to generate leads.

Tablet adoption and Silver Surfers…

(As featured at

In a previous blog post we looked at how Millenials, those between the ages of 16 and 24, are accessing the internet more and more through their phones. It is now the case that mobile internet access (although not browsing) has surpassed desktop (Source:, and from the data we featured in our previous blog it is clear that this is being driven by the Millenial and Young-Adult (25-34) groups. The latest data shows that internet usage through apps is growing, with tablet and phone browsing staying strong as well. But is this also a generational thing or is it across the board?

Age Group Browsing

At the opposite end of the spectrum to Millenials are the 55+ group, affectionately referred to as the ‘Silver Surfers’ (shown in the bottom left quadrant in the above infographic). This group is generally the most affluent of the four groups, with plenty of disposable income. They are also less time poor and despite being more well off are often more reluctant to spend their money.

Something that is immediately obvious in this age band is that the vast majority of internet access takes place via desktop or laptop (80%). Not unsurprisingly the level of smartphone access is very low as well (4%), but tablet usage (13%) is an emerging trend in this age group. In Q1 of 2012 only 4% of the Silver Surfer group reported that their household owned a tablet device (Source: Ofcom) compared to 19% by Q4 of 2013(Source: Ofcom). Even more significantly, of this 19% of Silver Surfers who own a tablet, 71% said they personally used the device.

There is a clear trend developing towards the adoption of tablets in the older generations and this may well become the mobile device of choice in this group. Intuitive interfaces and larger screens mean that it is a more comfortable experience, whilst the cost barrier is less of a deterrent for this more affluent group as well. Most importantly, they don’t feel the need to be online all the time whilst on the move, unlike the Millenials.

So what does this mean for Digital practitioners? Firstly there is an emerging and affluent group who are adopting tablet devices (and possibly in time smartphones with larger screens) to access the internet. Secondly, this is a group who are methodical, cautious and time rich. Thirdly, and most importantly, they are a group who have different needs and interests to younger generations, with expectations to match.

In planning for the mobile internet and the internet of things, this generation is going to play an interesting and more significant role over time. Responsive experiences are going to become as important in mobile as responsive design already is. Designing for this age group will be a different process to designing for Millenials. Similarly, there is a whole raft of apps that this age group would adopt that simply aren’t relevant to younger generations. This is only further supported by the release of health related hardware and a Silver Surfer generation who are more health conscious.


IE – that is, the bane of our lives!

One of the obstacles to progress in digital is the limitations placed upon us by the tools available to users – the browsers and devices. In a perfect world all devices and browsers would use the same code base. We could write some code and we would know that every device and browser would treat it in exactly the same way. But of course we don’t live in a perfect world, instead each developer of a browser or device treats code slightly differently and this adds overheads to projects because we then have to text against each of these. This is why standards like HTML5 and CSS3 are introduced, they are supported across the board (by all the major browsers). But the other problem we have is with legacy versions of browsers. Unfortunately older versions of browsers don’t support newer versions of code, which leaves us with two choices when developing new things:

  1. Use older technologies which limit creativity and make it harder to achieve a good result
  2. Build something that has a ‘graceful fallback’ for browsers that are older

The problem with the above is you have to make a compromise either way, option 1 limits the experience and option 2 adds budget and testing. Of course the 3rd option is to ignore older browsers…which may or may not be at your own peril.

So the online world is frustrating for developers, to say the least. But one particular browser has traditionally caused a lot more problems…and for each generation there is a new version to cause issues. Of course I speak of Internet Explorer, a browser that has become synonymous with cursing developers and irritated clients, inflated budgets and frustrating user experiences. When I first started out in digital the real problem was IE6, which at the time was still used by a lot of bigger organisations despite the fact IE8 was out. Thankfully this relic browser was retired a few years ago but with the emergence of HTML5 and CSS3 IE7 then became the new 6.

We are currently in the throes of developing a new CMS system, which utilizes a lot of the real time JavaScript technology AJAX. This has not only presented us with lots of problems in IE8, but also in the latest versions as well. We are in the position where we have to test the system in Firefox, Chrome and Safari once and then basically rebuild the code just so that IE will use the technology in the same way. Frustrating doesn’t cover it!

A couple of years ago an Australian shopping site got so fed up with the overheads imposed by Microsoft’s browser that they decided to impose a tax on users of the older IE7 browser unless they switched to another browser. Obviously Microsoft weren’t impressed, but maybe they should have taken note of the reasons behind the move.

One of the reasons this has occurred is no doubt that for years Microsoft had an unchallenged share in the market. Pretty much every PC in the world came with Windows and IE installed and for the most part people didn’t know there were alternatives. For Microsoft’s part, they have time and time again chosen to interpret the standards differently to the other browsers, causing issues for developers and bad experiences for users. An example is their interpretation of multiple file uploads. Most browsers would allow you to hold down Ctrl and multi-select files by clicking on them. Microsoft decided, in their infinite wisdom, that in IE you would have to select a file, add it to the queue, then go back and browser for another file, add it, and so on. A ridiculous experience.

Microsoft’s perceived arrogance in this area is being tested now. They continue to make the same mistakes and cause the same problems as they always have done, but they are being forced to change. One of the reasons for this is that IE no longer has a vast majority stake in the browsing market. Gone are the days of dominance, now Firefox and Chrome has made massive advances and on MAC no one would contemplate using IE anyway…lest their MAC would literally shut itself down out of embarrassment. The other reason is that users are becoming more savvy. Users realise that IE isn’t a very good experience when compared to other browsers and so make a choice not to use them. Of course, the growth in the use of devices that don’t have IE as their browser has also helped here. Hopefully Microsoft will take further note and try to converge rather than diverge and let us develop better experiences more easily and cheaply than they currently do.

But a word of warning to finish with. For years Microsoft created a rod for it own back by choosing not to go with the flow and insist on their own proprietary way of doing things. Sound familiar Apple? Already in development we are seeing issues arising that are not seen in other browsers. For example, the way Safari deals with PDF downloads can cause issues without the right plugins. Apple only accept their own formats of certain file types as well. And of course, they are renowned as a company a company that rejects all other tech except their own.

Of course it is unlikely their will ever been a completely unified approach to code and with more and more devices being used this is only going to get worse. Luckily they mostly stick to the standards, but there will always be problems that we need to solve as developers and that is why the job isn’t easy!

Shooting the messenger – a right to forget the point!

An EU court has today backed the right for people to request that Google must amend some search results, in what is becoming known as the “right to be forgotten”. This story, reported on the BBC News website, raises some interesting questions and also possible paves the way for a world wide web police, for which Google must logically be placed.

But first lets look at the court case, which presents a few problems and, much like the “cookie law”, has got quite a few people excited despite having hold of the wrong end of a very large stick. For one, the EU Justice Commissioner, Viviane Reding, said in a post on Facebook that it is a “clear victory for the protection of personal data of Europeans”. I can’t help but feel she has massively missed the point here. Google themselves make the salient point; They do not control data, they only offer links to information freely available on the internet.

What Viviane is failing to see is that this ruling doesn’t solve any problem at all. If people want misleading, inaccurate or otherwise unfair information about themselves to be removed from the internet then asking Google to remove it from search results doesn’t remove it from the internet, it merely stops it being shown in Google. The information itself is still out there as a source for people to find. And you might argue that if it isn’t shown in the most popular search engine (and indeed website) in the world then that solves the problem, but if you believe that you’re probably a bit short sighted. Why? Well quite simply because the online world is changing.

An example of this change is this article, which talks about how the younger generation are consuming the internet through apps rather than browsers. This is significant because the web is moving towards an information warehouse rather than website based approach, where your app of choice will be used to retrieve this information. If this is indeed where the net ends up then Google will no longer be the majority search engine, and therefore the information that Google has obediently hidden will be found again. Not to mention that if the actual content is not removed, it only takes a couple of people to find it and share it and then it is all over the net.

So, what is this ruling actually doing? Well it is shooting the messenger for the ‘crimes’ of others. Google is suffering from being the biggest name in the web. It suits the cause of the advocates, politicians and legal personnel to aim the gun directly at a big name rather than this court case disappearing into obscurity once the actual offending website is dealt with. More to the point, like the cookie law, it will get a lot of normal people who don’t 100% understand how the web actually works riled up and support some piece of law being passed that doesn’t actually solve the problem, just covers it with a plaster for a while.

But there is something else going on here as well, something which many people have seen coming for a while. In placing the responsibility with Google (and presumably other search engines, although none have currently been mentioned) to manage and control this content they have now effectively asked them to start policing the internet. As there are vagaries around exactly when a person can validly ask for content to be removed, there will need to be someone making judgements on what is and isn’t allowed and Google are best placed to do this. They have the biggest reach, the widest data access and the best understanding of content monitoring and assessment. Another perceived advantage is that Google are agnostic of governments and institutions, meaning they are well placed to make impartial judgements (in theory).

Whether or not Google does end up being this web police or not, this court case is a line in the sand. To date the internet is largely uncontrolled and almost anything can be uploaded. But this court case has moved a step closer to a situation where either proactively or retrospectively content is going to be monitored and potentially restricted or even removed. The age of the free internet, the ultimate safe harbour of freedom of speech, may well be coming to an end. Whether this is a good or bad thing however, is a whole different question.

In the meantime, if you see something about yourself online that you don’t think should be there, don’t ask Google to remove it, ask the actual website. That will be much more effective in actually removing the content.

Putting the U in UX

We bandy round the term ‘UX’ or ‘User Experience’ quite freely, but what does this actually mean? Often people talk of ‘good UX’ and refer to examples where the experience is obviously very good but there are few examples actually shown that demonstrate why ‘UX Design’ is so important.

Firstly, let me state what I believe good user experience design is. Quite simply it is designing with the end user in mind, so that what they see and use is intuitive. It is very easy to design and build something that is completely functional but actually very difficult to use. It is even easier to design something that is incredible to look at but the average user has next to no idea how to use.

So lets look at how problems arise when this approach is not taken. I have a phone contract with EE and having received some junk mail from them recently decided to use their online portal. We’ll put aside the issue whereby clicking on their link loaded a dead page. After a stroppy tweet to them it seemed to be working again. But having logged in problems soon started to arise. The main task I wanted to achieve was to see when I could upgrade my phone. This is where the problems started.


You can’t really fault the general design of the page. It is neat, easy to see, on brand and attractive. However, there is no where obvious on this screen that states that my portal is currently restricted, except for the small ‘Access Level: Restricted’ link in the account details section. This would be useful except that clicking the information icon doesn’t do anything!

So I carried on along my merry way and clicked on the upgrade options link. This loaded a new screen:


Success, this is what I want. Except that when I click on any of the links literally nothing happens. And to make things worse, most of the navigation items in the header also don’t seem to work. It was only when I did a print screen that the prompt text appeared giving some information that my account needed to be activated, but that didn’t even fit onto the screen properly.

Having stumbled blindly upon the problem I then returned to the account details screen and found the small link that allowed me to upgrade my account to full access.


But even now problems occurred. Having put my account number in, which seemed a bit of an odd step to have to take seeing as I was already logged into my account, I clicked OK and nothing seemed to happen. There was no on screen acknowledgement and the content just got blanked. I then input the details again and then on clicking OK it said that the account was already activated. I returned to the account details screen to see that it did indeed now say I had full access. I was then able to discover that I wasn’t yet eligible for an upgrade (sigh).

This experience is a classic example of how little things have big impacts when it comes to user experience. This isn’t the snazzy, sexy, all signing and dancing type of UX, it is the practical kind that almost all of us come into contact with. EE have spent a huge amount of time (and no doubt money) creating a nice brand, but they let themselves down massively here by simply ignoring the simple things. An absolute basic is to give on screen prompts, help and information that works and is obvious. I am a very savvy user and yet I found this frustrating and difficult to use. Less savvy people would no doubt have been on the phone to the helpline or simply would have given up.

The EE portal site is a classic example of where even 30 mins of user testing and IA input would make this experience so much better. It is a lesson we should all learn from. UX is not just about the big things, it is fundamentally about the little things. A seamless experience is created when a user feels they have everything they need to do the task simply. In the case of EE the site is clunky and has bugs and they have tried to just ‘tick the box’ for things like prompt messages, when they should have actually dealt with them properly.

As web designers it is our responsibility to put ourselves in the shoes of the users. That is hard when you are attached and close to a project for a long period of time, which is why user testing exists. It doesn’t take long to do, but it will make a world of difference to the users.

Running towards a virtual future.

I enjoy running and in the last couple of years I’ve started to enter a few races. More than anything else the driver for me is to stay fit and at the same time challenge myself. One of the best ways for me to do this is to enter events, as this gives me something to train for and thus prevents me wallowing on the sofa watching re-runs of Grey’s Anatomy.

Recently I entered my first virtual run. This isn’t some way of pretending you are running when you are actually sat on the sofa enjoying McDreamy and McSteamy. No, it is the growing phenomenon of running races that instead of having a fixed venue simply have a time period and in order to compete you record and post your entry during a given time period.

In this case I entered the Tweethearts 5k. This is a new one setup by my wife and some of her friends to raise money for the very worthy Moonwalk later this year. It was an interesting experience. Barely two weeks earlier I ran in the Richmond 5k, a race in Richmond park that attracts in the region of 150 runners and is a great occasion. The two experiences couldn’t be more different.

On the day of the Richmond race you arrive early and watch on, stretching in the park and trying to stay warm, queuing for one last go at the toilets, before congregating by the start line, watching the minutes tick by before the 10am start time arrives. The klaxon sounds and off you go, a runner amongst others, maneuvering for position, finding your pace and picking out the next runner in front that you want to catch and pass.

On the day of the virtual run I had breakfast, watched the morning news and even an episode of Games of Thrones before deciding that late morning would be the start time for my run. I got my running things together, took my time to select my playlist and stretch my legs out. I even waited a few minutes more for the rain to abate before heading out on to my usual route, devoid of any running marshals cheering you on, only cars passing by.

The two experiences couldn’t be more different. Richmond is a proper race, where you are competing alongside others, seeing them and sharing an experience with them. There are people around you, cheering you on and at the end you get handed your medal. It is a group experience that is entirely tangible from the moment it starts to the moment it finishes. The virtual run became something more personal. It is you and the road, you are there in that moment because you chose to be and no one else is there with you.

For me the virtual run experience was an interesting one. As someone competitive the Richmond run is great. The irresistible urge to try and catch the person in front of me, to try and get higher up the finishing list, is part of the thrill. All of this is stripped away in the virtual run and instead I am simply racing against myself. The pride comes in presenting my time to the rest of the community at the end of the day.

Virtual runs are very popular in the USA but I can’t help but think that they currently lack something that makes a running race what it is. In a world where our relationships, our entertainment, even our jobs are moving to a virtual model, the challenge for virtual runs is to try and capture the same excitement of the real races. The technology is there to allow us, as communities online, to have that excitement. Apps and websites can be created to allow real time competition between entrants. For me that is the key. I want to compete with my fellow runners, to see how they’re doing and to see if I can better them…or even just better myself.

Would I do another virtual run? Yes I would. The experience was interesting, it opened up new groups of people online to talk to and interact with. Do I think the virtual run is the future of running races? Not yet. The experience needs to be refined. I like running and a virtual run, as much as any other, gives me something to train for. But it doesn’t yet provide the experience, the excitement, the challenge and the occasion that a real run does. The online experience needs to be evolved to fill this void, to fill us with the urgency and excitement that we would otherwise feel when being in that place with those people. The gaming world provides this, so why not bring some of that gamification into the virtual running world? It will happen, of that I have no doubt. For now I will have to be content with knowing that I ran a good time and waiting for my medal to arrive in the post!