Nexus 7 (2012) life after Lollipop

Update 13 March 2015

Google has released Factory Images of Lollipop 5.1. Part of me wants to rejoin the Google fold, but I will need to be convinced that the performance issues are addressed, and the update does not leave me with too little storage space. And I am really quite pleased with the changes described below so it will take quite a bit of prompting to get me to re-flash.

Original Blog Post 7 March 2015

Like many other owners of the original Nexus 7 tablet I was really looking forward to getting the new version of Android, Lollipop. After all one of the great things about a Nexus device is getting early access to the latest operating system without waiting for OEMs to put their spin on it.Nexus 7 as a SatNav

Despite being one of the early generation of 7-inch tablets, the Nexus 7 has been a great tablet. Even with the cheapskate 8GB of RAM we have primarily used it as a widescreen SatNav using the excellent CoPilot software. The reasons CoPilot is our SatNav of choice are two-fold;

Downloadable maps: and frankly how anyone expects to navigate without downloaded maps over any sort of distance baffles me. CoPilot allows you to download maps for individual countries across Europe and even with most of Western Europe on board there was still plenty of breathing room on the tablet

But the principal reason is RV Mode. We don’t have an RV we actually have a caravan, but when navigating the roads of Western Europe the main concern is height, which is admirably covered by RV Mode, and being able to make the manoeuvres required. In RV Mode, not only do you get routed down roads that you can fit through, there is also none of the “make a U-turn” nonsense that would be impossible towing a 7 metre caravan. Instead, when you have missed a turn or need to re-route you get taken on a detour that works you round in loop.

Plugged into the cigarette lighter and with the WiFi off the CoPilot has successfully navigate us around France, Italy, Germany, Slovenia, Austria, The Netherlands and of course the UK. We have only come unstuck (or rather almost stuck) when we have ignored the advice of the nice SatNav lady.

Beyond the SatNav duties we have used the Nexus 7 with BubbleUPnP to stream media from a DLNA server to an XBMC client and to ChromeCast. It has also been a useful Hangouts and Skype device.

My expectation was that the Lollipop update would give the old tablet a new lease of life. The reality was that the upgrade to Android 5.0 made the Nexus 7 (2012) so slow and unresponsive as to be useless; and this experience appears to be the norm.

I had toyed with the idea of spending the £200-ish to get a replacement tablet that would offer more grunt than the current tablet and so would hopefully make better use of the new OS. But there was nothing fundamentally wrong with the Nexus 7 and, as you can tell by my choice of the 8GB version, I am on the look out for a bargain.

The solution turned out to be pretty simple and relatively painless; downgrade the Nexus 7 back to a KitKat based distribution: Cyanogenmod 11.cyanogenmod

Why Cyanogenmod?

There are the factory images from Google at Google Developers and that would have been the absolutely safest choice. My concern was that I would be nagged to upgrade to Lollipop again, which would not work. So I am keeping this as a fall back position.

Flashing the Nexus 7 seemed like a great opportunity to try a new type of ROM. There are quite a few that target the Nexus 7 on Xda Developers, but while I wanted to experiment a bit, I really just wanted the Nexus 7 to work again with the minimum of hassle. Cyanogenmod has always been a well supported distribution and now that Cyanogen is becoming much more mainstream it seemed like this was also a safe choice.

The resources for Cyanogenmod are very well laid out and easy to find once you realise that the device name for the Nexus 7 (2012) is grouper. The version I chose from the Information: Google Nexus 7 (Wi-Fi, 2012 version) (“grouper”) page was the latest in the Release channel, described as stable for daily use. When I did my downgrade this was cm-11.

How to install Cyanogenmod 11 on Nexus 7 (2012)

The instructions provided by Cyanogenmod are very straightforward and complete. The main trick was figuring out that the device name for the Nexus 7 (2012) is grouper. Armed with this information you can get a complete list of all the relevant Cyanogenmod distributions.

In following the step by step instructions, however, there were a few areas where the instructions, which appear to be boilerplate text, did not quite match the reality. I also managed to miss a couple of steps. So I thought it might be worth rehearsing the steps that actually worked here.

From the Cyanogenmod site download

My installation was made easier, I believe, by having access to a PC running Ubuntu so I could install the adb and fastboot packages (android-tools-adb android-tools-fastboot). There are lots of instructions for installing these on Windows and Mac (Lifehacker also has a summary), but having dedicated packages with a simple installation does help.

The first step was to enable developer settings on the Nexus 7 then connect to the PC using a USB cable. Once connected open the terminal and start the configuration.

This sequence sets up the connection, unlocks the bootloader, and then confirms that there is a live connection to the rebooted phone. I found that running commands as admin was the simplest way to perform these steps.

The next step was the recovery ROM. I had initially missed this step so make sure you download the recovery ROM.

make sure you pick the correct path and name of the recovery image.

Reboot the device into recovery to verify the installation. The Cyanogenmod wiki instructions are

Boot to recovery instructions: Hold Volume Up, Volume Down, & the Power button. Continue to hold all three until the screen flashes, then release all buttons.

I found you had to be alert to release the buttons as soon as the screen flashed. It also seemed to take quite a long time for the button presses to be registered.

Once in the Clockwork recovery the Cyanogenmod wiki instructions were pretty simple

  1. Use the volume rocker to move between the options and press the power button to select
  2. I did not want a backup of the current ROM so I skipped this step
  3. Select wipe data/factory reset

The wiki then offers a couple of options for flashing the ROM. I found only one worked: sideloading.

  1. Still in Clockwork recovery choose install zip > install zip from sideload
  2. On the PC  (note I renamed the Cyanogenmod ROM zip file for ease)
  3. in Clockwork recovery reboot system now

You should now be enjoying the Nexus 7 booting into Cyanogenmod. Rather than linger I proceeded with the Google Apps package, and again the sideload method was the one that worked.

  1. Reboot into Clockwork recovery
  2. Choose install zip from sideload
  3. From the PC: sudo adb sideload gapps.zip
  4. Restart device

At this point you can start setting up the fresh new operating system.

How is it working out?

For my purposes there is no practical difference working within the Cyanogenmod environment. The things are noticed while setting things up were

  • I chose to create and log in with a Cyanogenmod account
  • I then added the Google account
  • Downloading apps from the Play Store required a little thought so as not to clog up the Nexus again
  • The Cyangenmod themes engine was quite diverting, but not all the ones I tried worked well on this device. I ended up with the Android L theme pack by tung91 which gives a pleasing pseudo-Lollipop experience

ScreenshotCM11

Performance is now back to KitKat levels. The tablet is responsive(-ish) and pleasant to use for the navigation and media consumption. I have been careful to only install apps that I really do use on the tablet rather than the full gamut.

One application that really seemed to slow things down was Google Chrome. Admittedly it was the Chrome Beta, but this chimed with comments I had seen on forums. The Cyanogenmod Browser would have been an adequate replacement, but in the brief period of use it seemed rather unpolished and old fashioned. Instead I reverted to my old Symbian favourite, Opera. So far it has been a good experience with fast rendering, good tab management and a nice interface. Giving up on the shared browser history and bookmarks might be a bit of a loss, but if it delivers a responsive user experience it is a sacrifice worth making.

Creating and Deleting Meetings: Adding and removing meetings in Outlook using PowerShell part 3

In the previous post Getting PowerShell ready to work with Exchange: Adding and removing meetings in Outlook using PowerShell part 2 we looked at preparing the script to create meetings in an Exchange calendar. Now that the connection has been created to the appropriate calendar we can start creating meetings (which is the easy bit) and deleting them when necessary (which is not so easy).

There are a number of tutorials on using PowerShell to create a meeting in an Exchange calendar, and include the necessary participants.  Two I found particularly helpful were Aman Dhally’s Powershell and Outlook: Create Calendar Meetings using Powershell function and Mike Pfeiffer’s Creating Calendar Items with PowerShell and the EWS Managed API (which includes an interactive script that is great for testing).

In our case the details of the appointment were captured through a separate system and held in a SQL Server table.  The script used a view of that table to select the appointments that had not yet been processed and read them into an ADO.net data table. This is not really the focus of this series of posts so I have not included the detail of accessing the data from the database not of constructing the body of the meeting record using token substitutions. Instead we will focus on the mechanics of creating a meeting record.

Meeting Parameters

A meeting record consists of 5 parameters added to the Exchange Appointment object ($appointment):

  • The title of the meeting record: $appointment.Subject
  • The body of the meeting record: $appointment.Body
    • Note: this accepts HTML content by default (unlike a regular email record)
  • The meeting start date/time value: $appointment.Start
  • The meeting end date/time value: $appointment.End
  • And what makes it a meeting rather than an appointment is the inclusion of participants. In our case required participants: $appointment.RequiredAttendees

Typically these values will be generated by the PowerShell script, and in our case the values come from the SQL Server table. The following code snippet assumes that these values are in the $row object.

Notes:

The $exchService is the Exchange connection, see the previous post on creating this object.

Because RequiredAttendees is a multivalue construct it needs to be assigned with a pipe rather than a simple assignment as in the other parameters:

The meeting record is created, and Required Attendees notified, when the $appointment.Save command is executed.

Deleting Meeting Record

Most tutorials stop with the creation of the meeting (or appointment), and in many cases this is enough.  Unfortunately in our case not only were meetings initiated in the online booking system queried by this script, meetings could also be deleted in the same online booking system. We therefore needed a mechanism for the script to delete them from the Exchange calendar.

Designing a view in SQL Server to identify the meetings that had been cancelled was trivial. The real problem was at the Exchange end. In order for the script to delete the meetings we needed to be able to find which meeting records had been created in Exchange for each booking in the online system. This is where advice on the internet started to run very thin.

Step 1: Identify the Meeting Created in Exchange and store the identifier

There are a couple of different ways to identify a meeting in Exchange, but the one described here should retrieve a globally unique identifier that will persist even if a meeting is edited in Exchange. it also retrieves a text string that can easily be stored in whatever system underpins the script, in our case the table of bookings in SQL Server.

To capture the GUID we first need to extend the Exchange connection object with an extended property set

With this in place we can then capture the GUID after the meeting is created

The variable $CalIdVal64 is a base 64 encoded text string which represents to binary GUID. The text string can now be stored with the booking information and can be used to find and delete the meeting if necessary.

The full code snippet is available at the end of this post.

Step 2: Using the GUID to delete a meeting from Exchange

As with creating meetings, the script uses a database view to access details of meetings that need to be deleted from Exchange. In the code snippet these details are in the $row object

  • $CalIdVal64 is the base 64 encoded GUID captured when the meeting was created
  • $CleanGlobalObjectId is the extended property object created when defining the Exchange connection
  • $Calendar is the Exchange calendar object the script binds to

The meeting is cancelled when the $a.CancelMeeting() command executes and all Required Attendees are notified by email automatically.

The Full Script Skeleton

Because the script relies on an external store of meeting details it is difficult to provide a drop in and run script.  It would be possible to write the commands to create and delete a meeting as functions, and this is what we have done in production, but it does not make the code significantly more reusable and it is more difficult to follow. So what follows should be seen as a skeleton that can be fleshed out with your own external data sources, or dismembered and put into separate scripts to create and delete meetings.

 

 

 

 

 

Getting PowerShell ready to work with Exchange: Adding and removing meetings in Outlook using PowerShell part 2

In the previous post, Adding and removing meetings in Outlook using PowerShell,  I discussed why we had chosen to use PowerShell to manage appointments in Outlook.  In this post we will look how to set up PowerShell to perform these tasks, specifically:

  • enabling the EWS Managed API
  • using encrypted passwords with EWS
  • working with the calendar

The full script is listed at the bottom of the post.

Enabling the EWS Managed API

Microsoft’s EWS Managed API is the core mechanism for interacting with the Exchange Server. The code is now available on the OfficeDev github account which links to the Windows Installer downloads.  There is quite a lot of documentation on MSDN and other Microsoft outlets as well as third party sources.

Installing the package on the server where the PowerShell code will run was straightforward: download the MSI file and run the installer. The only thing to ensure is that the chosen version matches the Exchange server you are connecting to.

Once installed in the server, the EWS Managed API needs to be made available to PowerShell.  While it could be added to the profile, it is simpler to include it in the script

Once the Import-Module cmdlet has run all the APi features are available to PowerShell.

Using Encrypted Passwords with EWS

The aim of the script is to create appointments in the primary calendar of an Exchange account accessed by service administrators.  In order to add  appointments into the calendar the script needs to authenticate with the correct identity. We did consider running the script under that identity, but that would have meant adding permissions to other machines, databases and services that did not make sense.

Fortunately the EWS API allows the script to create a connection to an account by passing a plain text username and password.  Clearly this is not really acceptable, so we have used the technique for creating an encrypted password file described by Todd Klindt (the PowerShell v2 version). Once the encrypted password file was on the disk the trick was getting EWS to accept it.

Unfortunately this took a fair bit of trial and error.  Searching the web offered a number of different ways to create a connection, but most of them did not work. The solution was ultimately to create a credentials object using the secure string and then extract the password as a string variable that is then used to create the WebCredentials object:

  • get the contents of the encrypted password file (referenced by the $passfile variable) and turn it into a secure string

  • create a credentials object and extract the password

  • create a Exchange Service object (note this is for Exchange 2010)

  • use the password variable (and a username variable) to create credentials for the new Exchange Service object

Now that the Exchange connection is established we can bind to the calendar.

Working with the Calendar

The script then binds to the default calendar for the nominated account.  We have used this approach because most users are not too familiar with multiple calendars, and on this service account this default calendar is used primarily for managing these appointments.

An enhancement would be to check for the successful creation of the $folderid object, but as this is binding to the default calendar there is no real danger that the calendar will not be found as long as the credentials authenticate successfully.

We are now ready to create (and delete) entries in the calendar; which will be the subject of the next post; Creating and Deleting Meetings: Adding and removing meetings in Outlook using PowerShell part 3.

The script so far

 

Adding and removing meetings in Outlook using PowerShell

Over a couple of posts I plan to describe how I have used PowerShell to add and, more importantly, cancel meetings in a third party’s Outlook Calendar using PowerShell.

As is typically the case for these sorts of posts, it represents a distillation of and extension to a number of blog posts and forum answers that I have trawled through over a number of days working on this problem.  Unusually, I did not find any one source that provided the basis for the solution. Instead there are a number of sources that address various aspects of the problem, often several contributors saying pretty much the same thing. However, I have been struck by the relative paucity of PowerShell specific guidance on this sort of thing, and have found myself having to translate C# code into PowerShell on more than one occasion. Perhaps this indicates that this approach to managing meetings in Exchange is not best practice, but as PowerShell is starting to pervade all aspects of the Microsoft stack I thought it might be interesting to pass on my experience of stitching together hints and pointers from may sources.

The scenario

This development is part of a much larger service management system. As part of the service clients can book a range of appointments with relevant practitioners either directly online or through reception staff who will take the details and enter them into the online system on the client’s behalf. The online system handles the allocation of time slots among practitioners and appointment types, but here the requirement was that the practitioners wanted their appointments to appear in their personal Outlook calendar, and for cancelled appointments to be properly flagged.

In other settings it might have been appropriate to use a shared calendar or for practitioners to add a SharePoint calendar. Alternatively a separate interface within the online system could have provided personalised information to each practitioner. However these practitioners were often accessing their workload data remotely, and they have considerable autonomy in their choice of technologies to support their work, so in our setting it was necessary for the practitioners to have appointments in their own personal Exchange calendar without using any additional or added calendars.

The solution, inspired by a similar system developed by colleagues at the University of St. Andrews, was to create all the Exchange appointments in a central calendar managed by the admin team. The practitioner would then be added as a required attendee at the meeting. This would trigger the appointment to appear in the practitioner’s calendar. If the practitioner deleted or edited the appointment this would be flagged in the central calendar for the admin team to investigate. When appointments were cancelled in the central calendar the practitioner would see that the appointment had been cancelled in their calendar and could remove the entry.

Why PowerShell?

Given that the bookings that need to be synchronised with an Outlook calendar are made through an online form PowerShell is not the only, or even the most obvious, option, however in our setting it was an attractive one for various reasons:

1 PowerShell is self-contained

As we shall see it does required some additions, but fundamentally it does not rely on the CMS or any other system to perform these tasks; all it needs is access to the source data and the Exchange Web Services. While we run the script on the same host as the CMS it is not necessary and we could, in theory, have a dedicated VM just running our scheduled PS scripts.

2 it is easy to schedule

While the CMS has support for scheduling actions, as we have access to the host OS it is simpler to set up scheduled tasks in Windows than to create something that would run from the CMS. There are already other PS scripts scheduled so it also makes it simpler to choreograph them when they are all in the one place. There is an argument for making the setting and cancelling of appointments a real time push from the online forms to Outlook, and this would be possible if we developed something within the CMS. However, in our environment there are disadvantages to a real-time sync. Very often the booking is part of a discussion or at least deliberation where the client might be exploring several options. In this case bookings can be made and then cancelled almost immediately as the client weights up the implications of the booking. The appointment being booked is also at least one and often several days in advance for a fixed number of time slots, so the practitioners do not need to know immediately who is attending a particular session, they just need to know in time. There is no danger of the delay in synchronisation causing problems with bookings as all bookings are handled through the CMS.

3 it is easy to write

The relative ease or difficulty of any scripting task is primarily a function of the developer’s skill and experience in a particular language and environment. Working with the Microsoft stack there are a wide range of languages and frameworks to choose from. When you add the additional complexity of working within a software platform like SharePoint or DNN developers experienced in the underlying language often find the environment makes life even more complex; just ask .NET developer what it is like to work inside SharePoint for the first time.

In this enviroment PowerShell can play the role as common denominator or lingua franca. In a team where developers and devops work in different languages and on different software platforms on the Microsoft stack, it can be difficult to write custom code that can be supported by anyone in the team. As PowerShell becomes ever more ubiquitous across the stack, most professionals have some exposure to the language and can quickly get to grips with a self-contained script.

The anatomy of the script

The essence of the script is quite simple, performing two basic actions

  • Read the CMS database for appointments that have yet to be created in Exchange and create them
  • Read the CMS database for appointments that have been created in Exchange and have subsequently been cancelled and cancel them

(in our production environment these two actions are actually separate and embedded in larger scripts that perform a number of other actions; I will highlight the code that is common to both)

In the following posts I will describe

Giving up on OneNote

The Cloud, Subscription Software and Trust

After a brief flirtation with OneNote I have decided it is not for me.  Actually it is a great product and it works very well, even on the very mixed environment I choose to work in, i.e.Windows (mostly 8.1 and RT), Mac, Android and Ubuntu.  There are clients for most of the operating systems I use and there is always the web client which also works very well.

So if it works so well with everything, why not commit to OneNote. I must confess that after searching for a solution that would work on all my devices AND offer offline editing and sync I had thought that OneNote would be the one.  However what has put me off boils down to an issue of trust.

The seeds of Doubt

After updating the OneNote client on my Macbook I have not been able to access the OneNote notebooks on the university’s Office365 OneDrive for Business.  I can still access them perfectly well on my Windows machines (both personal and at work) and on the web, but when I try to access them on the Mac client I am asked to activate with my Office365 subscription.  As far as I am aware I have a perfectly good subscription that works on these other devices, but for some reason I cannot access these notebooks on my Mac through the desktop client. I have commented on this in the Apple App Store and on the Microsoft Community site and the lack of response probably indicates that this is an issue that other people are not facing. [updated 9/12/02014]  actually indicates that the behaviour I am seeing is what is supposed to be happening and this is what Microsoft want. Sadly the only place I found this information was on OneNote-blog.de.  This being a deliberate change and not an error might mean that some of the text below is inaccurate, but I believe that this being a deliberate change/clarification by MS actually strengthens the substantive argument.

So what is the big deal? I can still access these notebooks via the web interface and my notebooks on the free OneDrive personal are still accessible so why give up on all that OneNote has to offer? As I said above, it all boils down to trust.

As far as I can tell the problem with the Mac client is that it is not finding the Office365 subscription properly.  In other words, a glitch in Microsoft’s authentication has locked me out of my content on this client.  I can still get in to it in other ways, but what if the glitch prevented that.  If I am going to start to put a lot of content in OneNote, and important content, I don’t want to be at the mercy of some company’s subscription processing system.  Fundamentally I want to own my content.

The trend seems to be towards subscription access to pretty much everything online. I am pretty content with the idea of paying for access to media, as this is quite similar to paying to listen to a personalised radio station–but I want to keep the stuff I really like so I know I can access it even if I don’t have a live subscription.

The idea of renting software is rather different.  When I do work round the house I will occasionally rent a specialist tool to perform a specialised task.  The regular day to day stuff, on the other hand, gets fixed with tools I own.  They may not be the best tools (and sometimes not even the appropriate tools) but they are my tools in my toolbox.

Leaving aside the drift towards making the bread and butter office productivity apps a subscription product that could stop working when the real owner determines the subscription has lapsed, my experience with OneNote on the Mac has brought home to me that

  1. Microsoft is storing my content in the cloud and is allowing my to update it and synch it to various devices
  2. Microsoft owns the tools that allow me to access my content and, in the case of the Mac, can choose to prevent me from accessing my content

The fact that Microsoft is storing my content is not too much of an issue by itself.  I use a number of different cloud storage services of different types. Where there is an issue is that my content in OneNote form can only really sit on Microsoft’s cloud services, whereas most of the other content I have can be swapped around on any of the cloud storage platforms. Well I suppose technically speaking I could move the OneDrive Personal files around using another cloud service as long as they appeared to the client to be a local file. Or at least that is the way client works at the moment.

And there’s the real problem. All my content is locked away in a proprietary format in a way that, certainly in the case of OneDrive for Business, I don’t really understand. To a degree this is true of the other files I have, .docx .png .odt .html, they all to a greater or lesser extent need a program to make them usable, but the point is there is some choice. And that choice includes options that I can keep rather than rent.

So if not OneNote then what?

Keep it all in a bunch of word processor files

The beauty of OneNote, from my point of view, was that it provided a single place for a lot of structured notes about a lot of things. In most cases I could have written up the notes in a word processor, but the concept of separate but related pages is much nicer than either a section in a document or a completely separate file. I have worked with complex Word documents which included child documents, but that does not really match the sematic structure here and is more for managing the creation and maintenance of big documents rather than note taking.

I hear Evernote is really good

And I am sure it is, but I have never tried it. However in the context of this particular epiphany I am afraid that another subscription service is not that attractive.

What about Google Keep or Simplenote? These are both services I use for ephemeral notes that I don’t mind loosing.  The structure is also very simple so they are easily exportable. But this simplicity means they are not really suitable for the more complex structure notes that OneNote can deliver.

What I really need is …

Reflecting on my dissatisfaction with OneNote I have begun to formulate a wish list for an approach to deliver what I had hoped OneNote would provide:

  • the data must be in a format that does not tie me in to one piece of software
  • I must be able to store the data wherever I need, in the cloud (and any cloud at that) or on my own storage (the storage I have bought and own not just rent)
  • all these storage options must be able to synchronise, and synchronise without relying on a particular provider
  • the content must be available, and updateable, on all the devices I use, OSX, Windows, and Android
  • and, as I live in an area of the UK which is not blessed by consistent 3G coverage let alone 4G, the content must be available offline on all the devices and especially an Android smartphone

I am not entirely sure what the solution is, but the plan is to follow up with posts that explore how close I get to achieving this.

Reviewing for Black Magic Solutions for White Hat SharePoint

I have been working on reviewing drafts for the EUSP Black Magic Solutions for White Hat SharePoint project. The drive behind the project is to share the amazing things that can be done at the client side of SharePoint without the need for server access, Visual Studio or hard programming. Instead these solutions use the corner stone of web programming, javascript, to manipulate the both the end user experience and the back-end SharePoint data store to produce applications and interfaces that stretch the functionality of SharePoint into territory you would expect to pay thousands of pounds for from “proper” software houses.

So far I have reviewed Paul Tavares’ SharePoint in Agile: Managing an Agile Development Project. Even as someone who spends quite a bit of my time working with jQuery this was a revelation. The application is elegant and effective in itself and also achieves the aim of inspiring me to think of lots of other applications for these techniques, not only in SharePoint but other web applications I work on.

This is another really valuable initiative from the Nothing But SharePoint team and everyone should look out for the final version as it becomes available. For anyone disappointed with what seem to be the limitations of SharePoint or who feels that the barrier to development in the SharePoint corner of .NET are too high there are bound to be plenty of solutions to try.

I has also been a reason use Yammer for serious work.  Reviewing has been managed through the SPYam network.  While I have read a bit about Microsoft’s movement towards “social SharePoint” through Yammer, I had not really used it before.  Compared to other platforms/envrionments I have used for reviewing it has been a lot more free-form, and does tend to get a bit disjointed even thought there are only a few active participants.  As a serious social space it is pretty nice, but I think I need to commit a lot more effort to get the most out of it and I do wonder how I would manage dipping in and out if I was in a much larger/more active community.  So I will have to add it along with Twitter, Facebook, LinkedIn and Feedly as places to keep track of.

Synchronising through the cloud, part 2

In 2010 I reflected on the trials of synchronising across four platforms; Ubuntu Linux, MS Windows 7, Android and Symbian S60v3.  At that time, it was Symbian that was causing me problems, but this is no longer the case.

I now have a Nokia N8 Symbian ^3 Anna phone as my main phone, and I find I am using it more than my HTC Android phone, or rather it is my first point of call for anything media related. Not only is the camera fantastic, I find the slightly smaller screen better than the Desire and the audio is better.  For podcasts Nokia Podcatcher is better than the Listen app and for just listening to music I find the old fashioned LCG Jukebox very comfortable (even though the interface is not great with touch). And of course the sound quality, both speaker and headphones, is better on the Nokia.

With the Swype keyboard and QuickOffice it is also better for serious work on documents and spreadsheets.  Interestingly I also use Swype as my main keyboard on the Android phone and I also have the full version of QuickOffice and experience is not as good.  Although to be honest the Swype keyboard is great on both devices; it is the QuickOffice implementation that I just cannot get the hang of on the Desire.

The original post concentrated on synchronising content between all four platforms, or rather making the same files available.  In this domain, the main change is the emergence of Dropbox as the service of choice.

In the previous post it was Symbian S60V3 that was the problem, however I later discovered that with DropDav I could create a webdav connection in the default File Manager that allowed simple copying of files to and from my Dropbox account.  This is still available in the Nokia N8 and it is something that I still use. But, thanks to the All About Symbian podcast I have discovered CuteBox (currently free from the Ovi/Nokia Store) which matches or exceeds any Android app for convenience in accessing and updating Dropbox files.  In fact, because I so rarely use the HTC Desire to work with files any more this is now my main way of accessing Dropbox on the move.

There are plenty of other contenders in the cloud storage space, and I have accounts with box.net, sugarSync and UbuntuOne, but is Dropbox that currently provides me with a solution for every environment I use.

Publishing InfoPath Forms to SharePoint 2007 “The following URL is not valid:”

We have been making quite a bit of use of InfoPath forms recently and have published successfully to a number of sites across our farm.  However yesterday I tried to publish to a new site and got this error:

The following URL is not valid: https://…….

After a lot of googling I confirmed that this was not an uncommon error, but none of the scenarios matched our situation; i.e. we already have a root site and have been publishing successfully for some time, and could still publish to other sites.

After trying some test the culprit seems to be the Office SharePoint Server Publishing Infrastructure feature!

  • trying to publish to sites which have Office SharePoint Server Publishing Infrastructure enabled throws this error
  • publishing to a subsite which does not have Office SharePoint Server Publishing Infrastructure enabled works fine

It would have been nice to have seen this documented somewhere when I googled, but I suppose that is the danger of just searching.

URI encode Source attribute in SharePoint 2007 Data View Web Part calling an InfoPath form

In the previous post, Javascript in Data View Web Part XSLT, I showed how to use javascript to do things that XSLT alone cannot do in the SharePoint 2007 Data View Web Part.

The primary motivation for investigating this was wanting to

  • add a DVWP that showed items from a Forms Library
  • include in that DVWP a link that would open the InfoPath web form
  • direct the user back to the DVWP when he/she closed the InfoPath web form (including any querystring that could be used to filter the DVWP)

Creating a link to an InfoPath web form is tricky enough because of the syntax, but the real challenge was URI encoding the current page URL so that it could be used as the Source attribute and direct users back to the DVWP.

This approach calls a javascript function from the link column in the DVWP that constructs the InfoPath friendly URL and redirects the browser to that address.

This function uri encodes both the url and the query string and then inserts them as the Source attribute.

trgt = @FileDirRef

function uses trgt to calculate the subsite address for the /_layouts/FormServer.aspx url using trgt.substring(0,trgt.lastIndexOf(“/”))

(typically the DVWP will be showing the contents of the form library so this will work)

fn = @FileRef

This is the full file name of the form that is to be opened

Javascript in Data View Web Part XSLT