I received an email from Google a few weeks ago informing me that soon the @gardiner.net.au accounts hosted in Google Apps will be able to access virtually all of the suite of regular Google Applications (eg. Blogger, Reader etc). About time!
“Google Apps” is the name of Google’s hosted app service for a specific domain – not to be confused with Google Applications which you can access with just a regular Google account.
Up until now, Google Apps users were limited to a small subset (Mail, Calendar, Docs, Chat and Sites). This also meant if you wanted to access non-App applictions, you had to have a separate Google account, though it could have the same name as your Google app account.
This change is a good thing in that now the one account will be used to log in to both the App services as well as the other applications.
If you have a Google account with the same name as your Google Apps account, the non-Apps account will be renamed so that it is now unique. Any services/applications that were attached to the non-App account will remain with with that account.
So while all the services that I used to access with my old Google account are still there, they are now attached to the renamed account instead.
One thing that is helpful in the interim is Google have recently added the ability to switch between multiple accounts (Windows Live IDs have had this for some time), however I’d prefer to have all my services under the same account. Unfortunately there is no automated migration path to transition application settings between accounts.
The only solution is to manually move settings over. eg. For Reader, export my feeds to an OPML file then import them (which is fine for feeds but doesn’t migrate your “shared items” nor the “people you follow”). For other services like Blogger, Google Groups and Google Code, you need to re-register with the new account.
All this is a pain for me, but the other tricky part is that I’m not the only person using my domain. Google does tell me that there are other family members who are in a similar situation but for privacy reasons they won’t tell me exactly who or what those users are. I’ll have to wait for them to contact me so I can help them out. At least my domain just has a few users – this is going to be a much larger job for enterprise customers!
So did Sunday live up to Saturday’s standard? I think it did pretty well.
Today’s highlight was probably Lama’s talk on the .NET Micro Framework. If I were developing embedded systems then this does sound a very attractive option (especially compared to the tools and languages that I understand a lot of embedded development takes place in).
An excellent weekend. It would be great to get more people to attend as I think they’ve missed out on a great professional development opportunity.
Your next opportunity to quench your CodeCamp thirst will be on November 21-22nd at (the slightly delayed) CodeCampOz in Wagga Wagga.
Today was day one of CodeCampSA 2010. It’s been really enjoyable and I think the talks I saw today are probably some of the best I’ve seen in Adelaide.
It’s been quite a community atmosphere, with some good discussion and interaction between speakers and audience. What has pleased me the most is the technical depth that the talks have taken. This is a ‘developer’ conference after all, so it is great to have some meaty presentations that give you something to chew on
My personal highlights are probably the talks by James Chapman-Smith (Lambdas, Monads, LINQ & the Reactive Extensions) and Omar Besiso (Entity Framework 4.0: A Guide on using POCO Self Tracking Entities). James in particular has inspired me to take a closer look at Func<>, Action<> and how LINQ can be used in more places than you would think.
Even my own talk seemed to go well, which is pleasing (having to restart my SQL instance not withstanding!)
My only quibbles lie not with the event, but the facilities. I can’t believe in the year 2010 that UniSA can’t get an electrician in to fit more power outlets in their lecture theatres - 4 outlets for the entire theatre is ridiculous. The data projector in our room was also pretty disappointing – very washed out and hard to read (even from the front row, and yes we did try to adjust it without success). I feel sorry for the students who have to put up with this all the time.
But I’m not going to let that diminish my enthusiasm. Now I just have to wait to see what the speakers on Sunday’s agenda can produce!
I received news yesterday that I’ve been selected to be a Technical Learning Guide (TLG) for the Hands-on-Labs and Instructor-Led Labs at TechEd Australia 2010 Conference on the Gold Coast in late August (I was eligible for this because I’m now an MCT).
Rob suggested that I apply, and I’m glad I took his advice! As a TLG I will be asked to do one or more of the following activities:
- assist attendees with the self-paced hands-on-labs.
- assist attendees while they performing an instructor-led lab.
- present an instructor-led lab.
I’m really looking forward to this, and just in case you didn’t make the connection about the picture – they’re Girl Guide biscuits
For some reason I’d overlooked the fact that the instance of Team Foundation Server I was running on my Hyper-V server was still the release candidate instead of the RTM version.
Upgrading turns out to be relatively painless. I followed Johan’s suggestions.
To backup the data-tier I fired up sqlcmd and ran the following:
BACKUP DATABASE Tfs_Configuration TO DISK = ‘Tfs_Configuration.bak’
BACKUP DATABASE Tfs_DefaultCollection TO DISK = ‘Tfs_DefaultCollection.bak’
I uninstalled just the TFS component, then ran setup.exe off of the RTM and after allowing the setup to proceed, chose the ‘upgrade’ option. It’s nice to see that they ask you to confirm that you have done a backup before the upgrade can continue.
Coincidentally, like Johan I also had a problem with a TFS workspace being in use – though I believe this was because I changed the TFS Build process to run as a user account (previously it was using SYSTEM). I used a variation on the same command he used (different possibly because my servers are just in a workgroup rather than a domain).
tf workspace /delete “1_1_TFS;NT AUTHORITY\SYSTEM” /login:tfsserver\username,password
Note that the TF.EXE command comes with the TFS client bits – I ran it from the machine I run Visual Studio 2010 on as my TFS server just has the server stuff.