Search This Blog

Thursday, December 31, 2009

OpenDNS

OpenDNS - do you need it?
OpenDNS provides a free service to home users.  It also has options for small buiness and large corporations.  The free service provides quite a few benefits for the home user:
  • Faster Web navigation
  • Parental Control
  • Smart advice on mistyped address
  • Protection from phishing attacks
  • More...
How to set it up
The OpenDNS web site provides very good step-by-step advice on how to set up OpenDNS.  If you have a router, it allows you to select the brand, then shows you the models.  It tells you how to configure each model.

My experience
As you know, I run Ubuntu (Karmic Koala) on my laptop at home.  The first thing I noticed was a dramatically faster experience on Firefox.  I believe other users in my household - all of whom run various MS Operating Systems, also noticed an improvement.  I am not using any of the additional features like Parental Control or blacklisting.

It's free - so give it a go.  If you don't like, just revert back to your current settings.

Tuesday, December 8, 2009

Open Source - no money, but still a cost

Today there is a vast amount of software available at no monetary cost.  The software is usually covered by some form of licence.  One of the most popular licences is the General Public License (GPL).

A lot of Open Source software is created to run on open platforms such as Linux.  Installing software on Linux is a little less straightforward than installing software under Windows.  For a start, Linux has quite a few different "flavours".  I use Ubuntu Linux, but there are several other options available for the personal user as well as corporations.

Under Ubuntu, some software is available for installation via the Ubuntu Software Center (under the Applications Menu).  Ubuntu also has the Synaptic Package Manager (under the System > Administration menu).

Ubuntu is maintained and distributed by Canonical, a company with a goal of providing quality open source software.  The software available under the Applications Menu has been tested to work with Ubuntu.  This also goes for software under the Synaptic Package Manager, if it isn't listed as "universe" or "multiverse".  There are also a lot of debug versions of software to assist developers.

Other Open Source Software
There are various forums on the web that discuss open source software.  Some of the software is not available under the software manager on your platform.   The authors usually provide information as to how to download and install their software.  The process is not as simple as that under Windows, where you usually download an exe file and double-click to run it.  Linux has (arguably) a slightly tighter security model than Windows.  You need to install software as a Super User.  This is a bit like having Administrative access under Windows.

Sometimes Open Source software requires other software to run correctly.  If installed using the Software Manager, then any pre-requisites are installed along with the software you select.  This is great.  Sometimes under Windows, you install software and it needs something else, but you don't know what...

Keeping Current
Ubuntu has an Update Manager that checks whether there are any updates required for you platform.  But it does more than that.  It also checks for updates for any software you have installed from the Ubuntu Software Center or the Synaptics Package Manager (base level).  The process seems much quicker to me than Windows.

Is it Useful?
I have found the software available under Open Source to be as good as, or better than, commercial software.  OpenOffice provides basically the same functionality as Microsoft Office.  OK, there are some areas that are different, but for the general user, it works fine.  There are plenty of options available for multimedia software.  For Ubuntu, there is a site called Medibuntu that provides a host of great packages, installation instructions and discussion forums.

Give it a go!
If you are running Windows, it is possible to boot your PC from a USB stick with Ubuntu installed.  This means you can run Ubuntu without any permanent changes to your machine.  You can also run Ubuntu as a "dual boot" option alongside Windows.  If there are some software products that you use under Windows that you think you absolutely cannot do without, then there are software products under Ubuntu such as WINE, that allow you to install and run Windows software.

The cost?  Your time.  You need to search for software, read about what people are saying in the various web forums, then understand how to install it (together with any pre-requisite software).  If you aren't used to doing this yourself, find an Open Source friend, or use the forums.  I have found people in the forums very helpful for new comers (newbies).  You can quite often find an old discussion that answers your questions.





Thursday, November 12, 2009

Cloud computing - storm front approaching?

Well, we've had a few years of "cloud seeding" with a lot of hype and some reasonable applications.  However, let's think about it a bit.

What is Cloud computing?
For those who may not have heard this term, or haven't found an explanation - here's mine.  Cloud computing is where you are accessing an application or service via the Web and you data resides on their servers.  Usually, all you need is a PC with Internet connectivity and you're off and running.  Sometimes you download some code or the like, but true cloud computing means you should be able to access your service using any PC with Internet connectivity.

The Pro's
If you are an individual or small company, then a service provided via the cloud can be very cost effective.  You do not need to invest in any new hardware or software.  You do not need to worry about backing up your data or disaster recovery (read the Con's before disagreeing).  You can usually scale the service up and down to meet your business needs.  Quite often you can get access to a quality of service that you would not be able to afford.

You get things up and running much quicker than the conventional approach of buy, install, deploy.  You do not need to worry about upgrades or whether you have sufficient licences.

Seems like a pretty sweet deal...

The Con's
A recent (October 2009) incident in the USA has highlighted at least one significant pitfall.  A Telco offered a cloud data storage service via a third party.  The third party was acquired by Microsoft.  The service at some point after the acquisition went "down"... and stayed down.  Yes, that's right.  If your business was relying on accessing this data - say goodnight sweetheart.  It appears that Microsoft has recovered most of the data - some weeks later.  So for those people storing photos and other non-time critical data - fine.

It is harder to integrate cloud services with those on you PC and with other cloud services.  Whilst I am hopeful that some agreed integration standard will be devised - it's not here yet.

What happens if the service provider decides to increase your costs?  At some point they probably will - you just need to have a plan B when they do.  Do you stay with them?  At what point will you "pull the plug".  Indeed, can you pull the plug?

Summing Up
Cloud computing can be of great benefit to smaller organisations - or even larger ones if it is supplying a non-critical service (like on-line training).  If you have a small business and are considering adopting a cloud service, do yourself a favour and see if you can periodically download your data.  Also see what download formats are available, so that the data is easily usable if need be.

I think the Microsoft incident has probably put the brakes on a lot of organisations who were considering taking up cloud computing services.  Perhaps the best way forward is for larger organisations to adopt cloud computing as an internal model within their organisation to reduce the cost of maintaining PC's on everyone's desk - but that's another story.

Tuesday, October 27, 2009

I've Gone Ubuntu!

Well sports fans, I've finally done it - running my laptop with NO Microsoft software.  OK, here's my approach - and it worked well for me...

The Dual Boot
I was running MS Vista on my laptop (64 bit) and it wasn't too bad.  A few things irked me.  At home I have an old Window 2000 laptop acting as a print server for my Canon and Lexmark printers.  When I print to them from Vista it takes a---g---e---s to spool the print job - sometimes even timing out!

With Ubuntu (Linux), it just works - nice one.  By setting up dual-booting, my plan was to try to use Ubuntu as much as possible unless I really had to go back to Vista for something.  After 1 week, I was weaned off Vista.  I then took the brave step of nuking my laptop and giving it totally over to Ubuntu.

The Good (great even)
Everything I'm running on my laptop is "free".  There are varying levels of "freeness", so you need to understand the different licensing models.  But for personal use, it's pretty much free.

I use the latest version of OpenOffice which has the main applications for Word Processing, Spreadsheets, Presentations, Personal Databases, etc.  The applications read even the latest MS Office format files, but they tend to save in slightly older formats - however, I haven't found that to be a problem.  There's heaps of other applications, but I'm not doing the "kid in a candy store" trick of going "app crazy".  I tried using Evolution the Email & Calendar application that comes with Ubuntu.  It works quite well, but I have now chosen to move to two separate applications.  For email I use Thunderbird and for Calendaring I use Sunbird - both from the Mozilla (Firefox) stable.

Ubuntu has an Updates Manager and a software application manager that comes with a catalogue of applications that have been "passed" by Canonical (the company who helps to manage the maintenance and development of Ubuntu).

I have a PDA running Windows Mobile 6.1 that I used to sync with Outlook under Vista.  I am now syncing it with Thunderbird and Sunbird - but it wasn't a walk in the park...

The Bad
Fortunately for me, I have some ancient Unix knowledge that has stood me in good stead as Linux is a Unix-like operating system.  I first tried to sync my PDA with Evolution.  Although there were a few forum articles claiming success, I'm afraid with my device (HTC Touch 3G) I had interesting results.  Sometimes it would work fine - Eureka, I've done it!  Then I'd sync it another time and it would hang.  Then my PDA would run extremely slowly - even when not connected to my laptop and even after a soft reset.  Please note, I had NOT installed any software on my PDA...

I then installed Thunderbird and Sunbird, then imported my email and appointments from Evolution - easy.  I found another application called FinchSync that will sync with a Windows Mobile device - sweet.  It requires a small FinchSync server app to be installed on your PDA - easy.  Then you set up as many sources you like to sync with your PDA - and it works.

The Verdict
If you don't know much about computers, then find a friend who reckons they do.  Go to the Ubuntu site (or there are plenty of other Linux variants to choose from) and create a bootable USB stick.  The instructions are pretty easy to follow.  You can then create a dual boot PC.  This allows you to boot to your Windows operating system or Ubuntu.  From Ubuntu, you can still access your Windows documents, spreadsheets, etc.  Then it's up to you.  Until you are happy that you have got the hang of Ubuntu - and only then - you can wave goodbye to Mr Gates and the gang.  Oh, one last point.  I invested in a USB drive to copy all my files (documents, etc) "just in case".  Ubuntu supports USB drives, keyboards, monitors, printers, etc.

Cheers for now - happy computing!

Update
If you want to easily try out Ubuntu on your Windows machine (without trashing Windows and your files) have a look at Wubi

;-D

Tuesday, September 29, 2009

Something's Wrong...

In IT you are assured that at some point things will not go as expected.  There are software bugs, hardware failures and yes, even "human errors".  Here's a little story about what NOT to do...

Some time ago when I was working for a large organisation, they used removable storage disks for their large computers.  These disks weighed several kilos and had to be de-mounted and stopped before an operator could open the unit and place the disk cover over the disk, turn the cover until engaged, then remove the disk and place it into a locking base.  So the operation is not one that is done quickly.

Now one night, an operator saw that there were errors coming from one of the disks.  An acceptable procedure is to ask the Operating System to "swap" it.  The OS determines a disk that is not being used.  It then stops both drives and informs the operator of the drives to swap.  The operator then uses the above-mentioned technique to swap the disks.

This is fine for "soft" errors, where it may be a slight problem in reading/writing head alignment, etc.  However, operators are also trained to look for signs of a "head crash" before moving a disk to another unit.  A head crash means that the read/write heads that skim close to the surface of the dozen or so platters that make up a disk, have come into physical contact with the surface of the disk.  This is definitely a "hard error".  The disk is unusable and the unit should be checked by a technician.

Back to the story...  So the operator swaps the disks.  He then gets another error on the original disk.  So he swaps it again, and again, and again.  By the time he has finished he has wrecked at least half a dozen disks and potentially the same number of drive units - because he didn't check for a head crash on the first swap.  When he swapped the damaged disk to a new unit, because its surface is damaged, it has the potential to damage the read/write heads of the new unit.  Because he is swapping a new disk into the drive unit that had the head crash, the read/write heads could be damaged and then cause a head crash on the new disk.

So what can we do?
Process and procedures need to be in place to prevent problems and to guide people in resolving problems. With regard to software bugs, there are techniques to buld in "defensive code" that gracefully handles potential problems instead of failing catastrophically.  Providing log information to assist people in resolving issues is essential.  The software development life cycle (SDLC) must have checks and balances at different stages to pick up potential errors before the next part of the SDLC. The earlier a problem/defect is detected, the cheaper and easier it is to correct.  I have seen development teams push functionality into production knowing that it has errors and leaving it up to production support to fix.  Who is best equipped to fix the errors, the developers or the people who have read documentation on how the functionality should work?

With hardware, we need to establish a monitoring regime to highlight potential problems as early as possible.  Each hardware device will have statistics available from the vendor on Mean Time Between Failures (MTBF).  This provides you with "expected" failures over your installed base of these hardware devices.  You also need to know the Mean Time To Fix (MTTF).  How long does it typically take to fix a type of hardware device.  Of course, there are the whole areas of backing up data, disaster recovery and business continuity - but I won't cover them in this post.

Common Sense
  • Eliminate problems/defects as early as possible
  • Have clear procedures for people to follow (& educate them)
  • Know you software and hardware - monitor it
  • Be prepared before something goes wrong - it's too late to "wing it" in the midst of a catastrophe!

Friday, September 18, 2009

Interface Simplification

One of the common problems I see in large (and not so large) organisations is the proliferation of interfaces.  Let's not get hung up on the type of interface in terms of the enabling technology, but rather look at the complexity that builds over years and even decades.

Why does it happen?
Organisations either buy or build applications.  Then there is a decision to move information between the applications.  An interface is born!  Let's call the first 2 applications A & B.  Then another application (called C) wants to exchange information with A, but not quite the same information as B does.  So another interface is created - the A-C interface.

You can see how over years, this network of inter-connectivity between applications can reach quite daunting complexity.  Now when you change an application, as well as making sure the change doesn't break the application's current functionality, you must also ensure all of the interfaces involving this application, also continue to function correctly.  Then, when you move a change into production, you have to ensure that all changed applications are updated in unison.  If just 1 application change fails, you have to back out ALL changes - think of the cost and possible disruption to your business!

How to simplify
Start with an application, probably best to select the one with a lot of interfaces.  Identify the interfaces where the application provides outgoing information/data.  You need to analyse the data with a goal of combining the total data needs of all recipients and attempt to create a single, wider feed of information that includes the data elements required by all recipients.

For example, let's say an application has 3 outgoing interfaces to recipient applications called X, Y & Z.  X requires data elements D1, D2, D3 & D4.  Y requires D1, D2, D5 & D9.  And lastly, Z requires D1, D3, D6, D7 & D8.  By combining the needs of all recipients, we create an outgoing interface with data elements D1 through to D9.  We now have 1 interface instead of 3 - but there's more to do.

We need to ensure that the recipient applications only receive what they are expecting.  There are 2 approaches to this; change the recipient application to extract only the data elements it requires out of the new, wider interface, or create an interface "hub" to do this work.

In Conclusion
The more complexity, the more work you need to do to regain control over your interfaces.  However, the benefits are worth it.  Reduced development costs (with changes), reduced testing costs.  Faster delivery of new interface recipient for existing information feeds, etc.  There are other factors to consider, such as standardising interface formats, using XML, etc - but I'll save that for another day!

Cheers, Pete

Wednesday, September 16, 2009

First post and welcome

Pete here.  I started out in IT when it was called "data processing" back in the late 1970's.  Began my IT career working for a bank on some pretty old iron.

The intention of this blog is to provide my views on many IT-related themes.  I'll try to shed some light on common problems and misconceptions.  I'll also inject some personal anecdotes of IT "incidents" that I've been involved with over the years.

Starting off, my aim is to blog on a weekly basis.

So - welcome to "IT for the common man"!