McLean IT Consulting

WORRY FREE IT SUPPORT

Call Us: 250-412-5050
  • About
  • Services
    • IT Infrastructure Design
    • Remote & Onsite IT Support
    • Disaster Recovery
    • IT System Monitoring
    • IT Audit
    • Documentation
    • Medical IT Solutions
    • Wireless Networks
    • Cloud Computing
    • Virtualization
  • Partners
    • Lenovo
    • Ubiquiti Networks
    • Dragon Medical Practice Edition (Nuance)
    • Synology
    • Drobo
    • Adobe
    • Bitdefender
    • NAKIVO
  • Contact
  • Blog
  • Remote Support

Backups – Are They Working The Way You Think They Are?

April 24, 2011 By Andrew McLean Leave a Comment

There is a term the IT industry uses to define a hard drive crash: disaster. Backups are the primary tool for disaster recovery.

I try to address backups at every site I visit, mainly because in my experience, in the vast majority of cases, backups are not configured (or not behaving) the way they are expected to. This can be due to a hardware failure or a misunderstanding about the way backups were configured from the outset. On the flip-side of the coin, backups may be configured in such a way that it may not be immediately intuitive as to how to restore files from it if a disaster ever did occur.

I’ve seen situations where an external drive is attached to facilitate backups and it may have worked for some time, but after a few years, the drive no longer functions – but the backup software gives no warning indicating the failure. The user continues on with their life, blissfully unaware that their  backups no longer back up.

Still others will install an external drive but mistakenly configure backups to store on the internal drive, which obviously defeats the purpose of the activity.

Simple, intuitive backup software is available – free in many cases even in enterprise applications. Hard drives are cheaper and have higher capacity than ever before.

For homes or businesses, options such as offsite storage can address issues that local backups cannot — issues like fires, floods or earthquakes. For the price of a couple of cups of coffee (or one expensive one) each month, you can have the peace of mind that comes with data redundancy.

Take a good look at your backup designs and verify that they do what you think they do. Don’t find yourself on the wrong side of a disaster.

For those of you who wish to have managed or at the very least monitored offsite backup solutions, McLean IT Consulting offers great and affordable storage packages for homes or businesses, and we are notified of any failed backup jobs, enabling us to respond to issues in a timely manner and keep your data safe.

Filed Under: Technology

iPhone Security Issue? Maybe…

April 23, 2011 By Andrew McLean Leave a Comment

If you’ve been watching the news recently, you may have heard a story regarding the fact that the iPhone stores logs on the phone and computer documenting locations where your phone has been. It is being called a security issue because this information is not encrypted and is easily accessible, and because no one was made aware of the fact that this information was being collected.

Frankly it strikes me as sensationalism. Cell phone carriers frequently log this sort of information to improve coverage and analyze usage patterns. The fact that forensic investigators can pull this information from my device means nothing to me as a law abiding citizen, and the geographic coordinates contained therein are typically so inaccurate that I wonder what one could even do with such information. I might even go so far as to call the iPhone location tracking a feature, if logging accuracy was improved. Lots of companies pay good money for GPS tracking devices on fleet vehicles, all the better if that information could be extracted from existing company phones.

In the strictly IT sense, yes, Apple should have disclosed what information is being logged on the iPhone, especially in this day where privacy is such a sensitive subject for people. But personally I think people are getting a bit carried away over it.

Filed Under: Technology

Virtualization: What Is It? Why Use It?

March 8, 2011 By Andrew McLean 1 Comment

What Is Virtualization?

In the IT sense of the word, anything “virtual” is a representation of something that does not physically exist. Virtual Private Networks (VPN), Virtual Memory, virtual reality, virtual game worlds such as that of World Of Warcraft, and of course as it pertains to this article, Virtual Machines (VMs).

Virtualization is the act of separating operating systems and software from the physical hardware it runs on by placing an abstraction layer between them. Put simply, it turns the “computer” into an ethereal state that isn’t tied to the physical confines of the hardware.

Why Virtualize?

Take for example the images below.

Before Virtualization

After Virtualization

On the left we see the concept traditionally applied to general computing. Applications run on the operating system which runs on the hardware. Efficiency is quite low and in most cases the processor would be idle 80-90 percent of the time. Backups must be made to an external disk or network location from within the context of the computer it is backing up, subjecting it to potential configuration errors and faults. Adding hardware may require that the system be powered down on-site and physically maintained. If usage requirements change, system resources cannot easily be altered or redistributed.

On the right, VMware is acting as the virtual abstraction, separating and acting as a liaison between multiple hosted applications and operating systems, and the hardware itself. In the second example, available processor cycles are distributed between multiple operating systems running in tandem on the same computer hardware, thereby improving efficiency. Each VM can be paused, backed up and started outside the confines of the individual system. Memory, network devices and even storage can be added or redistributed (and even remotely) without interrupting the operations of the Virtual Machines. If the host hardware fails, the VM images can be migrated to a new physical host without regard to drivers or capacity. Virtual machine images can be placed in any state in standby, allowing new, fully installed and configured server images to be activated within seconds as needed.

Not only can the hardware host multiple Virtual Machines, but also many different operating systems. Some businesses require many different operating systems for separate departments and software necessities. One company might need both a Microsoft Server 2008 domain controller but also a Red Hat Linux web server. Once upon a time, that company would have to bite the bullet and invest in two separate hardware servers. Now, all this is possible on a single server.

A few years ago I held a position as a network administrator in charge of maintaining approximately thirty servers (most virtualized) at any given time, and virtualization afforded me flexibility and fault-tolerance that was otherwise not possible. When testing a patch or other general system change, it was easy to click a button to save the system state, then make a change. If anything other than the desired result happened, I clicked another button and went back to the way it was before in a matter of seconds. The value of this “undo button” cannot be overstated.

Things To Consider

The IT world changes so fast sometimes that it takes a while for it to catch up with itself. Virtualization has been around for years but traditional software licensing restrictions struggled to fit the concept of virtualization with accepted definitions of a “computer. Microsoft originally defined a license as being permitted to run on a single CPU. Then, when multiple CPU computers came about, they had to revise the wording. Then again when virtualization came along, Microsoft at first required that a new license needed to be purchased for every “instance” of their operating system. Microsoft eventually rethought the entire licensing structure and loosened restrictions enough to allow unlimited virtual machines provided the top-level host operating system was the Datacenter Edition of Windows. What’s more, Microsoft went a step further and released its own virtualization software known as Hyper-V, a variation of the term HyperVisor, the basis of which the virtualization technology came into existence.

Microsoft’s Hyper-V is available as part of the Windows Server 2008 system, and VMware’s VMware Server 2 is available for a free download for relatively simple infrastructure. More robust options are available for additional licensing fees.

Filed Under: Technology

Cloud Computing: What Is It? Should You Be Using It?

March 6, 2011 By Andrew McLean 4 Comments

Unless you’ve been living under a rock for the last couple of years, you’ll have heard of “Cloud Computing”, but recent studies have suggested as many as 40 percent of IT professionals are bewildered by the concept of Cloud Computing and its varied definitions. Advertisements say you want it. Big business agrees. But what is it and how can it help your business? Does it live up to the hype?

What Is Cloud Computing?

The simple truth is we’ve been using “Cloud Computing” or “software in the cloud” for years. In years past, if you as a business intended to use a web application, you were required to either purchase a server and deploy the application yourself, or alternatively rent hosting space from a web host and deploy the application there. Services such as Hotmail, Google Docs, Office Live, MobileMe and Flickr are examples of cloud software everyone has been using for years. What defines these is that you are not required to “own” any IT hardware or software to use them, but the information  held therein is entirely your own. To host your own email services as a business, you must have the IT infrastructure and the know-how to use it. The same is true of services like Flickr which doubles as both an image-centric social network and also a “cloud storage” service, effectively backing up all of your images safely off-site. These technologies are a subset of Cloud Computing called SaaS or Software As A Service. I myself have my company email hosted by Google Apps which gives me the ability to access my own email with my own domain (@mcleanit.ca not @gmail.com) from any Gmail interface. This affords me the ability to have an inbox of up to 7.5 GB of inbox storage space at the time of this writing and frees up my precious web server storage space to do what it does best: serve and store web content.

Why Use The Cloud?

One analogy I’ve heard and think fits well is “if you only need milk, why buy a cow?”. The advantage of Cloud Computing and/or SaaS is that instead of purchasing, licensing and maintaining on-site servers and software, companies can cut serious expenses in those areas. Instead, applications are run and shared from a centralized location over the internet. Corporate data centers are dramatically underutilized with servers typically idle 85% of the time, but this is done to afford the potential of increases at peak times (such as the holiday season) so the system can handle unexpected surges of traffic. Cloud computing allows a business to pay only for what they use. When leveraged correctly, this can almost entirely offload the overhead costs involved in providing IT services to employees or clients. All that remains is the cost of the service.

In the example of Google Docs, this SaaS can theoretically replace the need for desktop applications such as Microsoft Office or OpenOffice, and has the added benefit of being accessible anywhere with an internet connection, as opposed to something stored locally on a work computer, which generally isn’t accessible remotely (unless made so by online storage SaaS such as DropBox). But Cloud Computing and SaaS don’t stop at desktop applications and remote storage. Service exist that encompass HR, billing and invoicing, CRMs (Customer Relationship Manager), employee training and web conferencing.

The other side of Cloud Computing is Utility Computing. Companies like Amazon, Sun Microsystems, and IBM, who offer storage and virtualized servers that IT departments can access on demand. These virtualized network devices are accessible via the Internet “Cloud” and integrate seamlessly inside your business network via VPN (Virtual Private Network). Other commoditized services include centralized firewall and centrally managed antivirus systems. With all the buzz going on, more services and utilities are going “to the cloud” all the time.

visual representation of cloud systems

Why Might You Avoid The Cloud?

The marketing pitch and the fervor in which it is given would indicate these are simple turn-key business solutions, but they are certainly not appropriate in every case. The hype machine says “Lower your TCO and rollout time using Cloud Computing”. But the truth is, even as outsourced, virtualized utility IT infrastructure, the complexities of the technology don’t disappear simply because they are hosted “in the cloud”. A service as simple as email hosted and maintained by a company as big as Google can suffer disastrous mistakes. Less than two weeks ago, Google (by way of a software update and a bug therein) accidentally lost or deleted 150,000 email accounts (effecting less than 1% of Gmail users). It should be noted that Google had backups and restored the accounts and emails to all users but it took several days to do so. More advanced services have even more potential for similar issues.

Of course, hosting this infrastructure internal to your enterprise has the same potential for issues but there is a large perceptive difference between being internally responsible for mistakes, and being let down by an outside service provider. Businesses are inherently more forgiving of themselves than to outside service providers.

Furthermore, many SaaS products still require an “Administrator” who can hold certifications to maintain the service. But when new features are released, administrators are forced to re-certify with an exam, and this can happen 3-4 times a year! In fact the push for Cloud Computing and SaaS is increasing the sophistication of systems because of all the underlying technologies required to integrate it seamlessly with existing infrastructure.

Is Cloud Computing Right For You?

All businesses are different and of course so are their needs. A local service business may not benefit from cloud-based firewalls or virtualized server clusters as a large enterprise might, but perhaps a hosted email solution or online invoicing systems are simple and viable service options. As always business requirements will dictate whether Cloud Computing is right for you.

Image courtesy of centralasian

Filed Under: Technology

Powerline Networking: Communicating Via Electrical wiring

March 5, 2011 By Andrew McLean 2 Comments

powerline diagram

I recently had opportunity to work with a relatively rare and very new networking standard.

A residential client had a broadband internet connection that came in in one room of the house, but they wanted to give network access to a room on the far end of the house. The catch was that the client was opposed to the use of wireless devices including portable phones and microwaves, and running ethernet cable across the house was less than ideal. This left me with exactly one option. Powerline communications

Powerline Networking comes from a standard of the IEEE (Institute of Electrical and Electronics Engineers), known as IEEE 1901. For the sake of reference, FireWire is IEEE 1394, and Power over Ethernet (PoE) is IEEE 802.3af. The Powerline Network standard was published in February of 2011.

Unlike PoE, whose purpose was to abolish the need for electrical adapters for networking devices in addition to the ethernet cable by carrying both power and communications over the same cable, IEEE 1901 is the exact opposite. Its purpose is to eliminate the need for running ethernet cable at all by using the structure’s electrical system as a communications medium. This means no new wires are required, nor is wi-fi (which in consumer-grade routers is often unstable anyway).

How does it work? You plug a Powerline Ethernet adapter directly into an electrical outlet and plug that into your router with an ethernet cable. Then you plug another device into another outlet in the room you want to connect to the modem. You then link your computer to that device with an ethernet cable and the Powerline ethernet devices get to work. They operate by introducing a modulated carrier signal to the electrical system. A pulse of “interference” that is interpreted as a communications signal by IEEE 1901 devices.

Early devices were quite limited in speeds and reliability, but technological breakthroughs have allowed the creation of the HomePlug AV specification of the IEEE 1901 standard. HomePlug AV allows for up to 200MBps of potential bandwidth, doubling the speed of more common 10/100 ethernet connections. Many devices can be added, and any ethernet-compatible device is perfectly compatible with this breakthrough in networking technology.

Filed Under: Technology

  • « Previous Page
  • 1
  • 2
  • 3
  • 4

Contact Us

McLean IT Consulting Inc.
Serving Greater Victoria

P: 250-412-5050
E: info@mcleanit.ca
C: 250-514-2639

Featured Article

Homelab Introduction: Part 2 — Network Infrastructure

Network Anatomy When people think of networks, most will think of the wireless router they use at home. But in fact, the typical residential or even … Continue Reading

Blog Categories

Our Mission

We seek to enrich and improve small and medium businesses by delivering best-in-class technology solutions, and offering a premier customer service experience. Contact Us Now!

Quick Menu

  • About
  • Testimonials
  • Contact
  • Blog
  • Sitemap

Let’s Get Social

  • Email
  • Facebook
  • LinkedIn
  • Twitter
  • YouTube

Copyright © 2025