Our Blog

This is some blog description about this site

  • Home
    Home This is where you can find all the blog posts throughout the site.
  • Categories
    Categories Displays a list of categories from this blog.
  • Tags
    Tags Displays a list of tags that have been used in the blog.
  • Bloggers
    Bloggers Search for your favorite blogger from this site.
  • Team Blogs
    Team Blogs Find your favorite team blogs here.
  • Login
    Login Login form
Recent blog posts

Posted by on in IT Tips

Easy and convenient but is it safe?

Public Wi-Fi is being offered at many facilities including airports, coffee shops, hotels and shopping centres even the library.

Whether you're using a laptop, a tablet or a smart phone to browse the web on public wi-fi your information can be at risk. Cybercriminals often target these wireless networks. They don't even have to be within the premises offering the free internet service. They only need to be within reasonable range of the wireless router itself. These criminals can use something called a sniffer (software that can intercept and gather all visible traffic on a wireless channel).

When a device connects with a wireless hot spot a process called a four way handshake is negotiated with the connecting device. WPA2 is the currently recommended security standard. It uses a pre-shared key (PSK) in the form of text letters to authenticate users and encrypt data.

A determined attacker is able to sniff the four way handshake and capture the PSK. That person can decrypt all the traffic designated to your device.

A recent survey from Norton found that around 60% of Australians feel safe online, and a massive 83% of respondents claimed to have used public Wi-Fi to log into their email accounts, shared photos and videos even checked their bank balances.

What would a cyber criminal do with your information? Well they could pretend to be you and this would be called identity theft. They could on sell your personal info including user names and passwords to other criminals. They could use your banking information to transfer money out to an external account or pay for goods and services while pretending to be you.

What can be done to use public wi-fi safely?

  • limit your internet access to non sensitive browsing (ie news and other sites that do not require you to enter username and password)
  • use a VPN which is a tunnel that can encrypt all data from your device going to the hot spot
  • ensure you have installled anti malware software ideally with real time protection
  • verify that the hotspot you are connecting to is in fact provided by the establishment you're intending to connect with
  • never use the public wi-fi to download and install sofware to your device (that process could potentially download malware instead of legitimate software)

Better still don't use public wi-fi at all!  For the ultimate wireless security, bring your own portable internet 3G/4G device with you and use that instead of public wi-fi. Or use your mobile as the hotspot to access internet instead of using the public wi-fi.

Posted by on in IT Tips

Yes they do.

How frequent?

Depends on how it is used and how often.  Computers that do a lot of downloads

or install lots of software for example would need this done more often.

Generally between twice to six times a year is recommended. The average system should be serviced at least 4 times a year.

 What is serviced?

  • disk defragmentation
  • removing temp files
  • emptying recycle bin
  • uninstalling unneeded software
  • tuning windows start up
  • installing operating system patches
  • updating device drivers
  • registry tuning
  • check windows VM settings
  • malware update/scan
  • internet browser optimisation
  • fixing minor errors
  • updating utilities & programs such as Graphics, Flash, Silverlight etc

 What are the benefits?

  • identify any issues before they cause significant problems
  • speed up overall responsiveness
  • removal of malware/spyware/adware
  • optimise system performance
  • free disk space
  • allow workers to focus on getting work done not be hampered by system issues
  • improve overall productivity
  • improve response times (egs inputting data, web applications, generating reports, answer customer queries)
  • encourages workers to perform at peak levels

For desktop computers it is recommended that the system be cleaned internally and

dusted at least once a year.

What if its not done regularly enough?

  • system performance slows
  • worker productivity drops
  • worker morale drops
  • slower response times
  • errors start to develop
  • threat of malware/spyware/ransomware infection grows
  • overheating from dust build up may cause components failure

 How long does it take to service?

Generally each computer can be expected to take between 90 to 120 minutes

not including if dusting is required. (longer if there are significant issues, large

software updates, or the system had not been serviced for a long time) The later may take around 40 minutes as the system board and peripherals are wiped down before the full system is subject to a compressor spray.

Posted by on in IT Tips

The information, technology and telecommunications industry is constantly changing.

There are new products, software and new methodologies and techniques being

developed all the time.

Our consulting services look at your business taking into account your business objectives, your current I.T. setup, your staff and competencies. We take a systems

view of your business. That is we analyse the individual parts that make up your

I.T. set up. We then conduct an audit to determine which areas (if any) could be improved. We do this by looking at your existing hardware/software,

procedures, methodologies, reports and talking to people. We then compile a detailed report which outline the areas that can be improved.

To gain the best from technology, one has to have much more than just powerful

capable equipment. Implementing the right methodologies and techniques,

staff competencies and maintenance are also key factors. What works well

for one organisation may or may not work best for your set up. This is where a

customised solution may give you a competitive advantage.

Information is the life blood of many businesses. So it pays dividends to keep your

systems, processes, competencies up to date. But how would you know that you've

got the best combinations in place?

Through our consulting service that advice you what equipment, software, processes and competencies to implement. And the steps to take to go from where you are now to where you want to be.


Posted by on in IT Tips

This is a procedure that may need to be done whenever we are upgrading a computer

system. Invariably we would have a cache of useful files on the old computer that needs to go onto the new computer. The blog is focussed on transferring user data only. I discuss only high level steps just to give an overview of whats required. There are no step by step instructions in this blog.

For programs (such as Microsoft Office and others) typically we would just download and install or else use the program vendor's DVD. But since there would've  likely been some improvement to the software since we last installed it, that also requires that we may need to upgrade the program.  Or stay with the old version (not recommended in most cases).

It is not possible to simply copy and paste over programs like we do for data. It simply won't work+.

Main practical options to transfer data from one computer to another are:

  • USB memory stick
  • CD/DVD
  • portable hard drive
  • direct/indirect network connection
  • cloud
  • hard drive caddy
  • network server
  • wireless connection

USB memory stick - an obvious choice. Is limited to around 128Gb* of data. However can be used multiple times to transfer data above and beyond the stick capacity.

A CD holds around 650Mb of data whilst DVDs can hold around 4Gb of data. Providing

both old and new systems have these drives this method can be used. The old system

needs to have at least a CD burner and ideally a DVD burner drive, otherwise connecting a portable USB CD/DVD burner drive may be another option using this method.

If the data to transfer is significant (say larger than 30Gb) it may not be practical (due to the large number of disks required) although still possible to use this method. 

A portable hard drive would typically be a USB connected external device. Sizes can range from a few hundred Gb to several Tbs in capacity. More than enough in most cases! It is an easy and obvious method for bringing old system files across especially when data size is large (more than a few hundred Gbs). 

We can use this method to connect to the old system. Once connected Windows should see the drive automatically. Then in Windows Explorer copy across the files we wish to transfer. Connect the portable drive to the new system and then bring across the required files.

Direct connection would use a crossover cable to connect to each other, whilst an  indirect connection would use normal network cable (UTP) connected to a switch. In this case both old and new systems use the UTP cable to connect to the switch.

In both network connections, enable peer to peer networking by setting the same Workgroup name on both computers. 

The default name is WORKGROUP.  Can use this name or another name if desired.

Just ensure both computers use the same workgroup name.

Computers should be able to see each other in the Network.  It is also required to enable folder sharing on the old PC ( ie share the folders of the files we wish to transfer) otherwise the new system cannot see and not access the required files.

Use Drag and Drop technique to transfer files or Edit | Copy | Paste in Windows Explorer.

Cloud - On the old computer transfer data to the cloud using Windows Explorer and

Cut | Paste.  This may be from one of the providers such as Dropbox, OneDrive, GoogleDrive etc.

Then access the same cloud service on the new computer.

Now transfer files to new computer.

If data is particularly large ( > 100Gb) it can take a while (several hours or more depending on amount of data, computer  and internet speed) unless using super fast internet (such as NBN).

Another option could be to connect old computer's hard drive directly into new computer's system board or make use of a hard drive caddy. The new computer "sees" both drives so its just a matter of using the Windows Explorer to tranfer files. Transfer speeds (of direct hard drive connection) are significantly faster than using network, or internet based options.

The network server option uses either a Network Server computer or a NAS (Network Attached Storage) device. Using these devices, the old computer would

authenticate^ to the Server computer. Then its just a matter to transfer files over to a

network drive on the server. If using  NAS the old computer typically accesses the

NAS's storage using a web interface to transfer files over.

Then on the new computer log into the Network Server to retrieve the copied data from

a network drive.

Or access the web interface to retrieve files from the NAS.

The last option would typically apply to transfering data from an old laptop computer

with wireless capability. In this case the laptop should authenticate to a wireless

ADSL modem and likewise the new laptop computer. Both computers should be able to "see" each other in the Network. Apart from this the remaining steps are to use the Direct/Indirect Connection Workgroup strategy discussed earlier. The only difference is that the connection is done wirelessly not with UTP.

In this blog I've discussed eight separate ways of transfering data from an old computer to a new computer. Most of the options are fairly straight forward to do yourself. The only ones that require some know how or experience is with the direct/indirect connection and the Network Server options.


+ Since Windows 95 it has been necessary to use the vendor program installer rather than simply copy and paste program files as done with older Windows and Dos based systems.

* A useful fact to know is that if you use FAT file system (which is the default format) it is limited to a maximum size of 32Gb. Any larger means you would need to format the volume using NTFS file system.

^ Login to the server if its in Active Directory(AD), otherwise use Indirect Connection using Workgroup technique discussed earlier if the server is a member server (and not a part of AD).

Posted by on in IT Tips

Previously I talked about the importance and necessity for performing backups to

corporate data.

Now I'll define several of the key terms used in describing voltage fluctuations to

AC power.

A brown out occurs when the supply voltage sags. This can last between a minute

to several minutes to even hours. A temporary dimming of the lights in a room could be

an indicator of a brown out.

Voltage     ^

       320     |

       240     |------------------                         -------

       160     |                       \__________/          

         80     |                    


                          1       2       3       4       5       7 Time (minutes)

Above graph shows a brown out.

A black out occurs when there is no mains power supplied. Everything (nearly) that uses power goes off in a black out.

A spike is the opposite to a brown out in that the supply voltage has increased more than

an appliance is designed to handle. The spike usually only lasts for one or two nano

seconds. If it lasts longer than this it is called a power surge. The effect of a spike or surge is that it may damage an appliance (such as a computer or router). The greater voltage

introduces heat. This usually happens through system components such as ICs, transistors, capacitors and wires being burnt out.

Voltage     ^

(AC) 450   |               /\

         320   |             /   \             /\

         240   |---------/       \--------/--- \--------



                             1       2       3       4       5 Time (nano secs)

Above graph shows what a voltage spike looks like as viewed through an oscilloscope.

In the graph two spikes are visible.

Spikes and surges are amongst the most damaging types of power fluctuation that could cause damage to sensitive electronic devices including computers, network devices, servers etc.

A spike or surge may render an appliance like your computer useless. The cost may be more than just replacing the dead hardware. Think about the cost involved in terms of time, effort and money required to replace the failed system and get back to where it was before the power surge.

Whats each hour of downtime worth to your business? I believe that the cost of acquiring and using UPSs for your business would be a lot less than the hassle of down time caused by power fluctuations.

The simplest form of protection from surges/spkes is using a surge protector. The surge protector is a device which is plugged into the mains outlet and which your appliance such as a computer plugs into. A surge protector protects from over voltage surge and spikes in mains power. However it does not protect from brown out or black outs

Power fluctuations may be leading cause of power issues on desktop systems that can lead to malfunctions, crashes and loss of data.

A UPS or Uninterruptible Power Supply can provide back up mains power in case of a black out or mains outage. A UPS is rated by its VA (voltage ampere) or power rating. The higher the VA the greater the time available for mains backup at a given load rating. UPS provides regulated output voltage independent of input voltage.

Most UPSs would only provide minutes of AC power backup. This is normally sufficient time for a system to backup, close running applications and safely power down in event of an extended black out. Although this procedure is usually reserved for server computers rather than desktops.

There are different types of UPSs. Some are used in Industrial applications, others in Medical industry, some are used in Military installations and lastly for Computer and Communications.

The ones described in this blog are for Computer and Communications application.


A UPS uses a sealed lead acid battery which is similar to technology as a car battery.

- for single desktop computer VA rating should be a minimum of 600VA or more

- for a server computer VA rating should be a minimum of 1000VA or more

- to protect multiple computers/devices additional VA would be needed from one


Laptop computers do not need UPS protection as already have built in battery for backup

but may benefit from use of a surge protector.

UPS products vary in the platform( rack mounted or free standing), number of outlets (1 to 8 or more), type of outlets(IEC or standard mains plug), type of protection (true sine wave or simulated sine wave), VA rating (the more the better), surge, battery backup, computer interface and software are the main points

Many UPS provide software interface and use USB cable to initiate a safe shut down of computer in case of extended black out

Major brands which I've encountered and recommend are APC, Eaton, Powershield. Powershield is a locally designed Australian product. APC and Eaton are overseas based brands who've been in the market for many years and have a strong reputation.

Prices have steadily dropped over the years. My first UPS was a Sola (now Eaton)

600VA unit costing $800 back in 1995. Today one can buy a similar unit for between

$100 to $200. Surge protectors are much cheaper costing in tens of $ depending on

brand, quality and configuration (one or multiple power outlets).

In a later blog I talk about how protecting network devices can benefit your setup.

Posted by on in IT Tips

In this blog I'll talk about how you can back up your data. Your data may be your documents, emails, spreadsheets, photos, accounts data, project files, etc.

Essentially any files that are unique to you and/or your organisation. The simplest way to do this is using Windows Explorer to drag and drop files onto a USB memory stick.

This is all well and fine but what if this needs to be done regularly or scheduled automatically? Broadly speaking there are two main options here. You can use commercial backup software or you could write your own code to do the backup. For most users it would be the former option. For IT pros or computer nerds you can write your own scripting batch file, Windows Management Interface (WMI) or Powershell scripts to do this. An advantage to using backup software is that you can take a snapshot of system areas of the computer (typically this would be boot record, registry, windows system files) and also run a backup to a schedule. Other ways to backup data include writing your files to a CDRW/BRay disc, copying files over to the Cloud, copying files to a external USB hard drive, making copies of files on another computer* and backing up using a NAS+ .

If backup is for a Server computer then the typical options would be to use tape drive technologies (such as DLT, DAT, LTO), USB external Hard Drive, RDX drive.

For tape drives the tapes are typically inexpensive ( $tens of dollars onwards) but the drive can be more than the price of a powerful desktop PC (~ $2000 and over). Whereas for USB external Hard Drive its only going to cost in the hundreds depending on brand, capacity and quantity of drive used. Although tape drives appear to cost more overall, the cost may be justified by the durability of tapes over portable hard drives. Portable hard drives can be expected to last between a few years up to 9 or more years depending on how they are used, how they are stored as well as how they are handled. Whereas tapes can last two or more decades.

RDX drives are based on hard drive technology so they aren't using tapes. Although the drive is relatively cheap ( < $200 ) the data cartridge can be quite expensive. ( > $350 each) Cloud is an obvious backup choice. However it would be feasible providing data is not exceeding a few Gigs at most. Once we're looking at tens of Gigs cloud can become impractical due to internet bandwidth and current access speeds. Backup software for a Server computer is usually a necessity as its required just to work with the tape drive. Older versions of Windows Server (before 2008) may have included backup software as part of the operating system install. But the newer versions do not include them.

In case of using portable hard drives, it’s still a useful feature to be able to set a schedule and let it run automatically. Plus it would have a logging facility which can be used to check how the backup went and if any errors or issues were encountered. There are pros and cons with each backup solution besides the cost. Some of the considerations other than cost may include:

- media capacity
- media longevity
- media reliability
- backup to schedule
- ease of restoring data
- access data speed
- data security
- logging and reporting
- capability for incremental, differential backups

* such as another workstation or a server computer on a network
+ Network Attached Storage - a device used to host one or more hard drives that is connected to the network and can usually be accessed with web browser locally or via internet

Posted by on in IT Tips

In this blog I discuss about an often overlooked aspect of maintaining a computer - backing up files. Why is there a need to back up files? Well there are several reasons why this should be done. I will give some key reasons for backing up data.


·         hardware failure

·         power outage/power surge

·         user error

·         software error

·         loss of equipment

·         malware attack


Hardware failure - this can be due to a sub systems failure such as system board error, memory fault, hard drive failure amongst other things. Any failure of a computer sub system can render the computer unusable until the sub system failure is fixed. It can also mean that any data may be lost if not already backed up.  For example a failure of the graphics sub system would render the system unusable but your data should still be intact. But a hard drive failure could render your data inaccessible.

Power outage / power surge - mains voltage is subject to power fluctuations which can have an adverse effect on your computer if not properly protected. A power surge could cause a failure in the power supply and/or system board. A power outage as the name suggests could mean data corruption or data lost if not already backed up.

User error -  all too often data is accidentally erased. Providing you haven\'t already emptied the recycle bin, your data might still be retrievable. However for ultimate protection backing up your files ensure you can still recover from a \"user error\".

Software error - many commercial programs have thousands of lines of code. Sometimes the software can \"crash\" causing you to lose data or data which was not already saved.

Loss of equipment - this can be through being misplaced or through theft of your computer. If you haven\'t already backed up your files it may be gone forever unless you can recreate it. Even so this entails time and effort without a backup at hand.

Malware attack - some types of viruses target data destruction, while other try to gather sensitive data. This is just another reason to back up your files.

This covers most of the likely scenarios even data loss. It should already be clear that having a back up of your files at hand certainly beats having to re-create the data should the worse happen. For businesses it could mean the difference between staying in business and going bust.

In the next blog I discuss about what needs to be be backed up and how to back up your files.



Posted by on in IT Tips
How to Prolong the Life of Your Computer

Here are a few tips that can help to prolong the life of your computer.

- avoid using the computer when the ambient temperature is very warm to hot ( > 28°C ). Running a computer in hot ambient temperature

can shorten its service life especially for laptop computers where its electronics are closely packed inside.


- for laptop computers ensure that you run on battery power down to nearly

exhausted at least once a month. This can ensure the battery is used regularly and therefore improve its condition over time. Conditioned batteries can last between 300 to 500% longer than not conditioned.


- for desktop computers ensure you are using a Surge Protector or Un-interruptable Power Supply (UPS)  to avoid issues with surges, brownouts and power spikes. Power fluctuations can cause problems for a system's power supply or worse still affect system or motherboard components. A surge protector can be used effectively with a laptop computer but a UPS would not give any real benefit. This is since its own internal battery is already acting as the UPS.


A Surge Protector prevents harmful power surges or spikes from reaching delicate electronics. This can destroy components and/or shorten their life. A UPS although more expensive goes further and provides temporary AC power when the electricity goes out even for a few seconds. In the latter case a surge protector cannot prevent this from  occurring (as it only protects against surges and spikes not brownouts or black outs).  Use of a UPS should be mandatory when you're running a server or workstation computer where multiple users' work are saved or significant project work is stored. 


Keep the insides of the computer clean and free from dust.  Dust build up can cause an accumulation of heat which can affect the system operation due to electronic components' sensitivity to excess heat. The system would perform more slowly than normal or become less reliable over time. Regular inside cleaning should be part of the maintenance schedule of a desktop owner. Cleaning should be done at least every 12 to 18 months depending on prevailing conditions.


- service your computer for optimal performance periodically (2 or 3 times a year for most computers). This can not only ensure your system is more responsive but also increase the lifespan of your hard drive. 

Posted by on in IT Tips
Network Sharing Options

File Sharing

Today I am going to discuss what are the options available if you needed to share files on your work network. We share files for many reasons. Here are a few: a group working on a project to collaborate and share ideas, sales person accessing inventory/pricing data, data input into order entry system for generating sales reports, engineers working on a blue print design for an electronic device.

I'll talk about the pros and cons for each method available as well as costs.

I'll be making comparisons of the various options from the perspective of Windows based environment. Although many of the pros and cons still apply in other personal computing environments ie Apple, Android, Unix/Linux.


Network Attached Storage is a network device with built in intelligence for hosting single to several hard drives. Access to data stored on the NAS is through web based application interface ie web browser. This is a cost effective way to share files with users already connected to a network. It is possible to secure data using group and users' permissions.


·         cost effective to purchase ie hundreds rather than thousands of dollars to buy

·         can be configured and ready in a few hours rather than taking days

·         easy to implement file and folder level permissions with use of users and/or groups

·         can be used to implement own private cloud

·         data can be accessed remotely from any device with a web interface

·         allows use of RAID 0 and 1 to customise performance and redundancy

·         cheap to run ie low electricity consumption


·         does require set up and configuration. also router/firewall may need to be configured to enable Access Rules and Port Forwarding

·         data can be lost if not backed up

·         files can become out of synch as version control is not automatic ie  one user's modification can overwrite anothers

·         higher end NAS can cost quite a bit more than a few hundred

Peer to Peer Networking

In this model of networking, files are stored on workstation computers throughout the network. This is a decentralised way of accessing files. A key advantage to this method is its manageable for only small networks of up to a maximum of 5 workstations.


·         may not require any further investment in hardware or software beyond whats needed for a desktop/laptop/tablet PC

·         by its nature files are already distributed throughout various computers which creates redundancy ie no single point of failure

·         relatively easy to set up and configure ie implemented using workgroups which is default setting



·         access can be slow as workstations are not dedicated to file serving

·         for every user that needs access to files on a given workstation, that user needs to have an account created ie administrative overhead

·         does not scale well to larger networks with more than about 5 workstations due to significant administration required

·         security can be compromised as files are stored throughout the computer network


Client Server Networking

This is the centralised model of network file sharing. In this model we have Server computer(s) which serve the file needs of the workstation computers. More suitable for larger corporate networks that need centralised control of files, better administration, and file serving performance.


·         better performance as using dedicated hardware for file serving

·         centralised administration of folders and files

·         better control of uniformity of desktop rights and applications through use of Group Policies

·         easier to implement back up systems

·         higher security of folders and files compared to peer to peer and cloud based

·         automatically limit access to files providing in built version control

·         auditing and reporting capabilities



·         cost of implementing a server with OS, options and applications plus installation & configuration quite significant ie typically looking at between  $7k to $10k

·         requires a space to hold the server computer & monitor

·         cost of electricity to keep server up and running

·         ongoing costs of administration, updates, maintenance plus fixing any issues can be significant

·         potentially this can be a single point of failure ie if server goes down for maintenance or due to systems failure workers may not be able to access files ie cannot work

·         the better the server system, the higher the cost. ie improved performance, upgradability, redundancy



This technology seems to be the new buzzword in computing as many companies take their data to the cloud. There are a number of advantages although it is by no means the be all and end all of file sharing for everyone.


·         no need to make capital outlay. based on pay for capability/capacity as you go basis

·         being on the cloud data can be accessed by any computing device connected to the internet

·         easy to configure and set up

·         saves space - no need for big heavy server computers plus associated options

·         potential cost savings as no need to pay for updates in equipment or software and no general maintenance required by end user



·         slower performance of apps compared to running using client server

·         service shut off - if timely payment of bill is not paid

·         security and data confidentiality concerns

·         apps running using cloud based technology may not have full feature set of conventional thick clients'

·         performance and availability dependent on reliability and performance of internet access

·         cost savings can be overstated - when considering no. of employees multiplied by lifetime of server ( say 4 or 5 years )

·         no automatic version control means data can be inadvertently overwritten


In this blog, I've talked about the major pros and cons for the four different file sharing technologies: NAS, Clous, Peer to peer and Client Server.



Posted by on in IT Tips
How to avoid Malware

Malware is a broad term used to describe rogue software that includes viruses, worms, trojan horses, rootkits, spyware and adware.  For a computer that is infected with malware the effects can range from unwanted ads appearing to data lost to the computer crashing or even one's  identity stolen. There are a number of ways to minimise your computer from contracting malware.

- enable firewall using either Windows firewall or a third party product's firewall

- use up to date anti malware programs. Run periodic scans of your computer

- get the latest software updates

- limit user privileges

- knowledge of how malware works

A firewall is a utility (can be software or hardware) which basically limits access to your computer from external sources. It does this by selectively blocking or allowing access from programs to your computer from external sources.

Anti Malware program can be most effective providing it is run periodically using the latest detection signatures. For most systems a weekly scan is often enough.

Software can be updated by downloading the latest updates. This is necessary as malware authors are constantly looking at ways to bypass existing security measures. Software updates can be automatic as in Windows Updates or manual through user selection. Programs such as Internet Explorer, Firefox, Adobe Flash and Apple Quicktime release updates on a regular basis. 

If a computer user does not have administrator rights then this becomes another layer of defence against malware. Since malware running through an infected user's account is limited to actions according to that user's privelege.

One of the best ways to avoid malware is knowing how malware is transmitted in the first place. Malware can be inadvertently downloaded onto your computer from program downloads through the world wide web, opening attachments in emails, running infected programs and USB memory sticks.

Do not open attachments from people who you do not know or trust, careful when you download programs from a website. Do not click on links sent to you from an email if you don't know who the sender is.

Phishing is another form of malware where the sender asks you to provide sensitive information such as personal information and/or credit card/banking details. Classic example is an email I received asking to participate in a retailer's  reward survey.

The email asks you to click on a link which takes you to a third party website which mimics the retailer company  then asks you to participate in a survey with a cash reward at the end.  You can check the emails authenticity by calling the retailer or else checking the retailer website. Often they list scams and phishing activities by unauthorised third parties using their name.

An often overlooked form of defence against the effects of malware is to backup your documents. This includes photos, music and projects that you might have worked on. 

For some malware works by deleting or destroying data.


I shall discuss backups in a later blog.

Posted by on in IT Tips
My Computer Runs Slowly

In this report I discuss a number of factors that can make your computer run slowly. These factors that affect the performance of your PC are as follows:

·         CPU

·         Memory (RAM)

·         Application Software

·         System resources in use

·         Operating system configuration

·         Malware

·         Hard drive

·         Networking issues

IT Support Issues

Each week or so we'll be writing about common computer issues reported by clients. Some of the blog posts will address pre-emptive action you can take to reduce the risks of crashes, malware and downtime for your computer. If you have any particular IT issues

Subscribe to Our Newsletter