I’ve had issues when designing websites with the website appearing to shift this is a common issue and caused due to the vertical scroll bar (used to move up and down a page) automatically appearing if the page content requires scrolling and disappears when it is not needed.
This is not an issue on all browsers as some always show the vertical scroll bar so those users will not see this issue. I have used 3 methods to force the vertical scroll bar the first isn’t valid CSS but it seems to work on all browsers apart from opera I would recommended using the second or third.
All operating systems have some kind of scheduling ability on Windows we have task scheduler on Linux we have cron both of these accomplish the same task which is allowing us to run scripts or programs automatically based on different events, times and days. From my experience people tend to call the task scheduling system on Linux various things such as cron, cronjob(s), crond and cron daemon all of these are fine but the proper more correct is either crond or cron daemon as the cron system is a daemon process running in the background of the Linux operating system.
This is how I install crond onto my CentOS server edition as it doesn’t seem to come preinstalled, this might not be true for other distributions and the following command may not work for all distributions.
Shell
1
yum install vixie–cron
If the yum command is not found on your system then you most probably should be using the apt-get syntax.
Shell
1
apt–getinstall cron
If the installation package is not found then try one of the following commands, if that fails there is always Google or comment on my post and I will give you a hand.
Gone are the days of hundreds of CD’s and photo albums most of us now have thousands of valuable irreplaceable photos and videos but why do thousands of us never create backup procedures or recovery plans…remember this data is irreplaceable your literally throwing away memories and all the hundreds of pounds you have spent building up your music collection.
We all want a simple, cheap, effective and reliable method of backing up data and Dropbox could be the answer I have been using it rigorously for the past 3 months and have already started recommending it to others, it’s a great tool and let me tell you why.
The basics
You install a very small application on your computer and assign a folder; all files you place in this folder will automatically be uploaded to the Dropbox servers where you can then access it via Dropboxes website interface or via another computer running the Dropbox application which is enabled to use your account. Any changes you perform on your second computer will upload to Dropboxes server and your other computer will download or perform the required changes so that you have the exact data between both computers.
Syncing between multiple devices
You can install the Dropbox software on multiple computers and laptops and as long as they are all using the same Dropbox account any changes performed on either device will also happen on the others. For example I have my main laptop and a netbook, when I edit a file on my main laptop which is located in my Dropbox folder the change is uploaded to Dropbox, the software on the netbook detects that the version on Dropbox is different so syncs the data so I have an exact copy on my netbook also. You can potentially sync your files between hundreds of devices.
Free v Paid
There aren’t any differences between these apart from disk space if you require more room you will simply have to pay a small monthly fee or invite some friends to get some more disk space. I myself use the paid service as I require the extra disk space and for only £9 a month to backup all my music, videos, images, documents and programming work it’s not a bad offer, especially when your data is synced between multiple devices and is worth a considerable amount more than this monthly cost.
Undo/History (Packrat)
As an extra feature you can pay 3.99 a month which enables you to undo / view previous versions of your files also known as history, this feature may not be beneficial for all but I would recommend looking into it. Let’s look at my circumstance I am a programmer and all my work is automatically synced with Dropbox so I don’t lose data, the only issue is that if I accidently delete or edit a file unintentionally that change will upload to Dropbox and that file is now gone/edited for ever. With this extra feature I can log into my Dropbox account and view all deleted files and recover files, I can also view/download previous versions of a file so that if I was to accidently change a file I can now just retrieve an older copy.
This feature is also an added layer of backup protection let’s say someone gains access to your computer and deletes all the content in your Dropbox (and also removes any other method of recovering the files) that change will now occur on Dropbox and all your other devices, now let’s say the only copy of these files was what you had synced with Dropbox now you have nothing and no way of recovering them. This feature would allow you to undelete the files as long as the person doesn’t physically log into your Dropbox account via the web browser and permanently deletes your files you should be able to recover them just fine.
Shared web hosting is common and relatively cheap these days pay a small monthly fee from £1 all the way up to £35+ and you can easily setup a website with FTP, database and mail support. The features provided with most shared web hosting these days are varied to some providing hardly any to others providing complete automated website installation scripts for 1 click website setup. It’s a competitive market with the leading giants competing with each other and then the smaller single user companies trailing behind trying to make a living whilst never being able to match the giants without making a huge profit loss.
What people don’t understand is that a shared hosting package if for hosting websites only with some providing some extra features such as mail services. I have had numerous people ask me to setup various software, tools and services on their server to then find out that this server is actually a web hosting package on a shared web hosting server (which they don’t actually own or have control over), to which the response is normally why not?
Why not?
Well the answer is simple you technically are only renting the ability to store your files on your hosts server and use there already installed web server software to distribute your website content across the internet. It is called shared hosting as multiple websites are being hosted on a single server meaning each of your hosts clients including you are locked down into your own little prison, your website doesn’t (shouldn’t) have the ability to access any data outside of your dedicated area.
To give you the ability to install software or reconfigure already installed software would compromise the server security and reliability but any changes you performed would also affect every client your host has on that server. Let’s say you make a change to the PHP configuration and disable some PHP extensions as your website doesn’t use them, you have now just stopped potentially all the other websites hosted on this server from working due to your small change.
Installing new software is potentially the same issue but there is also the issue of overloading the servers CPU, RAM and most importantly licensing and illegal activities. A hacker who had access to the internals of a server and the ability to execute commands couldn’t easily find away to break from there dedicated space and reap havoc on the server.
A shared hosting provider cannot risk decreasing the security and reliability of a server to enable you to install custom applications, at the end of the day you do not own the server and think of it the other way around. If you had a server hosting potentially hundreds of paying customers would you want to risk their website going down every two minutes or have to tell them you have lost all their data?
The solution
The easiest method is to setup your own server if you do not have enough for this you could always look at renting a dedicated server from a third party or purchasing a virtual private server (VPS). I myself use a VPS which is just a mini dedicated server and gives you complete control of the server and its software. Here is a couple of posts I have done related to VPS would I would defiantly look into reading.
Website compression is an important feature which can help increase a websites loading speed, before you dive straight in you need to understand how it works as it will increase CPU usage which could lead to a web server (especially busy ones) becoming slow resulting in slow website speeds (which could leave you with a website which now loads even slower than with no compression).
On demand v precompressed content
When enabling compression such as apaches deflate or gzip the content is compressed on the fly meaning that when a visitor requests a page all the content related to that page is compressed at that very moment and then sent to the client, what this means is that with every page load the CPU is recompressing the content for each client. This has its positives and negatives such as stale content (dynamic pages) is never sent to the clients browser but the server has to do compression with every request and on a busy server this can lead to a stall as the CPU can’t process the compression requests faster than it is receiving them.
Precompressed content also known as caching has its benefits as the server compresses the content once and then stores it into a cache location, anymore requests for any files which have already been cached will be loaded from the cache so the server doesn’t have to recompile, the major issue with caching content is that dynamic page will not work correctly if loaded from the cache, what this means is that stale content (old versions of a page) will be sent to the client which can lead to website features not working such as login pages.
Knowing what to compress
You must understand is what types of files and data you can compress, it is quite common to find that people enable website compression and just compress everything what this does is increase CPU and potentially loading speed which is not necessary as the data is already compressed. Allot of data on a website is already stored in a compressed format such as images and archive files, file formats such as zip, rar, jpg, png and gif are already stored in a compressed format and so recompressing an already compressed file doesn’t adjust its size greatly (you are talking about a couple of KB maybe a couple of MB on a huge file). All you will be doing is putting extra unneeded stress onto the CPU which on a busy server will slow the overall performance of the server. If your images, videos or archive files are large you should find alternative ways to lower their size such as decreasing the bitrates, audio quality, image quality, scale and various other options. You should read my post here related to oversized images.
Is my website already compressed?
It is simple to detect if your website is being compressed via various methods, one of the most simplest it to use a gzip testing website such as http://www.gidnetwork.com/tools/gzip-test.php, this website will simply tell you if your website is being sent compressed and if not estimate the total size if it was to be compressed. If you are to view the header data of your website response it will include a compression line if there isn’t one then compression isn’t enabled. You may need to use an add-on such as Live HTTP Headers for Firefox to view the header data.
Should I or shouldn’t I?
Compression does have its benefits but it comes with risks as well I personally use it myself but I have tested it rigorously on my server and have a CPU usage monitor installed, this allows me to keep an eye on the CPU usage and if it was to ever go to high from a sudden burst of traffic I will know and can disable or decrease the compression level. One thing is for sure that I would never enable compression on a live server unless I know for defiantly that the server can handle the extra load, I have seem numerous times people saying there server is struggling and the CPU is dying but as soon as compression is disabled it goes from performing like a school bus to a Ferrari almost instantly. Yes the benefits can be great but you must and i mean must test it first especially if it’s an already busy server.
A very common issue on most websites is oversized images, most people tend to just leave it as their website appears to load fast but they never think about other potential visitors they are alienating who don’t have fast internet or mobile internet devices which can take a long time to load depending on signal. You may still be thinking what is and oversized image? Well it’s an image which has been scaled down to fit onto a website, the issue with scaling down an image is that you still have to download the full sized image no matter how much you scale it.
Let’s say for example Harry has a website with an 800×600 image which is 2MB in size, the image is too big for his website so he scales it by 50% to make it fit so now the image is 400×300 in length. The issue is that even though the images has been scaled by 50% any visitor who goes to Harry’s website still has to download the 800×600 image which is 2MB, if Harry was to change the actual picture so that it was 400×300 that would potentially decrease the image size to around 1MB resulting in faster and smoother loading speeds for Harry’s visitors.
Here is an example of an image I have lying around on my computer. The left side is the original size and the right is the new size. This is just an example but if helps educate you my readers easier.
So we have the same image but in two different sizes, the only thing which has been adjusted is the scale of the image. The one of the right is 50% smaller in both width and height than the one on the left.
As you can see from the file sizes the right image is significantly smaller, this is defiantly something all web developers and users should be aware of and trying to fix on their website. Just remember it may load fast on your home internet but there is people out there with even slower internet and this is the generation of portable internet, do you really want to alienate thousands of potential visitors?
When making a request with fsockopen the response you receive may not be exactly what you expected, let’s say you make a request to a simple HTML page which has one world which simply says “hi”, what you would expect from the fsockopen response data was the word “hi”, this is true we do receive the word but above that is around 6 lines of extra data. This extra data is called header data and is added to any request or response automatically, when you sent your initial request you would have provided some header data which the web server would use to determine exactly what you’re requesting and the format of the data so it understands the request. In return the web server also returns your data but also includes header data related to the response, this header data is normally picked up via are web browser and processed so we never normally see it.
The way a header request is ended is special as it always ends with a double line break which looks like the following.
Shell
1
\r\n\r\n
Due to this it is easy to filter out the header data using the following line of code.
This is a simple function I have used numerous times to simplify the process of stripping the header data which is normally found at the top of most web server requests and responses. I normally use this function if I am working with fsockopen and need to remove the response header before processing the response.
Recently I have been creating a bunch of bash scripts to automate some processes, I normally run these scripts manually so output is helpful but today I was working on a script which runs via a cronjob every 12 hours, the issue with this is that if a script outputs any data whilst being executed via cron the output is emailed to the user which ran it. This was causing an email to be sent to my inbox every 12 hours which I didn’t want (yes it allows me to verify it has run but I didn’t need to know this or want an email every 12 hours).
The answer is simple all I had to do was redirect the stdout output to a file or null location. This is achieved using the following command.
Shell
1
scriptname>/dev/null
What this does is send any output which was going to stdout to /dev/null so we no longer see the output. This however doesn’t hide error messages (stderror) either of the following two commands will redirect error messages.
Shell
1
2
scriptname>/dev/null2>&1
scriptname>/dev/null2>/dev/null
You can replace /dev/null with an actual file such as /home/shane/run_log.log and now output will be redirected to that file, the first line above directs both messages (stdout) and error messages (stderror) to the same location /dev/null, the second command however can be used to direct messages and error messages to separate locations / files.
Shell
1
scriptname>/dev/null2>/home/shane/run_error.log
The following line will suppress normal message output but error output will be stored into run_error.log. Because we are writing to a file and not the shell output we no longer receive the output in an email.
Today a came across a situation where I needed to loop through some data and open up a http request using a socket connection, each loop had to open a new connection and the handler had to be stored into a new variable so that each connection could be read separately (opening a new connection using the same variable name would close the previous connection and create a new one).
It’s quite simple if you know exactly how many connections you need to open each time as you can just define the max required variables, but what if one day it needs to open 2 connections then the next it needs 8 but there is no defined maximum either (no one wants to write out 100 variables just encase and it doesn’t meet any of the coding standards either).
A dynamic variable is what we need and PHP has this built in, which has been a life saver as we can declare many unique variables but only as many as we need. Here are some examples; the first creates 10 dynamic variables (dynamic_variable_0, dynamic_variable_1, ……., dynamic_variable_9).
PHP
1
2
3
4
5
<?php
for($a=0;$a<10;$a++){
${“dynamic_variable_”.$a}=$a;
}
?>
The following code below outputs the data from the 10 dynamic variables we created in the coding example above.