Super User

Super User

Thursday, 21 July 2016 06:28

HTTPOXY VULNERABILITY

if you are running your website on a VPS, then you need to be aware of the HTTPOXY vulnerability, and take immediate steps to prevent it from attacking your server.

What is the HTTPOXY Vulnerability?

httpoxy is a set of vulnerabilities that affect application code running in CGI, or CGI-like environments. It comes down to a simple namespace conflict: RFC 3875 (CGI) puts the HTTP Proxy header from a request into the environment variables as HTTP_PROXY HTTP_PROXY is a popular environment variable used to configure an outgoing proxy This leads to a remotely exploitable vulnerability. If you’re running PHP or CGI, you should block the Proxy header now.

What can happen if my web server is vulnerable?

If a vulnerable HTTP client makes an outgoing HTTP connection, while running in a server-side CGI application, an attacker may be able to:

  • Proxy the outgoing HTTP requests made by the web application
  • Direct the server to open outgoing connections to an address and port of their choosing
  • Tie up server resources by forcing the vulnerable software to use a malicious proxy

httpoxy is extremely easy to exploit in basic form. Luckily, if you  are affected, easy mitigations are available.

 

 

 

Saturday, 11 June 2016 05:26

How do improve search engine rankings

Search engine companies know very well that their rivals are only a click away; so their reputation for delivering the results you requested and need are a key factor in the continued success of their business. Their top asset is their search engine algorithm - which indexes and ranks pages based on signals. Signals are indicators that the page contains quality information. The signals used in ranking are constantly evolving, as search engines strive to provide better results than their competitors. Search engine companies provide analysis tools like pagespeed insights by Google .  Increasing the number of positive signals for your page will improve its ranking. The best method of improving your ranking is to: 1. Analyse the top ranked pages; 2. Replicate the positive signals used by those pages; 3. Improve on their weak signals.

Let's look at some of the most important signals.

1. Content Content Content

Regularly updated  quality content is the most important criteria for good search engine rankings. The content must be unique (simply cutting and copying from another website is a signal for poor quality), flow naturally,  be spell checked, and with good grammar. Cross linking the content to other articles on your site, and to external sites which rank well for the keyword or phrase being referenced in the content. Search engines like content which is unique, well organised, and coherent. Try to ensure that your page is aimed at a particular keyword or phrase otherwise it confuses both the visitor and search engines. Also check your page for grammatical errors and spelling mistakes.

In general, stuffing your content with keywords destroys the natural flow of the article and detracts from both the readability of your content to visitors and will detract from your quality score with search engines. Check your content with a Flesch-Kincaid readability score checker, and aim for a score of between 60-70.

Ensure that the language used in your page is consistent with the audience you are trying to engage.

A page should ideally be no bigger than 100Kb in size.

 

2. Page load speed

The speed at which a page loads and gets displayed on your device - research indicates that you lose 50% of visitors if they have to wait for more than 4 seconds. Google likes your web page to load in under two seconds. Increasingly mobile usage currently accounts for about 50% of searches, and that ratio is increasing rapidly. A page that loads quickly on the desktop does not necessarily load at the same speed on a mobile device. So it's important to analyse and optimise pages for mobile, tablet and desktop.

3.  The user experience (UX)

Search engines can determine the quality of the user experience your page presents to a user, of particular importance is whether the page is mobile friendly.  If you take  a common sense point of view would you rank a page at the top of the search results for a search from a mobile phone if it does not display properly on a mobile device.

Amongst others search engines can work out:-

  • Does the page have to wait until all the elements are loaded before displaying content at the top of the page which is currently visible;
  • whether the text is legible;
  • whether buttons/links are sufficiently wide apart to be able to tap them easily;
  • whether images compressed, and sized appropriately for the device;
  • if plugins used to display content which might not work across different platforms;
  • if the page display an intermediate page which display ads whilst the page is loading;
  • is the page configured so that the content can be displayed within the size of the screen without excessive scrolling.

Social Media/Inbound Links

There is only one thing in life worse than being talked about, and that is not being talked about - Oscar Wilde

The same is true when it comes to search engine rankings. Once you have all optimised all the other signals; it's down to which page has the best quality inbound links.

What makes a quality inbound link?

Inbound links from pages which rank highly in search engines for a relevant topic gives added credence and hence quality score. Links from non-relevant pages or link directory sharing sites actually has a detrimental quality score. Don't be tempted to buy links, sooner or later your going to get found out; and recovering your reputation with a search engines can take a significant amount of time.

Conformance to standards

A group called the "Worldwide web consortium" defines standards for the Internet.

 

Security

If your website gets compromised (hacked) or defaced, it can have a serious impact on your online business, you face reputational damage with your regular visits, but also the possibility that your website will be blacklisted by google, other search engines and security companies (like McAfee Site Advisor), getting removed from blacklists can take a considerable amount of time. In essence you have 48hours to resolve the issue, after that period the damage to your online reputation (and ranking) rapidly escalates. Read more about website security

 

Wednesday, 27 April 2016 19:15

Why an effective website needs a plan

Many companies we speak to complain that their website does not really add value to the business, or that its not meeting the objectives they had originally envisaged.  They get persuaded that its now out of date, and needs a revamp. In fact if you analyse the most successful websites they hardly ever get revamped (google, twitter, facebook, ebay), sure they add functionality, but the fundmental look and feel is straight forward, uncomplicated, and constant.

Many branding agencies and web designers focus on delivering a site which the client likes (i.e. looks great), but theres a whole lot more that they could be doing in the field of educating clients about maximising the return from the website, which after all is the purpose of the website.

The argument against doing a business plan is:  its a brochure website we don't need to go to all that effort, or we are not an e-commerce site, or how can we predict the numbers of visitors to plug into the plan. Well the answer is that you can and should develop a plan, it need not be complicated or long winded but it should set out the potential marketsize, forecast and initial market share and have targets for attracting visitors to the site and  turning those visitors into clients. You can then develop targets for returning customers, financial performance etc. Cube have developed a methodology to maximise the return from websites -  Traffic, Trust and Transactions ™

So how do we go about it ?

Step 1 Research,  set goals and objectives

The first step is to do research,  to answer the following questions

  1. What are people looking for on the web in relation to my product or services?
  2. How many of them are there?
  3. Where are they located?
  4. Who is our competition?
  5. What would a visitor buy product from me rather than the competition?

There are tools on the web, many of them free which provide accurate information on the exact keywords/phrases people are searching for, the number of searches for those phrases on a month by month basis, and where those people are located even down to the district in the town.

So now we know the web based marketsize on a month by month basis for our products. How do we determine the people who are browsing from those people who are ready to buy? Typically the person who is ready to buy uses a 4+ word phrase (so browsers use a single word, comparers use a two-three word search).

We use those phrases to analyse who the competition is, their strengths and weaknesses.

Step 2 Great Website & Social Media

The website platform and the design need to have all the features you need to showcase your products/offering, and convert visitors into customers or persuade them to execute a call to action. It's all about engagement. The design needs to be clear and uncluttered (clean design), responsive (presentation changes according to the device being used), and  contain  the most  effective marketing and engagement tools so you can engage your potential and existing customers to buy your products or offerings. The more you know about your visitors the better chance you have of enticing them  to buy. So visitor and customer intelligence tools is key to success. Many businesses spend a huge amount of time on ineffective social media; our website platforms feature tools which enable you to make the most of social media with the minimum amount of effort. Cube Creative have tuned platforms for Wordpress, Joomla and Magento.

Step 2 Develop targetted content

The research undertaken in Step 1 enables the writing of effective landing pages to target those phrases searched for in our target geographies. We  are not going to expand on what makes an effective landing page here, but suffice to say it's an evolving art.

Step 3 Analyse performance

Using tools such as google analytics and statcounter, it's possible to see the number of people who visit the site, how they got there (what keywords were searched on what search engine, twitter, facebook ),  their geographic  location and if its a corporate user the company they work for. The pages they viewed on our website, how long they stayed on the site, the jump off point, and the number of times they have returned to the site. There's a whole load more to performance analysis, but we are going to save that for our clients.

Step 4 Increase our market share

We now know the market size, our share, the number of conversions from visitors to clients. We can now set targets and implement strategies both to increase our share of the search and conversions, develop new products that people are actively looking for, or move into new geographies because we know there is a need.

The information contained in this article is only really a introduction to what should be done and what is possible, if you'd like to know more contact us

Wednesday, 27 April 2016 18:36

How much should I pay for hosting ?

How much should i pay for hosting?

€50 vs €400

Shared hosting packages are available on the internet for approximately €50 per annum, unfortunately many resellers are selling these for €400 per annum or even more

Shared hosting is specifically designed to provide a simple platform which will work for most common website frameworks, it also means that more than 150 websites can be sharing the same IP address - but why is this important ?

Search engines (Google et al) constantly try to ensure that their search results provide what the user is really looking for in the top search results, they use "signals" to determine quality, so those web pages with the highest quality score rank highly.

If you have taken the time and effort to write properly structured content, which is easy to read, grammatically correct and interesting content that in itself will increase your score, the more high quality websites (those that rank well on search engines) that refer to (link) your web page, again that will signal well on search engines.

However, a slowly responding web page will undo all that good, after all these days who wants to wait around for a slow page to load, search engines like the citizens of the internet has a short attention span, so consistently slow pages equals a poor rank. The Answer?, Pay for hosting optimised for your website, and make sure that your getting what you think you are paying for

Wednesday, 27 April 2016 18:35

Making Ruby on Rails Scale

Making Ruby on Rails Scale

So what is Scaleability?

Scaleability is the ability of an application to satisfactorily service and respond to the number of users/requests required. The variable being the number of users or requests, and the "satisfactorily service" being the subjective. Typically, the scaleability of an application is defined by the choices made by the original designers and developers of the application, and the constraints given them by the "customer".

Twitter for example, was not designed from the ground up to be capable of supporting millions of users, and therefore they have had scaleability and reliability issues. Its not their fault, they became victims of the success of the project; having said that its not a trivial or cheap exercise to recover from a scaleability issue (or poorly written) application and come out with your brand reputation intact - so you have to hand it to those guys.

Best then to get it right from the outset (but easier said than done)

Design

We don't want to get paralysed by masticating constantly over the design. But its much easier to fix at the design stage than in production. So we need to do some planning from the outset

Rule 1 - Get the scaleablity parameters from the business plan and keep the client informed of the limitations of the design from the get go

The client should have built a business plan for the project; if they haven't it should be ringing your alarm bells. The business plan should dicate the scaleability requirements for the design. The plan should give you sufficient information to enable you to determine the platform you should  be using for the application.

Rule 2 - Challenge whether Ruby on Rails is the most appropriate platform for this application

Cube will be writing on the limitations of Ruby on Rails v2.x in a further blog.

Rule 3 - Its horses for courses folks! waterfall for big objectives, agile for small utility type applications.

If the application is sizeable, and/or involves a medium to large population base, then you're going to need to use at least some of the waterfall techniques for design, sure you can mix in some agile techniques where appropriate to keep things moving. But for big projects, pure agile development is going to lead you into a cul-de-sac in an articulated lorry, and it ain't going to be easy to escape from that one!!!!

Rule 4 - Map out the objectives and make absolutely sure you understand the requirements for the application

So map out the objectives, and goals for the application in the written form, you'll need to speak to all the stakeholders via interviews, group sessions if need be. If you can write it down easily, then you have put sufficient thought into the application, if its hard to write down, you need further thought.

Rule 5 - If you can't write it down easily, you haven't put in sufficient thought

The map is going to form the basis of the specification of the application. Once you've completed and reviewed the map, the application architect takes on the mantle. The architect will map and agree with the client the actor/role/process/artifacts, and crucially the interaction between them. You should then be in a position to develop and agree the views. This is best mocked up in the pictorial form (whichever suits the team best), you then need to review the pictorial views against the actor/role/process/artifact model. This technique is designed to ensure that the views will be useable in the real world, it also provides good information as to what data/methods will be required for the controller, and model design.

Model-View-Controller Architecture

Shoulda been called the VCM model in Rails, since the model and view never directly interact. Below is my interpretion of the rails architecture.

Ruby on  Rails Architecture

Rule 6 - Stick with the rails rules for Views, Models, and Controllers from the outset

The purpose of the view

The view is purely there to provide an interface to the user, it supplies and retrieves data to/from the controller(css, html, javascript, json, xml, csv, png, etc). So no putting any heavy weight code such as complex validation by embedding javascript in there. If you want to do aysnch validation (by that i mean in realtime, pre form submit), use a javascript function and generate it from the controller using one of the frameworks (ajax, jquery, mootools). (AJAX would be my favourite for this task). Views should not have any other interaction with anything other than the controller.

The purpose of the controller

The controller is like the traffic cop marshalling requests, it takes requests from the view, parses them, handles sessions, cookies, submits and requests data from the model, it also serves to provide security for the application. It should be mean and lean. if its not then you need to rethink, and refactor.

The purpose of the model

Models validate, store and retrieve data from the database, and deal with the business logic. Its where all the hard work is done, in the traffic analogue its the articulated lorry - it does all the heavy lifting and transport.

I can understand that sometimes you want to use for a stored procedure to make the database server do the work, since it allows the logic to be split into an 'n' tier architecture. I'd resist that until i was absolutely sure that there was no other way.

Object to Relational Mapping

The next exercise is to perform the object to relational mapping, and database design - its absolutely crucual you get this right - its very difficult to play around with database design once you've a couple of million rows in place, and some initially happy and expectant customers.

Rule 7 - Use the Rails conventions for Object, Table, and Relationship mapping

Unless you are porting a legacy database and have no other choice, stick with the Rails conventions in table, and attribute naming. I've spent plenty of time regretting some early decisions on thinking I was right, and they were wrong.

Rule 8 - Perform a sensibility check on the design

Finally, perform a sensibility check of your design against the actor/role/process/artifact model just to make sure that you haven't missed anything. If you can't make this diagram look clean, and readable, then its likely that the design needs more work.

Ruby on Rails

Lets talk about pure Ruby (as opposed to JRuby)

Ruby is an object oriented interpreted language, developed in 1995 by Yukihiro “matz” Matsumoto. Ruby is an extremely elegant language, which allows the programmer an immense degree of freedom. That freedom though, comes at a price. As an interpreted language, Ruby is not the greyhound of the language world (1.9.0 of ruby vs C++ vs Java - 89.3, 1.6).But at least, with the advent of Ruby 1.9, we have the ability to take advantage of multiple OS threads, in versions previous to 1.9 we could only take advantage of a single OS thread.

The major limitation is the Global Implementation Lock (GIL), this prevents more than one OS thread running at a time. So what does this mean ?, well we can't take advantage of multiple CPU cores and we have an IO blocking issue. The reason, is the lack of certainty that the application is thread safe.

Thread Safety

So Whats thread safety, and why has it not been implemented before? Threads share the same memory address space, so you can write the same variable multiple times in multiple threads, so which one is the write(sic) one, and how do you implement thread safety. The answer is to lock the function/method right at the start, and drop the lock at the end. The problem is that its expensive in resource term, 'cos you're blocking on the function. A more refined approach is to put locks around the write to the variable, this is less expensive, but more complicated and they ain't easy to debug. A simpler option is to make variables write once. This is the approach adopted in the dataflow gem Thead safety is only important in parallelism, so should you bother? If you need an application to scale, then a simple method is to add more hardware, but that only works if you have built-in the tools to make best use of the additional hardware.

Application Partitioning

the majority of system architects took the fork and exec daemon approach Application Partitioning Lets assume that we have an application in which clients submit a form periodically, administrators perform administration, analysts analyse, and managers generate MIS reports. Its possible to partition the application, so that different instances of the application handle each community, we may even adopt a different strategy for each community.

Message Queuing

Applications some times require large blocks of code and complex database calls which block other simpler operations whilst executing. One solution is to use message queuing. Basically messages are passed to a queue, at the end of the queue is a background task execution server. It pops messages off the queue and executes the task , passing results back off the messaging queue. Its possible to perform asynchronous operations within a page using this technique. A significant number of message queuing components are available from Apaches ActiveMQ using the Stomp protocol,

though the database is going to become the blocking factor, even with pooling connections. Rails inherently at least as of 2.2 does not handle different database connections concurrently, and thats where you need to think about alternative approaches such as message queuing, but that involves making significant changes to your code. If your doing this at the design stage, then great, but make sure that the messaging engine you are going to use is going to be appropriate. Data Partitioning Lets say you've got a huge database, and that its become or will become the blocking factor, then you need to split the databases up. You use a single or cluster of database instances as an index server. You perform a lookup on the record you require from the index server, and then access the record from the appropriate database instance. Currently, this technique is beyond rails, so you'll need to perform some fancy footwork to get this going robustly. The De-coupled approach An alternative approach to thread safety, is to use a reinvention of the fork and exec daemon approach, by using the http server to handle and distribute incoming requests and generating multiple processes to handle them. The basic concept is to use apache with mod-cluster, to generate several mongrel servers. The principle is shown below. Its also possible to do this with several other webservers. An alternative is to use IBMs webserver with JRuby

Wednesday, 27 April 2016 18:34

2015 Wordpress Vulnerability

The recent wordpress security alert advised users to upgrade to the latest version of wordpress, and published a list of those plugins that wordpress thought vunerable. Users are advised to upgrade as soon as possible. In this particular case, keeping your plugins and wordpress up to date would not have protected you from infection. The core issue was a poorly documented subroutine/api which led developers to believe that they did not need to clean the parameters in the url (which can be used by hackers to execute commands on the webserver). The exploit can be used to  leave behind nasty backdoors and malware which they can activate at their leisure - so the infection won't necesarily be caught by a malware checker.

We also found that many modifications to plugins and themes performed by web developers were also vulnerable to the security issue. Simply upgrading to the latest version of wordpress and plugins won't necessarily help you if your site has already been infected. So how do you know if it has and what can you do about it. The only real way of protecting yourself is to reinstall wordpress and all your plugins from scratch, install your theme (after checking it for infection), and import your data (after checking it for compromised comments etc). Thats going to be a long and expensive process - particularly if you have a big site.

We upgraded wordpress and plugins in a timely fashion on a shared VPS with about 10 wordpress sites on it, and then went looking at modifications to plugins done by others and themes. We found that even after the upgrade the sites were compromised and in our particular case it was a PHP injection attack. We cleansed it by searching every php file for the pattern of the attack and removing the offending code, we also scheduled a search for the pattern every hour so we can see if we are still vulnerable. We then went looking on our dedicated VPS's and found exactly the same issues. We then had to reinstall wordpress from scratch on every site just to be sure that no trace of the infection was left behind.

The pattern we found was a PHP injection attack, that may not be the same for everyone, and it looks like it should be there. Look for "Speedup php function cache" in all php files the function uses base64 to inject whatever code the hacker likes into your php files.

A Strong message to all developers always always escape input/uri from whatever source (<input>, GET, Server variables, URI's) even if you think its safe and been escaped before.

A signalling system called ss7 - used by virtually all  mobile phone companies worldwide to connect between networks has a vulnerability, enabling cyber criminals and government agencies to listen to phone calls, read your texts, and find your location just by knowing your phone number and there is nothing that an end user can do to protect themselves against it. The increasing use of SMS codes to authorise changes to bank accounts & payments, two-step authentication for websites, means that yet another authentication method has been compromised.

What does SS7 do?

SS7 allows mobile phone networks to exchange the information needed for passing calls and text messages between each other and to ensure correct billing. SS7  also allows users on one network to roam on another network or in another country.

What can access to SS7 enable hackers/governments to do?

Cybercriminals and security agencies can transparently forward calls, giving them the ability to record or listen in to them;  read SMS messages sent between phones, and track the location of a phone.

Who is affected by the vulnerability?

Anyone with a mobile phone could be vulnerable, providing the mobile network or its connected networks.

What’s being done about it?

Since the exposure of security holes within the SS7 system, certain bodies, including the mobile phone operators’ trade association, the GSMA, have set up a series of services that monitor the networks, looking for intrusions or abuse of the signalling system.

What are the implications for users?

One of the biggest dangers is the interception of two-step verification codes that are often used as a security measure when logging into websites, email accounts or banking where verification codes are sent  via text message.

Banks and other secure institutions also use phone calls or text messages to verify a user’s identity, which could be intercepted and therefore led to fraud or malicious attacks.

What can I do to protect myself from snooping via SS7?

There is very little you can do to protect yourself beyond not using the services.  For text messages, avoiding SMS and instead using encrypted messaging services such as Apple’s iMessage, Facebook’s WhatsApp or the many others available will allow you to send and receive instant messages without having to go through the SMS network, protecting them from surveillance.

For calls, using a service that carries voice over data rather than through the voice call network will help prevent your calls from being snooped on. Messaging services including WhatsApp permit calls. Silent Circle’s end-to-end encrypted Phone service or the open-source Signal app also allow secure voice communications.

Your location could be being tracked at any stage when you have your mobile phone on. The only way to avoid it is to turn off your phone or turn off its connection to the mobile phone network and rely on Wi-Fi instead.

Why is the current situation?

Security holes within SS7 were first uncovered by security researchers, and demonstrated at Chaos Communication Congress hacker conference in Hamburg in 2014.  In 2015 the hacking of Italian surveillance software vendor Hacking Team highlighted the continuing use of the SS7 system in government and criminal snooping. German researcher Nohl demonstrated by remotely surveilling a US congressman in California from Berlin for CBS’s 60 Minutes that has brought SS7 under the spotlight once again. Congressman Ted Lieu has called for an oversight committee investigation into the vulnerability.

Wednesday, 13 April 2016 11:54

Page Load Times

Page Load Speed & Why it matters

An important factor in the success or failure of your website is the time it takes to load the page. Google likes your page to load in under 2 seconds - just putting up a page with good quality content that loads quickly will work miracles for your ranking - some website owners load a page during development on their website, and it appears incredibly quick - job done! well sadly thats not always the case. Modern browsers load images and files into local temporary files  when you first load a page, and use those locally held files (termed cache) when the page is requested again. A first time visitor though, has to load those files from your webserver transfer them to their local machine, and then render the page (that's the process of displaying images and formatting text). We have seen webpages which take 35 seconds to load for a first time visitor, but 1.7 seconds to load for a repeat visit. Research has shown that slow web pages (and by that we mean pages that take more than 4 seconds to load), lose 50% of visitors to rivals.

Why does  page load speed matter?

Really fast pages make an impact before even the visitor has read the content, they convey a consious and subconsious message about your companies approach to quality, customer service, and the importance of online services within your business. Search engines want to convey a quality experience for their customers; their search algorithms avoid awarding a top ranking for slow pages as page speed is an important signal that the page has the potential to contain quality content.

If your web page is slow; what message does that convey to potential customers (and search engines) about your company.  Really fast pages make an impact before even the visitor has read the content. So speed can be an important indicator of a company's approach to quality, customer service, and online services.

What causes slow pages?

Poorly Written Websites

HTML errors

Modern browsers are incredibly tolerant to errors, so whilst a page may look great; it might not be written very well. It takes browsers time when there are errors on the page to work out how best to render a page. In addition,  when your page is visited by search engines, they analyse the page to see if it contains errors, it's one of the quality indicators (or signals in SEO parlance) they use to help their ranking algorithm determine your quality score.

Image size

The larger an image the longer it takes to transfer across the internet right? yes that's true, but just because an image looks small on the screen does not mean that the actual image file is not huge. We have seen a website with a logo file which on the screen measured 100 pixels wide, but was 10,000 pixels wide in the image file - the html code tells the browser to resize the image. We typically find that the majority of images on a website have not been optimised, and this can seriously affect page load time. Search engines can calculate the size an image is displayed on the screen, and the size the image file actually is. If the two  differ widely, it marks down the quality score for that page. In theory you should have a different image file for each device size (i.e. mobile, desktop, tablet).

File Sizes

Computers ignore whitespace (spaces, newlines, tabs) in programming code. Programmers and software engineers prefer nicely formatted text, which makes it easier to read. The additional overhead caused by this whitespace can be hundreds of kilobytes or even megabytes of unncecessary data. We'll discuss in the next article how to keep the software engineers, internet users and website owners happy. 

Design

Poor design is often responsible for poor performance. Overbloated themes and templates are common on the internet, they typically look great, but contain huge images, and massive amounts of code, which the typical website never needs. Some website designers also take shortcuts copying websites styles  they have used before, but which increases page load time. Designs should be lean and only contain the styles and code necessary to run the website. For each file that the website requires it has to make a separate request to the webserver, which has to find the file, load it, and transfer across the internet to your browser. it's imperative to minimse the number of files required by a website, and again thankfully there's techniques to optimise even poorly designed websites. The previous statements don't mean that you must have a text only website, you can have an image and video rich website, but you must design the website with this in mind.

Cutting and Copying Text from word processors

When you copy and paste text from a word processor into a web page, it will typically copy in all of its internal formatting code, so you see on the screen

This is some formatted text but the code to produce "This is some formatted text" when pasted from a wordprocessor is huge - take a look yourself

Hosting

It's very tempting to opt for cheap hosting, it's all same right? in reality that's not true. If you are using a shared hosting platform it's possible that you are sharing an ip address (and hardware) with up to 2000 other websites. To maximise the benefit to the service provider (did I say that, I meant to say client), they impose restrictions on what optimisation functions a website can use in order to restrict the amount of CPU and memory used by a particular website. The hosting platform will also be sharing disk with other applications in the datacenter. All of which can contribute to either a slow or inconsistent response time from the webserver. Google likes your webserver to respond in under 200 milliseconds. Remember your webserver has to serve each file required to render your web page. So hosting is an incredibly factor important in response time.

 

×
Help Us
Get Better
Invalid Name Invalid Email Address Invalid Mobile Invalid characters in Messages
Please help us improve, leave some feedback on our site, products, or products you'd need that we don't provide. We promise to respond to all feedback.