Driving Traffic and Sales: The Imperative Role of Website Optimization in Modern Business Strategies

Reasons behind the growth of Website optimization among companies to attract traffic and generate sales

Abstract

The project aimed at establishing reasons as to why website optimization is such a significant process when it comes to generation of traffic and sales as well. A further coverage of the background study led to the realization of the gap areas such as code speed optimization, which is needed in boosting the performance of the website. With the help of the wide angle approach and the agile methodology, the project was able to develop a website for online restaurant. A further analysis of performance of the website revealed that code speed optimization was necessary in minifying the script, removing libraries and reducing the image size as ways of boosting the web page responsiveness.

 

1. Introduction

1.1 Background

The current world of business demands technological considerations that would upscale the quality of services. More attention is given to the changes within the business environment, which is expected to embrace the business dynamics. Based on the research conducted by Rivera (2019) indicated that modern businesses would consistently need strong and more reliable digital presence. The same research found out that e-commerce would account for almost $2.3 trillion in terms of sales around the world. This figure is projected to rise to around $4.88 trillion before 2021. According to Rivera (2019) further indicated that if any business enterprise would wish to be part of the projection, then such as business has to front web optimization as a priority. The scope of web optimization essentially covers significant marketing techniques, as well as approaches utilized attaining traffic while transforming the prospects as well as engaging the visitors in a more lucrative way. However, the understanding of web optimization is always never enough as stakeholders would call for more skills needed in handling corrective actions. Web optimization carries with it three significant approaches. The first approach covers the search engine optimization known for touching on the mini activities linked to the website content as well as stability of the advanced search tasks. Both in the United States and Europe, SEO is largely associated with Google. The close competitors to Google such as Yahoo are believed to have had their own ways of making entry into the markets.

Based on the study conducted in the year 2018, search engine optimization carries with it the most fundamental pillars. The first one covers the website content in which the quality of the content and the nature of the resources would determine the number of visitors who would be interested in the products or solutions provided. Content can simply be written, visual and even videos in which significant keywords are utilized in boosting the ranking. On the other hand, SEO depends on on-page optimization known for encompassing the technical aspects linked to ranking. The closing pillar covers the links known for fueling the search engines as argued by Quinton and Khan (2009). Findings tapped from different case studies still show that website optimization can be made possible through optimizing the technical performance. In most cases, performance would be measured in terms of the site speed, which can be determined with the help of such tools like Pingdom Website Speed Test, PageSpeed, Webpagetest and GTMetrix. In a different set of the findings, around 52% of the world traffic is believed to have been generated through mobiles in the year 2018. The emergence of the landing pages was seen as one way of saving on time while customizing on the web pages to attain separate displays. While studying the landing pages, it is more appropriate to link them to the user experience as well as optimizing conversion.

It is worth noting that conversion rates comprise of what is extended to the visitors for the need of taking note of the interaction experiences and browsing. Based on the findings extracted through Google analytics, the platform is believed to have shared insights regarding the conversion rates, abandonment rate, user type and bounce rate, which are commonly regarded as significant elements of the UX mechanisms. A further recap of the web design and performance testing takes note of critical aspects such as the time taken while delivering the material requested alongside the HTML content, the response of the browser   to the load requests and the viewership of the end users as suggested by the web page renders. It is worth noting that website performance has a subsequent impact on the search engine rankings, which are developed under the undisclosed and proprietary algorithms while incorporating website responsiveness, page speed and user experience. Critical gap areas that are increasingly receiving attention from researchers and developers cover the maximization of the website performance, which drags in the aspect of page speed optimization. The latter constitutes the optimization plugins, the final tweaks and the server side scripts. Engineers at Google Inc discovered that a perceptible load time of around 0.4 seconds can be long enough to an extent of making users to search for less. However, technology is believed to be dynamic and has its own impact on the enhancement of the internet experience. Observation of the crowded cyberworld in the 1990s was referred to as World Wide Wait. This is an incident that prompted a chain of innovations directed towards page speed optimization. 

1.2 Academic Question

What are the reasons for the growth of website optimization among companies?

 

 

1.3 Aim

The important aim of this project is to highlight and investigate the significant reason behind the growth of Website optimization across the companies for the purposes of attracting traffic and generate sales.

 

1.4 Objectives

The above aim is supported by the following objectives

  • To investigate common pitfall and best practices behind website optimization such as code speed optimization
  • To determine the essence of speed of the websites in the world of internet of things
  • To highlight reasons behind generation of sales and attraction of traffic through website optimization
  • To design and implement a simple website before conducting performance testing

1.5 Artefact

The project is purposed to design and develop an online restaurant which shall be receiving orders from the customers before they can be worked on. However, developing a website calls for basic knowledge in HTML, PHP, internet browser and CSS as well as text editor. Notably, the symbolic website is more appropriate in giving room for performance of tests on generation of traffic ad speed with the help of essential website optimization practices and tools. The project embraces both the agile approach and the wide angle method for the purposes of meeting the playful creativity and the sophisticated theoretical findings. The approaches embrace the working of the software, response to change, customer collaboration and team and individual interaction. Speed tests would be conducted in iterations before attaining the specific goal attached to the project.

 

1.6 Research Methods

As mentioned before, the project takes advantage of the wide angle approach and the agile methodology for software development. The wide angle approach covers the playful creativity of the project such as arrangement of the functional elements of the online restaurant, the web page and the nature of content that need to be incorporated in the website. The wide angle approach also looks at the flexibility of tools being used in the course of website optimization. The procedural part of the project is covered by the agile methodology, which cites the executable functions of the web design. The significant elements covered under agile methodology may not be limited to the working software, individual and interactions, response to change as well as customer or client collaboration. The define phase essentially determines the kind of work that need to be done while the design phase establishes the plan needed for building the requirements of a product. The build phase engages the design in a reality and tests would verify product functions.

1.7 Scope and Limitation

The scope of the project covers the pitfalls and the fallible practices that have been used before alongside website optimization and page speed enhancement. This streamlines the purpose of the project while aligning it to development of an online restaurant while paving way for performance testing. The project is limited to the agile and the wide angle approaches, which take care of the necessary tools and requisite practices needed for improving the performance of a web page. While there are many ways of optimizing a website, the project sets its limits on speed optimization while testing the load and the response of the page in the course of generating traffic.

 

2. Literature Review

The literature review plays a focal role in terms of delving deep into the past case studies in alignment to the research topic. The review puts its focus on substantive findings as well as the methodological contributions made by a chain of relevant studies in relation to the research topic. In this context, the review focuses on different areas of the study including the wide view of website optimization, code speed optimization, pitfalls and best practices in relation to website optimization, the essence of speed during loading times and reasons why generating traffic in a short time is a significant phenomenon in most businesses.

2.1 Web optimization

The project widely deals with website optimization, which is a critical area of study in this context. Before even designing the website of interest, it is important to understand the conceptual website optimization as covered in literature, the strategies and tools, and the elements that can be factored in the course of attaining optimization. According to Wei and Stata (2017), website optimization is a wide scope that can be covered in a range of dimensions. Collectively, website optimization is commonly regarded as a process of establishing changes to a website for the purposes of making it rank higher across the search engine results pages. This can be attained on the basis of a wide range of factor in which the search engines would evaluate the relevance as well as the readability of the website before assigning it a rank on the SERPS. If the website has more relevance, then it is possible for it to be highly ranked above others. Based on the observations made by Wei and Stata (2017), over 60% of the internet users would rarely look beyond the opening page of the search results. This means that it is necessary for the SEOs to push for better ranks by all means possible. For this to happen, Kennedy (2019) points out that web optimization has to focus on building links, resolving technical issues, selection of keyword and adding as well as refreshing content. Hatab (2014) directs the review towards what search engine do and the components website optimization has to check before the site can attain better ranks. According to Hatab (2014), a search engine, in the context of web optimization, would be regarded as a database of web pages which is commonly denoted as index. The latter is essentially a method of finding the web pages before indexing them. This means that search engines would have depend on spiders, which is also a software known for following the hyperlinks.

Web users or internet users would conduct simple searchers with the help of the key phrase or the key word before the search engine returns the results. Major search engines would consider the relevance of the term being searched. There are algorithms that would be used in determining the relevance. For instance, Google Inc would focus on an algorithm that has over 60 factors that would streamline the relevance of the search term. Any web search engine would four basic software modules. The crawling module brings in the updated or new pages close to the indexing module. The latter would thereafter create what is referred to as the compact searchable index with the help of the preprocessed information linked to page ranking. It is worth noting that the searching module would thereafter make use of the index in the course of establishing the ranked answer linked to the stream of queries. According to Hatab (2014), the SEO constitutes four significant processes. The first process is referred to as the keyword research, which is denoted as measure of the frequency of appearance of a certain phrase or word. However, when the keyword density becomes too high, then it is obvious that the search engine would start penalizing the site.

The second process is indexing, which comprises of attraction of the search engine spiders towards the site with the sole purpose of getting indexed. Major search engines commonly have a site set aside for users to submit their websites for consideration. However, this approach is commonly dismissed or disregarded by the search engine experts who, instead, recommend the use of spiders that would find links to the site. This implies that the most significant way of doing indexing is through getting links associated to the site of interest. The third process is on-site optimization, which is also a process of introducing changes to the website with the central of bolstering the search engine rankings. Some of the on-site factors applied for this process commonly include the bold text, the keyword density, the H1 tag and the Meta description tag. The website pages would insist on the embedding structures of the pages including the external and internal links, the directory structure and the naming files. The second element of interest include the HTML code, which carries with it the sitemap, images, meat, links and even the contents. The fourth process is the off-site optimization, commonly known as link building.

Search engines would essentially regard the link building in the course of ranking the algorithms. It is worth noting that a back link is essentially a hyperlink mostly from the external site directed to the target site. Findings established by Akram et al. (2010) established the fact that website optimization has attracted a number of techniques. The first technique is directory submission, which is known for creating the incoming links attached to the website with the help of the relevant page as well as category. Manual submissions to most of the directories have been determined as one sure way of ranking the website. Notably, internal directory is subsequently a platform on the World Wide Web as far as links and information are put into consideration. The directory submission carries with it such information like email, category, description, URL, keywords and title among others. The second technique highlighted by Akram et al. (2010) includes making use of the keyword generation. Keywords are deemed necessary due to the fact that search engines would need some of the words in the course of elaborating information. There is a chain of tools that can help one to generate keywords linked to different needs. Such tools are not limited to Google Ad Words, Thesaurus, word tracker and Yahoo keyword selector tool. The last technique covers the link exchanges, which includes placing the link on websites and website place links. Link exchange enhances the web traffic while trying to also increase the rankings of the website.

2.2 Code Speed Optimization

Based on the study conducted by Stone et al. (2017), the entire concept of code speed optimization is directed at the critical process that covers transformation of the code.  The transformation is intended to reflect aspects of speed and memory consumptions among others. The essence of optimizing the code revolves around discovering and compiling the time in relation to the runtime linked to the program. Additional findings established by Stone et al. (2017) denote that code speed optimization largely targets the involvement of the code and making it to run faster. According to Shen et al. (2016), optimizers are likely to consider using a significant range of approaches and methods in improving on the involved code. A significant look at the code speed optimization emphasizes on the idea of latency hiding with one thread fetching data. When there is a case of latency, then it follows that the GPU will end up switching to another thread said to have none of the overheads. However, the GPU can still access many threads in the course of searching an alternative. When there are different threads awaiting execution, then it is quite obvious that the system would end up hiding latency. On the basis of the findings established by Kuipers et al (2015), code speed optimization aims at removing any redundant code without engaging any changes or altering the initial program. The intended speed can still be attained through reduction of the code size.

Manhas (2013) further noted that code speed optimization, or generally speed optimization, is necessary when trying to correct the page load time. The latter can be defined as the lapsed time determined between the times a user requests for a new page and the time the page is rendered or displayed by the browser. Apparently, faster web pages would render the pages progressively thereby giving the appropriate perception of the website. However, additional conditions would be introduced for the purposes of rendering a clear content. Manhas (2013) categorically asserts that loading time is the most hated aspect of the web pages. Concerns of speed optimization largely emanate from delays on web pages. In the course of optimization, one has to understand that a web page has the external objects and the container object.  The container object carries with it the XHTML file known for referencing the EOs like the JavaScript files, images, video, external CSS and even audio among others. Most of the non-textual EOs would appear in a pre-compressed form. The COs commonly covers around 50% of the total size of the page. It also worth noting that web application performance can be determined by the response data, size of the request and how fast the requests can be served with the help of the client script libraries. Some of the recommendations surfaced by Manhas (2013) include minifying the JavaScript, which includes removal of comments as well as the white spaces noted in a large JavaScript.

The removal should not interfere with the functionality of the code. The second approach surfaced by the developer includes removal of the unused CSS. This means removal of rules thereby leading to fewer bytes which can be sent to the client without any significant changes in terms of the appearance. HTTP caching is also another recommendable approach while revisiting web page loading. The approach includes reusing the local copy while saving on time that would have been used while setting up a fresh connection. The HTTP control commonly facilitates two means of defining the validity of a resource either through the Cache Control or the Expires header. The expires header would put specification on the date on which the resource would be rendered invalid. If this happens, then the browser is expected to ask for the resource again.

Tips of code speed optimization were further shared through the findings established by Bartuskova et al. (2016). In the findings, the researchers noted that the JavaScript can offer the largest area for performance improvement of the website. Most of the websites depend on JavaScript especially the ones with AJAX. Based on the experiments conducted by the authors, they noted that when JavaScript is removed from the website, then there can be an improvement in terms of the performance of the website. The imminent solution to faster and quickest loading does not absolutely lie with the removal of JavaScript but rather the efficient use of the entire code. Some of the JavaScript techniques include minifying the code, which can help in terms of saving on the file size of js. Both the CSS and HTML minifications are recommended and can be made possible at the same time. It is worth noting that minifying either the CSS or the JavaScript carries with it the most significant benefits. Some of them include enhanced browser loading as well as execution, reduced network latency ad enhanced compression. In most cases, compression gives room for the content to be ferried over the network in even a more compact form that would reduce on the download time. Code optimization may equally be aligned to exclusion of functions and libraries which may not be needed by a given website. Coding in pure JavaScript without involving the libraries can also be recommendable.

Further findings introduced by Ericsson (2013) noted the essence of minimizing the requests for the purposes of enhancing the performance of websites as far as loading time is put into consideration. Based on the assertions made by Ericsson (2013), it is important to give some feedback to the users if the pages turn unresponsive instead of leaving the white screen. However, code speed optimization is not a subject that can be left out all the same. Most of the developers are now putting their preference on smaller files which take relatively a short period of time to make a transfer. It is more recommendable if the server takes a relatively short period of time to evaluate the request. The only solution while making a transfer of many files is to combine them together with images allowed making use of the CSS sprites. The latter is regarded as a technique applied in combining number of images together before the browser can be told to render them to the destination. Therefore, code speed optimization is important as far as page responsiveness is put into consideration.

2.3 Common pitfalls and best practices behind web optimization

Website optimization is one significant area that has received a lot of attention over the years. Most of the scholars have revisited some of the pitfalls as well as best practices that have ever been realized across website optimization. Based on the findings established by Mattos et al. (2019), it could be noted that website optimization is increasingly emerging as one of the significant processes which take advantage of the experimentation while enhancing the capacity of attaining business goals. Based on the authors, a large number of website owners are increasingly attracted to the commonly known A/B testing while establishing the variations on web pages as well as responding to conversions. With attention given to web optimization, a number of measurable benefits have been aligned to the web page elements. Studies conducted by Ferris and Zhang (2016) noted that despite having a series of benefits linked to web optimization, it is possible to also witness common pitfalls that emerge from the mistakes developers make on daily basis. According to the findings established by Ferris and Zhang (2016), the first pitfall is always noted in the course of making a wrong choice of words. The erroneous practice can be noticed in line with negligence of the most preferred long tail keywords. Besides, keyword stuffing in which there is an overuse of the purposed keywords may turn out crucial when determining the success of a performing website. Lack of the quality links would sometimes make the website to lose its meaning in an effort to stage an extended meaning. The notable mistakes are commonly committed by developers who have the tendency of introducing a wrong code in an attempt of enhancing speed. Such attempts would attract bugs that would still drag the speed of the website.

Despite experiencing the pitfalls and the skewed practices, it is evident that website optimization still has its positive side as far as the notable players are put into consideration. Based on the arguments raised by Krawczyk and Woda (2013), the best practice that has been aligned to web optimization includes what is referred to as the source code optimization. The latter is essentially regarded as the significant stage of making adjustments to the mobile devices. However, the process is always complicated due to the detailed technical coverage.  At the same time, the process is believed to be necessary in the sense that practices can be analyzed in a significant range of the parameters associated to the normal pages linked to the websites and the customized ones. Notably, the starting point of the source code optimization takes note of the technical aspects linked to the internet standards associated to the home pages and websites. The aspects are widely characterized through the popular markup language that is always referred to as XHTLM MP. At the same time, JavaScript is essentially given the necessary attention in the course of enabling the web pages. Most of the tested websites may have a range of things in common. The most notable ones are not limited to the CSS 2.1 style sheets which carry the JavaScript and take the advantageous use of the graphic among other relevant formats.

In some of the case scenarios, developers are heavily thought to have embraced the idea of reducing the significant size of the entire website and the associated code, as well as the aligned resources. This move compels most of the developers to focus on code speed optimization. The practice is not only limited to the source code but extended in equal measures towards the images which need to be reformatted while adopting versions that can easily be adjusted to suit the needs of the users. For instance, caching is likely to save on the bandwidth and time as well. The entire process of optimizing most of the CSS style sheets may not be far from the necessary adjustments made on the markup code processes. Chotikitpat et al (2015) established the fact that a range of the optimization practices would essentially fall behind the SEO practices and suggestions. Some of the significant suggestions have been linked to on page optimization, which is associated to the searching behaviors the moment terms or even sentences are typed in the search engine. With such attention in place, website improvement is   emerging as one of the unavoidable necessity to most of the developers. A range of the improvements may not be limited to the keywords or the subdomains of the URL.

At the same time, the practice of structure optimization increasingly denotes the efforts aligned to the W3C standards which are normally associated to website development. Other observations established by Chotikitpat et al. (2015) points out that the off page optimization can still be pursued with the central focus of creating the most enabling environment needed for the users. A notable example include page ranking which is common to the Google search engine. The website owners may have no permissions of adding the page rank. However, a directory website may be sufficient enough in making better rankings. Again, WordPress practices are increasingly gaining traction with time. One of such practices includes the tendency among developers to choose the most convenient web hosting provider. In most cases, shared hosting increases the likelihood of encountering poorer performance due to shared themes and server space. With the current decline of prices noticed in the web hosting industry, developers can cheaply host their websites in individual servers. Some of the notable servers include the Google Compute Engine, the Amazon web services and the Digital Ocean. Besides, lightweight WordPress framework or theme is more preferable due to the concerns raised along the social icons, sliders and the dynamic elements.  Advanced caching mechanisms are even more recommendable with the help of the caching plugin. Some of the caching plugins includes the W3 Total Cache. Lastly, the GZIP compression would essentially reduce the usage of bandwidth as well as the time one would take before accessing the website.

2.4 The essence of speed on websites

A range of case studies have already covered reasons as to why the load time is such a critical aspect when analyzing the responsiveness of the web page. The same attention has been given to the necessity of conducting code speed optimization. However, one critical question is why is speed such a critical aspect to developers. Booth and Jansen (2010) delved into web analytics while taking note of the metrics that anyone would consider before visiting the website more than two times. Such metrics include the visitor type, the duration of the visit and the number of visitors realized on a page. Based on a history of website analytics, it is worth noting that the analysis of traffic on website became more conspicuous in the 1990s. The visitor type became part of the metrics which has essentially evolved to include the significant number of hits on a web page, and the number of persons said to have made a visit to the site. Notably, two categories of visitor types can be realized and these include the category that was there before and the one that prevails. In both categories, individual behaviors attached to the visitor would eventually make them to be regarded as unique visitors (Aijaz et al. 2013). The analytic software would take advantage of the cookies in the course of tracking users but the approach may not work where the visitor decides to disable cookies or clear the cache.

Website owners are setting their preferences on tracking essential sessions which rarely depend on the cookies. In relation to the visit length as linked to the speed of the website, Falk et al. (2008) defines it as the period that would be spent by any visitor in one session on any given website. In the light of the measurements, preference is put on the breaking time and the chunks of time related to it. Additional properties under observation include the significant statistics related to the system, the demographics and the visitor path. Other properties are not limited to the internal search details or information, the keyword analysis and the top pages associated to the website. Based on the findings of Armstrong (2003), a delay of microseconds of the web page may attract losses that may amount to millions. Other findings established by Khan et al (2015) noted that the speed of any given website would have an impact on the metrics. The assertions can still be linked to the page views and the speed known for being the most outstanding metrics. Nevertheless, speed of the website needs to get the right perception of the clients or users, and should therefore be determined by other factor such as the load time and the responsiveness of the web page. The page load time has the capacity of either terminating the user experience or sometimes lead to a better relationship.

Therefore, delays that amount to a second are likely to attract over 7% in terms of the reductions in the conversions. If such a case occurs to a company, then the delays might even tarnish the reputation of the organization. Reduction of the load time carries with it the positive impact on the performance of the company’s website. A good example is Shopzilla which is said to have boosted the speed of the site from 6 seconds to 1.2 seconds. The improvement is believed to have attracted a revenue change amounting to 12% with page views going up by 25%. On the other hand, Mozilla is also believed to have realized an addition of 60 million downloads by simply making the website to run 2.2 seconds faster in a year (Hu et al. 2019). However, Amazon ended up counting losses which amounted to $1.6 billion due to a slowdown to its pages by only a second. A survey conducted by Google showed that a load time that goes below 100 milliseconds would eventually make most of the users to trust the instantaneous response of the page. As a result of this observation, Google Inc is believed to have lost close to 20% of its traffic due to the load delay that lasted for only 100 milliseconds. Walmart and Amazon noted a 1% increase in the revenues as a result of improved page speed. In a collective view, the essence of speed is largely a technical concern that carries an impact on the usability and accessibility to the search engine optimization. Therefore, both the workflow as well as the design decisions needs to be informed by significant performance implications.

From time to time, performance needs to be measured, monitored as well as refined. This is important in the light of the dynamics and challenges posited by the web development process, which makes it hard to track the significant metrics. The fluid nature of the metrics is due to their reliance on the network type, the device, latency, browsers, load balancers, the proxies, caches and the firewalls among other significant components. The essence of speed in websites has been aligned to a chain of modern efforts of optimization (Kranzusch et al. 2015). The most recent efforts include assets optimizations as noted along the utilization of Brotli for the plain text compression. Brotli was introduced by Google in the year 2015, which is regarded as one of the most important new open source lossless format said to be supported by most of the modern browsers. Brotli is said to be more effective compared to deflate and Gzip. Browsers are believed to accept Brotli when the user visits the page over the HTTPS (Jackson 2017). It is worth noting that Brotli is essentially supported by most of the CDNs such as Fastly, AWS, Akamai and keyCDN. Compression of assets using Brotli can be expensive, which means that the servers cannot use it due to the cost overhead realized in the process. The essence of speed has also attracted the significant use of the compression algorithms, which can be noticed from the WebP and the responsive images.  Both the WebP and the WebP options for the images and the relevant plugins can be extended to the further use of Cache Enabler and Optimus.

2.5 Reasons for generation of traffic through web optimization

Most of the businesses around the world find it necessary to generate traffic in their websites with the help of website optimization. However, the critical question is what can be the reasons for the generation of sales and traffic on the company’s website. Based on the study conducted by Page (2012), it can be established that investments in the marketing area imply that with only little money, organizations or businesses would easily analyze and understand every visitor that makes his or her way to the website. Observations made in the prior studies noted that visitors to the dot.com sites would rarely make any purchase. Following the untenable valuations of the stock market, businesses would increasingly focus on the next move of the investments. The emergence of website analysis would subsequently support the increasing demand of such tools line the WebTrends among others. More focus on the website visitors again prompts a refocus on the usability of the websites. In 2007, optimization and testing became paramount while launching the commonly known website optimizer tool. Recent days have seen businesses pay much of the attention to the website optimization as a result of the conversion rates. Most of the conversion rates seem to be more vital in an era where daily functions count as significant components that determine the organizational performance.

Based on the observations made by Grappone and Couzin (2011), it could be noted that attracting traffic via website optimization would not only boost sales but also make most of the businesses available or accessible for sales. Websites having heavy traffic means that the business makes enough sales. Such a move equally implies that businesses can easily engage in extra leads as one way of tapping into information from relevant websites. Increased access to information makes the business to be informed about the consumers and the nature of the purchasing decisions such customers are likely to make while accessing online services. Content facilitated to the site is believed to play the most focal role of the sales person while helping most of the consumers to get more details regarding the benefits of services and products being offered by the relevant organization (Kranzusch et al. 2015). Besides, websites having a higher traffic tends to enhance the credibility of the website as clients or users feel confident or more secure while interacting with it. This is due to the history of the website. The era of the User Generated Content has seen more websites attracting more users or visitors said to be more trustworthy, reliable and even educative. This means that web optimization plays the most fundamental roles in the course of generating sales and attracting traffic at the same time.

Other studies have cited a different version of the reasons that make generation of traffic as well as sales necessary with the help of web optimization. The first reason includes enhancing product awareness. A better video or image would attract the attention of users. Secondly, generation of the traffic would obviously boost the sales leads. This can be aligned to the fact that driving the appropriate traffic would largely encourage most people to facilitate the most qualified contact details deemed necessary for future relationship building. Another reason is to establish one as an expert in the field (Bitincka et al. 2010). This means visibility across the search engines larges develops into an implied endorsement effect in which the searchers can associate relevance, quality and trustworthiness with the relevant sites that have high ranks. It is also quite obvious that generation of traffic on the websites and sales helps the website owners in encouraging the repeat visitors. This means that most of the optimized pages would help clients or customers in finding the additional services and product especially after making the initial purchase. Such a move has been thought to be an excellent strategy in boosting customer support as well as service. It is also possible for the website owners to qualify visitor with the help of website optimization. This means that one can easily understand the buyer by getting a clear picture of the visitor or the searcher intent thereby helping one to adjust the site for the purposes of accommodating their needs.

Generation of traffic on a website and website optimization commonly work in tandem in the course of establishing a learner and producing an  effective website. The first reason has been aligned to making the website more useful to specific audience. Sometimes, marketers would feel at odds with such search engines like Google. This is due to the failure of the engine in terms of ranking the content as the website (Moreno and Martinez 2013). In other times, marketers would notice a huge drop in terms of traffic especially after doing an algorithm update.  However, the idea is to make sure that the goals of the search engine and the ones for the marketers needs to be the same. Among the goals, it is important to note the essence of serving up the most comprehensive and  best content to the audience, develop a more positive user experience and make a priority of the content that carries with it significant value.  One thing marketers have to take note revolves around the fact that Google Inc. is engaged in the analysis of millions of the websites (Gardi et al. 2013). It therefore makes use of over 200 ranking signals in the course of deciding the piece of content which needs to land on the first page.

Apparently, the algorithm would give attention to the signs displayed by the web page that has the most clarified and more relevant content. Secondly, marketers need to understand the reasons for growing the organic traffic. Businesses would spend around 1% of their revenues on developing ads. However, something that has rarely been discovered is the fact that attracting the organic traffic with the help of SEO is almost free. Therefore, it is more advisable for the marketers to invest time and money in growing their own organic traffic.

 

3. Main Body

3.1 Justification of Tools and Techniques

The project has already highlighted the purpose of establishing a website for an online restaurant that allows customers to place orders and make payments. The project has also noted the meaningful use of the wide angle approach, which invests in a range of tools and techniques needed in developing a robust website. On the other hand, the agile methodology describes the process needed in developing the website. Some of the tools and techniques needed in developing the website include XAMPP, text editor, MySQL, PHP, HTML and CSS. The most important tool among all these is XAMPP, which is also a free as well as open-source cross platform web server package said to have been developed by the famously known Apache Friends. The package has the MariaDB database, Apache HTTP server as well as the interpreters for the scripts, which are expressed in Perl programming languages and PHP.

XAMPP is confirmed to be the easiest platform that can be deployed with the help of the LAMP or the WAMP stack. XAMPP is an acronym that stands for X – Cross-Platform, A – Apache, M – MariaDB also known as MySQL, P – PHP and P – Pearl. XAMPP is believed to have the capacity of creating as well as manipulating such databases in SQLite and MariaDB. With XAMPP, one can easily treat the local host the same way it is for a remote host especially when it is an FTP client. Components included in XAMPP include PHP 7.4.5, MariaDB 10.4.11, and Apache 2.4.43, phpMyAdmin 5.0.2, Webalizer, FileZilla FTP Server 0.9.41, XAMPP control panel 3.2.4 and OpenLDAP – client 2.4.48. Apart from XAMPP, PHP is regarded as one of the general purpose scripting languages, which is suitable for facilitating the server side web development. PHP can easily run on the web server and any file that is requested can easily be executed on the PHP runtime. The MySQL, PHP and the HTML can all be hosted on XAMPP in the course of developing the software.

3.2 Development

3.2.1 Planning

The planning phase demanded the project to establish tasks and assign the appropriate time to each task before coming up with the schedule. With the help of a Gantt chart, it was possible to develop a clear picture of the main tasks that needed to be covered in the project. There are five main stages that were covered with the help of the agile methodology. The stages included ideation, design specification, prototyping, testing and implementation. However, additional tasks like consultation, development of literature and methodology are deemed to be supportive tasks that added more ideas to the project before reaching the completion stage. The Gantt chart is as shown below.

Figure 1: The Gantt chart for the project schedule

3.2.2 Design and implementation

Design and implementation could be covered in phases. The first phase was to gather information and developing a site map for the purpose of developing a flow of activities. It is important to understand the business goals and the utility of the website. Some of the critical things that need to be considered include the purpose. The purpose of the website is to allow customers to place orders and make a payment before the meals are prepared and delivered. The second consideration narrows to the goals. The project wishes to attain the shortest load time for the purposes of attracting traffic on the restaurant’s website. The online restaurants target every segment in the segment especially the online users with the content tailored to attract their attention towards new meals that can easily be prepared and delivered on time. The site map takes care of the topic areas and the subtopics that avails the comprehensive navigational system. In the sitemap, customers are only needed to access the website ordering system. The admin can still provide the menu where customers can pick more than one food item to be added to the cart. The system will them prompt a payment system.

Figure 2: The site map for the restaurant’s website

The next phase is the design phase which captures the logic part of the project while highlighting the decisive areas. On the logic design, the menu management module allows the admin to add, update or even delete any given food category in the menu. The same can be done for the food item. Prices can still be updated on the menu management system, which can be done alongside any additional information that can be appended to the food item of concern. The web ordering system counts as the second module and would therefore allow clients to create an account, log into it, manage the account place an order, provide the payment details, add items to the shopping cart and even remove an item. The order retrieval system is the closing module and help in the display of the orders, retrieve new orders and ensuring the graphical compliance.

Figure 3: The decision flow diagram

Based on the logic flow of the functions, it is expected that once the client makes a visit to the website, he or she has to see the menu, food items and where to register if necessary. A section of the code for the GUI is shown below.

{

try {

$stmt = $conn->prepare (“UPDATE users SET status=: status WHERE id=:id”);

$stmt->execute ([‘status’=>1, ‘id’=>$row [‘id’]]);

$output. = ‘

<div class=”alert alert-success”>

<h4><i class=”icon fa fa-check”></i> Success! </h4>

Account activated – Email: <b>’. $row [’email’].'</b>.

</div>

<h4>You may <a href=”login.php”>Login</a> or back to <a href=”index.php”>Homepage</a>.</h4>

‘;

}

catch (PDOException $e){

$output. = ‘

<div class=”alert alert-danger”>

<h4><i class=”icon fa fa-warning”></i> Error!</h4>

‘.$e->get Message().’

</div>

<h4>You may <a href=”signup.php”>Signup</a> or back to <a href=”index.php”>Homepage</a>.</h4>

‘;

}

 

Based on the code, one can access the home page before exploring what is being offered by the restaurant.  Perhaps, the website would give some of the food items that have received high orders, which is first indication of a generated traffic. The homepage is shown below.

Figure 4: The Dashboard

The registration may be prompted where the client needs to provide the necessary details that can be stored in the memory of the system. Once the details are verified, the customer can access the food items, which allows one to open the shopping cart. The system registers a session in its memory before the shopping can be done. It is possible that a client may place two or three orders depending on the size of the family one has. The code showing a single session is partly shown below.

($_SESSION[‘user’])){

$stmt = $conn->prepare (“SELECT *, COUNT (*) AS numrows FROM cart WHERE user_id=:user_id AND product_id=:product_id”);

$stmt->execute ([‘user_id’=>$user [‘id’], ‘product_id’=>$id]);

$row = $stmt->fetch ();

if($row[‘numrows’] < 1){

try{

$stmt = $conn->prepare(“INSERT INTO cart (user_id, product_id, quantity) VALUES (:user_id, :product_id, :quantity)”);

$stmt->execute ([‘user_id’=>$user[‘id’], ‘product_id’=>$id, ‘quantity’=>$quantity]);

$output[‘message’] = ‘Item added to cart’;

 

}

catch (PDOException $e){

$output[‘error’] = true;

$output[‘message’] = $e->getMessage ();

}

}

Figure 5:  Categories displayed on the admin’s end

The admin decides on what category of the food item is ready and can be added to the display. The customer is then allowed to add to the cart depending on food available at the restaurant.

Figure 6: The shopping cart

Immediately after closing the shopping cart, the system displays the price of a single item, and gives a total when there is more than one item.

Figure 7: The price display for a single item

When the shopping cart is verified and confirmed, the system opens another session for the payments. A number of payment options would be provided before the order is covered. Part of the code for the session is as shown below.

 

$stmt = $conn->prepare (“SELECT * FROM details LEFT JOIN products ON products.id=details. Product_id LEFT JOIN sales ON sales.id=details.sales_id WHERE details.sales_id=:id”);

$stmt->execute ([‘id’=>$id]);

 

$total = 0;

foreach($stmt as $row){

$output [‘transaction’] = $row [‘pay_id’];

$output [‘date’] = date (‘M d, Y’, strtotime($row[‘sales_date’]));

$subtotal = $row [‘price’]*$row [‘quantity’];

$total += $subtotal;

$output [‘list’]. = ”

<tr class=’prepend_items’>

<td>”.$row[‘name’].”</td>

<td>&#36; “.number_format($row[‘price’], 2).”</td>

<td>”.$row[‘quantity’].”</td>

<td>&#36; “.number_format ($subtotal, 2).”</td>

</tr>

“;

}

 

 

If a client is using PayPal is the option for making the payments, then the following display is likely to pop up thereby allowing the client to close the order.

Figure 8: The shopping cart pay

Therefore, the website touches on every functional area of the modules covered both on the admin end and the user end. The most observed characteristic is the browser compatibility as far as the significant tests are put into consideration. With a group of tasks incorporated in the website, it can be asserted that that the site is all inclusive as far as the online services are concerned.

3.2.3 Testing

The project aimed at performing the code speed optimization. However, the most critical side encompasses speed testing with the help of the commonly known keyCDN tool. Performance test for the speed of the website presents the analysis of the load times attached to the single asset before returning the breakdown of a number of the attributes. For instance, the status gives the HTTP status code while cache gives the keyCDN caching status which can MISS or HIT or any other status. The connect attribute presents the time taken before connecting to the web server and this needs to be below the 200ms mark given by a green indicator. Again, the TTFB denotes the time that would be taken when the client makes any of the HTTP request upon receiving the first byte of the data emanating from the web server. The TTFB needs to be below 400ms for a properly working website. During the test, 10 logins were triggered in the system for the purposes of determining the response of the web page. The CDN tests would determine the loading time and the idle time detected from the system.

3.2.4 Results

The CDN test and other inspection tests gave the impression of a working website that has such parameters like the loading time, scripting, rendering as well as painting. The following diagrams give the impression of the performance of the website once traffic is generated to it.

Figure 9: The loading time

The loading graphs would give a reflection of the response of the system to the requests. The diagram is as shown below.

Figure 10: Active requests and throughputs

3.2.5 Analysis

The project has provided the online platform through which clients can place their orders before making payments. While developing the website seems to be an obvious task, speed of the website remains a critical component in the analysis of the performance of the site. The results show that when traffic is generated to the website, the page response gives a loading time of 181ms. The loading time presents both the load performance and the runtime performance. While the time indicated is way below the 200ms mark, it is still more recommendable to have a load time of 100ms especially when there are only 10 login requests. Perhaps, code speed optimization need to be implemented to boost the response of the web page.

The system also responds to rendering rate of 178ms. Essentially, rendering implies generation of the HTML output by the web server. In most cases, the render time can be confused with the load time. The difference is that the load time commonly measures the real time taken to download the CSS, js and even the images into the browser used by the visitor. However, the render time focuses on the time taken in processing all the objects before showing the outcome to the visitor. For a well-functioning website, render time should be close to the load time given that the end result of the two processes is almost the same. Therefore, code speed optimization done on the load time should the same impact on the render time.

The test also takes note of the system, idle, painting and scripting. The scripting time seemingly takes the longest period of time among all the components. The common script is the JavaScript, which may take relatively long to be executed (Stone et al. 2017). With a runtime of 1110ms for only 10 requests, it is possible that the scripts must have been poorly written thereby conflicting with the site elements thereby leading to a slow performance. Therefore, the scripting time, in this context, need to be investigated and the ones that take time or seem to be problematic need to be deactivated for the purposes of improving performance (Aijaz et al. 2013). Minification of the scripts needs to be considered especially when the JavaScript and CSS files have empty spaces. On the other hand, the idle time denotes the time when the user is not interacting with the page. An idle time of 352ms means that users are actively interacting with the site and maybe placing orders.

Apart from the loading time and the performance analysis of the website, it is important to focus on the active requests, the VUs and the duration time. The VU, or the virtual user, load time is one the most aggregated result metric and it stands for the time take for the simulated users to essentially perform the HTTP transactions less the sleep time. The VU from the load curve is almost instantaneous, which means that users load pages at discrete times. Perhaps, this means that the web pages are having more content that keeps most of the users exploring for some time before proceeding to the next page. On the other hand, the curve for active requests flattens then goes up instantaneously (Stone et al. 2017). This is an indication that most of the users are considering wider options before placing any request or even make an order. The curve explains other factors that determine the purchase decisions apart from how fast the page responds, and how quick the services can be delivered. Possible factors include peer decisions, affordability, alternatives that are available and security especially when making payments or sharing personal information.

3.3 Academic Question

What are challenges encountered during the development process in terms of tools and techniques and usability?

3.3.1 Tools and techniques

The parent tool used in the development of the website is the XAMPP. The tool plays a significant role in setting up a PHP environment. However, installation of XAMPP on Windows 10 can be problematic in some cases. For instance, the installation processes could encounter scenarios where the XAMPP apache server would not start. However, the only reason that could explain this case is due the actively used default port 80. Perhaps, changing the ports would bring changes in the activation of XAMPP. At some point, the MySQL server could not be activated instantly which prompted the adjustment of the directories.

3.3.2 Security

The website never incorporated any of the security measures as far as the encryption methods are put into consideration. Apparently, this must have raised doubts among the virtual users who would take relatively long period of time before they could place an active request.  Delays over the scripting time must have worsened the case especially when users are conducting the transactions or doing the payments.

3.3.3 Usability

During the tests, it could be established that there were records of idle time, which could not be dismissed due to the fact that users were spending a lot of time on one page before loading the next one. Perhaps, this is not a good indication of usability given the low conversion rates.

 

4. Conclusion

The project aimed at establishing reasons for the increasing growth of website optimization across companies that aim at generating both the sales and traffic. The background information revisited a number of case studies that reflected on the changing business environment. Based on the observations made, search engine optimization and the associated practices are being used at an increasing trajectory. This is due to the meaningful use of the website content and the relevant resources that would attract visitors to a particular website. However, the background information noted some of the gap areas leading to the necessity of having code speed optimization. This prompted an artefact that focused more on designing as well as developing an online restaurant, which shall allow its customers to place orders. The research methods covered the meaningful use of the wide angle approach and the agile methodology.

The literature further expounded on the finding tapped from the case studies as far as web optimization, more so code speed optimization, is put into consideration. The main body focused on justification of tools and techniques with a prime focus on XAMPP, PHP and HTML among others. The development phase focused on planning, design and implementation and testing. Analytical areas could not be limited to the loading time, the idle time and the scripting time among others. Attention could be given to the active requests, the virtual users and the throughput, which reflected on the pattern of user on the web page by the clients. Based on the analysis, it could be noted that code speed optimization as necessary to reduce the load time and increase the responsiveness of the web page. Some of the challenges that have been cited in the project include failure for the XAMPP tool to establish a PHP environment, lack of security considerations in the course of developing the website and the doubted usability on the client end.

 

5. Critical Evaluation

5.1 Process

The process was effectively covered with the help of the Gantt chart. Despite the fact that the Gantt chart never provided a breakdown of the main tasks, it still provided enough time for each task to be covered comprehensively. The Gantt chart is as shown below.

Figure 11: The Gantt chart

The activities were carried out within the stimulated time with slight variations on some of them. The ideation process took slightly shorter that the time assigned to the task. This is due to the fact that the research area could easily be extracted from the gaps left by a range of the case studies. Consultations were conducted within the time limits and the literature covered a wide range of the case studies that could be associated to web optimization and code speed optimization. However, the methodology section took slightly longer than the set time due to collection of the necessary tools such as XAMPP and defining the agile process. Again, installation of tools such as XAMPP took longer than expected due to activation process that was impeded due to failure of the ports. Prototyping was better placed within the timelines because the website could easily be tested with 10 login requests. Tests conducted through the keyCDN tool focused on the load time, the idle time, the response of the web page and the scripting time. It could be noticed that the scripting time took longer than expected. This prompted minification of the JavaScript. Both the implementation and documentation fell within the acceptable timelines. The choice of the artefact was best aligned to the available tools that prompted development of an online restaurant that had its services supported through the website.

5.2 Product

Based on the tests carried out on the website, it could be noticed that minifying the JavaScript and reducing the libraries linked to the website, as a way of code speed optimization, essentially improves the load time and responsiveness of the web page.  With such measures put in place, the load time would go as low as 100ms and the scripting time would be less than 800ms. When there is an improvement in terms of the page responsiveness, the website stands a chance of generating traffic and attracting more sales for the restaurant. Code speed optimization is therefore regarded as an important process that would boost the performance of the businesses by building confidence among the online clients or customers.

5.2.1 Future work

The future work does not focus on website optimization but narrows down on a range of ways that would help developers to carry out code speed optimization. A further commitment would invest both time and resources in engaging compliers that would easily transform the programming languages into easy and readable codes.

5.3 Self Reflection

Developing the artefact and building a project out of it has been a long journey for me. Perhaps, it gave me a platform of exploiting what I had and putting my creativity to test.  What began as a simple idea grew into a complicated project that need more than just resources. I enjoyed the prerequisites and exploring materials that had same ideas. However, upon developing the main body, the project took a new turn that defined the lanes of reasoning in its own way. I had to invest time and resources into the project especially when it came to defining the tools and techniques. The course of web development was enjoyable but somehow problematic at the start. Creating the PHP environment was such a big challenge that attracted further attention in terms of troubleshooting the possible cause that stopped XAMPP from being activated. Solving this part of the project marked the greatest achievement for me and this allowed a quick flow of the remaining activities. The last bit of the project was interesting because I was testing whether my idea would work or not. Following a chain of tests that went through more successful, it became obvious that the long process had come to an end. I believe that the project was both a success and a challenge at the same time.