SAMPLE REPORT – deliverable (Note: rename this document and remove the “sample” text before submitting as it will no longer be a sample report)

 Overall analysis and score for
Score61 (SEO SCORE) 29 (PASSED) 4 (WARNINGS) – explain 14 (FAILED) – explain 0 (UNRESOLVED) – explain
Score based on
Pros (add your own comments; analysis report will not provide these comments)Portal is very resourceful.Link to knowledge center provides a place for research.Pull down menus are helpful for accessibilitySite content and search box are uniqueMultilanguage browsing is also unique.Membership page is easily accessible.
Cons (add your own comments; analysis report will not provide these comments)Too many hyperlinks in one page.Multiple sections in one page are difficult to navigate. 
The title of page: Most search engines will truncate titles to 70 characters. Length of 85 characters. 
The meta description of page: The meta description tag is meant to be a short and accurate summary of page content. This description can affect search engine rankings and can also show up directly in search engine results (and affect whether or not the user clicks through to the site).The meta description of page have a length of 157 characters. Most search engines will truncate meta descriptions to 160 characters.
Robots.txt Test: Search engines send out tiny programs called spiders or robots to search your site and bring information back so that your pages can be indexed in the search results and found by web users. If there are files and directories you do not want indexed by search engines, you can use the “robots.txt” file to define where the robots should not go. ​These files are very simple text files that are placed on the root folder of your website. There are two important considerations when using “robots.txt”: first, the “robots.txt” file is a publicly available file, so anyone can see what sections of your server you don’t want robots to use; second, robots can ignore your “robots.txt”, especially malware robots that scan the web for security vulnerabilities.Site use a “robots.txt” file: 
Sitemap Test: This test checks if the website is using a “sitemap” file: sitemap.xml, sitemap.xml.gz or sitemapindex.xml. Sitemaps are an easy way for webmasters to inform search engines about pages on their sites that are available for crawling. In its simplest form, a sitemap is an XML file that lists URLs for a site along with additional metadata about each URL (when it was last updated, how often it usually changes, and how important it is, relative to other URLs in the site) so that search engines can more intelligently crawl the site.Found 1 sitemap file for the website.
Favicon Test and Validator: Checks if site is using and correctly implementing a favicon. Favicons are small icons that appear in browser’s URL navigation bar. They are also saved next to URL’s title when bookmarking that page. They can help brand the site and make it easy for users to navigate to site among a list of bookmarks.Site doesn’t have a favicon or has not been referenced correctly.
Page ObjectsTotal objects: 120; Html pages: 1; Images: 40; CSS files: 8; Scripts: 22; CSS images: 49; Video files: 0;   Page has more than 20 http requests, which can slow down page loading.  
Code To Text Ratio: Checks webpage source code in order to measure the size of text content compared to the structure (HTML code). This percent is not a direct ranking factor for search engines but there are other factors that depend on it such as site loading speed and user experience.Page size (source code) is 89.94 Kb and content text size is 6.80 Kb. Content text represents 7.56% from webpage source code. This is a low ratio and you might need to add more content!
Google Analytics Test: Checks if website is connected with google analytics.Website is using the asynchronous version of Google Analytics tracking code.
Google Preview: To see how webpage might look into a Google search results page. A Google search result uses webpage title, url and meta-description in order to display the information. If these elements are too long, Google will truncate their content.Information Technology – Information Security – Information Assurance | ISACA ISACA® is a nonprofit, independent association that advocates for professionals involved in information security, assurance, risk management and governance.
Keywords Cloud: The Keyword Cloud is a visual representation of keywords used on website. This will show which words are frequently used in the content of webpage.advanced america amp apply audit audit/assurance author become certified benefits best big blog browse business cacs career center certification cgeit chapter chinese cisa cism cobit conference conferences control courses cpe cpes crisc csx cybersecurity data day download earn education events exam feb focus football framework global governance implementing information isaca isaca’s isrm jan join june knowledge learn licensing local maintain malware manage management mar membership nbspnbsp;&nbspnbsp;isaca network new north objectives online opportunities papers privacy professional programs publications purpose register renew report research review risk sarbanesoxley save search share sign study today topics training user using value volunteer web week
Page Cache Test: Checks if the site is serving cached pages. Caches reduce server-load (since pages are generated less often) and speeds up page display (by caching page output vs compiling the php-page). Caches also reduces bandwidth requirements by up to 80%. Caching makes most sense for high traffic pages whose content does not change on every page view. Common caching methods are Quickcache and jpcache.The site has a caching mechanism. Caching helps speed page loading times as well as reduce server load.
Image Expires Tag Test: Checks if page is using an image expires tag, which specifies a future expiration date for images. Browsers will see this tag and caches the image in the user’s browser until the specified date (so that it does not keep re-fetching the unchanged image from your server). This speeds up site the next time that user visits the site and requires the same image.This webpage uses ‘Expires’ header for images and the browsers will display these images from the cache.
URL Canonicalization Test: Test the site for potential URL canonicalization issues. Canonicalization describes how a site can use slightly different URLs for the same page (for example, if and displays the same page but do not resolve to the same URL). If this happens, search engines may be unsure as to which URL is the correct one to index. and resolve to the same URL.
Directory Browsing Test: Checks if server allows directory browsing. If directory browsing is disabled, visitors will not be able to browse the directory by accessing the directory directly (if there is no index.html file). This will protect files from being exposed to the public. Apache web server allows directory browsing by default. Disabling directory browsing is generally a good idea from a security standpoint.The server has disabled directory browsing.
Libwww-perl Access Test: Botnet scripts that automatically look for vulnerabilities in the software are sometimes identified as User-Agent libwww-perl. By blocking access from libwww-perl you can eliminate many simpler attacks. Server does not allows access from libwww-perl User-Agent.
Server Signature Test: Checks if server signature is on. Turning off server signature is generally a good idea from a security standpoint.Server signature is on.
Plaintext Emails Test: Checks webpage for plaintext email addresses. Any e-mail address posted in public is likely to be automatically collected by computer software used by bulk e-mailers (a process known as e-mail address harvesting). A spam harvester can read through the pages in the site and extract email addresses which are then added to bulk marketing databases and the result is more spam in the inbox.Webpage does not include email addresses in plaintext
IP Canonicalization Test: Test the site for potential IP canonicalization issues. Canonicalization describes how a site can use slightly different URLs for the same page (for example, if the site’s IP address and domain name display the same page but do not resolve to the same URL). If this happens, search engines may be unsure as to which URL is the correct one to index. Site’s IP does not redirect to site’s domain name. This could cause duplicate content problems if a search engine indexes site under both its IP and domain name.
Safe Browsing Test: Checks if website is listed with malware or phishing activity.This site is not currently listed as suspicious (no malware or phishing activity found).
Social Media Check: Test if the website connects to at least one of the most important social networks.Website is not connected with social media using the API’s provided by Facebook, Google +, Twitter, Pinterest, or using
Social Media Activity: Checks the activity on social media networks of the website or URL. This activity is measured in total number of shares, likes, comments, tweets, plusOnes and pins and this activity covers only the URL and not social media accounts linked with the webpage.This website has a good activity on social media networks. Search engines are increasingly using social media activity to determine which pages are most relevant for keyword searches. Social media engagement helps increase page rank and to increase revenue generated through organic search.   – Facebook Likes: 122, Facebook Shares: 265, Facebook Comments: 51 – Tweets: 3 – Google PlusOnes: 2879 – No activity on Pinterest!
Most Common Keywords TestFill-in the results and explain all (below)
URL SEO Friendly TestFill-in the results
Underscores in Links TestFill-in the results
Image Alt TestFill-in the results
Inline CSS TestFill-in the results
Deprecated HTML TagsFill-in the results
Nested Tables TestFill-in the results
JS Minification TestFill-in the results
CSS Minification TestFill-in the results
MOBILE USABILITYFill-in the results
Media Query Responsive TestFill-in the results

All papers are written by ENL (US, UK, AUSTRALIA) writers with vast experience in the field. We perform a quality assessment on all orders before submitting them.

Do you have an urgent order?  We have more than enough writers who will ensure that your order is delivered on time. 

We provide plagiarism reports for all our custom written papers. All papers are written from scratch.

24/7 Customer Support

Contact us anytime, any day, via any means if you need any help. You can use the Live Chat, email, or our provided phone number anytime.

We will not disclose the nature of our services or any information you provide to a third party.

Assignment Help Services
Money-Back Guarantee

Get your money back if your paper is not delivered on time or if your instructions are not followed.

We Guarantee the Best Grades
Assignment Help Services