You have a page title and it does not exceed 70 characters. Well done!
About this SEO factor:
A page title is often treated as the most important on-page element. It is a strong relevancy signal for search engines because it tells them what the page is really about. It is of course important that title includes your most important keywords. More than that, every page should have a unique title to ensure that search engines have no trouble in determining which of the website pages is relevant for each query. Pages with a duplicate title have less chance of ranking high and the other pages may be hard to get ranked as well.
If a page doesn't have a title, or the title tag is empty (i.e. it just looks like this in the code: <title></title>), Google and other search engines will decide on their own, what content to show on the results page. Therefore, if the page ranks on Google for a keyword and someone sees it in the results for their search, they may not click on it because it is not clear. Every time you are creating a webpage, don't forget to add a meaningful title that will attract your audience.
Description is the full interpretation of your website content and features. Most often it is a short paragraph that describe what are features and information provided by the website to its visitors. You may consider it a advertising of your website. Although not important for search engine ranking but very important for hits or visits through search engine results. Description should be less than 150 character because search engine shows this length of paragraph on search result. And every page of website should contain an unique description to avoid description duplication. Description is the definition of your website for user experience so form it as complete but short and precise illustration of your website.
Meta keywords are keywords inside Meta tags. Meta keywords are not likely to be used for search engine ranking. the words of title and description can be used as meta keywords. it is a good idea for SEO other than search engine ranking.
Unique words are uncommon words that reflects your site features and informations. Search engine metrics are not intended to use unique words as ranking factor but it is still useful to get a proper picture of your site contents. Using positive unique words like complete, perfect, shiny, is a good idea user experience.
Stop words are common words like all the preposition, some generic words like download, click me, offer, win etc. since most used keyword may be a slight factor for visitors you are encouraged to use more unique words and less stop words.
The ideal page's ratio of text to HTML code must be lie between 20 to 60%.
Because if it is come less than 20% it means you need to write more text in your web page while in case of more than 60% your page might be considered as spam.
h1 status is the existence of any content inside h1 tag. Although not important like Meta titles and descriptions for search engine ranking but still a good way to describe your contents in search engine result.
h2 status less important but should be used for proper understanding of your website for visitor.
robots.txt is text file that reside on website root directory and contains the instruction for various robots (mainly search engine robots) for how to crawl and indexing your website for their webpage. robots.txt contains the search bots or others bots name, directory list allowed or disallowed to be indexing and crawling for bots, time delay for bots to crawl and indexing and even the sitemap url. A full access or a full restriction or customized access or restriction can be imposed through robots.txt.
robots.txt is very important for SEO. Your website directories will be crawled and indexed on search engine according to robots.txt instructions. So add a robots.txt file in your website root directory. Write it properly including your content enriched pages and other public pages and exclude any pages which contain sensitive information. Remember robots.txt instruction to restrict access to your sensitive information of your page is not formidable on web page security ground. So do not use it on security purpose.
Sitemap is a xml file which contain full list of your website urls. It is used to include directories of your websites for crawling and indexing for search engine and access for users. it can help search engine robots for indexing your website more fast and deeply. It is roughly an opposite of robots.txt
You can create a sitemap.xml by various free and paid service or you can write it with proper way (read about how write a sitemap).
Also keep these things in mind:
1) Sitemap must be less than 10 MB (10,485,760 bytes) and can contain maximum 50,000 urls. if you have more uls than this create multiple sitemap files and use a sitemap index file.
2) Put your sitemap in website root directory and add the url of your sitemap in robots.txt.
3) sitemap.xml can be compressed using grip for faster loading.
Broken link: a broken link is an inaccessible link or url of a website. a higher rate of broken links have a negative effect on search engine ranking due to reduced link equity. it also has a bad impact on user experience. There are several reasons for broken link. All are listed below.
1) An incorrect link entered by you.
2) The destination website removed the linked web page given by you. (A common 404 error).
3) The destination website is irreversibly moved or not exists anymore. (Changing domain or site blocked or dysfunctional).
4) User may behind some firewall or alike software or security mechanism that is blocking the access to the destination website.
5) You have provided a link to a site that is blocked by firewall or alike software for outside access. Learn more or Learn more
NoIndex : noindex directive is a meta tag value. noindex directive is for not to show your website on search engine results. You must not set ‘noindex’ as value in meta tags if you want to be your website on search engine result.
By default, a webpage is set to “index.” You should add a <meta name="robots" content="noindex" /> directive to a webpage in the <head> section of the HTML if you do not want search engines to crawl a given page and include it in the SERPs (Search Engine Results Pages).
DoFollow & NoFollow : nofollow directive is a meta tag value. Nofollow directive is for not to follow any links of your website by search engine bots. You must not set ‘nofollow’ as value in meta tags if you want follow your link by search engine bots.
By default, links are set to “follow.” You would set a link to “nofollow” in this way: <a href="http://www.example.com/" rel="nofollow">Anchor Text</a> if you want to suggest to Google that the hyperlink should not pass any link equity/SEO value to the link target.
An SEO friendly link is roughly follows these rules. The url should contain dash as a separator, not to contain parameters and numbers and should be static urls.
To resolve this use these techniques.
1) Replace underscore or other separator by dash, clean url by deleting or replaceing number and parameters.
2) Marge your www and non www urls.
3) Do not use dynamic and related urls. Create an xml sitemap for proper indexing of search engine.
4) Block unfriendly and irrelevant links through robots.txt.
5) Endorse your canonical urls in canonical tag. Learn more
An alternate title for image. Alt attribute content to describe an image. It is necessary for notifying search engine spider and improve actability to your website. So put a suitable title for your image at least those are your website content not including the images for designing your website. To resolve this put a suitable title in your alt attributes. Learn more
Older HTML tags and attributes that have been superseded by other more functional or flexible alternatives (whether as HTML or as CSS ) are declared as deprecated in HTML4 by the W3C - the consortium that sets the HTML standards. Browsers should continue to support deprecated tags and attributes, but eventually these tags are likely to become obsolete and so future support cannot be guaranteed.
HTML page size is the one of the main factors of webpage loading time. It should be less than 100 KB according to google recommendation. Note that, this size not including external css, js or images files. So small page size less loading time.
To reduce your page size do this steps
1) Move all your css and js code to external file.
2) make sure your text content be on top of the page so that it can displayed before full page loading.
3) Reduce or compress all the image, flash media file etc. will be better if these files are less than 100 KB Learn more
GZIP is a generic compressor that can be applied to any stream of bytes: under the hood it remembers some of the previously seen content and attempts to find and replace duplicate data fragments in an efficient way - for the curious, great low-level explanation of GZIP. However, in practice, GZIP performs best on text-based content, often achieving compression rates of as high as 70-90% for larger files, whereas running GZIP on assets that are already compressed via alternative algorithms (e.g. most image formats) yields little to no improvement. It is also recommended that, GZIP compressed size should be <=33 KB
Inline css is the css code reside in html page under html tags not in external .css file. Inline css increases the loading time of your webpage which is an important search engine ranking factor. So try not to use inline css.
Internal css is the css codes which resides on html page inside style tag. Internal css is increases loading time since no page caching is possible for internal css. Try to put your css code in external file.
Micro data is the information underlying a html string or paragraph. Consider a string “Avatar”, it could refer a profile picture on forum, blog or social networking site or may it refer to a highly successful 3D movie. Microdot is used to specify the reference or underlying information about an html string. Microdata gives chances to search engine and other application for better understanding of your content and better display significantly on search result.
If multiple domain name is registered under single ip address the search bots can label other sites as duplicates of one sites. This is ip canonicalization. Little bit like url canonicalizaion. To solve this use redirects.
Canonical tags make your all urls those lead to a single address or webpage into a single url. Like : <link rel="canonical" href="https://mywebsite.com/home" /> <link rel="canonical" href="https://www.mywebsite.com/home" />
Both refer to the link mywebsite.com/home. So all the different url with same content or page now comes under the link or url mywebsite.com/home. Which will boost up your search engine ranking by eliminating content duplication.
Use canonical tag for all the same urls. Learn more
Site passed plain text email test. No plain text email found.
About this SEO factor:
Plain text email address is vulnerable to email scrapping agents. An email scrapping agent crawls your website and collects every Email address which written in plain text. So existence of plain text email address in your website can help spammers in email Harvesting. This could be a bad sign for search engine.
To fight this you can obfuscate your email addresses in several ways:
1) CSS pseudo classes.
2) Writing backward your email address.
3) Turn of display using css.
5) Using wordpress and php (wordpress site only). Learn more
Compressing resources with gzip or deflate can reduce the number of bytes sent over the network.
Enable compression for the following resources to reduce their transfer size by 36.2KiB ( 64%reduction ).
https://vpnur.com/stat/piwik.jscould save 34.5KiB ( 65%reduction )
https://passwordsgenerator.net/name-generator/could save 1.7KiB ( 59%reduction )
Setting an expiry date or a maximum age in the HTTP headers for static resources instructs the browser to load previously downloaded resources from local disk rather than over the network.
Leverage browser caching for the following cacheable resources:
https://passwordsgenerator.net/name-generator/could save 525B ( 18%reduction )
https://passwordsgenerator.net/name-generator/main.js?v=1.3 could save 247B ( 25%reduction )
Your page has (1) blocking script resources and (1) blocking CSS resources. This causes a delay in rendering your page. None of the above-the-fold content on your page could be rendered without waiting for the following resources to load. Try to defer or asynchronously load blocking resources, or inline the critical portions of those resources directly in the HTML.
Images are often the largest part of a page and account for most of the downloaded data. As a result, optimizing images can often be the largest savings and performance improvement. The less data the browser has to download the faster the page will load.
Optimizing image works by removing unnecessary data that is saved in the images. This can reduce the total page load size by up to 80%.
Visible content is the portion of a webpage users see on their screen before they scroll. It is sometimes referred to as "above the fold" content. Websites that seem very fast and crisp to load are often just as large as slow websites, they have just prioritized the visible content so the site appears to load faster. Google prefers webpages that show content quickly.
Google suggests two main strategies for prioritizing the visible content. First, structure your HTML to load the critical content first, such as the main contents instead of ads or navigation. Next, reduce the amount of data used by your resources by using techniques such as image optimization, minification, and by removing unnecessary elements.