The Liquid Purple Digital Marketing Agency can assist you with website creation, management, marketing, and SEO...
The following metrics are generated using performance data.
First Contentful Paint marks the time at which the first text or image is painted onto your page. A good user experience is 0.9s or less. 4.5 s DESKTOP 1.1 s Score: 0.8 Learn More
Time to interactive is the amount of time it takes for the page to become fully interactive. A good user experience is 2.5s or less. 8.5 s DESKTOP 3.4 s Learn More
Speed Index shows how quickly the contents of a page are visibly populated. A good user experience is 1.3s or less. 4.6 s DESKTOP 2.3 s Score: 0.49 Learn More
How much time is blocked by scripts during your page loading process or the sum of all time periods between FCP and Time to Interactive. A good user experience is 150ms or less. Learn More
Largest Contentful Paint marks how long it takes for the largest element of content (e.g. a hero image) to be painted on your page. A good user experience is 1.2s or less. Learn more.
Cumulative Layout Shift measures the movement of visible elements within the viewport or how much your page's layout shifts as it loads. A good user experience is a score of 0.1 or less. Learn More
Total Page Size - 1.4 MB
JavaScript
Images
Fonts
CSS Stylesheets
HTML
Other
Media
Total Page Requests - 128
Response Codes:
Here is more detailed information about the page.
NoIndex : noindex directive is a meta tag value. noindex directive is for not to show your website on search engine results. You must not set ‘noindex’ as value in meta tags if you want to be your website on search engine result.
By default, a webpage is set to “index.” You should add a <meta name="robots" content="noindex" /> directive to a webpage in the <head> section of the HTML if you do not want search engines to crawl a given page and include it in the SERPs (Search Engine Results Pages).
<meta name="robots" content="noindex" />
DoFollow & NoFollow : nofollow directive is a meta tag value. Nofollow directive is for not to follow any links of your website by search engine bots. You must not set ‘nofollow’ as value in meta tags if you want follow your link by search engine bots.
By default, links are set to “follow.” You would set a link to “nofollow” in this way: <a href="http://www.example.com/" rel="nofollow">Anchor Text</a> if you want to suggest to Google that the hyperlink should not pass any link equity/SEO value to the link target.
<a href="http://www.example.com/" rel="nofollow">Anchor Text</a>
Learn more
<link rel="canonical" href="https://mywebsite.com/home" />
<link rel="canonical" href="https://www.mywebsite.com/home" />
Potential savings of 3,460 ms
Potential savings of 730 ms
1 resource found
6 resources found
Third-party code blocked the main thread for 640 ms
Third-party code blocked the main thread for 630 ms
5.1 s
1.2 s
1,260 ms
730 ms
Potential savings of 40 KiB
30 ms
70 ms
Potential savings of 49 KiB
Total size was 1,183 KiB
Total size was 1,428 KiB
5.7 s
3.8 s
50 chains found
673 elements
675 elements
Potential savings of 39 KiB
0 ms
10 ms
Potential savings of 8 KiB
Potential savings of 29 KiB
Potential savings of 12 KiB
Potential savings of 321 KiB
Potential savings of 341 KiB
Root document took 430 ms
Root document took 1,630 ms
20 long tasks found
10 long tasks found
# If the Joomla site is installed within a folder # eg www.example.com/joomla/ then the robots.txt file # MUST be moved to the site root # eg www.example.com/robots.txt # AND the joomla folder name MUST be prefixed to all of the # paths. # eg the Disallow rule for the /administrator/ folder MUST # be changed to read # Disallow: /joomla/administrator/ # # For more information about the robots.txt standard, see: # http://www.robotstxt.org/orig.html # # For syntax checking, see: # http://tool.motoricerca.info/robots-checker.phtml Sitemap: https://www.jackmarvin.com/sitemap-xml User-agent: * Allow: /*.js* Allow: /*.css* Allow: /*.png* Allow: /*.jpg* Allow: /*.gif* Disallow: /administrator/ Disallow: /bin/ Disallow: /cache/ Disallow: /cli/ Disallow: /includes/ Disallow: /installation/ Disallow: /language/ Disallow: /layouts/ Disallow: /libraries/ Disallow: /logs/ Disallow: /tmp/ # #
Here are some additional factors that affect your SEO page score and rankings.
There are browser cache times that could be set longer.
Serve static assets with an efficient cache policy. A long cache lifetime can speed up repeat visits to your page. Learn more
There are 6 resources found with short cache lifetimes.
Third-party code IS impacting load performance!
Third-party code can significantly impact load performance. Limit the number of redundant third-party providers and try to load third-party code after your page has primarily finished loading. [Learn how to minimize third-party impact](https://developers.google.com/web/fundamentals/performance/optimizing-content-efficiency/loading-third-party-javascript/).
Third-party code blocked the main thread for 630 ms.
Resources ARE blocking the first paint of your page!
Resources are blocking the first paint of your page. Consider delivering critical JS/CSS inline and deferring all non-critical JS/styles. [Learn how to eliminate render-blocking resources](https://developer.chrome.com/docs/lighthouse/performance/render-blocking-resources/).
Potential savings of 730 ms.
The network round trip time is good!
Network round trip times (RTT) have a large impact on performance. If the RTT to an origin is high, it's an indication that servers closer to the user could improve performance. [Learn more about the Round Trip Time](https://hpbn.co/primer-on-latency-and-bandwidth/).
Main thread work can use minimizing!
Consider reducing the time spent parsing, compiling and executing JS. You may find delivering smaller JS payloads helps with this. [Learn how to minimize main-thread work](https://developer.chrome.com/docs/lighthouse/performance/mainthread-work-breakdown/)