Potongan cerita pendek - kelas kosong masa SMA yang mengingatkanku pada dirinya
Recent newsCerita ini fiksi belaka, kemiripan dengan kejadian sesungguhnya hanyalah kebetulan, atau dirimu memang ingin membuatnya kebetulan 😅
read more(Comments)
I just read a great article from SEM rush. That how the SEO mistake is made nowadays.
Here is some of the summary that perhaps will be useful for future usage.
The fight to keep pace with continual updates and avoid technical site issues is part of daily life for webmasters.
Even if you are already aware of a number of problems with your website, it can be a struggle to maintain its health in the ever-changing world of SEO.
With a firm grasp of the most common (and potentially harmful) mistakes, though, you can give yourself a fighting chance of keeping technical issues to a minimum and website performance to a maximum.
This guide gives you a comprehensive site audit checklist that will help you do exactly that as a webmaster, no matter how large or small your site might be.
We ran 250,000 websites from a range of niches, including health, travel, sports and science, through the SEMrush Site Audit tool to find the most prevalent SEO mistakes holding them back.
In total, we analyzed:
310,161,067 webpages
28,561,137,301 links
6,910,489,415 images
This breadth of analysis gave us enough insight to create a comprehensive site audit template that webmasters can use to avoid the mistakes themselves.
There is no escaping the fact that a properly conducted site audit is a time-consuming task.
Our study revealed 27 common mistakes that obviously can’t be completed at once, so we have broken the list down into digestible chunks to use as an actionable template.
Click on one of the categories below to jump to the section you need:
The most critical technical issues with a website are often related to its HTTP status.
These include status codes like Error 404 (Page not found), which indicate the server’s response on the back of a request from a client, such as a browser or search engine.
When the dialogue between a client and a server - or, in simpler terms, a user and your website - gets interrupted and breaks down, so too does the trust the user has in the site.
Serious server issues may not only lead to lost traffic because of inaccessible content, but they may also damage your rankings in the long run if they leave Google unable to find any suitable results on your site for the searcher.
1. 4xx errors
4xx codes mean that a page is broken and cannot be reached. They can also apply to working pages when something is blocking them from being crawled.
MORE ON THISThe Most Common Internal Link Building Mistakes: A SEMrush Study Post
2. Pages not crawled
This occurs when a page cannot be reached for one of two reasons: 1) the response time of your website is over five seconds; or 2) your server denied access to the page.
3. Broken internal links
These are links that lead users to pages to a non-functioning page, which can damage UX and SEO.
4. Broken external links
These are links that lead users to pages that don’t exist on another site, which sends negative signals to search engines.
5. Broken internal images
This is flagged when a picture file no longer exists, or its URL is misspelled.
Permanent redirects
Temporary redirects
Your meta tags help search engines identify the subject matters of your pages to connect them with the keywords and phrases used by searchers.
Creating the right title tags means choosing the relevant keywords to form a unique and click-worthy link for users in the search engine results pages (SERPs).
The meta descriptions give you additional opportunities to include keywords and related phrases.
They should be as unique and tailored as possible - if you don’t create your own, Google will automatically generate them based on the keywords in users’ queries, which can sometimes lead to mismatched search terms and associated results.
Optimized title tags and meta descriptions need to include the most appropriate keywords, be the correct length and avoid duplication as much as possible.
Some industries, such as e-commerce fashion, are unable to create unique descriptions for every single product, so they need to offer unique value in other areas of their landing pages’ body copy.
If unique meta data are possible, though, you should head in that direction to give your site the best chance of maximizing its impact in the SERPs.
6. Duplicate title tags and meta descriptions
Two or more pages with the same titles and descriptions make it difficult for search engines to properly determine relevance and, in turn, rankings.
7. Missing H1 tags
H1 tags help search engines determine the topic of your content. If they are missing, there will be gaps in Google’s understanding of your website.
8. Missing meta descriptions
Well-written meta descriptions help Google understand relevance and encourage users to click on your result. If they are missing, click-through rates can fall.
9. Missing ALT attributes
ALT attributes provide search engines and visually impaired people with descriptions of the images in your content. Without them, relevance is lost and engagement can suffer.
10. Duplicate H1 tags and title tags
When H1 tags and title tags are the same on any given page, it can look over-optimized and it can mean opportunities to rank for other relevant keywords have been missed.
Short / long title elements
Multiple H1 tags
Duplicate content has the capacity to damage your rankings - and potentially for a while.
You should steer clear of duplicating any kind of content from any kind of site out there, whether they are a direct competitor or not.
Look out for duplicate descriptions, paragraphs and entire sections of copy, duplicate H1 tags across multiple pages and URL issues, such as www and non-www versions of the same page.
Pay attention to the uniqueness of every detail to make sure a page is not only rankable in Google’s eyes, but also clickable in users’ eyes.
11. Duplicate content
The Site Audit tool flags duplicate content when pages on your website have the same URL or copy, for instance. It can be resolved by adding a rel=“canonical” link to one of the duplicates, or using a 301 redirect.
Duplicate H1 tags and title tags
Duplicate meta descriptions
The links that guide your visitors in and out of your customer journeys can damage your overall user experience and, in turn, your search performance. Google simply will not rank sites that deliver a poor user experience.
This study revealed that close to half of the sites we ran through the Site Audit tool have problems with both internal and external links, which would suggest that their individual link architectures are not optimized.
Some of the links themselves have underscores in the URLs, contain nofollow attributes, and are HTTP instead of HTTPS - this can impact rankings.
You can find broken links on your site with the Site Audit tool; the next step would be for you to identify which ones are having the biggest effect on your user engagement levels and to fix them in order of priority.
12. Links that lead to HTTP pages on an HTTPS site
Links to old HTTP pages may cause an unsafe dialog between users and a server, so be sure to check that all your links are up to date.
13. URLs containing underscores
Search engines may misinterpret underscores and incorrectly document your site index. Stick to using hyphens instead.
Broken internal links
Broken external links
Nofollow attributes in external links
Pages with only one internal link
Page Crawl Depths of more than 3 clicks
Crawlability sits alongside indexation issues as one of the crucial health indicators of a website.
There is ground to be both lost and gained in the SERPs when it comes to the crawlability of your site.
MORE ON THISZero-Click Searches, SERP Features and Getting Traffic Post
If you ignore any crawling issues from a technical SEO perspective, some of the pages on your site might not be as visible as they should be to Google.
If you fix any crawling issues, however, Google will be more likely to identify the right links for the right users in the SERPs.
You can avoid technical issues by assessing your site for broken or blocked elements that restrict its crawlability.
Kevin Indig, VP SEO & Content at G2dotcom, emphasizes the importance of synergy between sitemaps and robots here:
What surprised me is that many XML sitemaps are not referenced in the robots.txt. That seems like a standard to me. What’s not surprising is the high degree of sites with only one internal link to pages or even orphaned pages. That’s a classic site structure issue that only SEOs have the awareness for.
An absence of a sitemap.xml file in your robots.txt file, for example, can lead to search engine crawlers misinterpreting your site architecture, as Matt Jones, SEO and CRO Manager at Rise at Seven, says:
As sitemap.xml files can help search engine crawlers identify and find the URLs that exist across your website, allowing them to crawl these [is] definitely a fantastic way to help search engines gain an in-depth understanding of your website and, in turn, gain higher rankings for more relevant terms.
14. Nofollow attributes in outgoing internal links
Internal links that contain the nofollow attribute block any potential link equity from flowing through your site.
15. Incorrect pages found in sitemap.xml
Your sitemap.xml should contain no broken pages. Check it for any redirect chains and non-canonical pages and make sure they return a 200 status code.
16. Sitemap.xml not found
Missing sitemaps make it more difficult for search engines to explore, crawl and index the pages of your site.
17. Sitemap.xml not specified in robots.txt
Without a link to your sitemap.xml in your robots.txt file, search engines will not be able to fully understand the structure of your site.
Pages not crawled
Broken internal images
Broken internal links
URLs containing underscores
4xx errors
Resources formatted as page links
Blocked external resources in robots.txt
Nofollow attributes in outgoing external links
Blocked from crawling
Pages with only one internal link
Orphaned sitemap pages
Page Crawl Depths more than 3 clicks
Temporary redirects
Good indexability indicators are vital for SEO. Put simply, if a page is not indexed, it won’t be seen by a search engine, so it won’t be seen by users either.
There are many factors that can prevent your website from being indexed, even if you seem to have no issues with crawlability.
Duplicate meta data and content, for instance, can make it difficult for search engines to identify which pages to rank for certain similar search terms.
You can see from our research above that almost half of the sites we audited are suffering from indexing issues caused by duplicate title tags, descriptions and body content.
This may mean that Google is being forced into making decisions about which pages to rank, despite the fact that webmasters can preempt problems like these and tell Google what to do.
A range of different issues can affect the indexability of your site, from low word count to hreflang gaps or conflicts for multilingual websites.
18. Short / long title tags
Title tags of over 60 characters are cut short in the SERPs, while those under 60 characters might be missed opportunities for further optimization.
19. Hreflang conflicts within page source code
Multilingual websites can confuse search engines if the hreflang attribute is in conflict with the source code of any given page.
20. Issues with incorrect hreflang links
Broken hreflang links can create indexing issues if, for example, relative URLs are used instead of absolute ones: https://yourwebsite/blog/your-article instead of /blog/your-article.
21. Low word counts
The Site Audit tool can flag pages that appear to be lacking in content, so it is worth reviewing these to make sure they are as informative as possible.
22. Missing hreflang and lang attributes
This issue is triggered when a page on a multilingual site is missing the necessary links or tags to tell search engines what to serve users in each region. Find out more about hreflang here.
23. AMP HTML issues
This issue concerns mobile users of your website and is flagged when the HTML code does not align with AMP standards.
Duplicate H1 tags
Duplicate content
Duplicate title tags
Duplicate meta descriptions
Missing H1 tags
Multiple H1 tags
Hreflang language mismatch issues
It is vital to gear your on-page SEO towards having a mobile-friendly site.
We know that mobile-friendliness will become default ranking criteria for both mobile and desktop for Google in September 2020.
This means that, as webmasters, you need to make sure your site’s HTML code complies with Google’s AMP guidelines before then to be mobile-ready and avoid potential damage to your search performance.
Check for invalid AMP pages on your site with the Site Audit tool so you can see what needs fixing; it may come down to your HTML, your style and layout or your page templates.
AMP HTML issues can be related to style or layout and, as mentioned above, can affect the indexability of a site.
Page load time is becoming increasingly important in SEO. The slower your site, the less likely it is to engage the users who have the patience to wait for it to load.
You can get page speed suggestions for mobile and desktop directly from Google. Learn how to measure page speed, and identify opportunities to make your site faster.
The Google site speed test used in conjunction with the SEMrush Site Audit tool might reveal, for example, overcomplicated JavaScript or CSS files (as it did with many of the sites in our study).
Gerry White, SEO Director at Rise at Seven, suggests that code minifying is a quick win as far as site performance and user experience are concerned:
One of the things that stands out in the data is the amount of quick wins for page speed. It isn’t just about rankings but also about the user and conversion - simple quick wins that can usually be delivered without too much development effort is where I would focus my efforts on that front. Tasks such as compressing JavaScript and CSS take minutes to do, but can make huge improvements on many websites. This should be combined with ensuring that HTTPS is enabled with HTTP2.
25. Slow page (HTML) load speed
The time it takes for a page to be fully rendered by a browser should be as short as possible, as speed directly affects your rankings.
26. Uncached JavaScript and CSS files
This issue may be tied to your page load speed and happens if browser caching is not specified in the response header.
27. Unminified JavaScript and CSS files
This issue is about making JavaScript and CSS smaller. Remove unnecessary lines, comments, and white space to improve page load speed.
In some cases, errors, warnings and notices picked up by the Site Audit tool will fall under several categories.
This means they can cause a range of problems for your website, as illustrated below, so it is recommended that they are addressed as priorities.
Committing any of these SEO mistakes can hold your website back from reaching its full potential, so it is vital that you keep on top of them as a webmaster with regular site audits.
Whether you are suffering from crawlability issues preventing pages from being indexed, or duplication issues risking possible penalties, you can use this checklist to stop molehills becoming mountains.
Make a habit of looking after your SEO and UX health with tools like the Site Audit tool and you will be rewarded with the kind of search visibility and user engagement that have positive impacts on your bottom line.
Cerita ini fiksi belaka, kemiripan dengan kejadian sesungguhnya hanyalah kebetulan, atau dirimu memang ingin membuatnya kebetulan 😅
read moreDalam konteks formulir C Plano pada Pilkada, singkatan “KWK” berarti “Kepala Wilayah Kerja”. Formulir C1-KWK Plano adalah catatan hasil penghitungan suara di Tempat Pemungutan Suara (TPS) yang digunakan dalam Pemilihan Kepala Daerah dan Wakil Kepala Daerah. Formulir ini mencatat secara rinci perolehan suara di setiap TPS dan merupakan bagian penting dalam proses rekapitulasi suara.
read moreThe **Department of Government Efficiency (DOGE)** is a proposed initiative by President-elect Donald Trump, aiming to streamline federal operations and reduce wasteful spending. Announced on November 12, 2024, the department is set to be co-led by tech entrepreneur Elon Musk and former Republican presidential candidate Vivek Ramaswamy.
read moreKyle Singler is a former professional basketball player known for his collegiate success at Duke University and his tenure in the NBA.
read morePete Hegseth is an American television host, author, and Army National Guard officer, recently nominated by President-elect Donald Trump to serve as the United States Secretary of Defense.
read moreAnne Applebaum is a renowned journalist, historian, and author whose works delve into some of the most pressing and complex topics of the modern era. Her expertise lies in examining the intricacies of authoritarian regimes, the rise of populism, and the fragility of democratic institutions. Her Pulitzer Prize-winning book, "Gulag: A History," offers an in-depth exploration of the Soviet labor camp system, shedding light on the human suffering and ideological underpinnings of one of the 20th century’s most oppressive systems.
read morePlexity AI is a marvel of our times—a confluence of technological ingenuity and the boundless hunger for understanding. At its core, Plexity AI represents an advanced synthesis of artificial intelligence and machine learning, built not merely to mimic thought but to empower it. Unlike earlier iterations of AI, which focused on specialized tasks or data crunching, Plexity seems designed to operate as an expansive intellectual partner, capable of untangling the Gordian knots of complexity that define the modern era.
read moreCollaboratively administrate empowered markets via plug-and-play networks. Dynamically procrastinate B2C users after installed base benefits. Dramatically visualize customer directed convergence without
Comments