What is SEO?
SEO stands for search engine optimization. Which is the process of increasing the quantity and quality of the traffic to your website through organic search engine results
Google algorithm always ranks your website on three main factors: On-page SEO, Off-Page SEO, and Technical SEO.
This SEO “trilogy” does not always divide into three clean sections: some of these SEO elements will overlap.
Here we will know about On-page SEO
On-page SEO is the practice of the entire website page optimization process, it will optimize for rank higher and earn more relevant traffic in search engines. On-page SEO is the process of optimizing the element of website Like Text, Image, Video so, that it shows in search engine and bring relevant traffic. Only a normal On-page is not enough for ranking there is also a need to check the technical SEO of your blog or website.
1. Heading Tag
Heading tags are HTML tags that use to identify heading and subheadings within your content from other types of text.
It’s from H1 to H6, H1 is the main heading of a page and H2-H6 are optional tags to organize the content in a way that’s easy to navigate about content.
Following are best practices for heading
- Give each page a unique H1, it should cover your entire page content like a book name from a book name. You can easily understand what information could be written in this book.
- Mostly, no need to go further than the H3 tag. Use your secondary keyword in H2 to H6 tags that must be relevant to your every paragraph.
- Don’t overuse keywords and tags on content
- Put the most important keyword in H1 and H2 tags
HTTPS (Hypertext Transfer Protocol Secure) is a secure version of HTTP. it’s works in conjunction with another protocol, secure socket layer (SSL) to transport data safely.
Users prefer to visit a site which is more secure, HTTP provides Privacy and Security for your site visitors. There are three layers of protection for you and your user’s data (Encryption, data Integrity, and authentication).
In 2014 google announced that moving your site to HTTPS will give you a slight ranking boost.
Trust is the most important factor anywhere, here privacy and security both are a bridge of trust between you and your users.
3. robots.txt file
Robots.txt file is a text file that is created to instruct web robots (Typically Search Engine Robots) how to crawl a page on your website. Robots.txt is saved in the root directory of a domain. Search engine spider checks robot.txt file when crawling your site. However, it does not control crawling. You can also include a link to your sitemap, which informs the search engine spider an overview of all existing URLs of your entire website.
What role does robot.txt play in search engines? Robot.txt file gives instructions to search engine spiders which URLs or files allow or disallow your website.
Example: 1. It means all your website crawl on SERP (* means it allows crawl all Search Engine)
Example: 2. If you want not to crawl any particular file or folder on SERP
Disallow:/wp tpm file/
Before using robots.txt you must make sure that all valuable parts of your website are crawled or indexed by search bots. And ensure your most important part that your website crucial parts are not blocked by search engine bots.
A sitemap is a list of pages of a web site. website’s content designed to help both users and search engines to navigate the site. An XML sitemap helps search engines to understand your website which page to crawl in methodical manners. upload an XML sitemap of your website to Google Search Console and Bing webmaster in order to provide an overview of all URLs that exist on your website.
XML sitemap generator online tools, such as xml-sitemaps.com, allow you to automatically create an XML sitemap for your website and submit it to the webmaster. If you have in-dept coding skills you can also build an XML sitemap manually.
you can also create specific sitemaps for images, videos, or any file. In a simple way it’s a road map that tells the search engine what content is available and how to reach it.
5. Structured Data (Schema Markup)
It is the process of “marking up” your website source code to make it easier for Google to find and understand a different element of your content. on SERP has evolved so much that you may not even need to click through the result to get the answer to your query.
A rich snippet with a 5-star rating, nice pic, specific price range, stock status, operation hours or whatever is useful for you is likely to catch an eye and attract more clicks than a normal text.
Go through schema.org for more information and understand which schema.org (Founded by Google, Microsoft, Yahoo, and Yandex,) are best for your site content and assign them.
6. Bounce Rate
the percentage of visitors who visit a particular website and then leave rather than continue to stay on the website becomes a bounce rate.
If you want to decrease the bounce rate then the content on site must be more relevant.
The page speed should be standard.
The best practice is to deliver what people find and what they were looking for.
7. Site Speed
Site speed is a small factor for SEO. If site speed is good it means search engine spiders will be able to crawl more pages on your website. Every additional loading second conversion drops by up to 20%. Whether being viewed on a mobile device or desktop your site must be able to load quickly.
You can easily check your site’s loading speed using the tools GTMetrix.com and Google speed testing tools and fixed according to requirements.
8. Quality Content
Content is king, high-quality content benefits both your website and visitor and it tells what your business is all about and how you can help quality content increase your website value.
The best practice is to create high-quality content is to choose relevant keyword and topic
- Use long tail and short tail keyword naturally
- Wright for your specific niche
- It should solve your audience problem
- Create Shareable content
- Use synonyms for best result
9. Optimize Image
Here, the most important thing to keep in mind is image alt text, image size, quality of the image. When you optimize images, the alt text should be relevant to broader page content and use proper keywords and don’t keyword stuff. Second is reduce the size of the image as much as possible without sacrificing the quality of an image. It helps to reduce the loading time of the website.
10. Internal and External Linking
Linking your site internal and external increases your website page authority (PA) and domain authority (DA) it boosts your ranking in SERP. If you use strategy clay, a link can send page authority and domain authority to important pages. In short internal and external linking is key for any site that wants a higher ranking in google.
The best practice is to add an internal link (dofollow) to your old pages. And for external links wherever your link makes sure the website or domain should be trusted, check the popularity of the link page, the relevancy of the content between the source page and target page. Anchor text used in the link and number of links to the same page on the source page.
The valuable link also helps to improve website authority, by providing a viewer with reference.
11. URL Parameter
Url Should be simple to understand for both readers and search engine robots.
URL should Unique and short if possible and use a hyphen ( – ) for a separate word