2020: Blogger finally is introducing new features!

Updated on 3 Feb 2020:

Blogger, the blogging platform maintained by Google, is often criricized for being slow in introducing new features. While Wordpress has grown as the most used CMS and blogging platform, Blogger has lagged behind.

Starting mid-December 2019, Blogger is cleaning up some of its old stuff. It has changed the way Themes and Stats are offered, and promises to introduced modern-looking features in the days to come.

Hope, the changes being introduced by Blogger are not just minor tweaks to user interface for bloggers. My wishlist contains a few new themes, some features in post editors, some in-built SEO (search engine optimization) features. High time, Blogger rewards bloggers who have stayed with it despite the onslaught of Wordpress, with productive blogging features.

Changes noticed so far:

  • POST EDITOR: On the right column, The LINK option has been changed to include exposure link.
  • STATS: The looks have been changed, also now summary and the latest post data are shown on top.
  • COMMENTS: Looks have been changed.
  • PAGE EDITOR: Search description is available.
  • THEME: PREVIEW is not available now in the opening screen. The options to backup, HTML editing etc are hidden in the three dots on the right side. HTML EDITOR's buttons for saving/ preview are at the bottom.
  • In SETTINGS> OTHER, there is an option for VIDEO MANAGEMENT.
  • You can change the LANGUAGE of BLOGGER through SETTINGS> USER SETTINGS. But that will change the language for other Google products, e.g. Gmain also.

Is social media pushing you too much online? Try Pod and be social!

I as a user of social networking platforms have realized that most of my friends and followers on Whatsapp or Instagram are in fact inconsequential when it comes to actual real-life socialization. My Facebook friends (except for a few who are actual relatives and friends are also friends on Facebook too) would  hardly come to my help when I am under duress. I also do not feel satisfied enough when they 'like' something about me.

It is one bad thing that our busyness on social networking apps leaves little time for real-world socialization. Another related thing is that these apps have become a substitute for meeting and greeting because we use these apps also for sending greetings. Sometimes we don't hesitate in sending bulk messages or cut-and-paste gifs to all on our friend list/ address book.

App that encourages you to meet social media friends

What about an app that would encourage online friends to meet and share news and feelings?

Some early platforms looked at online socialization as an extension of real-life networks but they were annihilated by Facebook, Whatsapp etc. There were some attempts at having real-life engagement through social media platforms, but they remained localized and faded away. So, except for a few niche ones (e.g. WhatsApp groups of close family-members/ friends), you won't find social apps that encourage you to meet up.

Pod, a new social networking platform/ app that believes in putting people on real-world interactions, seems to be succeeding in its aim. It already has about 5 million users across 200 countries, making around 60 million connections. That is a big number, isn't it?

Pod utilizes location data and artificial intelligence to find people with similar interests who are in the geographical vicinity. It then tells them about such people, encouraging them to say hello and meet up.

location based social media

The app uses a map to show people around you. You can ping someone if you are interested in him as your hobbies or professional interests match or you would buy/ sell something from/ to him. Meeting is, of course, a desired action so that further socialization among a number of people in the neighborhood takes place.

Pod sums up its philosophy thus:
At Pod we believe in the power of real connections. The first generation of social media apps have encouraged us all to take our friends, put them online and communicate with them online. At Pod we are on a mission to do the opposite. We want to bring people together.

Connecting for professional purposes.

More than personal engagement, it appears that Pod would later focus on professional engagement. It is already making a pitch for professionals and businessmen to join the platform for exchange of information, ideas and transactions. 

Security and safety in engaging with strangers.

Meeting strangers without much background check, experts say, is not without risks to one's safety and security. The risk may extend further to one's family and professional life when people meet up and exchange information in good faith. Criminals may exploit this situation, especially if children use this app for real-life socialization.

Hope, Pod builds security features to take care of these concerns, but I did not find them on the app so far.

Mutual promotion of blogs/ websites can invite heavy penalties from Google. Is your website promotion safe?

Mutual promotion includes actions that are perfomed on different websites to promote one another. Such actions are taken by website owners/ bloggers and SEO guys to improve search ratings of the linked websites.

Mutual actions can take many shapes. Many websites may be involved. Actions can range from just an occasional cross-link to brazen cross-promotion. People form communities and use members for cross-promotion of websites.

Well, all mutual actions are not bad. But search engines have burnt their fingers because unethical promotion of linked websites or blogs has been going on for years by unethical SEO experts to manipulate search results. So, they are wary of any suspicious action beyond a very low threshhold and don't hesitate in penalizing websites. Innocent bloggers are mostly the ones who are badly impacted as they don't know what wrong they had done to invite the punishment.

Search engines try to go by the intent of the link: whether it is for giving a valuable cross-reference or to just promote the linked website. In reality, people may have the intent of genuinely providing value to the visitors but they might end up in taking actions that look suspicious. So, the idea is not to do mutual promotion on the website/ blog to the extent that it raises search engines' suspicion.

I am giving here the main pointers to guide how to avoid being penalised by search engines for promoting one website/ blog on the onther:

Too much cross-linking is bad SEO, no questions asked.

This is obvious. When there are too many and too frequent links among two or more websites/ blogs, it means there is some hanky-panky between the involved sites. The sites could belong to the same person/ firm or to friends or community members, or there might be some other mutually beneficial arrangement among them. Whatever the reason, such cross-linking - beyond some point - distorts search results, and search engines punish by devaluing such websites in their indexing/ ranking systems.

Some bloggers have the impression that they would benefit in terms of search if they open more than one blog and link them with each other. Sometimes they open many blogs and try to link their posts to one primary website/ blog.

SEO through cross promotion

In the diagram here, there are 8 websites, whose imaginary values in terms of search ratings are given in the red under the site name. The idea is to get links from all these websites to one website whom you want to promote, so that their values get added and the target website is taken by search engines as a very valuable website.

This trick used to work long back and it could work for a short time even now, but the chances of getting a slap are very high with this type of linking even if the links are natural. This is still in use, especially by link farms and PBN as explained below.

Unnatural, unrelated linking among websites is undesirable, even if it is very little.

When you link your blog or website to another one that is not related to your website's topic, search engines take such links as unnatural. So, a link of a website on beef  given on a website on bed covers is not something natural and search engines deprecate such links.

A few links on unrelated sites can be ignored and would lead to loss of that link's juice, but search engines would get alerted if such links are excessive and for no reason.

Too much internal linking with unrelated posts is a no-no.

Internal linking (=linking resources on the same website) is desirable as it helps the visitor by bringing to his notice related content at one place. It also helps search crawlers to understand the depth of information on a topic on the same website.

There are many ways people link resources on the same website, e.g. by giving proper links, by categorising and displaying posts with the same label/ tag, by giving a list or thumbnails of related posts at the bottom of new posts, by having an index or archive of old posts, and so on.

All the above-mentioned ways of internal linking are ethical. However, excess of good things too is bad, and we don't know when a search engine takes internal linking as too much. So, it is better to be cautious and not indulge in too much internal linking.

Irrelevant outbound linking loses worth.

Not many people realize that even when you put a link of an authoritative source on our website, that can be a positive signal for search engines - because you are guiding the visitor to a good reference material.

Search engines are supposed to know when to take an external link as a positive signal and when not. Millions of websites have links to Google for one reason or the other - sometimes just because they have used a Google code on their website - and that would not give any points to the website. But if you have a blog on earthquakes and you have links to scientific resources on earthquakes on reputed international journals, that would matter. Moreover, it would give a better signal when such a link is part of the post than when it is part of a list.

The opposite of this too is true. When you give links to shady websites - those indulging in unethical SEO and hacking, trolling and online abuse, aggressively pushing products, pornography, etc - there is a fair chance that search engines would suspect you of promoting anti-social and other bad acts.

Too specific links may look unnatural, particularly when too many.

Cross-linking between websites becomes more suspect when specific anchor text is used for linking and is repeated many times. It is because such things do not happen naturally.

A related action would be linking to a specific page of a website too often.

It is therefore a good practice to give different anchor texts when we have to link our other websites. When the other resource is really relevant to a discussion elsewhere, it is good to link the relevant expression to the related page, but resist the temptation to link them more than once within the same page/ post.

Cross posting of content causes 'duplicate content', shows unwanted traffic intent.

When the same content is posted at more than one places, search engines have difficulty in deciding which of the available versions is original and which duplicate. So, in one round of crawling, the search engine might index one version while in the second round, a different one. That reduces the value of both the versions.

In addition, posting the same content again and again with no or small change is a bad practice because it is an artificial way to promote the same content. It also irritates visitors when they happen to find that the blogger/ content creator has himself copy-pasted his article in more than one place.

Link farms and link exchanges are very bad for SEO in the long run.

Link farms are not natural groups/ communities that develop among people with the same interest. They are created with the sole intent of linking and liking other's websites so that they get 'link juice' from other members of the community. Sometimes members are forced to like a few given websites so that whether the members' sites get advantage or not, the chosen ones get hundreds of links. There are many shady ways that the promoters and moderators of such communities adopt to help themselves.

As a member of such a community, you may get some amount of traffic and likes, but that is worthless as the visitors are not the ones really interested in your content; in addition, you have the risk of your website's reputation going down.

All types of communities that promote unnatural link-building, are bugbears for search engines.

Badges give useless links. Use them only if they serve a purpose beyond linking.

We often create badges that we give to others for being part of our community or for participating in some competition or on achieving a milestone.

Badges are generally linked to the website that issues the badge. Therefore, the websites are not often on the same topic, and the link does not have a meaningful anchor text. For search engines, this type of links are either useless or undesirable.

As a blogger who puts badges on the blog, it is not a big SEO issue, but too many badges, especially when not from good websites/ blogs, mar the reputation of the blog and they also clutter the blog.

PBN and other one-way links to a single website are highly undesirable SEO techniques.

You might have received a message on your email offering you huge traffic to your blog by becoming member of a private blog network or PBN.

PBNs are created by buying obsolete blogs that used to have good standing. These are then used for sending links to a target website. The PBN network works like that shown in the picture above, but is much more complex and layered so that it can fool search engines into believing that the website is getting natural links from good blogs/ websites.

The guy who creates such PBNs offers links to website owners and bloggers with the promise that the website or blog would get a jump in search ranking. Naturally, the website that gets links from the PBN gets an initial high ranking due to good links but the reputation plunges when the links are found out to be from a PBN. To avoid being caught, PBN operators keep buying more old blogs.

PBN or similar link-building methods are frauds meant to cheat search engines, and there is no other reason for their existence. Therefore, the penalty for being part of a PBN can be very heavy and you must resist the temptation of joining a PBN for the sake of quick traffic.

Have you tried Mastodon, a distributed social media app?

Mastodon is the latest social media app that is making waves bigger than most of its siblings. That's for some valid reasons.

Before I come to Mastodon's features, which right now are slightly confusing for users of popular social media apps (Facebook, Twitter, etc), let me highlight where the app differes greatly from them. That is: it is a community-owned app, not owned by a commercial organization. To be more specific, it is 'open source'.

Federated microblogging: heard this?

In Mastodon, you don't opean your account on Mastodon's server. You choose one of the servers from the thousands that you see when you first open the app. Each server is like a node in a big network of servers. Each one of them is called an instance. So, if you open your account on XYZ instance, you become part of its local community and bound by its rules. In addition, you are free to interact with millions of other users on whichever server they might be. So, it is a decentralised way of social media in which thousands of server-owners moderate their own communities.

Since it is an open-source system, thousands of codes are available on the web in places like GitHub. If you own a server on the web, you can host your own instance for free and write your own code or use one available free on the web for sprucing up its features.

How instances network across Mastadon
The servers on Mastodon network use a protocol called ActivityPub that allows servers to communicate within the network. If you care, the server-side tech used in this network is Ruby on Rails and Node.js, and front-end languages are based on JavaScript (React.js, Redux).

Mastodon is a micro-blogging platform, like Twitter. The character limit here is 500, and some instances allow even bigger number. Meessages are toots here, parallel to tweets on Twitter.

Web media that is social, not commercial!

We have had many instances of big social platforms selling our data to analytical and other agencies, not keeping data safe and private, and serving us based on what serve their interests not ours.

Unliket them, Mastodon claims to be much safer. One, since each instance-owner moderates his community and there also are all-encompassing checks, trolling and abuse are less likely to occur. If they do occur, the platform is programmed to react faster and more in the interest of the community. Two, each message has a variety of privacy options, which allows the user a much bigger control on one's privacy than on Facebook etc.

There are no advertisements. In addition, what is served on one's timeline is non-algorithmic and not based on the platform's commercial interests.

Mastodon can be quite useful for organizations as they can have their own instances on which they can have internal communication in a social way.

Can Mastodon challenge Twitter, etc?

The platform was created in October 2016 and started operating with first few instances in 2017. In nearly 3 years, its membership has gone up to 2.2 million. That is a good critical mass but nowhere near the numbers of Facebook, Instagram, Twitter etc. It is reported that a large number of Indians angry with Twitter's features have joined this network in 2019.

Mastodon has come up with many mobile and desktop apps. Some of these apps are available in local languages - giving it the opportunity to penetrate populations not comfortable with English. It has been coming with new features every few days thanks to voluntary contributions.

But is that enough to switch from other social media apps, say Twitter or Facebook? Frankly, unless you have people - whom you will like to interact with and/or follow - on the same platform, you might not like to make a switch. In addition, for a Twitter or Facebook user, Mastodon's feature of joining a particular (unknown) instance and the way they communicate look confusing.

However, the platform has its own merits as mentioned above (particularly privacy), and you might like to experiment with it.

My own take is that it is difficult for any platform to dethrone the big social media platforms becasue they come out for very interesting and useful features now and then. People get hooked to these features and demand more. So, however public-spirited and safe a new platform may be, people would not join it by abandoning the big ones. TikTok, even with a big Chinese company behind it, has not been able to dethrone any video-sharing sites; in fact, all the text+image sites now offer video. Whatsapp and Facebook have added e-commerce and money-transfer features in addition to many other. Telegram offers much bigger group size and claims to be safer than Whatsapp but the latter rules the chatting app arena.

However, some of the big social biggies are likely to collapse in the years to come (due to their size, legal issues or what?), and the time would then be ripe for open-source systems to flourish. In the meantime, such apps - including Mastadon - will create a niche for themselves, and would spring when their time comes. Well, if they do not become extinct like the elephant-like creature called mastadon. Haven't we actually witnessed many mighty apps wither away: GeoCities, Squidoo, Orkut, MySpace, Google Plus...

mastodon social media app
Mastodon screen on desktop app

The richest blogger shares social, environmental concerns.

Bill Gates, like only a few other billionares, is a thinker and philanthropist. The co-founder of Microsoft, he also is a regular blogger.

Bill Gates happens to be the richest man alive on earth (in some years, he has a tie with Jeff Bejos of Amazon). Just to put this blogger in terms of his net worth: His wealth is about $107 billion while the GDP of Ethiopia is $95 billion, that of Nepal $35 billion and that of Afghanistan $20 billion. 

On his blog, Bill often writes about social issues. In his two latest posts, he writes about these highly relevant matters facing the society.

1. Alzheimer’s Disease. This is how Gates himself describes this post: It features a clip from the excellent documentary Turning Point: The Quest for a Cure, which looks at why it’s so hard to run clinical trials that would help us develop treatments for Alzheimer’s. I’ve been talking to experts at that very subject, and I’ve even learned about some ways that each of us can contribute to stopping Alzheimer’s.

2. Energy and climate change. This one is part of a series he is running on the blog. This time I wrote about why buildings are so bad for the climate, and what we can do about it. It’s a sneak peak at one of the areas I’ll cover in the book I’m writing about climate change, which will come out next year. 
Buildings can impact climate: from Bill Gates blog

Travel blogger Mariellen gets blogging award

Canada born travel blogger Mariellen Ward, 59, who writes extensively on India, has received Indian national tourism award for performance in 2017-18. Vice President of India gave away these awards in different categories earlier this month.

She has bagged the award in Best  Foreign Journalist / Travel Writer/ Blogger/ Photographer for India category.

blogger gets tourism award
Mariellen writes extensively on Indian culture. She has explored tourism sites across the country and has indulged in local fashion and traditions.

She has been maintaining her blog BreatheDreamGo passionately for the last ten years. She describes her blog in this intro on the blog:
Breathedreamgo is an award-winning travel site published by Canadian travel writer and India travel expert Mariellen Ward. Breathedreamgo was launched in 2009 and focuses on transformative travel, travel in India, travel in Canada, responsible travel, and solo female travel. Our purpose is to encourage you with inspiration and information to live your travel dreams.

25 years of blogging marathon: and one of the very first blogs keeps running!

Blogging has completed 25 years, and what a jouney it has been!

It was 1993 when updation of diaries on the web started and in the following year, Justin Hall and Dave Winer started blogging regularly. 

Blogging gave the common man <no gender bias intended> a medium that he could use to express himself to the world, without the need of an intermediary (book publisher, newspaper/ magazine editor, television show producer...). 

It was in October 1994 that one of the first blogger, Dave Winer started his blog, Scripting News. It finds an honorable mention in The Manual of Blogging (screen shot of Amazon ebook shown below). 

Scripting News blog

Dave has been updating Scripting News almost daily, though with some gaps when he does not have access to the net. In his post celebrating 25 years of non-stop blogging, he writes:
A lot of other things happened while this blog was running. Needless to say there were no blogs when it started. There was only email, no instant messaging. No RSS or podcasting, no Twitter, Facebook, Google. Amazon and Netscape were less than a year old. Microsoft tried to take over the web and failed. Steve Jobs came back to Apple and brought us the iPhone. And much more.
In fact, when blogging was at its peak (in the last 2-3 years of the last millennium and first years of the current one), media pundits had even predicted that blogging would make mainstream press irrelevant.

But then came social networking and its other ruinous sisters. Serious discussion and even curation of long-form personal musings gave way to likes, votes, shares, pins, 140-character posts, quick comments and followers. The sober and sane voices on blogs seem to have submerged in this cacophony of words, images and numbers - and now videos.

But blogging has not stopped. While a large number of old blogs perish, an equal number of new blogs open. According to one estimate, there are more than a billion blogs on the www, written in different languages. Many surveys and studies have found that blogs are taken as much more credible sources of information and they provide an important medium of expression.

Restaurants and hotels to sue food bloggers for criticism: you read it right!

A Pune-based union of Indian hospitality industry, NUHII, says, it will certify food bloggers and take legal action against bloggers for negative reviews.

Representatives of the union held a meeting earlier this month and decided to start certifying food bloggers. A rep said, there are about 500 food bloggers in Pune itself and only 25% of them are genuine. In the eyes of hotel and restaurant owners who are NUHII members, common reviewers who are not food experts or chefs post critical reviews without the required knowledge of the subject and thus hurt their image. Some reviewers, they say, blackmail them too.

The union also seems to be cut up with quick reviews on social media and aggregation sites, where users tend to make comments based on their one-off bad experience and ignore making comments when they are satisfied.

One of the biggest aggregator, Zomato, some years back put formulas in place to filter and hide what it perceived as biased comments by users. In its blog post, Zomato said that since bias comes through not only deliberate criticism but also paid reviews, the platform would keep upgrading its systems to beat bias.

food blogger, hotel reviews

The other side of the coin

I must have browsed hundreds of food bloggers, including Indian, for my blogging research. I find that mainstream food bloggers are usually passionate about cooking, recipes, nutrition etc. They need to be taken separately from the casual ones on social media platform, spammers, users and commenters. 

Blogging has its own ills, and a few bloggers (and social media influencers) might be using their big social presence and following to ask for money or freebies and even to blackmail hoteliers and restaurateurs. But most others have been doing blogging with commitment and fairplay. Their criticism may not be palatable to the hotel and restaurant owners but they must learn to swallow it.

There are also these questions: Who authorizes one of the many industry bodies to register bloggers? Will registration of publishers by an industry body not be seen as a ploy to stop criticism? What if a registered blogger decides to write a highly critical review? Who defines what is genuine and reasonable criticism, and what is defamation?

I think, bloggers as a group, and also commenters and non-blogging reviewers, have an upper hand here. An industry body cannot act to suppress criticism. However, if someone deliberately defames or blackmails a firm, that firm has all the legal right to sue the blogger; the industry body would in that case be in its right to support the firm. 

How does Google search work and what are its lessons for SEO?

Ever wondered, how Google search works? If you are not a techie, you might wonder how you get '2.5 million search results in 0.35 seconds' on Google the moment you finish typing a query.  Let us see how Google is able to do that magic and how you can use this knowledge for better search engine optimization of your website/ blog.

Search engines show indexed web pages, not instant results

When we type (or read out) a search query on the Google search box, it comes out with thousands of search results. What people generally think is that Google sends its tools to search the entire www and the tools come out with the best results - like the Aladdin genie.

No, Google and other search engines don't work like that. Even the search on your computer does not do an instant search. They all maintain an index of items.

What Google does is that
(i) its search bots keep crawling the www to find stuff based on its experience of what people actually search,
(ii) it maintains a huge index of such stuff, and
(iii) it uses a set of complex formulas to match your search query with the stuff in its index.

What type of content Google considers good for search? 

Google does not put all web pages in its index, but filters them so that it does not throw up rubbish when someone makes a search. So, inappropriate content such as child porn is filtered out and also very thin content, e.g. social media comments.

Even among the web pages that pass Google's initial filters, not all come up on search results. Google assigns points/ marks to many quality parameters and uses a series of very complicated formulas (=algorithms) to give a combined ranking to each web page. It is reported that there are many parameters that relate to the entire website and many more that relate to individual entities (web pages, files, images, etc). Web pages and websites with low ranking are not kept in the index and when a website's/ page's quality deteriorates, that is taken out of the index. A web page with high ranking is likely to come high on search results when someone makes a search on Google.

In Google's consideration, the quality of content comes on top when indexing web pages and other web entities. There are hundreds of quality parameters on which Google evaluates a webpage. These include:
  • Originality of content 
  • Thoroughness of content 
  • Usefulness of content 
  • How much the web page has been linked from authoritative websites 
  • Social media signals about the content 
  • Lack of grammatical and language errors 
  • Lack of unethical optimization (=artificial jacking up) done on the web page/ website 

Search engines like fresh content

Search engines like fresh content. So, webpages that update regularly are more likely to be served than the ones that have not been updated for long. Blogs score a big point here.

However, evergreen content that does not change over time (e.g 'the planetary system') has its own value for search engines, if it is written well. As compared to the content that needs updating, evergreen high-quality content has a very long shelf life; it is searched for and shared again and again, and this sends a positive signal to Google.

Google has a special freshness algorithms to find if the searcher is seeking up-to-date information or an evergreen one. When a search query has keywords such as 'latest', 'updates', 'score' etc, Google tries to show up content that gives fresh information on that topic/ event.

How do search engines deliver search results so fast?

In indexing as well as serving results in response to queries, search engines look for meaningful expressions (=keywords). Once the search engine derives specific keywords in the search query, it looks for web pages that are valuable for those keywords. That takes only a fraction of a second because the search biggies have extremely powerful computers for that. That is how we get millions of results in a fraction of a second.

How do search engines find true meaning of search queries?

When typing a search query, we start with whatever comes to our mind the first. If we feel that what we typed does not make full sense, we use qualifiers. The query is often not very clear and sometimes can have more than one meaning.

If you search for 'power solutions' on Google, it will try to find whether you are searching for an electricity solution near you, an article on electric power issues and their solutions, a liquid solution with good strength of cleaning etc, ways to deal with political power, or something else.

While deciding what keywords a web page contains, the same confusion occurs.
How Google search works?

Initial search engines were 'dumb'; they just looked at the search query and matched it with entries in their index. Smart webmasters made fool of them by stuffing keywords into their web pages and getting useless web pages on top of search results. Then Google and others started punishing such artificially jacked up content.

Search engines, especially Google, also built language models to better understand how different phrases with the same word mean different things with change of context or the way the query is made. Later, search engines started using machine learning for better understanding the intent behind a search. Now they serve search results on the basis of many factors other than direct relevance, e.g. search settings, searcher's location, what other searches were recently made on the same device, and which app the searcher had been using at that time.

The intent of search also matters: a query may be for information and another one for buying, but both may have the same keywords. When the intent is clear, there is no issue; but when it is not clear, location, earlier searches, etc come handy. For example, 'Fix a faulty faucet' tells the search engine that [perhaps] the searcher wants to get his faulty faucet repaired from a technician. So, the top search results to this query will mostly be local plumbing services and faucet sellers. At the same time, Google will also serve articles that advise on fixing a faulty faucet.Very rarely, there may be some web pages that talk about engineering behind faucets, but only such web pages that it considers highly valuable.

Now, you might ask whether Google's index already had web pages for 'repairing' and 'faucet' both? Yes. Instead of just indexing entries for 'faucets', Google has a way to index web pages that contain keywords with more than one word. In this case, Google will look for index entries in which 'repair' and 'faucet' come together in a meaningful sequence. So, Google is likely to serve similar search results if you make one of the following queries: how to repair a faucet, what to do when a faucet leaks, ways of fixing faucets, how to fix taps so that they do not leak.

Google goes a step forward to know the real value of a web page in relation to a keyword. Besides exact keywords, Google also looks for other expressions and other elements on the web page that indicate whether the web page is relevant to the search query. This example given by Google itself will illustrate this: If you search for 'dog', it will not serve a web page with 'dog' written in it a hundred times. In addition to finding whether the web page really has useful information on dogs (The page will have information on dog foods, dog breeds, dog diseases, pet care etc and will have  names of some breeds, diseases, vet hospitals etc. The search engine will also check whether the web page has photos, videos and links pertaining to dogs.

As mentioned above, the freshness algorithm smells if the query is for updates. If you want to know 'Cicago weather', it will give latest weather updates on Cicago on top of search page; if you query 'dollar-rupee rate', it will give the latest exchange rate between these currencies. Maybe, it will also give links to forex dealers near you.

Google penalties for black hat SEO

Let me make one thing very clear: good content is no guarantee that Google would show a web page high on its search results. Google and other major search engines do not serve results just on the strength of quality and relevance. They give value to search engine optimization or SEO and therefore search-optimized content is likely to come up even if it may not be the best and most relevant.

All webmasters know about SEO. It includes measures that are taken on the websites and specific web pages so that they come high on search results.

Search engines welcome ethical SEO - which guides search engines about content and relevance of web pages, but they hate black-hat SEO - which tries to fool search engines into believing that a poor quality web page is of very high quality and relevance. Filtering out such bad, spammy, web pages is one reason search engines keep changing their algorithms very regularly.

We do not know much about other search engines because they do not go public about their activities, but about Google we know it for sure: Google routinely carries out numerous algorithmic changes to improve efficiency of search and also for penalizing actions that it thinks as black hat SEO. Google has reported making as many as 3234 improvements in its search within the last one year (i.e. 2018).

In the linked article, you can see a list of SEO actions Google likes and dislikes.

The future of search engines

It is estimated that on average, nearly 2 trillion (=2000 billion) searches are made in a year. More and more searches are now made on mobile devices. This has posed new challenges as well as opportunities before search engines. Mobile phones have made Google available all the time and everywhere. New developments in localization have also made it easy for users to search everything around them on the go. Earlier, when we did not know details about something or someone, we looked for it in Google on our desktops. Now we go to Google when we plan a trip or look for a restaurant nearby or forget the way home while driving. The results for such queries have to be exact, instant and with useful links.

Another major change that the search engines are seeing is 'voice search'. More and more people are using voice for search, especially on the move or those not comfortable in typing fast or on virtual assistants. Mind works differently when one types a query and when one speaks it into the microphone. There also are issues relating to pronunciation and noise. Search engines, thus, have to be even smarter in getting the query and its intent right.

Search queries are getting longer, and that is challenging the way web pages are indexed and served (e.g. plain/ with snippets/ with images/ video first/ etc). Though SEO experts have been emphasizing on optimizing web pages for long keywords, just putting long keywords on web pages may not work for long if search engines prioritize or discard certain types of long phrases.

Google says, it favors authority and quality, but it also seems to be favoring big firms and brands - sometimes more than quality - and the argument seems to be that results relating to them are useful to searchers. Competing search engines do the same. Of course, for search engines focused on marketplaces (e.g. Amazon), buying intent is what matters.

SEO will continue to matter. It will adjust to the latest web technologies and algorithmic changes. Many SEO actions that looked natural earlier will keep being taken as black hat. Search engines will keep carrying out arbitrary changes now and then so that SEO experts are not able to precisely guess the methods used by them. 

Does Google give preference to paid results? 

Search engines need huge resources. So, they must earn while giving the results free. They serve paid results before organic (= naturally occurring) search results. On sidebar (on wide devices) and top and bottom (especially on smaller devices), they stuff advertisements. They collect your browsing data, ostensibly to refine search but also with the intent to serve you targeted ads. There are many other ways search engines play with results; sometimes to help the searcher and sometimes for commercial reasons. Can you really blame them when you are getting so much information and convenience for free?

If you are interested to learn more about SEO, you may like to visit these resources: 

Fundamentals of search engine optimization: What? Why? How?

This is a knowledge base article. It would answer your questions on search engine optimization (SEO) and supplement other SEO posts on this website. In the area of search engine optimization, you will find the same set of tips, with one less here and one more there, on most expert websites (and thousands of websites that copy-paste them).

I would advise that you go through the present post in full before you look for actual tips. Why? Because reading tips does not lead to success unless we know the 'why' and 'how' behind the tips.

    What actually is SEO and how to improve search engine ranking?

    SEO or search engine optimization is the sum of all the actions you take so that your webpages (or blog posts) appear high on search engines when someone searches on the web for something related to the webpage.

    So, SEO has two important aspects:

    1. Content's relevance is very important.

    Search engines (Google, Bing, Yahoo!, DuckDuckGo, etc) must be able to relate your website with the search query. If someone searches for 'yoga', your site will be shown by search engines only if it has content on yoga. 

    Take care that your website/ blog has enough content, and of good quality, on the subject on which you want the site to come up in search pages.

    2. Website's authority is equally important.

    For deciding which website to give a higher position on the search pages, Search engines must rank the websites according to their authority. Authority comes from quality of content and how other authoritative sites talk about this site.

    For getting a high ranking for your website in search engine's index, you must ensure high-quality content and also have links (=backlinks) and recommendations from other good websites, and people should talk favorably about your website, on social media. 

    Thus, search engine optimization or SEO is mostly the actions we take for improving relevance of a website and its authority

    Does search optimization matter, now that most website traffic comes through social media?

    Social media is important for getting traffic to a website, but search remains one of the top sources of traffic. What actually happens is that if the content is relevant and of high value, SEO and social media work in synergy: people share the content on social media, which gives authority signals to search engines, website starts coming high on search pages, and the virtuous cycle moves on and on.

    Search engine optimization remains very important, and its importance has only increased over the years - because now people do not remember URLs or ask others for references; they just type their search query on Google and then click on search results.

    Since search engines as well as searchers have become smarter, many of old-type SEO techniques do not work well now. In fact, if we apply some of these techniques in 2020, it hurts the websites. So, what is important is that we apply SEO techniques that are natural and keep in mind searchers more than search engines.

    We hear of frequent Google algorithm updates. What are the latest developments in this field?

    What new developments the search engines, especially Google, carry out is anybody's guess unless they themselves announce a major update. However, there are some companies that spend hours in analyzing search data of different websites and then deciphering what Google might have secretly done. Based on both these, these are the updates as of early 2020:
    • Google keeps updating its algorithms (=formulae) now and then. Such small changes can sometimes be done on daily basis! In a post in July 2019, Google shared that in the preceding one year, it had made 'more than 3200 small changes to its search systems'. 
    • Once in a while, Google comes with 'broadcore' algo updates, which are important and may upset a website's search rankingGoogle's latest broad algorithm change came in January 2020.
    • Google advises that website owners need not worry if their search volume suddenly goes down due to algorithm updates, unless the webmaster himself has taken undesirable SEO actions. Yet, many sites lose traffic badly when major updates take place. 
    • It is believed that other search engines follow Google and carry out their own algorithm changes now and then. 
    • Search engines now use machine learning and other artificial intelligence techniques for finding the exact intent of a search query (e.g whether it is for data, finding location, buying, comparing or something else) and serving the most relevant results. Google has announced that it now uses BERT, a natural language processing algorithm to find the intent of search queries better.
    • Mobile search has gained priority. Google now tags its indexed pages also for mobile so that pages with mobile optimization are served when somebody searches on his mobile phone.
    • Video and image are getting importance as web content, and search engines are finding ways to rank them high. Google has started showing videos on top of search pages.
    • Search pages are now full of content that is not organic or naturally obtained. Advertisements or paid search results often come on top of search pages, and thus even the top organic search results come below paid content.

    Do unethical, black-hat SEO techniques work?

    Frankly, they work. But only some of them work and they work for a short time. They badly harm the website in the long run.

    Please remember that all SEO is not natural; some of it is technical and most of it is forced. But that does not make it unethical. Good SEO is like advertising your website and its webpages to search engines. Problem comes when people take undesirable actions, e.g. fooling search engines into believing that a poor quality content is of very high quality.
    Such actions are called 'black-hat' SEO

    Google is known to slap penalties on websites when it finds them using black-hat techniques of SEO. It is reported that many SEO guys promise quick rise in search rankings through such techniques. One should not fall prey to their allurements. Even Google has itself said that attempts to instantly raise search ranking may lead one into trouble.  

    Should I go for an SEO service? Will it quickly multiply my search traffic? 

    Part of this question has been answered in the previous reply: be wary when choosing someone who calls himself an 'SEO expert'. There are numerous fake SEO experts all around. In fact the Australian ombudsman on small businesses has recently called SEO 'a minefield of dodgy practitioners' and cautioned businesses against hiring SEO experts mindlessly.

    When you do not know  the tricks of the trade, or you find some actions too technical and confusing, or you fear that a small wrong step might hurt your website, you can sure think of hiring an SEO expert. If you run the website for earning purposes and need a stream of traffic, the need is much more than when you run a personal/ hobby blog.

    If you have time and inclination to learn, you can learn SEO fundamentals from the web or take an online course. You can also make use of free and paid tools, but do not get swayed by their jargon and the enormous data they generate.

    What top SEO strategy would you suggest to a blogger who is not too tech-savvy?

    All major search engines advise web publishers to focus on quality of content and updating the content regularly. At the same time, at least some SEO techniques tactics is essential because in the enormous crowd of websites, a blog/ website cannot come on top only on the strength of quality - it must apply some SEO techniques to show to the search engines that it is a website worth being shown on top of search results.

    In fact, Google and Bing themselves suggest some technical actions that website owners should not ignore, e.g. relevant URL, title and description; proper site structure; use of ALT attribute with images; linking with authoritative references; proper use of keywords; and so on.  

    However, if you feel that you need a regular stream of search traffic and you are not too tech-savvy, you can go for an SEO expert.  As mentioned above, be careful when choosing one.

    Why is my blog not coming high in Google search results? 

    Many bloggers get frustrated when they find that their blog is not coming high in Google search results despite all their SEO efforts. If you are one of them, you might be wrongly taking one or the other SEO action. For example, your content might be of poor quality, or it may be good but not useful. Your content may be of top quality but not relevant to what people are looking for. 

    Have you optimized your individual posts for visitors as well as search engines? For example: Have you given the post a great, interesting, heading and sub-divided text into sections with sub-headings? Have you added keywords in the post's URL, introduction, body, search description etc? Have you put alt attribute to images? Have you added a compelling description to the post? If you are on Wordpress, have you installed a SEO plugin (e.g. Yoast) and optimized the post according to its suggestions?

    If your blog and posts do not come high in search pages in spite of all your efforts, look closely at the competition. You might be optimizing your posts for keywords that are highly competitive: what it means is that there may already be other web pages with very high authority and relevance for those keywords. 

    One mistake bloggers and website owners usually make is to try to optimize the website but not individual posts and pages. Google and other search engines show individual web pages more than websites when someone makes a search - so individual web pages need to be optimized. Page-level optimization helps greatly in improving relevance (while site-level optimization is more effective in improving authority).

    Also remember that SEO has to be a continuous process; it is not done once in five years and then forgotten. You need not - and should not - go overboard with SEO but it always helps to  optimize main posts when publishing them and also re-visit site-wide SEO once every 6-months or so.

    Best SEO techniques are simple but focused.

    How good are Google and Bing webmaster tools, from SEO point of view?  

    Both Google Search Console and Bing Webmaster Tools are good as they give data on which keywords rank high, from where your site gets traffic, technical issues that might be hurting your website's SEO, etc. There are some good inbuilt tools for analytics as well optimization, and they give genuine and practical technical SEO tips. 

    Go for either of the two. Do not get bogged down with technicalities. Just see if your website has some serious issue and if your search graph is progressing positively. Take the actions that they suggest.

    Should I go for paid SEO tools available on the web? 

    If you are attracted towards SEO tools available on the web, start with free and trial versions of one or two top SEO companies. Only after that should you take a final decision whether to buy one. 

    In general, you should not go for such tools, if 

    • You are a hobby blogger. 
    • You find that these SEO tools are complicated and useless. 
    • You do not have much technical expertise and you dislike technology. 
    • You already have an SEO expert by your side. 

    You can think of buying SEO tools, if 

    • You have time for technical actions that SEO tools require.
    • You do not hate doing technical work yourself. 
    • You do not want to spend money on hiring an SEO expert but would like SEO strongly implemented on your website. 
    • You find the available free SEO tools useful and think that you would have more options in the paid versions.

    If you like this page, would you mind tweeting about it?