Travel blogger Mariellen gets blogging award

Canada born travel blogger Mariellen Ward, 59, who writes extensively on India, has received Indian national tourism award for performance in 2017-18. Vice President of India gave away these awards in different categories earlier this month.

She has bagged the award in Best  Foreign Journalist / Travel Writer/ Blogger/ Photographer for India category.

blogger gets tourism award
Mariellen writes extensively on Indian culture. She has explored tourism sites across the country and has indulged in local fashion and traditions.

She has been maintaining her blog BreatheDreamGo passionately for the last ten years. She describes her blog in this intro on the blog:
Breathedreamgo is an award-winning travel site published by Canadian travel writer and India travel expert Mariellen Ward. Breathedreamgo was launched in 2009 and focuses on transformative travel, travel in India, travel in Canada, responsible travel, and solo female travel. Our purpose is to encourage you with inspiration and information to live your travel dreams.

25 years of blogging marathon: and one of the very first blogs keeps running!

Blogging has completed 25 years, and what a jouney it has been!

It was 1993 when updation of diaries on the web started and in the following year, Justin Hall and Dave Winer started blogging regularly. 

Blogging gave the common man <no gender bias intended> a medium that he could use to express himself to the world, without the need of an intermediary (book publisher, newspaper/ magazine editor, television show producer...). 

It was in October 1994 that one of the first blogger, Dave Winer started his blog, Scripting News. It finds an honorable mention in The Manual of Blogging (screen shot of Amazon ebook shown below). 

Scripting News blog

Dave has been updating Scripting News almost daily, though with some gaps when he does not have access to the net. In his post celebrating 25 years of non-stop blogging, he writes:
A lot of other things happened while this blog was running. Needless to say there were no blogs when it started. There was only email, no instant messaging. No RSS or podcasting, no Twitter, Facebook, Google. Amazon and Netscape were less than a year old. Microsoft tried to take over the web and failed. Steve Jobs came back to Apple and brought us the iPhone. And much more.
In fact, when blogging was at its peak (in the last 2-3 years of the last millennium and first years of the current one), media pundits had even predicted that blogging would make mainstream press irrelevant.

But then came social networking and its other ruinous sisters. Serious discussion and even curation of long-form personal musings gave way to likes, votes, shares, pins, 140-character posts, quick comments and followers. The sober and sane voices on blogs seem to have submerged in this cacophony of words, images and numbers - and now videos.

But blogging has not stopped. While a large number of old blogs perish, an equal number of new blogs open. According to one estimate, there are more than a billion blogs on the www, written in different languages. Many surveys and studies have found that blogs are taken as much more credible sources of information and they provide an important medium of expression.

Restaurants and hotels to sue food bloggers for criticism: you read it right!

A Pune-based union of Indian hospitality industry, NUHII, says, it will certify food bloggers and take legal action against bloggers for negative reviews.

Representatives of the union held a meeting earlier this month and decided to start certifying food bloggers. A rep said, there are about 500 food bloggers in Pune itself and only 25% of them are genuine. In the eyes of hotel and restaurant owners who are NUHII members, common reviewers who are not food experts or chefs post critical reviews without the required knowledge of the subject and thus hurt their image. Some reviewers, they say, blackmail them too.

The union also seems to be cut up with quick reviews on social media and aggregation sites, where users tend to make comments based on their one-off bad experience and ignore making comments when they are satisfied.

One of the biggest aggregator, Zomato, some years back put formulas in place to filter and hide what it perceived as biased comments by users. In its blog post, Zomato said that since bias comes through not only deliberate criticism but also paid reviews, the platform would keep upgrading its systems to beat bias.

food blogger, hotel reviews

The other side of the coin

I must have browsed hundreds of food bloggers, including Indian, for my blogging research. I find that mainstream food bloggers are usually passionate about cooking, recipes, nutrition etc. They need to be taken separately from the casual ones on social media platform, spammers, users and commenters. 

Blogging has its own ills, and a few bloggers (and social media influencers) might be using their big social presence and following to ask for money or freebies and even to blackmail hoteliers and restaurateurs. But most others have been doing blogging with commitment and fairplay. Their criticism may not be palatable to the hotel and restaurant owners but they must learn to swallow it.

There are also these questions: Who authorizes one of the many industry bodies to register bloggers? Will registration of publishers by an industry body not be seen as a ploy to stop criticism? What if a registered blogger decides to write a highly critical review? Who defines what is genuine and reasonable criticism, and what is defamation?

I think, bloggers as a group, and also commenters and non-blogging reviewers, have an upper hand here. An industry body cannot act to suppress criticism. However, if someone deliberately defames or blackmails a firm, that firm has all the legal right to sue the blogger; the industry body would in that case be in its right to support the firm. 

How does Google search work and what are its lessons for SEO?

Ever wondered, how Google search works? If you are not a techie, you might wonder how you get '2.5 million search results in 0.35 seconds' on Google the moment you finish typing a query.  Let us see how Google is able to do that magic and how you can use this knowledge for better search engine optimization of your website/ blog.

Search engines show indexed web pages, not instant results

When we type (or read out) a search query on the Google search box, it comes out with thousands of search results. What people generally think is that Google sends its tools to search the entire www and the tools come out with the best results - like the Aladdin genie.

No, Google and other search engines don't work like that. Even the search on your computer does not do an instant search. They all maintain an index of items.

What Google does is that
(i) its search bots keep crawling the www to find stuff based on its experience of what people actually search,
(ii) it maintains a huge index of such stuff, and
(iii) it uses a set of complex formulas to match your search query with the stuff in its index.

What type of content Google considers good for search? 

Google does not put all web pages in its index, but filters them so that it does not throw up rubbish when someone makes a search. So, inappropriate content such as child porn is filtered out and also very thin content, e.g. social media comments.

Even among the web pages that pass Google's initial filters, not all come up on search results. Google assigns points/ marks to many quality parameters and uses a series of very complicated formulas (=algorithms) to give a combined ranking to each web page. It is reported that there are many parameters that relate to the entire website and many more that relate to individual entities (web pages, files, images, etc). Web pages and websites with low ranking are not kept in the index and when a website's/ page's quality deteriorates, that is taken out of the index. A web page with high ranking is likely to come high on search results when someone makes a search on Google.

In Google's consideration, the quality of content comes on top when indexing web pages and other web entities. There are hundreds of quality parameters on which Google evaluates a webpage. These include:
  • Originality of content 
  • Thoroughness of content 
  • Usefulness of content 
  • How much the web page has been linked from authoritative websites 
  • Social media signals about the content 
  • Lack of grammatical and language errors 
  • Lack of unethical optimization (=artificial jacking up) done on the web page/ website 

Search engines like fresh content

Search engines like fresh content. So, webpages that update regularly are more likely to be served than the ones that have not been updated for long. Blogs score a big point here.

However, evergreen content that does not change over time (e.g 'the planetary system') has its own value for search engines, if it is written well. As compared to the content that needs updating, evergreen high-quality content has a very long shelf life; it is searched for and shared again and again, and this sends a positive signal to Google.

Google has a special freshness algorithms to find if the searcher is seeking up-to-date information or an evergreen one. When a search query has keywords such as 'latest', 'updates', 'score' etc, Google tries to show up content that gives fresh information on that topic/ event.

How do search engines deliver search results so fast?

In indexing as well as serving results in response to queries, search engines look for meaningful expressions (=keywords). Once the search engine derives specific keywords in the search query, it looks for web pages that are valuable for those keywords. That takes only a fraction of a second because the search biggies have extremely powerful computers for that. That is how we get millions of results in a fraction of a second.

How do search engines find true meaning of search queries?

When typing a search query, we start with whatever comes to our mind the first. If we feel that what we typed does not make full sense, we use qualifiers. The query is often not very clear and sometimes can have more than one meaning.

If you search for 'power solutions' on Google, it will try to find whether you are searching for an electricity solution near you, an article on electric power issues and their solutions, a liquid solution with good strength of cleaning etc, ways to deal with political power, or something else.

While deciding what keywords a web page contains, the same confusion occurs.
How Google search works?

Initial search engines were 'dumb'; they just looked at the search query and matched it with entries in their index. Smart webmasters made fool of them by stuffing keywords into their web pages and getting useless web pages on top of search results. Then Google and others started punishing such artificially jacked up content.

Search engines, especially Google, also built language models to better understand how different phrases with the same word mean different things with change of context or the way the query is made. Later, search engines started using machine learning for better understanding the intent behind a search. Now they serve search results on the basis of many factors other than direct relevance, e.g. search settings, searcher's location, what other searches were recently made on the same device, and which app the searcher had been using at that time.

The intent of search also matters: a query may be for information and another one for buying, but both may have the same keywords. When the intent is clear, there is no issue; but when it is not clear, location, earlier searches, etc come handy. For example, 'Fix a faulty faucet' tells the search engine that [perhaps] the searcher wants to get his faulty faucet repaired from a technician. So, the top search results to this query will mostly be local plumbing services and faucet sellers. At the same time, Google will also serve articles that advise on fixing a faulty faucet.Very rarely, there may be some web pages that talk about engineering behind faucets, but only such web pages that it considers highly valuable.

Now, you might ask whether Google's index already had web pages for 'repairing' and 'faucet' both? Yes. Instead of just indexing entries for 'faucets', Google has a way to index web pages that contain keywords with more than one word. In this case, Google will look for index entries in which 'repair' and 'faucet' come together in a meaningful sequence. So, Google is likely to serve similar search results if you make one of the following queries: how to repair a faucet, what to do when a faucet leaks, ways of fixing faucets, how to fix taps so that they do not leak.

Google goes a step forward to know the real value of a web page in relation to a keyword. Besides exact keywords, Google also looks for other expressions and other elements on the web page that indicate whether the web page is relevant to the search query. This example given by Google itself will illustrate this: If you search for 'dog', it will not serve a web page with 'dog' written in it a hundred times. In addition to finding whether the web page really has useful information on dogs (The page will have information on dog foods, dog breeds, dog diseases, pet care etc and will have  names of some breeds, diseases, vet hospitals etc. The search engine will also check whether the web page has photos, videos and links pertaining to dogs.

As mentioned above, the freshness algorithm smells if the query is for updates. If you want to know 'Cicago weather', it will give latest weather updates on Cicago on top of search page; if you query 'dollar-rupee rate', it will give the latest exchange rate between these currencies. Maybe, it will also give links to forex dealers near you.

Google penalties for black hat SEO

Let me make one thing very clear: good content is no guarantee that Google would show a web page high on its search results. Google and other major search engines do not serve results just on the strength of quality and relevance. They give value to search engine optimization or SEO and therefore search-optimized content is likely to come up even if it may not be the best and most relevant.

All webmasters know about SEO. It includes measures that are taken on the websites and specific web pages so that they come high on search results.

Search engines welcome ethical SEO - which guides search engines about content and relevance of web pages, but they hate black-hat SEO - which tries to fool search engines into believing that a poor quality web page is of very high quality and relevance. Filtering out such bad, spammy, web pages is one reason search engines keep changing their algorithms very regularly.

We do not know much about other search engines because they do not go public about their activities, but about Google we know it for sure: Google routinely carries out numerous algorithmic changes to improve efficiency of search and also for penalizing actions that it thinks as black hat SEO. Google has reported making as many as 3234 improvements in its search within the last one year (i.e. 2018).

In the linked article, you can see a list of SEO actions Google likes and dislikes.

The future of search engines

It is estimated that on average, nearly 2 trillion (=2000 billion) searches are made in a year. More and more searches are now made on mobile devices. This has posed new challenges as well as opportunities before search engines. Mobile phones have made Google available all the time and everywhere. New developments in localization have also made it easy for users to search everything around them on the go. Earlier, when we did not know details about something or someone, we looked for it in Google on our desktops. Now we go to Google when we plan a trip or look for a restaurant nearby or forget the way home while driving. The results for such queries have to be exact, instant and with useful links.

Another major change that the search engines are seeing is 'voice search'. More and more people are using voice for search, especially on the move or those not comfortable in typing fast or on virtual assistants. Mind works differently when one types a query and when one speaks it into the microphone. There also are issues relating to pronunciation and noise. Search engines, thus, have to be even smarter in getting the query and its intent right.

Search queries are getting longer, and that is challenging the way web pages are indexed and served (e.g. plain/ with snippets/ with images/ video first/ etc). Though SEO experts have been emphasizing on optimizing web pages for long keywords, just putting long keywords on web pages may not work for long if search engines prioritize or discard certain types of long phrases.

Google says, it favors authority and quality, but it also seems to be favoring big firms and brands - sometimes more than quality - and the argument seems to be that results relating to them are useful to searchers. Competing search engines do the same. Of course, for search engines focused on marketplaces (e.g. Amazon), buying intent is what matters.

SEO will continue to matter. It will adjust to the latest web technologies and algorithmic changes. Many SEO actions that looked natural earlier will keep being taken as black hat. Search engines will keep carrying out arbitrary changes now and then so that SEO experts are not able to precisely guess the methods used by them. 

Does Google give preference to paid results? 

Search engines need huge resources. So, they must earn while giving the results free. They serve paid results before organic (= naturally occurring) search results. On sidebar (on wide devices) and top and bottom (especially on smaller devices), they stuff advertisements. They collect your browsing data, ostensibly to refine search but also with the intent to serve you targeted ads. There are many other ways search engines play with results; sometimes to help the searcher and sometimes for commercial reasons. Can you really blame them when you are getting so much information and convenience for free?

If you are interested to learn more about SEO, you may like to visit these resources: