A Brief History Of The Search Engine Phenomenon
It’s hard to believe that anyone in the 90’s actually knew about the old comic book characters, Archie, Jughead and Veronica (except for collectors, of course). But things were changing rapidly in technology at that time, and the Internet promised to become a big player. So, an entrepreneur had an idea.
He decided that the Internet could be used for people to search for information and websites through a central clearing house. He named this very first search engine “Archie.” (Actually it was “Archive” without the “v,” but that’s just a detail.)
Now the Archie search engine (if you can really call it that) was like an online phone book. It was simply a directory listing of websites based upon a search term – cars, movies, etc. – no information other than the site URL.
A year later, in 1991, “Veronica” and “Jughead” came to be, followed in 1993 by Excite, Bot and Aliweb – still just phone books that would pull up site names and URL’s. Users had to have pen and paper handy, in order to write down the URL addresses to go to. Anyone born around this time who came into their teenage years by the early 2000’s absolutely laughs at how archaic all of this sounds, but evolution, not revolution, is the “name of the game.”
It’s All About the Keyword Density
Jump Station, Alta Vista, Lycos, Web Crawler, and Magellan soon followed these old indexes with some innovations of their own. The designers decided that, rather than just pull up a listing of websites in no particular order, these sites should be ranked in some fashion, based upon the search keyword term.
If someone were searching with keyword “car,” for example, the search engine would locate those sites that used the term “car” most often, and those sites would be first on the lists. You can only imagine what happened next. Entrepreneurs in the car business with a web presence merely had to get the word “car” in the text of their sites a huge number of times, and they would come up first in the search results. Thus began the phenomenon of “keyword stuffing.”
And savvy site owners also knew that getting that keyword in a whole bunch in the early part of their page texts would be more valuable, because more value was given to that by the search engines!
“Value” is Added
By 2003, innovators like Larry Page and Sergy Brin, students at Stanford University, along with other innovators who began Yahoo and Ask, decided that there had to be a better way to rank sites – based upon the value of those sites to users. The rankings, they believed, should be a result of user reviews, and this thing called “link analysis.”
So, if a site could get its “users” (or paid writers, actually) to write and add great reviews, then page ranking could be impacted positively. Also, if those reviews could be placed outside of the site, in article directories, for example, with links back to the website, all the better. So, the early attempts at “ranking” the value of a web site were human-driven – based upon the number of links back to a site that an entrepreneur could get.
The more links, the higher the ranking – links became like a ‘vote,’ and, as with any election, the most votes wins!
Now, to be fair, Google did try to look at quality of site content, but the algorithms to do this were pretty “pre-historic.” Other search engines (e.g., Direct Hit) began to massage their page rankings based upon the number of “click-throughs.”
Basically, they would monitor the number of times a user would click on a site that came up in a search, and alter page ranking based upon that. What’s not to love? This is democracy in action, right? Except that shoe savvy site owners knew this and even paid people to conduct searches and to click through their sites!
Going Vertical
Now, up until this time, searches were generally conducted using a single word – cars, movies, news, etc. But those in the know realized quite quickly that these “horizontal” searches would result in pages upon pages of websites that were of no relevance to the user.
So, the era of getting more specific with keywords was born. Instead of “news,” for example, you now type in a more specific thing, like “terrorism,” and you will get news from all outlets that have the word “terrorism” embedded within.
This is called “vertical” searching – rather than thousands upon thousands of pages related to “news” (horizontal), you can now get very specific with the type of news you want. And media was added as well! Now, a user can search for pictures of something.
Instead of “wall art,” for example, one can search for “pictures of contemporary wall art,” and get not just related sites but images as well!
Return Of The Human Factor
The first decade of the 21st century focused on search algorithms based upon keywords and links. These were objective, non-human factors that determined page rankings. Personalizing and individualizing searches and remembering which sites an individual user has accessed before has now become an important factor in searches.
Suppose, for example, I want to go to Belize and look for property to purchase. I have conducted a number of searches with titles such as “Real Estate in Belize.” My search engine remembers this, so when I type in Belize again, the first sites presented will not be about the country, its history, or even hotels and tourist attractions. It will be about property in Belize!
Another really cool aspect of current searches is the ability of a search engine to know my location. So if, for example, I am in St. Louis, and I am searching for historical or art museums, the engine pinpoints my location and gives me, first of all, museum locations in the St. Louis area. Pretty cool actually!
Oh, Those Sneaky Search Engine Tactics!
In the ongoing pursuit of making Internet searches better for users, things have come a long way, baby! Today, here are some of the really critical features that major engines are now incorporating!
- Use of social media: This increases the human consumer factor significantly! Rather than simply relying on articles, blogs, keywords, and links, engines are now crawling through social media websites to discover content that has been produced and posted by humans who have experienced products and/or services. If a particular Italian restaurant in Atlanta, Georgia is getting rave reviews all over social media sites, Google and others take note, and this will improve page rankings! Real democracy in action!
- Focus on Content: Search engines have always looked for facts and information based upon search terms. Now, however, they have algorithms that look for facts and information that is unique and “fresh,” – stuff that is not commonly found everywhere. When they find such content contained within websites, it is more interested, and this new interest impacts page rankings! Thus, sites that have blogs today will be wise to focus less on keywords than on great content!
- New algorithms: Basically, algorithms as they relate to search engines refer to the rules and formulae used to determine the relevance of a site to a search that is conducted. These will search out and discover websites with too much advertising and too little content; they will reward sites with good, unique, and high quality content; they will penalize sites that continue to rely on keyword “stuffing” and “Black Hat” linking schemes. Crawlers such as Penguin and Panda will only get better at this, so site owners beware!
See related post: Beware Of Old Link Building Techniques!
A Word About Twitter
Created by Jack Dorsey in 2006, Twitter has become far more than just famous people setting up accounts and reaching out to their “followers.” It has now become a “mini-blogging” platform that will have huge impact on the future of marketing! When one considers that there are about 90 million tweets a day and that account holders can now receive suggestions of other individuals and businesses to follow, and that people can share news, articles, site content and blogs, as well as media, with all of their followers, think of the endless possibilities!
As Twitter continues to expand, refine, and update its capabilities, it will become a wildly popular search tool on its own! And the sharing of information, the making of recommendations, and the opinions on all sorts of products and services will drive entrepreneurs to establish a positive Twitter image. Imagine, for example, the following that Magic Johnson has on Twitter. If a company can contract with him to tweet endorsements of its products and/or service, think of the audience that will reach!
The Future? Put On Your Seat Belts!
As much as search engine technology is evolving, and at the rate it is evolving, it is impossible to predict the next phases accurately! The only advice is to stay “current,” and to adapt to the changes as they come!
Julie Ellis, as a regular blogger for Premier Essay, has never found a topic to be uninteresting. “What I don’t know, I find out, and my creative “juices” take care of the rest!” she states.